Digital Native: #MemeToo

Posted by Nishant Shah at Sep 09, 2018 02:00 PM |
An old meme shows the need for emotional literacy in our digitally saturated age. Memes, like regrettable exes, have the habit of resurfacing at regular periods.

The article was published in Indian Express on September 9, 2018.


Memes, like regrettable exes, have the habit of resurfacing at regular periods. This week saw the return of the “Qajar Princess” meme across social media and institutional news media outlets as well. For those late to the viral party, Princess Qajar first made its appearance towards the end of 2017, when the world was riding high on its pop-feminist assertions and the revelations of the #MeToo movements — a photograph of a person dressed in a gown with dark long hair, thick eyebrows and a moustache, as she gets her portrait shot. The caption identified this person as Princess Qajar who was a “symbol of beauty in Persia” (now Iran), and also stated how “13 young men killed themselves” because she rejected their advances.

Everything about the meme was click-bait worthy — from the defiance of feminine standards to the possibility of a woman scripting her own narrative of beauty and empowerment. It fed perfectly into our female emancipation narratives.

There was only one problem with this meme — it was completely made up. There was quick debunking of all its claims. Excellent websites like Abitofhistory and many investigators on Reddit showed that everything about the meme was a fabrication. While it did seem to respond to the political zeitgeist and celebrate women’s bodies and desire — also giving us a non-Western narrative of beauty — it was all just #FakeNews. The meme had more or less died its timely death by the time 2018 rolled in, but, surprisingly, it has come back again on Instagram and Facebook news where equal parts admiration and ridicule are expressed at the cost of the person in that image.

The meme does not have any immediate problematic actions associated with it, though it carries both the oriental prejudices of framing the Persian region as “freaky”, and the misogynist framing of a woman’s body as something that is available for shameless analysing and commenting. This obvious piece of disinformation does belie the volatile nature of news and information circulation that we live in, in the age of information overload. I was in Jakarta in late August, sitting with 30 news media professionals, information activists, and policy actors from Asia, where we were discussing the surfeit of such disinformation, and our apparent incapacity to engage with it.

As we went through various workshops and talks curated by the Digital Asia Hub, one thing was increasingly becoming clear. People do not have a rational relationship with information. In fact, historically, the regulation of news media has been focused on how to create a rational, evidence-based narrative so that information consumers can be trained into developing a rational relationship with the information that comes to them. However, as information production and consumption patterns change, with the proliferation of new info sources and authorship, these old regulations are collapsing. We have tried very hard, even in artistic platforms like cinema, to distinguish between factual information and emotional information.

Especially in countries like India, where such disinformation has resulted in vigilante justice and lynch-mob violence, the question of how we manage the emotional tenor of our information consumption is critical. Information management giants like Facebook and its messaging service WhatsApp have come under severe scrutiny because they have become platforms of unfettered disinformation. Especially with newly-literate digital users engaging with this information on sites which are not informational but social, the viral trigger and emotional responses has been quick and uncontrolled. The tech companies have started introducing a variety of solutions — limiting the number of people a message can be forwarded to, establishing filters that mark messages as possibly suspicious, restricting the powers of group broadcasting to moderators and introducing forward marks to signal authorship.

These technical solutions are only going so far in tackling the fundamental question of emotional information. Technical solutions fall back on the management of factual information. It can provide a series of safeguards that could insert a pause between the first delivery and immediate action, but this presumes that the person receiving and sharing the information is interested in that pause. What we need, and haven’t paid enough attention to, is how we can train people into developing an emotional literacy for the age of information overload. While the technology development has to continue its filtering and managing, what we perhaps need is a people’s movement that focuses on how to give voice to and recognise the emotional expression and manipulation that these new information regimes are ushering in.

Meta

Author

Nishant Shah

Dr. Nishant Shah is the co-founder and board member of the Centre for Internet and Society in Bangalore, India, and is a professor at the Institute of Culture and Aesthetics of Digital Media at Leuphana University in Germany, and is Dean of Research at ArtEZ Graduate School, the Netherlands.