Digital Native: Hardly Friends Like That

Posted by Nishant Shah at Sep 30, 2018 09:00 PM |
Individual effort is far from enough to fool Facebook’s grouping algorithm.

The article was published in the Indian Express on September 30, 2018


Lately, my Facebook timeline is flooded with people who are trying to “hack” Facebook’s friendship algorithm. Ever since Facebook took away the option from its users, to view their posts in reverse chronology, and made us slaves to its algorithms that pick and choose, based on opaque rules, what we see on our timeline, people have been frustrated with it. When your newsfeed is compiled by an algorithm that selects and decides what is good for you to see and what will be your interest, it doesn’t just mean that you have lost control, but that you are being manipulated without even noticing it, responding to only certain kinds of information that triggers specific responses from you.

This has led to a lot of people trying to “fool” the Facebook algorithm and taking their agency back. One of the most popular version of this is a meme that announces that Facebook algorithms only show us particular kinds of information from a certain kind of people, thus creating an echo chamber where all we do is see pictures of cute cats, dancing babies and holidays. The post suggests that if we all just talk to each other more, then we will have meaningful conversations — like, you know, about dancing cats, cute babies and where we wish to go on a holiday.

It is true that based on the nature of interaction, Facebook seems to designate some connections as strong connections. So, if we are chatting on Messenger, liking each others’s posts a lot, have many friends in common, are tagged together in the same pictures, Facebook makes a logical deduction that we have a lot in common in real life, and that we would be interested in each other more than other low-traffic connections. The meme asks people to leave a message on the post, start a conversation, and with this clever ploy, upset the Facebook algorithm. Now that we have chatted once, it suggests, Facebook is going to think we are the best of friends and is going to show us more diverse sources on the timeline.

This meme, and many like it, are attempts at taking agency in how we curate and consume our social media. Both of them are romantic, human, and absolutely flawed. They seem to think that Facebook’s algorithms follow human logic, and that they work on simple principles which we can counteract with simple actions. What they fail to take into account is that in the world of big data connections, Facebook’s algorithms draw their causal and correlative powers from more than a 100 data points which create a unique profile for each of its users. They fail to recognise that this message of resistance is still subject to the same principles of “traffic generating capacity”, and will be showed more often only for a temporary period until people stop interacting on that thread. With time and waning interest, it will die and people will be distracted by other information. They also don’t recognise that Facebook is still going to show your post largely to the same people that it has been showing your pictures to, and even if new people show engagement with it, it is not going to radically change your timeline.

While these posts are fun conversation starters, they cannot possibly be taken seriously. If Facebook’s algorithms were this easy to fool, every advertiser worth their salt would be busy manipulating the stream without spending any money on the platform. More importantly, individual actions are not going to circumvent the automation of our digital collective behaviour. To pretend that there is scope for such actions in the age of extreme customisation and profiling is a fool’s paradise. It also deflects our attention from the fact that if these are critical concerns, the responsibility of changing these conditions is not on the users but on companies like Facebooks and the governments that have to hold them accountable.

You and I, with all our good intentions, are not going to be able to “hack” Facebook’s algorithms or “fool” them into giving us results that we want. The only thing that can produce this change is strong regulation, robust policy, and taking the social media behemoth to task about how it addresses the questions of human agency and choice. So, the next time you want to produce real change, join the campaigns and ask our government to do something so that we can control our social media life.


Meta

Author

Nishant Shah

Dr. Nishant Shah is the co-founder and board member of the Centre for Internet and Society in Bangalore, India, and is a professor at the Institute of Culture and Aesthetics of Digital Media at Leuphana University in Germany, and is Dean of Research at ArtEZ Graduate School, the Netherlands.