Digital Native: People Like Us

Posted by Nishant Shah at Dec 18, 2016 02:19 PM |
How the algorithm decides what you see on your timeline. If you have been hanging out on social media, there is one thing you can’t have escaped — a filter bubble. Be it demonetisation and its discontents, the fake news stories that seem to have ruined the US election, or the eternal conflict about the nature of Indian politics, your timeline must have been filled largely by people who think like you.

The article was published in the Indian Express on December 18, 2016.


From your Facebook feed to your Twitter trends, you must have been bombarded with multiple news sources, breaking news, hashtags, memes, and viral videos all more or less affirming how you feel about these issues at hand. Even when you did come across a story that you did not agree with, or a status that offended you, you would have found many others in your ever-expanding social media groups, who would have expressed their anger or dismay at the phenomenon.

The filter bubble — our self-selecting process of making alliances, connections, friends and relationships online with people like us, resulting in getting a biased, one-sided, uni-dimensional view on most public events and phenomena, has long since been presented as one of the most dangerous phenomena of our times. Filter bubbles mean that based on our social, political, cultural, geographical, ethnic, racial, religious, gendered, sexual identities and affinities, social media algorithms show us material that we are more likely to click on, share, comment on, and generate traffic, which increases their revenue. Or in other words, what you see of your friends and the people you follow, on your social media apps, is not organic, chronological or natural. It is at the mercy of an algorithm that is continually monitoring you, tracking the immense digital footprint that you possess, and constantly curating and arranging the data to make sure you stay on the site.

With the accelerated rate of the digital web, the new real estate is not location, but time. The more time a user spends with a particular app, the more they can be tracked. Longer tracking means that the algorithms have more data to look at predictable behaviour and particular user types, thus, offering more opportunities for customised advertisements that the users would click on, and generate profits for these ‘free’ apps. It is in the interest of these social media sites, then, to show us material that would keep us polarised, either into state of happiness and comfort, or in movements of anger and passion. This is why filter bubbles come into being — because the social media algorithms are constantly adjusting the material to keep us engaged, rewarding us with information and news that suits our own frame of mind, and increasing the chances of us spending more time on a platform.

However, there is another side to filter bubbles that we need to perhaps examine. A lot of attention on filter bubbles is about how we hear only one side of the story. What is missing from this narrative is not just that we hear one side of the story, but that we also hear very limited stories. As social media becomes one of the primary source for news consumption, the new filter bubbles ensure that we only receive stories that are suited to our interests as predicted by a big data driven algorithm.

So if you look at your news feed recently, you might have a variety of sources coming your way, but you might realise that in their diversity, they are very homogeneous. They pretend in their multi-media diversity to be delivering varied content but what we get instead is a limited section of perspectives on the same topics so that there is a monopoly of what gets talked about and how. The global, the viral, the popular and the paid content, thus, hides and makes invisible all the local, the niche, the less seductive or alarming but still important news that should inform our everyday practice and politics.

What we get then is the world as rendered visible by these predictive algorithms that make their choices of showing us content based on profits that they generate. In the process, we enter a filter-bubble which we can’t even see, thus losing the opportunity to deep-dive into the rich information landscape that the digital world offers. And as we get more and more entrenched in these bubbles, the alternative voices, the contentious questions, the moves to resistance, and the calls for action get buried and forgotten under the plethora of cute cats, dancing babies, alarmist conspiracy theories, and spam-like repetitive images that keep us informationally activated without allowing a deeper, more substantial engagement with the world around us.