The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 1 to 6.
Studying the Internet Discourse in India through the Prism of Human Rights
http://editors.cis-india.org/raw/blog_studying-the-internet-discourse-in-india-through-the-prism-of-human-rights
<b>This post by Deva Prasad M is part of the 'Studying Internets in India' series. Deva Prasad is Assistant Professor at the National Law School of India University (NLSIU), Bangalore. In this essay, he analyses key public discussions around Internet related issues from the human rights angle, and explores how this angle may contribute to understanding the features of the Internet discourse in India.</b>
<p> </p>
<h2>Introduction</h2>
<p>The significance of Internet as an important element and tool in day-to-day life of mankind is an established experiential fact. The intrinsic value that Internet brings to our lives has transformed the access to Internet as a necessity. Internet’s intrinsic value acts an enabling tool for information, communication and commerce to be effectively and expeditiously carried forward. It is to due to this enormous intrinsic value attached with Internet that there is an emerging trend of exploring Internet from the perspective of human rights. Moreover, Internet as a medium also helps in furtherance of human rights [1]. Social movements have attained a new lease of life with the digital activism over Internet. Arab spring is an epitome of this phenomenon.</p>
<p>There is an emerging positive trend of linking established norms of human rights with Internet. The Report of the Special Rapporteur on the right to freedom of opinion and expression has vividly explained the possibility and feasibility of extending and extrapolating the right of freedom of opinion and expression to Internet medium (Article 19 of the UDHR and the ICCPR) [2]. The Special Rapporteur also highlights the need to have access to Internet for effective enjoyment of right to freedom of opinion and expression in the digital sphere. The UN High Commissioner on Human Right’s report on‘The Right To Privacy In The Digital Age’ also explicitly highlights the significance of protecting the right to privacy in the internet medium in light of extensive “surveillance and the interception of digital communications and the collection of personal data” [3]. The extensive interception and blocking of the online communication is also a pertinent reason, which calls for human right protection to be extended to Internet.</p>
<p>The WSIS Declaration for Building of Information Society [4] and the Charter of Human Rights and Principles for the Internet [5] also have played a significant role in furthering the inter-linkage between human rights and Internet.</p>
<p>The Internet and human rights policy developments have gathered significant relevance in international human rights law and Internet policy fora. But it is interesting to note that the Indian government and state institutional mechanisms have not yet pro-actively accepted relevance of applying human rights norm to the Internet medium in India.</p>
<p>As an essay in the Studying Internet series, it is important to highlight how human rights acts as underlying factors in many socio-political issues pertaining to Internet in India. Analysis of these issues helps us to understand that, even though the Indian state turns a blind eye to the human rights element in the various socio-political issues relating to Internet, the digitally conscious Indian’s have realized their rights and even fought their own battle for exercising their rights.</p>
<p>In recent years, the Internet discourse in India has witnessed many socio-political concerns. This essay would be exploring the pertinent socio-political issues in Indian context and the underlying link to human rights thread. Globally, exploring Internet from the perspective of human rights brings out multitude of issues, which requires application of established human rights norms of right to privacy, freedom of expression, access. The story in India is no different. In this regard, three socio-political issues relating to Internet, which gained much attention in India roughly in last one year, are being analyzed. Interestingly, all three issues have an underlying thread of human right perspective connecting them and need pertinent deliberation from human rights perspective.</p>
<p> </p>
<h2>Section 66A and Freedom of Speech and Expression</h2>
<p>The lack of freedom of expression on Internet and Section 66A of Information Technology Act, 2000 is an interesting case study. Indian government used Section 66A as a tool for extensive surveillance and had taken criminal legal action against the Internet and social media users for posting the offensive comments and posts. But Section 66A was badly drafted allowing the government to initiate criminal legal action in an arbitrary and whimsical manner. Thus such a provision could be misused by the state for curbing the freedom of expression in the Internet sphere. The rampant usage of the Indian state machinery of Section 66A had led to sharp reaction amongst the Internet and social media users in India. The vagueness in language and unconstitutionality of Section 66A were criticized by legal experts. The action of state machinery in arresting a cartoonist, a professor and two girls in Maharashtra [6] (and many others) for comments and post on social media against politicians, had made it evident the lack of respect for freedom for speech and expression on Internet by the Indian state machinery (Most of these incidents took place during the year 2012). These incidents led to wide spread protest for violation of human right to freedom of speech and expression by the digital media users. When the Public Interest Litigation [7] filed by Shreya Singhal led to the Supreme Court striking down the Section 66A on 24th March, 2015 for lack of due process being followed, it was a water shed moment for internet discourse in India. The significance of human rights (especially the freedom of speech and expression) in the Internet medium got asserted.</p>
<p> </p>
<h2>Net Neutrality and Internet Access Issue</h2>
<p>The recent net neutrality debate in India has also evoked deliberation about the right of equal access to Internet and the need to maintain Internet as a democratic space. The net neutrality debate on keeping Internet a democratic space that is equally accessible to everyone has got much vogue in India. An important point that needs to be emphasized in the debate regarding net neutrality in India is the equal access question being raised. The equal access question is more a product of the lack of regulatory clarity regarding TRAI’s (Telecom Regulatory Authority of India) capacity to regulate the Over-the top (OTT) services; coupled with the lack of well stipulated right to internet access in the Indian context.</p>
<p>The net neutrality rides on the premise that the entire data available on the Internet should be equally accessible to everyone. No discrimination should be allowed regarding access to a particular website or any particular content on the Internet. Tim Wu, a renowned scholar in Internet and communication law has mentioned in his seminal work, <em>Network Neutrality and Broadband Discrimination</em>, that network neutrality signifies “an Internet that does not favor one application” [8].</p>
<p>In this regard, there has been a constructive dialogue between the Federal Communication Commission in United States and the various stakeholders. An interesting development was a proposition, which attempted to classify broadband internet service access as a public utility [9]. There is much relevance for such debates in the Indian context. India also needs public participation (especially strong voices from internet user’s perspective) to highlight these access concerns regarding Internet. Human right’s concerns regarding Internet should be pro-actively brought to the attention of regulatory institutions such as TRAI. There is need to balance the economic and for-profit interest of service providers with the larger public interest based on equal access.</p>
<p>The pressure created by public opinion through online activism upon the TRAI’s proposal to regulate the OTT services helps in understanding the power of public participation in the pertinent human rights issues relating to Internet [10]. The broader design in which the principle of human rights in the context of Internet medium would have to be asserted in India is also vividly seen in the case of protest against OTT regulation.</p>
<p> </p>
<h2>Right to be Forgotten in EU and Repercussions in India</h2>
<p>The repercussions of ‘Right to be Forgotten’ judgment of European Union also had led to debate of similar rights in Indian context. The Google v. AEPD and Mario Cosjeta [11] is an interesting case decided by the Court of Justice of European Union, where the court held that based on the right to privacy and data protection, persons could ask databases (this case was against the search engine Google) on Internet medium to curtail from referring to certain aspects of their personal information [12]. This is basically referred to as ‘right to be forgotten’.</p>
<p>Viktor Mayor Schonberg in his book <em>Delete: The Virtue of Forgetting in Digital Age</em> has elaborated the problem of how the digital age coupled with the Internet has led to store, disseminate and track information in a substantially easy way and advocates for the more informational privacy rights [13]. In this judgment, the Court of Justice of European Union has furthered the information privacy rights in the European Union with the ‘right to be forgotten’.</p>
<p>In the Indian context, it is important to note that information privacy rights are yet to evolve to the extent that of European Union with definite privacy and data protection law. But interestingly, there was a request made to a media news website by a person attempting to enforce the right to be forgotten [14]. Even though the application of right to be forgotten is not directly applicable in the Indian context, this event throws light to the fact that Internet users in India are becoming conscious of their rights in the Internet space. The way Indian news media gave relevance to the right to be forgotten ruling also is an example of how there is an implicit recognition of the interlink between human rights and Internet that is slowly seeping into the Indian milieu.</p>
<p> </p>
<h2>Internet Discourse in India and Human Rights</h2>
<p>Discussion of the three issues mentioned above points out to an important fact that human rights are not pro-actively applied to the Internet medium by the Indian state machinery. Even though the international human rights law and various Internet policy organizations are pushing the Internet and human rights agenda, the same is yet to gain momentum in India.</p>
<p>But at the same time, an interesting development that could be witnessed from the above discussion is the manner in which the Internet users are asserting their rights over the Internet and slowly paving the path for an enriching view towards applying the human rights perspective to Internet. In the first instance, the freedom of speech and expression was not pro-actively applied to the digital space and Internet. This has happened when Article 19 of Constitution of India has clearly provided for freedom of speech and expression. The second instance of net neutrality has thrown wide open the lack of clear policy regarding Internet access in Indian context. The public opinion has pointed out to the fact that there is a public interest demand to ensure that there is no discrimination in the case of Internet access. The third instance of looking at ‘right to be forgotten’ in Indian perspective, provides the understanding that the users of Internet are becoming conscious of their individual rights in the digital space in a more affirmative manner.</p>
<p>Further, the operationalization of human rights in these three instances also needs to be critically looked into. The assertion of the freedom of speech and expression in the Internet medium could be made possible effectively due to the fact that Article 19 of the Constitution of India, 1950, protects freedom of speech and expression. The vast amount of precedence existing in the field of freedom of speech and expression relating to constitutional litigation and allied jurisprudence has helped in crafting the extension of the right of freedom of expression to the digital medium of Internet. Further, using the social action tool of Public Interest Litigation, the unconstitutionality of Article 19 of the Constitution of India, 1950 could be brought before the Supreme Court.</p>
<p>But interestingly, the net neutrality issue, which is concerning the access to Internet in a non-discriminatory manner, is yet to be perceived in Indian context from a strong human rights perspective. Internet access as a public utility concept is yet to be evolved and articulated in concrete manner in the Indian context. Further, the Indian network neutrality discourse attempts to operationalize through the free market approach. In the free market approach the entire non-discriminatory access has to be ensured by the market competition with the necessary regulatory bodies. In this sense, the human rights angle of access to Internet will have to be ensured by effective competition in the market along with the proper oversight of regulatory bodies such as TRAI and Competition Commission of India. It is important for the regulatory bodies to have broad goals for furthering public interest by ensuring non-discriminatory access to Internet. Further, with the financial and infrastructure led limitations of government’s capability of ensuring access to Internet for all, the market-led model with sufficient regulation might be the right way forward.</p>
<p>Looking at the issue of the right to be forgotten, it could be easily perceived that the Indian milieu is yet to articulate privacy rights to that high standard. Even though the right to privacy is being understood in the constitutional law context through effective interpretation by the judiciary, the concept of digital privacy has not yet evolved in India. There is no collective understanding, till now, that has emerged regarding right to be forgotten in India. Even though individual attempts to assert the right was witnessed, there is much room for an evolved collective understanding in Indian context. Civil society organizations would have a crucial role to play in this regard.</p>
<p>There is an emerging consciousness amongst a set of Internet users in India, who values and gives importance to the Internet being a democratic space, without unwanted restriction from the government machinery or even the private entities. Hence looking at the Internet discourse of India from the perspective of human rights, there is an implicit way in which the human rights are being applied to the Internet space. The lack of a state’s pro-active approach in asserting human rights to Internet space is highlighted by the assertions being made by the Internet users in India.</p>
<p> </p>
<h2>Way Forward</h2>
<p>For Internet to remain as a democratic space, there is need for pro-active application of these human rights norms and clear understanding in Internet governance. At present, the state of affairs in India regarding application of human rights to Internet is far from satisfactory.</p>
<p>This essay which is part of the ‘Studying Internet in India’ series, has till now done a stock taking analysis of emerging dimension of human rights and Internet in India. Lack of interest from government and state machinery to further the human rights and Internet dimension need to be seriously reconsidered. Attempting to intervene in Internet law and policy in India from the rights based approach should be an important agenda for furthering digital rights in India. For this, civil society organizations have an important role to play. Exploring the public interest could be done effectively with public participation of stakeholders. Here in, platforms such as India Internet Governance Forum could play a crucial role.</p>
<p>Apart from the civil society organizations, it is also pertinent for state and governmental institutional mechanism to also take a pro-active stance. For ensuring that the rights based approach to Internet has to be duly included in the Internet law and policy; and there should be institutional mechanism, which could look into areas pertaining to human rights and Internet. It is a well know fact that India lacks institutional mechanism for looking into communication and privacy issues regulation. Further, the National Human Rights Commission (NHRC) also needs to look at the relevance of human rights for Internet. Inspiration could be drawn from the pioneering work of Australian Commission of Human Rights on applying human rights norms and standards to Internet medium [15]. This essay has only flagged the need to apply the established human rights norms to Internet space. Much more issues such as access to Internet by disabled, safety of children and Internet medium are also pertinent areas.</p>
<p>Moreover, it is important to have digital rights of Internet users in India to be explicitly enshrined in a legal framework. Presently, a gap in law and policy framework regarding human rights and Internet is evident, as highlighted in this essay. The pertinent questions regarding access, privacy and freedom of expression are to be taken seriously by the government and state machinery for which clear and well-defined rights relating to Internet space have to be framed. For Internet and human rights to be taken seriously, it is high time that legal and institutional framework to explore these issues also are evolved.</p>
<p> </p>
<h2>Emphasizing the Right to Communication in India</h2>
<p>Further, the present understanding of right to communication in India, which is perceived in narrow manner, could be re-worked with the help of a pro-active application of human rights norms to the Internet governance. The intrusion into the freedom of speech and expression especially in the telecommunication context has to be highlighted. Protection of communal harmony has been used as rationale for capping the number of the SMS messages that could be sent per day during the exodus of people of Northeastern states origin from Bangalore, Pune and other major cities in India.</p>
<p>This move has been criticized for being unreasonable and universality of capping the number of SMS messages [16]. Further, the telecommunication and Internet services (especially Facebook and YouTube) were blocked in Kashmir for restricting the protest [17]. The telecommunication and Internet services were blocked on the grounds of protection of national security. The reasonableness of restrictions that could be imposed on right to communication is a major concern in the above-mentioned instances. Making a blanket ban applicable in a universal manner undermines the right to communication of various genuine users of bulk messaging and social media sites.</p>
<p>The right to communication especially in the digital and telecommunication media needs to be emphasized. Applying human rights perspective and norms to Internet governance would help in articulating and evolving the right to communication in India. With adequate institutional oversight, the human rights norms could make the digital right to communication an effective right.</p>
<p>To conclude, the Internet discourse in India has already paved path for human rights norms to be applied to Internet space. The seriousness that could be attributed to those rights is evident by the assertions by the Internet users in India. But the state and government machinery in India also should explore the human rights and Internet agenda seriously.</p>
<p> </p>
<h2>Endnotes</h2>
<p>[1] Frank La Rue, Report Of The Special Rapporteur On The Promotion And Protection Of The Right To Freedom Of Opinion And Expression, Available at <a href="http://www2.ohchr.org/english/bodies/hrcouncil/docs/17session/A.HRC.17.27_en.pdf">http://www2.ohchr.org/english/bodies/hrcouncil/docs/17session/A.HRC.17.27_en.pdf</a> (Last accessed on 25/05/2015).</p>
<p>[2] Ibid, Special Rapporteur in the Report points out that the language of Article 19 of ICCPR is media neutral and is applicable to online media technological developments also. Para 20 and 21 of the Report.</p>
<p>[3] UN High Commissioner on Human Right, Report on ‘The Right To Privacy In The Digital Age’, Available at <a href="http://www.ohchr.org/EN/HRBodies/HRC/RegularSessions/Session27/Documents/A.HRC.27.37_en.pdf">http://www.ohchr.org/EN/HRBodies/HRC/RegularSessions/Session27/Documents/A.HRC.27.37_en.pdf</a> (Last accessed on 25/05/2015).</p>
<p>[4] WSIS Declaration for Building of Information Society, Available at <a href="http://www.itu.int/wsis/docs/geneva/official/dop.html">http://www.itu.int/wsis/docs/geneva/official/dop.html</a>. (Last accessed on 25/05/2015). Article 58, WSIS Declaration reads as follows: “The use of ICTs and content creation should respect human rights and fundamental freedoms of others, including personal privacy, and the right to freedom of thought, conscience, and religion in conformity with relevant international instruments”.</p>
<p>[5] Charter of Human Rights and Principles for the Internet Available at <a href="http://internetrightsandprinciples.org/site/wp-content/uploads/2013/10/IRP_booklet_final1.pdf">http://internetrightsandprinciples.org/site/wp-content/uploads/2013/10/IRP_booklet_final1.pdf</a>, (Last accessed on 25/05/2015).</p>
<p>[6] See Section 66A:Six Cases That Sparked Debate, Available at <a href="http://www.livemint.com/Politics/xnoW0mizd6RYbuBPY2WDnM/Six-cases-where-the-draconian-Section-66A-was-applied.html">http://www.livemint.com/Politics/xnoW0mizd6RYbuBPY2WDnM/Six-cases-where-the-draconian-Section-66A-was-applied.html</a>, (Last accessed on 25/05/2015). Also see, Facebook Trouble:10 Cases of Arrest Under Section 66A of IT Act, Available at <a href="http://www.hindustantimes.com/india-news/facebook-trouble-people-arrested-under-sec-66a-of-it-act/article1-1329883.aspx">http://www.hindustantimes.com/india-news/facebook-trouble-people-arrested-under-sec-66a-of-it-act/article1-1329883.aspx</a> (Last accessed on 25/05/2015).</p>
<p>[7] Shreya Singhal v. Union of India, Available at <a href="http://indiankanoon.org/doc/110813550/">http://indiankanoon.org/doc/110813550/</a> (Last accessed on 25/05/2015).</p>
<p>[8] Tim Wu, Network Neutrality, Broadband Discrimination, Available at <a href="https://cdt.org/files/speech/net-neutrality/2005wu.pdf">https://cdt.org/files/speech/net-neutrality/2005wu.pdf</a> (Last accessed on 25/05/2015).</p>
<p>[9] F.C.C. Approves Net Neutrality Rules, Classifying Broadband Internet Service as a Utility, Available at <a href="http://www.nytimes.com/2015/02/27/technology/net-neutrality-fcc-vote-internet-utility.html">http://www.nytimes.com/2015/02/27/technology/net-neutrality-fcc-vote-internet-utility.html</a> (Last accessed on 25/05/2015).</p>
<p>[10] The online campaign by www.savetheinternet.in and the AIB video have played a crucial role in gathering public support.</p>
<p>[11] Court of Justice of European Union, Case C-131/12.</p>
<p>[12] Rising like a Phoenix: The ‘Right to be Forgotten’ before the ECJ, Available at <a href="http://europeanlawblog.eu/?p=2351">http://europeanlawblog.eu/?p=2351</a> (Last accessed on 25/05/2015).</p>
<p>[13] Viktor Mayor Schonberg, Delete: The Virtue of Forgetting in Digital Age, Princeton University Press (2009).</p>
<p>[14] Right to be Forgotten Poses A Legal Dilemma in India, Available at <a href="http://www.livemint.com/Industry/5jmbcpuHqO7UwX3IBsiGCM/Right-to-be-forgotten-poses-a-legal-dilemma-in-India.html">http://www.livemint.com/Industry/5jmbcpuHqO7UwX3IBsiGCM/Right-to-be-forgotten-poses-a-legal-dilemma-in-India.html</a>, (Last accessed on 25/05/2015). Also see We received a Right to be Forgotten request from an Indian user, Available at <a href="http://www.medianama.com/2014/06/223-right-to-be-forgotten-india/">http://www.medianama.com/2014/06/223-right-to-be-forgotten-india/</a> (Last accessed on 25/05/2015).</p>
<p>[15] Human Rights and Internet, Available at <a href="https://www.humanrights.gov.au/our-work/rights-and-freedoms/projects/human-rights-and-internet">https://www.humanrights.gov.au/our-work/rights-and-freedoms/projects/human-rights-and-internet</a> (Last accessed on 25/05/2015).</p>
<p>[16] Chinmayi Arun, SMS Block as Threat to Free Speech, Available at <a href="http://cis-india.org/internet-governance/www-the-hindubusinessline-op-ed-sep-1-2012-chinmayi-arun-sms-block-as-threat-to-free-speech">http://cis-india.org/internet-governance/www-the-hindubusinessline-op-ed-sep-1-2012-chinmayi-arun-sms-block-as-threat-to-free-speech</a> (Last accessed on 15/07/2015).</p>
<p>[17] Pamposh Raina and Betwa Sharma, Telecom Services Blocked to Curb Protests in Kashmir, Available at <a href="http://india.blogs.nytimes.com/2012/09/21/telecom-services-blocked-to-curb-protests-in-kashmir/?_r=0">http://india.blogs.nytimes.com/2012/09/21/telecom-services-blocked-to-curb-protests-in-kashmir/?_r=0</a> (Last accessed on 15/07/2015).</p>
<p> </p>
<p><em>Author's Note: All the views expressed are my own and in no way are linked to the opinion of my employers. I thank CIS for this opportunity to explore Internet and Human Rights interface in India as part of the Studying Internet in India essay series.</em></p>
<p><em>Note: The post is published under <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank">Creative Commons Attribution 4.0 International</a> license, and copyright is retained by the author.</em></p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/blog_studying-the-internet-discourse-in-india-through-the-prism-of-human-rights'>http://editors.cis-india.org/raw/blog_studying-the-internet-discourse-in-india-through-the-prism-of-human-rights</a>
</p>
No publisherDeva Prasad MHuman RightsInternet StudiesRAW BlogHuman Rights OnlineResearchers at Work2015-07-22T04:18:37ZBlog EntryPrivacy after Big Data: Compilation of Early Research
http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-compilation-of-early-research
<b>Evolving data science, technologies, techniques, and practices, including big data, are enabling shifts in how the public and private sectors carry out their functions and responsibilities, deliver services, and facilitate innovative production and service models to emerge. In this compilation we have put together a series of articles that we have developed as we explore the impacts – positive and negative – of big data. This is a growing body of research that we are exploring and
is relevant to multiple areas of our work including privacy and surveillance. Feedback and comments on the compilation are welcome and appreciated.</b>
<p> </p>
<h4><a href="https://github.com/cis-india/website/raw/master/docs/CIS_PrivacyAfterBigData_CompilationOfEarlyResearch_2016.11.pdf">Download the Compilation</a> (PDF)</h4>
<hr />
<h3><strong>Privacy after Big Data</strong></h3>
<p>Evolving data science, technologies, techniques, and practices, including big data, are enabling shifts in how the public and private sectors carry out their functions and responsibilities, deliver services, and facilitate innovative production and service models to emerge. For example, in the public sector, the Indian government has considered replacing the traditional poverty line with targeted subsidies based on individual household income and assets. The my.gov.in platform is aimed to enable participation of the connected citizens, to pull in online public opinion in a structured manner on key governance topics in the country. The 100 Smart Cities Mission looks forwards to leverage big data analytics and techniques to deliver services and govern citizens within city sub-systems. In the private sector, emerging financial technology companies are developing credit scoring models using big, small, social, and fragmented data so that people with no formal credit history can be offered loans. These models promote efficiency and reduction in cost through personalization and are powered by a wide variety of data sources including mobile data, social media data, web usage data, and passively collected data from usages of IoT or connected devices.</p>
<p>These data technologies and solutions are enabling business models that are based on the ideals of ‘less’: cash-less, presence-less, and paper-less. This push towards an economy premised upon a foundational digital ID in a prevailing condition of absent legal frameworks leads to substantive loss of anonymity and privacy of individual citizens and consumers vis-a-vis both the state and the private sector. Indeed, the present use of these techniques run contrary to the notion of the ‘sunlight effect’ - making the individual fully transparent (often without their knowledge) to the state and private sector, while the algorithms and means of reaching a decision are opaque and inaccessible to the individual.</p>
<p>These techniques, characterized by the volume of data processed, the variety of sources data is processed from, and the ability to both contextualize - learning new insights from disconnected data points - and de-contextualize - finding correlation rather than causation - have also increased the value of all forms of data. In some ways, big data has made data exist on an equal playing field as far as monetisation and joining up are concerned. Meta data can be just as valuable to an entity as content data. As data science techniques evolve to find new ways of collecting, processing, and analyzing data - the benefits of the same are clear and tangible, while the harms are less clear, but significantly present.</p>
<p>Is it possible for an algorithm to discriminate? Will incorrect decisions be made based on data collected? Will populations be excluded from necessary services if they do not engage with certain models or do emerging models overlook certain populations? Can such tools be used to surveil individuals at a level of granularity that was formerly not possible and before a crime occurs? Can such tools be used to violate rights – for example target certain types of speech or groups online? And importantly, when these practices are opaque to the individual, how can one seek appropriate and effective remedy.</p>
<p>Traditionally, data protection standards have defined and established protections for certain categories of data. Yet, data science techniques have evolved beyond data protection principles. It is now infinitely harder to obtain informed consent from an individual when data that is collected can be used for multiple purposes by multiple bodies. Providing notice for every use is also more difficult – as is fulfilling requirements of data minimization. Some say privacy is dead in the era of big data. Others say privacy needs to be re-conceptualized, while others say protecting privacy now, more than ever, requires a ‘regulatory sandbox’ that brings together technical design, markets, legislative reforms, self regulation, and innovative regulatory frameworks. It also demands an expanding of the narrative around privacy – one that has largely been focused on harms such as misuse of data or unauthorized collection – to include discrimination, marginalization, and competition harms.</p>
<p>In this compilation we have put together a series of articles that we have developed as we explore the impacts – positive and negative – of big data. This includes looking at India’s data protection regime in the context of big data, reviewing literature on the benefits of harms of big data, studying emerging predictive policing techniques that rely on big data, and analyzing closely the impact of big data on specific privacy principles such as consent. This is a growing body of research that we are exploring and is relevant to multiple areas of our work including privacy and surveillance. Feedback and comments on the compilation are welcome and appreciated.</p>
<p><em>Elonnai Hickok</em><br />Director - Internet Governance</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-compilation-of-early-research'>http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-compilation-of-early-research</a>
</p>
No publisherSaumyaa NaiduHuman RightsIT ActBig DataPrivacyInternet GovernanceSmart CitiesData ProtectionInformation TechnologyPublications2016-11-12T01:37:03ZBlog EntryNew Media, personalisation and the role of algorithms
http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms
<b>In his much acclaimed book, The Filter Bubble, Eli Pariser explains how personalisation of services on the web works and laments that they are creating individual bubbles for each user, which run counter to the idea of the Internet as an inherently open place. While Pariser’s book looks at the practices of various large companies providing online services, he briefly touches upon the role of new media such as search engines and social media portals in new curation. Building upon Pariser’s unexplored argument, this article looks at the impact of algorithmic decision-making and Big Data in the context of news reporting and curation.</b>
<em><br /></em>
<blockquote>
<div>
<div><em>Everything which bars freedom and fullness of communication sets up barriers that divide human beings into sets and cliques, into antagonistic sects and factions, and thereby undermines the democratic way of life. </em>—John Dewey</div>
</div>
</blockquote>
<p> Eli Pariser, in his book, The Filter Bubble,[1] refers to the scholarship by Walter Lippmann and John Dewey as integral to the evolution of the understanding of the democratic and ethical duties of the Fourth Estate. Lippmann was disillusioned by the role of newspapers in propaganda for the First World War. He responded with three books in quick succession — Liberty and the News,[2] Public Opinion[3] and The Phantom Public.[4] Lippmann brought attention the fact that the process of news-reporting was conducted through privately determined and unexamined standards. The failure of the Fourth Estate to perform its democratic functions, was, in the opinion of Lippmann, one of the prime factors responsible for the public not being an informed and rational entity. John Dewey, while rejecting Lippmann’s arguments that matters of public policy can only be determined by inside experts with training and education, did acknowledge the his critique of the media.</p>
<p>Pariser points to the creation of a wall between editorial decisionmaking and advertiser interests, as the eventual result of the Lippmann and Dewey debate. While accepting that this division between the financial and reporting sides of media houses has not been always observed, Pariser emphasises that the fact that the standard exists is important.[5] Unlike traditional media, the new media which relies on algorithmic decision-making for personalisation is not subject to the same standards which try to mitigate the influence of commercial interests on editorial decisions while performing many of the same functions as the traditional media.[6] </p>
<h3>How personalisation algorithms work</h3>
<p dir="ltr">Kevin Slavin, at his famous talk in the TEDGLobal Conference, characterised algorithms as “maths that computers use to decide stuff” and that it was infiltrating every aspect of our lives.[7] According to Slavin’s view, algorithms can be seen as control technologies and shape our world constantly through media and information systems, dynamically modifying content and function through these programmed routines. Search engines and social media platforms perpetually rank user-generated content through algorithms.[8]</p>
<p>Personalisation technologies have various advantages. It translates into more relevant content, which for service providers means more clicks and revenue and for consumer, less time spent on finding the content.[9] However, it also leads to privacy compromise, lack of control and reduced individual capability.[10] Search engines like Google use the famous PageRank algorithm, which combined with geographical location and previous searches yields most relevant search results.[11] PageRank algorithm uses various real time variables dependent on both voluntary and involuntary user inputs. These variables include number of clicks, number of occurrences of the key terms and number of references by other credible pages etc. This data in turn determines the order of pages in search results and influences the way we perceive, understand and analyse information.[12] Maps showing real time traffic information retrieve data from laser and infrared sensors alongside the road and from information from devices of users. Once this real time data is combined with historical trends, these maps recommend rout to every user, hence influencing the traffic patterns.[13]</p>
<p>Even though this phenomenon of personalization may appears to be new, it has been prevalent in the society for ages.[14] The history of mass media culture clearly shows personalization has always been a method to increase market, market reach and customer satisfaction.[15] Newspapers have sections dedicated to special topics, radio and TV have channels dedicated to different interest groups, age groups and consumers.[16] These personalised sections in a newspaper and personalised channels on radio and television don’t just provide greater satisfaction to the readers or listeners or consumers, they also provide targeted advertisement space for the advertisers and content developers. However, digital footprints and mass collection of data have made this phenomenon much more granular and detailed. Geographical location of an individual can tell a lot about their community, their culture and other important traits local to a community.[17] This data further assists in personalisation. Current developments in technology not only help in better collection of data about personal preferences but also help in better personalisation.</p>
<p>Pariser mentions three ways in which the personalization technologies of this day are different from those of the past. First, for the very first time, individuals are alone in the filter bubble. While in traditional forms of personalisation, there were various individuals who shared the same frame of reference, now there is a separate sets of filters governing the dissemination of content to each individual.[18] Second, the personalisation technologies are entirely invisible now, and there is little that consumers can do to control or modify them.[19] Third, often the decision to be subject to these personalisation technologies is not an informed choice. A good example of this would be an individual’s geographical location.[20]</p>
<h3>The neutrality of New Media?</h3>
<p dir="ltr">More and more, we have noticed personalisation technologies having an impact on how we consume news on the Internet. Google News, Facebook’s News Feed which tries to put together a dynamic feed for both personal and global stories, and Twitter’s trending hashtag feature, have brought forward these services are key drivers of an emerging news ecosystem. Initially, this new media was hailed as a natural consequence of the Internet which would enable greater public participation, allow journalists to find more stories and engage with the readers directly. An illustration of the same could be seen in the way Internet based news media and social networking websites behaved in the aftermath of Israel’s attacks on a United Nations run school in Gaza strip. While much of the international Internet media covered the story, Israel’s home media did not cover the story. The only exception to this was the liberal Israeli news website Ha’aretz.[21] Network graph details of Twitter, for a few days immediately after the incident clearly show the social media manifestation of the event in the personalised cyberspace. It is clearly visible that when most of the word was re-tweeting news of this heinous act of Israel, Israeli’s hardly re-tweeted this news. In fact they were busty re-tweeting the news of rocket attacks on Israel.[22]</p>
<p>The use of social media in newsmaking was hailed by many scholars as symptomatic of the decentralisation characteristic of the Internet. It has been seen as movement towards greater grassroots participation by negating the ‘gatekeeping’ role traditionally played by editors. Thomas Poell and José van Dijck punch holes in theory of social media and other online technologies as mere facilitators of user participation and translators of user preferences through Big Data analytics.[23] They quote T. Gillespie’s work which talks of the narrative of these online services as platforms which are “open, neutral, egalitarian and progressive support for activity.”[24]</p>
<p>Pedro Domingos calls the overwhelming number of choices as the defining problem of the information age, and machine learning and data analytics as the largest part of this solution.[25] The primary function of algorithmic decision making in the context of consumption of content is to narrow down the choices. Domingos is more optimistic about the impact of these technologies, and he says “last step of the decision is usually still for humans to make, but learners intelligently reduce the choices to something a human can manage.”[26] On the other hand, Pariser is more circumspect about the coercive result of machine learning algorithms. Whichever way we lean, we have to accept that a large part of personalisation algorithms is to select and prioritize content by categorising it on the basis of relevance and popularity. </p>
<p>Poell and van Dijck call this a new knowledge logic which in effect replaces human judgement (as, earlier exercised by editors) to some kind of proxy decisionmaking based on data. Their main thesis is that there is little evidence to suggest that the latter is more democratic than former and creates new problems of its own. They go on to compare the practices of various services including Facebook’s new graph and Twitter’s trending topic, and conclude that they prioritise breaking news stories over other kinds of content.[27] For instance, the algorithm for the trending topics depends not on the volume but the velocity of the tweets with the hashtag or term. It could be argued that given this predilection, the algorithms will rarely prefer complex content. If we go by Lippmann and Dewey’s idea that the role of the Fourth Estate is to inform public debate and accountability of those in positions of power, this aspect of Big Data algorithms does not correspond with this role.</p>
<h3>Quantified Audience</h3>
<p dir="ltr">Another aspect of use of Big Data and algorithms in New Media that requires attention is that the networked infrastructure enables a quantified audience. C W Anderson who has studied newsroom practices in the US looked at role played by audience quantification and rationalization in shifting newswork practices. He concluded that more and more, journalists are less autonomous in their news decisions and increasingly reliant on audience metrics as a supplement to news judgment.[28] Poell and van Dijck review the the practices by some leading publications such a New York Times, L.A. Times and Huffington Post, and degree to which audience metrics dictates editorial decisions. While New York Times seems to prioritise content on their social media portals based on expectation of spike in user traffic, L.A. Times goes one step further by developing content specifically aimed towards promoting greater social participation. Neither of these practices though compare to the reliance on SEO and SMO strategies of web-born news providers like Huffington Post. They have traffic editors who trawl the Internet for trending topics and popular search terms, the feedback from them dictates the content creation.[29]</p>
<h3>Conclusion</h3>
<p dir="ltr">The above factors demonstrate that the idea of New Media leading to the Fourth Estate performing its democratic functions does not take into account the actual practices. This idea is based on the erroneous assumption that technology, in general and algorithms, in particular are neutral. While the emergence of New Media might have reduced the gatekeeping role played by the editors, its strong prioritisation of content that will be popular reduce the validity of arguments that it leads to more informed public discussion. As Pariser said, the traditional media scores over the New Media inasmuch as there is an existence of a standard of division between editorial decisionmaking and advertiser interest. While this standard is flouted by media houses all the time, it exists as a metric to aspire to and measure service providers against. The New Media performs many of the same functions and maybe it is time to evolve some principles and ethical standards that take into account the need for it to perform these democratic functions.</p>
<h3>Endnotes </h3>
<p class="normal"><sup><sup>[1]</sup></sup> Eli Pariser, The Filter Bubble: What the Internet is
hiding from you (The Penguin Press, New York, 2011) </p>
<p dir="ltr"><span class="MsoFootnoteReference"><span class="MsoFootnoteReference">[2]</span></span> Walter Lippmann, Liberty and News (Harcourt, Brace
and Howe, New York 1920) available at<a href="https://archive.org/details/libertyandnews01lippgoog">https://archive.org/details/libertyandnews01lippgoog</a></p>
<p class="normal"><sup><sup>[3]</sup></sup> Walter Lippmann, Public Opinion (Harcourt, Brace and
Howe, New York 1920) available at <a href="http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html">http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html</a></p>
<p class="normal"><sup><sup>[4]</sup></sup> Walter Lippmann, The Phantom Public (Transaction
Publishers, New York, 1925)</p>
<p class="normal"><sup><sup>[5]</sup></sup> <em>Supra</em> Note
1 at 35.</p>
<p class="normal"><sup><sup>[6]</sup></sup> <em>Supra</em> Note
1 at 36.</p>
<p class="normal"><sup><sup>[7]</sup></sup> <a href="https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en">https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en</a></p>
<p class="normal"><sup><sup>[8]</sup></sup> Fenwick McKelvey, “Algorithmic Media Need Democratic
Methods: Why Publics Matter”, available at <a href="http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf">http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf</a>.</p>
<p class="normal"><sup><sup>[9]</sup></sup> <a href="http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1">http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1</a></p>
<p class="normal"><sup><sup>[10]</sup></sup> Helen Ashman, Tim Brailsford, Alexandra Cristea, Quan
Z Sheng, Craig Stewart, Elaine Torns and Vincent Wade, “The ethical and social
implications of personalization technologies for e-learning” available at <a href="http://www.sciencedirect.com/science/article/pii/S0378720614000524">http://www.sciencedirect.com/science/article/pii/S0378720614000524</a>.</p>
<p class="normal"><sup><sup>[11]</sup></sup> Sergey Brin and Lawrence Page, “The Anatomy of a
Large-Scale Hypertextual Web Search Engine” available at <a href="http://infolab.stanford.edu/pub/papers/google.pdf">http://infolab.stanford.edu/pub/papers/google.pdf</a>.</p>
<p class="normal"><sup><sup>[12]</sup></sup> Ian Rogers, “The Google Pagerank Algorithm and How It
Works” available at <a href="http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm">http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm</a>.</p>
<p class="normal"><sup><sup>[13]</sup></sup> Trygve Olson and Terry Nelson, “The Internet’s Impact
on Political Parties and Campaigns”, available at <a href="http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942">http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942</a>.</p>
<p class="normal"><sup><sup>[14]</sup></sup> Ian Witten, “Bias, privacy and and personalisation on
the web”, available at <a href="http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf">http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf</a>.</p>
<p class="normal"><sup><sup>[15]</sup></sup> <em>Supra</em> Note
1 at 10.</p>
<p class="normal"><sup><sup>[16]</sup></sup> <a href="https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/">https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/</a></p>
<p class="normal"><sup><sup>[17]</sup></sup> Charles Heatwole, “Culture: A Geographical Perspective”
available at <a href="http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html">http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html</a>.</p>
<p class="normal"><sup><sup>[18]</sup></sup> <em>Supra</em> Note
1 at 10.</p>
<p class="normal"><sup><sup>[19]</sup></sup> <em>Id</em>.</p>
<p class="normal"><sup><sup>[20]</sup></sup> <em>Supra</em> Note
1 at 11.</p>
<p class="normal"><sup><sup>[21]</sup></sup> Paul Mason, “Why Israel is losing the social media
war over Gaza?” available at <a href="http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182">http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182</a>.</p>
<p class="normal"><sup><sup>[22]</sup></sup> Gilad Lotan, Israel, Gaza, War & Data: Social
Networks and the Art of Personalizing Propaganda available at <a href="http://www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html">www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html</a></p>
<p class="normal"><sup><sup>[23]</sup></sup> Thomas Poell and José van Dijck, “Social Media and
Journalistic Independence” in Media Independence: Working with Freedom or
Working for Free?, edited by James Bennett & Niki Strange. (Routledge,
London, 2015)</p>
<p class="normal"><sup><sup>[24]</sup></sup> T Gillespie, “The politics of ‘platforms,” in New
Media & Society (Volume 12, Issue 3).</p>
<p class="normal"><sup><sup>[25]</sup></sup> Pedro Domingos, The Master Algorithm: How the quest
for the ultimate learning machine will re-make the world (Basic Books, New
York, 2015) at 38.</p>
<p class="normal"><sup><sup>[26]</sup></sup> <em>Ibid</em> at 40.</p>
<p class="normal"><sup><sup>[27]</sup></sup> <em>Supra</em> Note
23.</p>
<p class="normal"><sup><sup>[28]</sup></sup> C W Anderson, Between creative and quantified
audiences: Web metrics and changing patterns of newswork in local US newsrooms,
available at <a href="https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms">https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms</a></p>
<p dir="ltr">
<sup><sup>[29]</sup></sup> <em>Supra </em>Note 23.</p>
<p dir="ltr"><span id="docs-internal-guid-24b4db2a-a606-d425-16ff-1d76b980367d"><br /></span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms'>http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms</a>
</p>
No publisheramberHuman RightsBig DataInternet GovernanceMachine LearningAlgorithmsNew Media2017-01-16T07:20:52ZBlog EntryCIS Submission to the UN Special Rapporteur on Freedom of Speech and Expression: Surveillance Industry and Human Rights
http://editors.cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights
<b>CIS responded to the call for submissions from the UN Special Rapporteur on Freedom of Speech and Expression. The submission was on the Surveillance Industry and Human Rights.</b>
<p>CIS is grateful for the opportunity to submit the United Nations (UN) Special Rapporteur on call for submissions on the surveillance industry and human rights.1 Over the last decade, CIS has worked extensively on research around state and private surveillance around the world. In this response, individuals working at CIS wish to highlight these programs, with a special focus on India.</p>
<p>The response can be accessed <a href="https://cis-india.org/internet-governance/resources/the-surveillance-industry-and-human-rights.pdf">here</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights'>http://editors.cis-india.org/internet-governance/blog/cis-submission-to-the-un-special-rapporteur-on-freedom-of-speech-and-expression-surveillance-industry-and-human-rights</a>
</p>
No publisherElonnai Hickok, Arindrajit Basu, Gurshabad Grover, Akriti Bopanna, Shweta Mohandas, Martyna KalvaityteHuman RightsInternet GovernanceSurveillance2019-02-20T10:48:24ZBlog EntryCentre for Internet and Society joins the Dynamic Coalition for Platform Responsibility
http://editors.cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility
<b>The Centre for Internet and Society (CIS) has joined the multistakeholder cooperative engagement amidst stakeholders towards creating Due Diligence Recommendations for online platforms and Model Contractual Provisions to be enshrined in ToS. This blog provides a brief background of the role of dynamic coalitions within the IGF structure, establishes the need for the coalition and provides an update on the action plan and next steps for interested stakeholders.</b>
<p class="callout" style="text-align: justify; ">"Identify emerging issues, bring them to the attention of the relevant bodies and the general public, and, where appropriate, make recommendations."<br />Tunis Agenda (Para 72.g)</p>
<p style="text-align: justify; ">The first United Nations Internet Governance Forum (IGF), in 2006 saw the emergence of the concept of Dynamic Coalition and a number of coalitions have been established over the years. The IGF is structured to bring together multistakeholder groups to,</p>
<p class="callout" style="text-align: justify; ">"Discuss public policy issues related to key elements of Internet governance in order to foster the sustainability, robustness, security, stability and development of the Internet."<br />Tunis Agenda (Para 72.a)</p>
<p style="text-align: justify; ">While IGF workshops allow various stakeholders to jointly analyse "hot topics" or to examine progress that such issues have undertaken since the previous IGF, dynamic coalitions are informal, issue-specific groups comprising members of various stakeholder groups. With no strictures upon the objects, structure or processes of dynamic coalitions claiming association with the IGF, and no formal institutional affiliation, nor any access to the resources of the IGF Secretariat, IGF Dynamic Coalitions allow collaboration of anyone interested in contributing to their discussions. Currently, there are eleven active dynamic coalitions at the IGF and can be divided into three distinct types—networks, working groups and Birds of Feather (BOFs).</p>
<p style="text-align: justify; ">Workshops at the IGF are content specific events that, though valuable in informing participants, are limited in their impact by being confined to the launch of a report or by the issues raised within the conference room. The coalitions on the other hand are expected to have a broader function, acting as a coalescing point for interested stakeholders to gather and analyse progress around identified issues and plan next steps. The coalitions can also make recommendations around issues, however, no mechanism has been developed so far, by which the recommendations can be considered by the plenary body. The long-term nature of coalition is perhaps, most suited to engage stakeholders in heterogeneous groups, towards understanding and cooperating around emerging issues and to make recommendations to inform policy making.</p>
<h3 style="text-align: justify; ">Platform Responsibility</h3>
<p style="text-align: justify; ">Social networks and other interactive online services, give rise to 'cyber-spaces' where individuals gather, express their personalities and exchange information and ideas. The transnational and private nature of such platforms means that they are regulated through contractual provisions enshrined in the platforms' Terms of Service (ToS). The provisions delineated in the ToS not only extend to users in spite of their geographical location, the private decisions undertaken by platform providers in implementing the ToS are not subject to constitutional guarantees framed under national jurisdictions.</p>
<p style="text-align: justify; ">While ToS serve as binding agreement online, an absence of binding international rules in this area despite the universal nature of human rights represented is a real challenge, and makes it necessary to engage in a multistakeholder effort to produce model contractual provisions that can be incorporated in ToS. The concept of 'platform responsibility' aims to stimulate behaviour in platform providers to provide intelligible and solid mechanisms, in line with the principles laid out by the UN Guiding Principles on Business and Human Rights and equip platform users with common and easy-to-grasp tools to guarantee the full enjoyment of their human rights online. The utilisation of model contractual provisions in ToS may prove instrumental in fostering trust in online services for content production, use and dissemination, increasing demand of services and ultimately consumer demand may drive the market towards human rights compliant solutions.</p>
<h3 style="text-align: justify; ">The Dynamic Coalition on Platform Responsibility</h3>
<p style="text-align: justify; ">To nurture a multi-stakeholder endeavour aimed at the elaboration of model contractual-provisions, Mr. Luca Belli, Council of Europe / Université Paris II, Ms Primavera De Filippi, CNRS / Berkman Center for Internet and Society and Mr Nicolo Zingales, Tilburg University / Center for Technology and Society Rio, initiated and facilitated the creation of the Dynamic Coalition on Platform Responsibility (DCPR). DCPR has over fifty individual and organisational members from civil society organisations, academia, private sector organisations and intergovernmental organisations and held its first meeting at the IGF in Istanbul. The meeting began with an overview of the concept of platform responsibility, highlighting relevant initiatives from Council of Europe, Global Network Initiative, Ranking Digital Rights and the Center for Democracy and Technology have undertaken in this regard. Existing issues such as difficulty in comprehension and lack of standardization of redress across rights were raised along with the fundamental lack of due process in terms of transparency across existing mechanisms.</p>
<p style="text-align: justify; ">Online platforms compliance to human rights is often framed around the duty of States to protect human rights and often, Internet companies do not sufficient consideration of the effects of their business practices on users fundamental rights undermining trust.</p>
<p style="text-align: justify; ">The meeting focused it efforts with a call to identify issues of process and substance and specific rights and challenges to be addressed by the DCPR. The procedural issues raised concerned 'responsibility' in decision-making e.g., giving users the right to be heard and an effective remedy before an impartial decision-making body, and obtaining their consent for changes in the contractual terms. The concerns raised around substantive rights such as privacy and freedom of expression eg., disclosure of personal information and content removal and need to promote 'responsibility' through establishing concrete mechanisms to deal with such issues.</p>
<p style="text-align: justify; ">It was suggested that concept of responsibility including in case of conflict between different rights could be grounded in Human Rights case law eg., from European Court of Human Rights jurisprudence. It was also established that any framework that would evolve from this coalition would consider the distinction between users (eg., adults, children, and people with or without continuous access to the Internet) and platforms (eg., in terms of size and functionality).</p>
<h3 style="text-align: justify; ">Action Plan</h3>
<p style="text-align: justify; ">The participants at the DCPR meeting agreed to establish a multistakeholder cooperative engagement amidst stakeholders that will go beyond dialogue and produce concrete proposals. Particularly, participants suggested developing:</p>
<ol>
<li style="text-align: justify; ">Due Diligence Recommendations: Recommendations to online platforms with regard to processes of compliance with internationally agreed human rights standards.</li>
<li style="text-align: justify; ">Model Contractual Provisions: Elaboration of a set of principles and provisions protecting platform users’ rights and guaranteeing transparent mechanisms to seek redress in case of violations.</li>
</ol>
<p style="text-align: justify; ">DCPR will ground the development of these frameworks in the preliminary step of compilation of existing projects and initiatives dealing with the analysis of ToS compatibility with human rights standards. Members, participants and interested stakeholders are invited to highlight and share relevant initiatives by 10th October regarding:</p>
<ol>
<li>Processes of due diligence for human rights compliance;</li>
<li>The evaluation of ToS cocompliance with human rights standards;</li>
</ol>
<p style="text-align: justify; ">Further to this compilation, a first recommendation draft regarding online platforms' due diligence will be circulated on the mailing list by 30th October 2014. CIS will be contributing to the drafting which will be led and elaborated by the DCPR coordinators. This draft will be open for comments via the DCPR mailing list until 30th November 2014 and we encourage you to sign up to the mailing list (<a class="external-link" href="http://lists.platformresponsibility.info/listinfo/dcpr">http://lists.platformresponsibility.info/listinfo/dcpr</a>).<br /><br />A second draft will be developed compiling the comments expressed via the mailing-list and shared for comments by 10 December 2014. The final version of the recommendation will be drafted by 30 December. Subsequently, the first set of model contractual provisions will be elaborated building upon such recommendation. A call for inputs will be issued in order to gather suggestions on the content of these provisions.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility'>http://editors.cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility</a>
</p>
No publisherjyotiHuman RightsPrivacyInternet Governance ForumData ProtectionTerms of ServiceInternet GovernancePlatform ResponsibilityIntermediary Liability2014-10-07T10:54:03ZBlog EntryBig Data in India: Benefits, Harms, and Human Rights - Workshop Report
http://editors.cis-india.org/internet-governance/big-data-in-india-benefits-harms-and-human-rights-a-report
<b>The Centre for Internet and Society held a one-day workshop on “Big Data in India: Benefits, Harms and Human Rights” at India Habitat Centre, New Delhi on the 1st of October, 2016. This report is a compilation of the the issues discussed, ideas exchanged and challenges recognized during the workshop. The objective of the workshop was to discuss aspects of big data technologies in terms of harms, opportunities and human rights. The discussion was designed around an extensive study of current and potential future uses of big data for governance in India, that CIS has undertaken over the last year with support from the MacArthur Foundation.</b>
<p> </p>
<p><strong>Contents</strong></p>
<p><a href="#1"><strong>Big Data: Definitions and Global South Perspectives</strong></a></p>
<p><a href="#2"><strong>Aadhaar as Big Data</strong></a></p>
<p><a href="#3"><strong>Seeding</strong></a></p>
<p><a href="#4"><strong>Aadhaar and Data Security</strong></a></p>
<p><a href="#5"><strong>Aadhaar’s Relational Arrangement with Big Data Scheme</strong></a></p>
<p><a href="#6"><strong>The Myths surrounding Aadhaar</strong></a></p>
<p><a href="#7"><strong>IndiaStack and FinTech Apps</strong></a></p>
<p><a href="#8"><strong>Problems with UID</strong></a></p>
<hr />
<h2 id="1">Big Data: Definitions and Global South Perspectives</h2>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">“Big Data” has been defined by multiple scholars till date. The first consideration at the workshop was to discuss various definitions of big data, and also to understand what could be considered Big Data in terms of governance, especially in the absence of academic consensus. One of the most basic ways to define it, as given by the National Institute of Standards and Technology, USA, is to take it to be the data that is beyond the computational capacity of current systems. This definition has been accepted by the UIDAI of India. Another participant pointed out that Big Data is not only indicative of size, but rather the nature of data which is unstructured, and continuously flowing. The Gartner definition of Big Data relies on the three Vs i.e. Volume (size), Velocity (infinite number of ways in which data is being continuously collected) and Variety (the number of ways in which data can be collected in rows and columns).</p>
<p style="text-align: justify;" dir="ltr">The presentation also looked at ways in which Big Data is different from traditional data. It was pointed out that it can accommodate diverse unstructured datasets, and it is ‘relational’ i.e. it needs the presence of common field(s) across datasets which allows these fields to be conjoined. For e.g., the UID in India is being linked to many different datasets, and they don’t constitute Big Data separately, but do so together. An increasingly popular definition is to define data as “Big Data” based on what can be achieved through it. It has been described by authors as the ability to harness new kinds of insight which can inform decision making. It was pointed out that CIS does not subscribe to any particular definition, and is still in the process of coming up with a comprehensive definition of Big Data.</p>
<p style="text-align: justify;" dir="ltr">Further, discussion touched upon the approach to Big Data in the Global South. It was pointed out that most discussions about Big Data in the Global South are about the kind of value that it can have, the ways in which it can change our society. The Global North, on the other hand, has moved on to discussing the ethics and privacy issues associated with Big Data.</p>
<p style="text-align: justify;" dir="ltr">After this, the presentation focussed on case studies surrounding key Central Government initiatives and projects like Aadhaar, Predictive Policing, and Financial Technology (FinTech).</p>
<h2 id="2">Aadhaar as Big Data</h2>
<p style="text-align: justify;" dir="ltr">In presenting CIS’ case study on Aadhaar, it was pointed out that initially, Aadhaar, with its enrollment dataset was by itself being seen as Big Data. However, upon careful consideration in light of definitions discussed above, it can be seen as something that enables Big Data. The different e-governance projects within Digital India, along with Aadhaar, constitute Big Data. The case study discussed the Big Data implications of Aadhaar, and in particular looked at a ‘cradle to grave’ identity mapping through various e-government projects and the datafication of various transaction generated data.</p>
<h2 id="3">Seeding</h2>
<p style="text-align: justify;" dir="ltr">Any digital identity like Aadhaar typically has three features: 1. Identification i.e. a number or card used to identify yourself; 2. Authentication, which is based on your number or card and any other digital attributes that you might have; 3. Authorisation: As bearers of the digital identity, we can authorise the service providers to take some steps on our behalf. The case study discussed ‘seeding’ which enables the Big Data aspects of Digital India. In the process of seeding, different government databases can be seeded with the UID number using a platform called Ginger. Due to this, other databases can be connected to UIDAI, and through it, data from other databases can be queried by using your Aadhaar identity itself. This is an example of relationality, where fractured data is being brought together. At the moment, it is not clear whether this access by UIDAI means that an actual physical copy of such data from various sources will be transferred to UIDAI’s servers or if they will just access it through internet, but the data remains on the host government agency’s server. An example of even private parties becoming a part of this infrastructure was raised by a participant when it was pointed out that Reliance Jio is now asking for fingerprints. This can then be connected to the relational infrastructure being created by UIDAI. The discussion then focused on how such a structure will function, where it was mentioned that as of now, it cannot be said with certainty that UIDAI will be the agency managing this relational infrastructure in the long run, even though it is the one building it.</p>
<h2 id="4">Aadhaar and Data Security</h2>
<p style="text-align: justify;" dir="ltr">This case study also dealt with the sheer lack of data protection legislation in India except for S.43A of the IT Act. The section does not provide adequate protection as the constitutionality of the rules and regulations under S.43A is ambivalent. More importantly, it only refers to private bodies. Hence, any seeding which is being done by the government is outside the scope of data protection legislation. Thus, at the moment, no legal framework covers the processes and the structures being used for datasets. Due to the inapplicability of S.43A to public bodies, questions were raised as to the existence of a comprehensive data protection policy for government institutions. Participants answered the question in the negative. They pointed out that if any government department starts collecting data, they develop their own privacy policy. There are no set guidelines for such policies and they do not address concerns related to consent, data minimisation and purpose limitation at all. Questions were also raised about the access and control over Big Data with government institutions. A tentative answer from a participant was that such data will remain under the control of the domain specific government ministry or department, for e.g. MNREGA data with the Ministry of Rural Development, because the focus is not on data centralisation but rather on data linking. As long as such fractured data is linked and there is an agency that is responsible to link them, this data can be brought together. Such data is primarily for government agencies. But the government is opening up certain aspects of the data present with it for public consumption for research and entrepreneurial purposes.The UIDAI provides you access to your own data after paying a minimal fee. The procedure for such access is still developing.</p>
<h2 id="5">Aadhaar’s Relational Arrangement with Big Data Scheme</h2>
<p style="text-align: justify;" dir="ltr">The various Digital India schemes brought in by the government were elucidated during the workshop. It was pointed out that these schemes extend to myriad aspects of a citizen’s daily life and cover all the essential public services like health, education etc. This makes Aadhaar imperative even though the Supreme Court has observed that it is not mandatory for every citizen to have a unique identity number. The benefits of such identity mapping and the ecosystem being generated by it was also enumerated during the discourse. But the complete absence of any data ethics or data confidentiality principles make us unaware of the costs at which these benefits are being conferred on us. Apart from surveillance concerns, the knowledge gap being created between the citizens and the government was also flagged. Three main benefits touted to be provided by Aadhaar were then analysed. The first is the efficient delivery of services. This appears to be an overblown claim as the Aadhaar specific digitisation and automation does not affect the way in which employment will be provided to citizens through MNREGA or how wage payment delays will be overcome. These are administrative problems that Aadhaar and associated technologies cannot solve. The second is convenience to the citizens. The fallacies in this assertion were also brought out and identified. Before the Aadhaar scheme was rolled in, ration cards were issued based on certain exclusion and inclusion criteria.. The exclusion and inclusion criteria remain the same while another hurdle in the form of Aadhaar has been created. As India is still lacking in supporting infrastructure such as electricity, server connectivity among other things, Aadhaar is acting as a barrier rather than making it convenient for citizens to enroll in such schemes.The third benefit is fraud management. Here, a participant pointed out that this benefit was due to digitisation in the form of GPS chips in food delivery trucks and electronic payment and not the relational nature of Aadhaar. Aadhaar is only concerned with the linking up or relational part. About deduplication, it was pointed out how various government agencies have tackled it quite successfully by using technology different from biometrics which is unreliable at the best of times.</p>
<h2 id="6">The Myths surrounding Aadhaar</h2>
<p style="text-align: justify;" dir="ltr">The discussion also reflected on the fact that Aadhaar is often considered to be a panacea that subsumes all kinds of technologies to tackle leakages. However, this does not take into account the fact that leakages happen in many ways. A system should have been built to tackle those specific kinds of leakages, but the focus is solely on Aadhaar as the cure for all. Notably, participants who have been a part of the government pointed out how this myth is misleading and should instead be seen as the first step towards a more digitally enhanced country which is combining different technologies through one medium.</p>
<h2 id="7">IndiaStack and FinTech Apps</h2>
<h3 id="71">What is India Stack?</h3>
<p style="text-align: justify;" dir="ltr">The focus then shifted to another extremely important Big Data project, India Stack, being conceptualised and developed by a team of private developers called iStack, for the NPCI. It builds on the UID project, Jan Dhan Yojana and mobile services trinity to propagate and develop a cashless, presence-less, paperless and granular consent layer based on UID infrastructure to digitise India.</p>
<p style="text-align: justify;" dir="ltr">A participant pointed out that the idea of India Stack is to use UID as a platform and keep stacking things on it, such that more and more applications are developed. This in turn will help us to move from being a ‘data poor’ country to a ‘data rich’ one. The economic benefits of this data though as evidenced from the TAGUP report - a report about the creation of National Information Utilities to manage the data that is present with the government - is for the corporations and not the common man. The TAGUP report openly talks about privatisation of data.</p>
<h3 id="72">Problems with India Stack</h3>
<p style="text-align: justify;" dir="ltr">The granular consent layer of India Stack hasn’t been developed yet but they have proposed to base it on MIT Media Lab’s OpenPDS system. The idea being that, on the basis of the choices made by the concerned person, access to a person’s personal information may be granted to an agency like a bank. What is more revolutionary is that India Stack might even revoke this access if the concerned person expresses a wish to do so or the surrounding circumstances signal to India Stack that it will be prudent to do so. It should be pointed out that the the technology required for OpenPDS is extremely complex and is not available in India. Moreover, it’s not clear how this system would work. Apart from this, even the paperless layer has its faults and has been criticised by many since its inception, because an actual government signed and stamped paper has been the basis of a claim.. In the paperless system, you are provided a Digilocker in which all your papers are stored electronically, on the basis of your UID number. However, it was brought to light that this doesn’t take into account those who either do not want a Digilocker or UID number or cases where they do not have access to their digital records. How in such cases will people make claims?</p>
<h3 id="73">A Digital Post-Dated Cheque: It’s Ramifications</h3>
<p style="text-align: justify;" dir="ltr">A key change that FinTech apps and the surrounding ecosystem want to make is to create a digital post-dated cheque so as to allow individuals to get loans from their mobiles especially in remote areas. This will potentially cut out the need to construct new banks, thus reducing the capital expenditure , while at the same time allowing the credit services to grow. The direct transfer of money between UID numbers without the involvement of banks is a step to further help this ecosystem grow. Once an individual consents to such a system, however, automatic transfer of money from one’s bank accounts will be affected, regardless of the reason for payment. This is different from auto debt deductions done by banks presently, as in the present system banks have other forms of collateral as well. The automatic deduction now is only affected if these other forms are defaulted upon. There is no knowledge as to whether this consent will be reversible or irreversible. As Jan Dhan Yojana accounts are zero balance accounts, the account holder will be bled dry. The implication of schemes such as “Loan in under 8 minutes” were also discussed. The advantage of such schemes is that transaction costs are reduced.The financial institution can thus grant loans for the minimum amount without any additional enquiries. It was pointed out that this new system is based on living on future income much like the US housing bubble crash. Interestingly, in Public Distribution Systems, biometrics are insisted upon even though it disrupts the system. This can be seen as a part of the larger infrastructure to ensure that digital post-dated cheques become a success.</p>
<h3 id="74">The Role of FinTech Apps</h3>
<p style="text-align: justify;" dir="ltr">FinTech ‘apps’ are being presented with the aim of propagating financial inclusion. The Technology Advisory Group for Unique Projects report stated that as managing such information sources is a big task, just like electricity utilities, a National Information Utilities (NIU) should be set up for data sources. These NIUs as per the report will follow a fee based model where they will be charging for their services for government schemes. The report identified two key NIUs namely the National Payments Corporation of India (NPCI) and the Goods and Services Tax Network (GSTN). The key usage that FinTech applications will serve is credit scoring. The traditional credit scoring data sources only comprised a thin file of records for an individual, but the data that FinTech apps collect - a person’s UID number, mobile number. and bank account number all linked up, allow for a far more comprehensive credit rating. Government departments are willing to share this data with FinTech apps as they are getting analysis in return. Thus, by using UID and the varied data sources that have been linked together by UID, a ‘thick file’ is now being created by FinTech apps. Banking apps have not yet gone down the route of FinTech apps to utilise Big Data for credit scoring purposes.</p>
<p style="text-align: justify;" dir="ltr"> </p>
<p style="text-align: justify;" dir="ltr">The two main problems with such apps is that there is no uniform way of credit scoring. This distorts the rate at which a person has to pay interest. The consent layer adds another layer of complication as refusal to share mobile data with a FinTech app may lead to the app declaring one to be a risky investment thus, subjecting that individual to a higher rate of interest .</p>
<div style="text-align: justify;" dir="ltr"> </div>
<h3 id="75">Regulation of FinTech Apps and the UID Infrastructure</h3>
<p style="text-align: justify;" dir="ltr"> India Stack and the applications that are being built on it, generate a lot of transaction metadata that is very intimate in nature. The privacy aspects of the UID legislation doesn't cover such data. The granular consent layer which has been touted to cover this still has to come into existence. Also, Big Data is based on sharing and linking of data. Here, privacy concerns and Big Data objectives clash. Big Data by its very nature challenges privacy principles like data minimisation and purpose limitation.The need for regulation to cover the various new apps and infrastructure which are being developed was pointed out.</p>
<h2 id="8">Problems with UID</h2>
<p style="text-align: justify;" dir="ltr">It has been observed that any problem present with Aadhaar is usually labelled as a teething problem, it’s claimed that it will be solved in the next 10 years. But, this begs the question - why is the system online right now?</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Aadhaar is essentially a new data condition and a new exclusion or inclusion criteria. Data exclusion modalities as observed in Rajasthan after the introduction of biometric Point of Service (POS) machines at ration shops was found to be 45% of the population availing PDS services. This number also includes those who were excluded from the database by being included in the wrong dataset. There is no information present to tell us how many actual duplicates and how many genuine ration card holders were weeded out/excluded by POS.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">It was also mentioned that any attempt to question Aadhaar is considered to be an attempt to go back to the manual system and this binary thinking needs to change. Big Data has the potential to benefit people, as has been evidenced by the scholarship and pension portals. However, Big Data’s problems arise in systems like PDS, where there is centralised exclusion at the level of the cloud. Moreover, the quantity problem present in the PDS and MNREGA systems persists. There is still the possibility of getting lesser grains and salary even with analysis of biometrics, hence proving that there are better technologies to tackle these problems. Presently, the accountability mechanisms are being weakened as the poor don’t know where to go to for redressal. Moreover, the mechanisms to check whether the people excluded are duplicates or not is not there. At the time of UID enrollment, out of 90 crores, 9 crore were rejected. There was no feedback or follow-up mechanism to figure out why are people being rejected. It was just assumed that they might have been duplicates.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Another problem is the rolling out of software without checking for inefficiencies or problems at a beta testing phase. The control of developers over this software, is so massive that it can be changed so easily without any accountability.. The decision making components of the software are all proprietary like in the the de-duplication algorithm being used by the UIDAI. Thus, this leads to a loss of accountability because the system itself is in flux, none of it is present in public domain and there are no means to analyse it in a transparent fashion..</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">These schemes are also being pushed through due to database politics. On a field study of NPR of citizens, another Big Data scheme, it was found that you are assumed to be an alien if you did not have the documents to prove that you are a citizen. Hence, unless you fulfill certain conditions of a database, you are excluded and are not eligible for the benefits that being on the database afford you.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Why is the private sector pushing for UIDAI and the surrounding ecosystem?</p>
<p style="text-align: justify;" dir="ltr">Financial institutions stand to gain from encouraging the UID as it encourages the credit culture and reduces transaction costs.. Another advantage for the private sector is perhaps the more obvious one, that is allows for efficient marketing of products and services..</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">The above mentioned fears and challenges were actually observed on the ground and the same was shown through the medium of a case study in West Bengal on the smart meters being installed there by the state electricity utility. While the data coming in from these smart meters is being used to ensure that a more efficient system is developed,it is also being used as a surrogate for income mapping on the basis of electricity bills being paid. This helps companies profile neighbourhoods. The technical officer who first receives that data has complete control over it and he can easily misuse the data. This case study again shows that instruments like Aadhaar and India Stack are limited in their application and aren’t the panacea that they are portrayed to be.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">A participant pointed out that in the light of the above discussions, the aim appears to be to get all kinds of data, through any source, and once you have gotten the UID, you link all of this data to the UID number, and then use it in all the corporate schemes that are being started. Most of the problems associated with Big Data are being described as teething problems. The India Stack and FinTech scheme is coming in when we already know about the problems being faced by UID. The same problems will be faced by India Stack as well.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Can you opt out of the Aadhaar system and the surrounding ecosystem?</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">The discussion then turned towards whether there can be voluntary opting out from Aadhaar. It was pointed out that the government has stated that you cannot opt out of Aadhaar. Further, the privacy principles in the UIDAI bill are ambiguously worded where individuals only have recourse for basic things like correction of your personal information. The enforcement mechanism present in the UIDAI Act is also severely deficient. There is no notification procedure if a data breach occurs. . The appellate body ‘Cyber Appellate Tribunal’ has not been set up in three years.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">CCTNS: Big Data and its Predictive Uses</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">What is Predictive Policing?</p>
<p style="text-align: justify;" dir="ltr">The next big Big Data case study was on the Crime and Criminal Tracking Network & Systems (CCTNS). Originally it was supposed to be a digitisation and interconnection scheme where police records would be digitised and police stations across the length and breadth of the country would be interconnected. But, in the last few years some police departments of states like Chandigarh, Delhi and Jharkhand have mooted the idea of moving on to predictive policing techniques. It envisages the use of existing statistical and actuarial techniques along with many other tropes of data to do so. It works in four ways: 1. By predicting the place and time where crimes might occur; 2. To predict potential future offenders; 3. To create profiles of past crimes in order to predict future crimes; 4. Predicting groups of individuals who are likely to be victims of future crimes.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">How is Predictive Policing done?</p>
<p style="text-align: justify;" dir="ltr">To achieve this, the following process is followed: 1. Data collection from various sources which includes structured data like FIRs and unstructured data like call detail records, neighbourhood data, crime seasonal patterns etc. 2. Analysis by using theories like the near repeat theory, regression models on the basis of risk factors etc. 3. Intervention</p>
<div style="text-align: justify;" dir="ltr"> </div>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Flaws in Predictive Policing and questions of bias</p>
<p style="text-align: justify;" dir="ltr">An obvious weak point in the system is that if the initial data going into the system is wrong or biased, the analysis will also be wrong. Efforts are being made to detect such biases. An important way to do so will be by building data collection practices into the system that protect its accuracy. The historical data being entered into the system is carrying on the prejudices inherited from the British Raj and biases based on religion, caste, socio-economic background etc.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">One participant brought about the issue of data digitization in police stations, and the impact of this haphazard, unreliable data on a Big Data system. This coupled with paucity of data is bound to lead to arbitrary results. An effective example was that of black neighbourhoods in the USA. These are considered problematic and thus they are policed more, leading to a higher crime rate as they are arrested for doing things that white people in an affluent neighbourhood get away with. This in turn further perpetuates the crime rate and it becomes a self-fulfilling prophecy. In India, such a phenomenon might easily develop in the case of migrants, de-notified tribes, Muslims etc. A counter-view on bias and discrimination was offered here. One participant pointed out that problems with haphazard or poor quality of data is not a colossal issue as private companies are willing to fill this void and are actually doing so in exchange for access to this raw data. It was also pointed out how bias by itself is being used as an all encompassing term. There are multiplicities of biases and while analysing the data, care should be taken to keep it in mind that one person’s bias and analysis might and usually does differ from another. Even after a computer has analysed the data, the data still falls into human hands for implementation.</p>
<p style="text-align: justify;" dir="ltr">The issue of such databases being used to target particular communities on the basis of religion, race, caste, ethnicity among other parameters was raised. Questions about control and analysis of data were also discussed, i.e. whether it will be top-down with data analysis being done in state capitals or will this analysis be done at village and thana levels as well too. It was discussed as topointed out how this could play a major role in the success and possible persecutory treatment of citizens, as the policemen at both these levels will have different perceptions of what the data is saying. . It was further pointed out, that at the moment, there’s no clarity on the mode of implementation of Big Data policing systems. Police in the USA have been seen to rely on Big Data so much that they have been seen to become ‘data myopic’. For those who are on the bad side of Big Data, in the Indian context, laws like preventive detention can be heavily misused.There’s a very high chance that predictive policing due to the inherent biases in the system and the prejudices and inefficiency of the legal system will further suppress the already targeted sections of the society. A counterpoint was raised and it was suggested that contrary to our fears, CCTNS might lead to changes in our understanding and help us to overcome longstanding biases.</p>
<p style="text-align: justify;" dir="ltr">Open Knowledge Architecture as a solution to Big Data biases?</p>
<p style="text-align: justify;" dir="ltr">The conference then mulled over the use of ‘Open Knowledge’ architecture to see whether it can provide the solution to rid Big Data of its biases and inaccuracies if enough eyes are there. It was pointed out that Open Knowledge itself can’t provide foolproof protection against these biases as the people who make up the eyes themselves are predominantly male belonging to the affluent sections of the society and they themselves suffer from these biases.</p>
<p style="text-align: justify;" dir="ltr">Who exactly is Big Data supposed to serve?</p>
<p style="text-align: justify;" dir="ltr">The discussion also looked at questions such as who is this data for? Janata Information System (JIS), is a concept developed by MKSS where the data collected and generated by the government is taken to be for the common citizens. For e.g. MNREGA data should be used to serve the purposes of the labourers. The raw data as is available at the moment, usually cannot be used by the common man as it is so vast and full of information that is not useful for them at all. It was pointed out that while using Big Data for policy planning purposes, the actual string of information that turned out to be needed was very little but the task of unravelling this data for civil society purposes is humongous. By presenting the data in the right manner, the individual can be empowered. The importance of data presentation was also flagged. It was agreed upon that the content of the data should be for the labourer and not a MNC, as the MNC has the capability to utilise the raw data on it’s own regardless.</p>
<p style="text-align: justify;" dir="ltr">Concerns about Big Data usage</p>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">Participants pointed out that privacy concerns are usually brushed under the table due to a belief that the law is sufficient or that the privacy battle has already been lost. </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">In the absence of knowledge of domain and context, Big Data analysis is quite limited. Big Data’s accuracy and potential to solve problems needs to be factually backed.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">The narrative of Big Data often rests on the assumption that descriptive statistics take over inferential statistics, thus eliminating the need for domain specific knowledge. It is claimed that the data is so big that it will describe everything that we need to know.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">Big Data is creating a shift from a deductive model of scientific rigour to an inductive one. In response to this, a participant offered the idea that troves of good data allow us to make informed questions on the basis of which the deductive model will be formed. A hybrid approach combining both deductive and inductive might serve us best.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">The need to collect the right data in the correct format, in the right place was also expressed.</p>
</li></ol>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Potential Research Questions & Participants’ Areas of Research</p>
<p style="text-align: justify;" dir="ltr">Following this discussion, participants brainstormed to come up with potential areas of research and research questions. They have been captured below:</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Big Data, Aadhaar and India Stack:</p>
<div style="text-align: justify;" dir="ltr"> </div>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">Has Aadhaar been able to tackle illegal ways of claiming services or are local negotiations and other methods still prevalent?</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">Is the consent layer of India Stack being developed in a way that provides an opportunity to the UID user to give informed consent? The OpenPDS and its counterpart in the EU i.e. the My Data Structure were designed for countries with strong privacy laws. Importantly, they were meant for information shared on social media and not for an individual’s health or credit history. India is using it in a completely different sphere without strong data protection laws. What were the granular consent layer structures present in the West designed for and what were they supposed to protect?</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">The question of ownership of data needs to be studied especially in context of a globalised world where MNCs are collecting copious amounts of data of Indian citizens. What is the interaction of private parties in this regard?</p>
</li></ol>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Big Data and Predictive Policing:</p>
<div style="text-align: justify;" dir="ltr"> </div>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">How are inequalities being created through the Big Data systems? Lessons should be taken from the Western experience with the advent of predictive policing and other big data techniques - they tend to lead to perpetuation of the current biases which are already ingrained in the system.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">It was also pointed out how while studying these topics and anything related to technology generally, we become aware of a divide that is present between the computational sciences and social sciences. This divide needs to be erased if Big Data or any kind of data is to be used efficiently. There should be a cross-pollination between different groups of academics. An example of this can be seen to be the ‘computational social sciences departments’ that have been coming up in the last 3-4 years.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">Why are so many interim promises made by Big Data failing? A study of this phenomenon needs to be done from a social science perspective. This will allow one to look at it from a different angle.</p>
</li></ol>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Studying Big Data:</p>
<div style="text-align: justify;" dir="ltr"> </div>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">What is the historical context of the terms of reference being used for Big Data? The current Big Data debate in India is based on parameters set by the West. For better understanding of Big Data, it was suggested that P.C. Mahalanobis’ experience while conducting the Indian census, (which was the Big Data of that time) can be looked at to get a historical perspective on Big Data. This comparison might allow us to discover questions that are important in the Indian context. It was also suggested that rather than using ‘Big Data’ as a catchphrase to describe these new technological innovations, we need to be more discerning.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">What are the ideological aspects that must be considered while studying Big Data? What does the dialectical promise of technology mean? It was contended that every time there is a shift in technology, the zeitgeist of that period is extremely excited and there are claims that it will solve everything. There’s a need to study this dialectical promise and the social promise surrounding it.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">Apart from the legitimate fears that Big Data might lead to exclusion, what are the possibilities in which it improve inclusion too?</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">The diminishing barrier between the public and private self, which is a tangent to the larger public-private debate was mentioned.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">How does one distinguish between technology failure and process failure while studying Big Data? </p>
</li></ol>
<div style="text-align: justify;" dir="ltr"> </div>
<div style="text-align: justify;" dir="ltr"> </div>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Big Data: A Friend?</p>
<p style="text-align: justify;" dir="ltr">In the concluding session, the fact that the Big Data moment cannot be wished away was acknowledged. The use of analytics and predictive modelling by the private sector is now commonplace and India has made a move towards a database state through UID and Digital India. The need for a nuanced debate, that does away with the false equivalence of being either a Big Data enthusiast or a luddite is crucial.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">A participant offered two approaches to solving a Big Data problem. The first was the Big Data due process framework which states that if a decision has been taken that impacts the rights of a citizen, it needs to be cross examined. The efficacy and practicality of such an approach is still not clear. The second, slightly paternalistic in nature, was the approach where Big Data problems would be solved at the data science level itself. This is much like the affirmative algorithmic approach which says that if in a particular dataset, the data for the minority community is not available then it should be artificially introduced in the dataset. It was also suggested that carefully calibrated free market competition can be used to regulate Big Data. For e.g. a private personal wallet company that charges higher, but does not share your data at all can be an example of such competition. </p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">Another important observation was the need to understand Big Data in a Global South context and account for unique challenges that arise. While the convenience of Big Data is promising, its actual manifestation depends on externalities like connectivity, accurate and adequate data etc that must be studied in the Global South.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<p style="text-align: justify;" dir="ltr">While the promises of Big Data are encouraging, it is also important to examine its impacts and its interaction with people's rights. Regulatory solutions to mitigate the harms of big data while also reaping its benefits need to evolve.</p>
<div style="text-align: justify;" dir="ltr"> </div>
<div style="text-align: justify;" dir="ltr"> </div>
<p><span id="docs-internal-guid-90fa226f-6157-27d9-30cd-050bdc280875"></span></p>
<div style="text-align: justify;" dir="ltr"> </div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/big-data-in-india-benefits-harms-and-human-rights-a-report'>http://editors.cis-india.org/internet-governance/big-data-in-india-benefits-harms-and-human-rights-a-report</a>
</p>
No publisherVidushi Marda, Akash Deep Singh and Geethanjali JujjavarapuHuman RightsUIDBig DataPrivacyArtificial IntelligenceInternet GovernanceMachine LearningFeaturedDigital IndiaAadhaarInformation TechnologyE-Governance2016-11-18T12:58:19ZBlog Entry