<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="http://editors.cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>http://editors.cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 21 to 35.
        
  </description>
  
  
  
  
  <image rdf:resource="http://editors.cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/openness/design-public-conclave-6th-edition"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/files/data-protection-submission"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/files/data-for-the-benefit-of-people"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/counter-comments-on-trais-consultation-paper-on-privacy-security-and-ownership-of-data-in-telecom-sector"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report">
    <title>Privacy after Big Data - Workshop Report</title>
    <link>http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society (CIS) and the Sarai programme, CSDS, organised a workshop on 'Privacy after Big Data: What Changes? What should Change?' on Saturday, November 12, 2016 at Centre for the Study of Developing Societies in New Delhi. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;This workshop aimed to build a dialogue around some of the key government-led big data initiatives in India and elsewhere that are contributing significant new challenges and concerns to the ongoing debates on the right to privacy. It was an open event.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In this age of big data, discussions about privacy are intertwined with the use of technology and the data deluge. Though big data possesses enormous value for driving innovation and contributing to productivity and efficiency, privacy concerns have gained significance in the dialogue around regulated use of data and the means by which individual privacy might be compromised through means such as surveillance, or protected. The tremendous opportunities big data creates in varied sectors ranges from financial technology, governance, education, health, welfare schemes, smart cities to name a few. With the UID project re-animating the Right to Privacy debate in India, and the financial technology ecosystem growing rapidly, striking a balance between benefits of big data and privacy concerns is a critical policy question that demands public dialogue and research to inform an evidence based decision. Also, with the advent of potential big data initiatives like the ambitious Smart Cities Mission under the Digital India Scheme, which would rely on harvesting large data sets and the use of analytics in city subsystems to make public utilities and services efficient, the tasks of ensuring data security on one hand and protecting individual privacy on the other become harder.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This workshop sought to discuss some of the emerging problems due to the advent of big data and possible ways to address these problems. The workshop began with Amber Sinha of CIS and Sandeep Mertia of Sarai introducing the topic of big data and implications for privacy. Both speakers tried to define big data and brief history of the evolution of the term and raised questions about how we understand it. Dr. Usha Ramanathan spoke on the right to privacy in the context of the ongoing Aadhaar case and Vipul Kharbanda introduced the concept of Habeas Data as a possible solution to the privacy problems posed by big data.  Amelia Andersotter discussed national centralised digital ID systems and their evolution in Europe, often operating at a cross-functional scale, and highlighted its implications for discussions on data protection, welfare governance, and exclusion from public and private services. Srikanth Lakshmanan spoke of the issues with technology and privacy, and possible technological solutions.  Dr. Anupam Saraph discussed the rise of digital banking and Aadhaar based payments and its potential use for corrupt practices. Astha Kapoor of Microsave spoke about her experience of implementation of digital money solution in rural India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Post lunch, Dr. Anja Kovacs and Mathew Rice spoke on the rise of mass communication surveillance across the world, and the evolving challenges of regulating surveillance by government agencies. Mathew also spoke of privacy movements by citizens and civil society in regions. In the final speaking session, Apar Gupta and Kritika Bhardwaj traced the history of jurisprudence on the right to privacy and the existing regulations and procedures. In the final session, the participants discussed various possible solutions to privacy threats from big data and identity projects including better regulation, new approached such as harms based regulation and privacy risk assessments, and conceiving privacy as a horizontal right. The workshop ended with vote of thanks from the organizers.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The agenda for the event can be accessed &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS-Sarai_PrivacyAfterBigData_ConceptAgenda.pdf"&gt;here&lt;/a&gt;, and the transcript is available &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/privacy-after-big-data/"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report'&gt;http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-01-27T01:09:17Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-">
    <title>New Recommendations to Regulate Online Hate Speech Could Pose More Problems Than Solutions</title>
    <link>http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-</link>
    <description>
        &lt;b&gt;The T.K. Viswanathan committee’s recommendations could prove to be dangerous for free speech if acted upon without resolving its flaws.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published by &lt;a class="external-link" href="https://thewire.in/187381/new-recommendations-regulate-online-hate-speech-problems/"&gt;Wire&lt;/a&gt; on October 14, 2017&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a title="It was reported last week" href="https://thewire.in/184920/post-section-66a-central-panel-tells-government-to-amend-ipc-crpc-it-act-to-punish-online-hate-speech/" rel="noopener
        noreferrer" target="_blank"&gt;&lt;span&gt;It was reported last week&lt;/span&gt;&lt;/a&gt; that an expert       committee headed by T.K. Viswanathan, former secretary general of       Lok Sabha, recommended that the Indian Penal Code (IPC), the Code       of Criminal Procedure and the Information Technology Act be       amended to include stringent penal provisions regarding online       hate speech. While this report has not been made public, &lt;a title="the Indian
        Express reported" href="http://indianexpress.com/article/india/hate-speech-online-punishment-supreme-court-section-66a-information-technology-act-narendra-modi-4876648/" rel="external nofollow" target="_blank"&gt;&lt;span&gt;the&lt;em&gt; Indian Express&lt;/em&gt; reported&lt;/span&gt;&lt;/a&gt; that       the committee’s recommendations include, among other things,       insertion and expansion of penal provisions in the IPC on       ‘incitement to hatred’ (Section 153C) and ‘causing fear, alarm or       provocation of violence’ (Section 505A) to include online speech,       and creation of the offices of state cyber crime coordinator and       district cyber crime cell.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Online hate speech has been among the more complex issues with       regard to the regulation of technology. The complexity of       restricting hate speech has to do with a number of factors,       including the ubiquity of strong opinions in online speech, often       offensive to certain groups, the interplay between individual and       group rights, and the tensions between the values of dignity,       liberty and equality. Siddharth Narrain has &lt;a title="pointed out" href="http://jmi.ac.in/upload/menuupload/16_ccmg_epwsedition.pdf" rel="external nofollow" target="_blank"&gt;&lt;span&gt;pointed         out&lt;/span&gt;&lt;/a&gt; in his thesis on hate speech law that the use of law to       curb offensive or hurtful speech has been done by religious       groups, caste based groups, occupation based groups with strong       caste associations, language groups and gender based groups. The       range of actions arising from such uses of the law include the       banning of books, criminal proceedings for political satire, or       even ‘liking’ political posts on social media.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The relationship between speech acts and acts of violence is a       complicated issue with little consensus on appropriate ways to       regulate it. Scholars such as Jonathan Maynard have advocated       greater reliance on non-legal responses such as counter speech, as       the use of criminal law to tackle speech often has the effect of       chilling forms of dissent. The f&lt;span&gt;&lt;span&gt;ormulation and application of legal           tests in criminal law with respect to hate speech is also hard           as hate speech has much to do with the content of speech as it           has to do with the context, including factors such as power           structures.&lt;/span&gt; &lt;span&gt;Speech by a           figure in a position of power also has a greater likelihood to           result in a call for violence. &lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Before looking at the specific recommendations made by the T.K.       Viswanathan committee, it would be worthwhile to also look at the       background of this committee. The committee notes with approval       the &lt;a title="Law Commission of
        India’s 267th report on the issue of hate speech" href="http://lawcommissionofindia.nic.in/reports/Report267.pdf" rel="external nofollow" target="_blank"&gt;&lt;span&gt;Law Commission         of India’s 267th report on the issue of hate speech&lt;/span&gt;&lt;/a&gt;. The Law       Commission, in turn, was acting at the behest of observations made       by the Supreme Court in &lt;a title="Pravasi Bhalai
        Sangathan v. Union of India" href="https://indiankanoon.org/docfragment/61854231/?formInput=ramesh%20union%20india%20" rel="external nofollow" target="_blank"&gt;&lt;span&gt;&lt;i&gt;Pravasi Bhalai Sangathan&lt;/i&gt; v.         &lt;i&gt;Union of India&lt;/i&gt;&lt;/span&gt;&lt;/a&gt; in 2014. In this case, the Supreme       Court exhibited judicial restraint and refused to frame guidelines       prohibiting political hate speech, and had instead requested the       Law Commission to look into it. However, the court noted with       approval international case law on the issues, particularly the       observations in the Canadian case &lt;a title="Saskatchewan v. Whatcott" href="https://scc-csc.lexum.com/scc-csc/scc-csc/en/item/12876/index.do" rel="external nofollow" target="_blank"&gt;&lt;span&gt;&lt;i&gt;Saskatchewan&lt;/i&gt; v. &lt;i&gt;Whatcott&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;.       Relying on &lt;i&gt;Whatcott&lt;/i&gt;, the Supreme Court provides a       definition of hate speech that includes the following statements:&lt;/p&gt;
&lt;blockquote style="text-align: justify; "&gt;
&lt;p&gt;“Hate speech is an effort to marginalise individuals based on         their membership in a group. Using expression that exposes the         group to hatred, hate speech seeks to delegitimise group members         in the eyes of the majority, reducing their social standing and         acceptance within society. Hate speech, therefore, rises beyond         causing distress to individual group members..[and] lays the         groundwork for later, broad attacks on vulnerable that can range         from discrimination, to ostracism, segregation, deportation,         violence and, in the most extreme cases, to genocide. Hate         speech also impacts a protected group’s ability to respond to         the substantive ideas under debate, thereby placing a serious         barrier to their full participation in our democracy.”&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;Thus, it is evident that the Supreme Court itself clearly states       that hate speech must be viewed through the lens of the right to       equality, and relates to speech not merely offensive or hurtful to       specific individuals, but also inciting discrimination or violence       on the basis of inclusion of individuals within certain groups. It       is important to note that it is the consequence of speech that is       the determinative factor in interpreting hate speech, more so than       even perhaps the content of the speech. This is also broadly       reflected in the Law Commission’s report that identifies the       status of the author of the speech, the status of victims of the       speech, the potential impact of the speech and whether it amounts       to incitement as key identifying criteria of hate speech.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, in the commission’s recommendations, these principles       are not fairly represented in the suggested new Sections 153C and       505A, as per a &lt;a title="draft released" href="https://internetfreedom.in/government-committee-wants-to-bring-back-section-66a/" rel="external nofollow" target="_blank"&gt;&lt;span&gt;draft         released&lt;/span&gt;&lt;/a&gt; by the Internet Freedom Foundation. Section 505A,       for instance, refers to “highly disparaging, indecent, abusive,       inflammatory, false or grossly offensive information” and       “derogatory information.” These are extremely broad terms, not       having any guiding jurisprudence within Indian or international       law, which may be helpful in restrictively interpreting them. It       is important to note the similarities between this provision and       the repealed Section 66A of the Information Technology Act, which       sought to criminalise speech that was “grossly offensive,” having       “menacing character,” or “causing       annoyance..danger..insult..enmity, hatred or ill will.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These terms in the recommended Section 505A also run foul of the       observations of Justice Nariman in &lt;em&gt;&lt;a title="Shreya
          Singhal v. Union of India" href="https://cis-india.org/internet-governance/blog/shreya-singhal-judgment.pdf" rel="external nofollow" target="_blank"&gt;&lt;span&gt;Shreya Singhal v. Union of India&lt;/span&gt;&lt;/a&gt;,&lt;/em&gt; where       he took exception to the nature of the terms in Section 66A by       stating that, “Information that may be grossly offensive or which       causes annoyance or inconvenience are undefined terms which take       into the net a very large amount of protected and innocent       speech.” While these terms are somewhat tempered in this provision       with a requirement to show intent to “cause fear of injury or       alarm,” they remain exceedingly broad and contrary to the       requirement that restrictions on speech must be couched in the       narrowest possible terms.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The T.K. Viswanathan committee, in addition, seeks to bring,       within the scope of the prospective Sections 153C and 505A,       electronic speech. As per its recommendations, ‘means of       communication’ would include “any words either spoken or written,       signs, visible representations, information, audio, video or       combination of both transmitted, retransmitted or sent through any       telecommunication service, communication device or computer       resource.” This could have the impact of bringing in a provision       that has some similar effects as that of the now defunct Section       66A of the Information Technology Act. The lack of regard for the       Supreme Court’s observations on hate speech, the need to look at       it through the lens of equality and the over-broadness of       restrictions on speech are likely to be dangerous for free speech       if the recommendations of this committee are acted upon.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-'&gt;http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Hate Speech</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2018-01-02T03:06:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms">
    <title>New Media, personalisation and the role of algorithms</title>
    <link>http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms</link>
    <description>
        &lt;b&gt;In his much acclaimed book, The Filter Bubble, Eli Pariser explains how personalisation of services on the web works and laments that they are creating individual bubbles for each user, which run counter to the idea of the Internet as an inherently open place. While Pariser’s book looks at the practices of various large companies providing online services, he briefly touches upon the role of new media such as search engines and social media portals in new curation. Building upon Pariser’s unexplored argument, this article looks at the impact of algorithmic decision-making and Big Data in the context of news reporting and curation.&lt;/b&gt;
        &lt;em&gt;&lt;br /&gt;&lt;/em&gt;
&lt;blockquote&gt;
&lt;div&gt;
&lt;div&gt;&lt;em&gt;Everything which bars freedom and fullness of communication sets up barriers that divide human beings into sets and cliques, into antagonistic sects and factions, and thereby undermines the democratic way of life. &lt;/em&gt;—John Dewey&lt;/div&gt;
&lt;/div&gt;
&lt;/blockquote&gt;
&lt;p&gt;&amp;nbsp;Eli Pariser, in his book, The Filter Bubble,[1] refers to the scholarship by Walter Lippmann and John Dewey as integral to the evolution of the understanding of the democratic and ethical duties of the Fourth Estate. Lippmann was disillusioned by the role of newspapers in propaganda for the First World War. He responded with three books in quick succession — Liberty and the News,[2] Public Opinion[3] and The Phantom Public.[4] Lippmann brought attention the fact that the process of news-reporting was conducted through privately determined and unexamined standards. The failure of the Fourth Estate to perform its democratic functions, was, in the opinion of Lippmann, one of the prime factors responsible for the public not being an informed and rational entity. John Dewey, while rejecting Lippmann’s arguments that matters of public policy can only be determined by inside experts with training and education, did acknowledge the his critique of the media.&lt;/p&gt;
&lt;p&gt;Pariser points to the creation of a wall between editorial decisionmaking and advertiser interests, as the eventual result of the Lippmann and Dewey debate. While accepting that this division between the financial and reporting sides of media houses has not been always observed, Pariser emphasises that the fact that the standard exists is important.[5] Unlike traditional media, the new media which relies on algorithmic decision-making for personalisation is not subject to the same standards which try to mitigate the influence of commercial interests on editorial decisions while performing many of the same functions as the traditional media.[6] &amp;nbsp;&lt;/p&gt;
&lt;h3&gt;How personalisation algorithms work&lt;/h3&gt;
&lt;p dir="ltr"&gt;Kevin Slavin, at his famous talk in the TEDGLobal Conference, characterised algorithms as “maths that computers use to decide stuff” and that it was infiltrating every aspect of our lives.[7] According to Slavin’s view, algorithms can be seen as control technologies and shape our world constantly through media and information systems, dynamically modifying content and function through these programmed routines. Search engines and social media platforms perpetually rank user-generated content through algorithms.[8]&lt;/p&gt;
&lt;p&gt;Personalisation technologies have various advantages. It translates into more relevant content, which for service providers means more clicks and revenue and for consumer, less time spent on finding the content.[9] However, it also leads to privacy compromise, lack of control and reduced individual capability.[10] Search engines like Google use the famous PageRank algorithm, which combined with geographical location and previous searches yields most relevant search results.[11] PageRank algorithm uses various real time variables dependent on both voluntary and involuntary user inputs. These variables include number of clicks, number of occurrences of the key terms and number of references by other credible pages etc. This data in turn determines the order of pages in search results and influences the way we perceive, understand and analyse information.[12] Maps showing real time traffic information retrieve data from laser and infrared sensors alongside the road and from information from devices of users. Once this real time data is combined with historical trends, these maps recommend rout to every user, hence influencing the traffic patterns.[13]&lt;/p&gt;
&lt;p&gt;Even though this phenomenon of personalization may appears to be new, it has been prevalent in the society for ages.[14] The history of mass media culture clearly shows personalization has always been a method to increase market, market reach and customer satisfaction.[15] Newspapers have sections dedicated to special topics, radio and TV have channels dedicated to different interest groups, age groups and consumers.[16] These personalised sections in a newspaper and personalised channels on radio and television don’t just provide greater satisfaction to the readers or listeners or consumers, they also provide targeted advertisement space for the advertisers and content developers. However, digital footprints and mass collection of data have made this phenomenon much more granular and detailed. Geographical location of an individual can tell a lot about their community, their culture and other important traits local to a community.[17] This data further assists in personalisation. Current developments in technology not only help in better collection of data about personal preferences but also help in better personalisation.&lt;/p&gt;
&lt;p&gt;Pariser mentions three ways in which the personalization technologies of this day are different from those of the past. First, for the very first time, individuals are alone in the filter bubble. While in traditional forms of personalisation, there were various individuals who shared the same frame of reference, now there is a separate sets of filters governing the dissemination of content to each individual.[18] Second, the personalisation technologies are entirely invisible now, and there is little that consumers can do to control or modify them.[19] Third, often the decision to be subject to these personalisation technologies is not an informed choice. A good example of this would be an individual’s geographical location.[20]&lt;/p&gt;
&lt;h3&gt;The neutrality of New Media?&lt;/h3&gt;
&lt;p dir="ltr"&gt;More and more, we have noticed personalisation technologies having an impact on how we consume news on the Internet. Google News, Facebook’s News Feed which tries to put together a dynamic feed for both personal and global stories, and Twitter’s trending hashtag feature, have brought forward these services are key drivers of an emerging news ecosystem. Initially, this new media was hailed as a natural consequence of the Internet which would enable greater public participation, allow journalists to find more stories and engage with the readers directly. &amp;nbsp;An illustration of the same could be seen in the way Internet based news media and social networking websites behaved in the aftermath of Israel’s attacks on a United Nations run school in Gaza strip. While much of the international Internet media covered the story, Israel’s home media did not cover the story. The only exception to this was the liberal Israeli news website Ha’aretz.[21] Network graph details of Twitter, for a few days immediately after the incident clearly show the social media manifestation of the event in the personalised cyberspace. It is clearly visible that when most of the word was re-tweeting news of this heinous act of Israel, Israeli’s hardly re-tweeted this news. In fact they were busty re-tweeting the news of rocket attacks on Israel.[22]&lt;/p&gt;
&lt;p&gt;The use of social media in newsmaking was hailed by many scholars as symptomatic of the decentralisation characteristic of the Internet. It has been seen as movement towards greater grassroots participation by negating the ‘gatekeeping’ role traditionally played by editors. &amp;nbsp;Thomas Poell and José van Dijck punch holes in theory of social media and other online technologies as mere facilitators of user participation and translators of user preferences through Big Data analytics.[23] They quote T. Gillespie’s work which talks of the narrative of these online services as platforms which are “open, neutral, egalitarian and progressive support for activity.”[24]&lt;/p&gt;
&lt;p&gt;Pedro Domingos calls the overwhelming number of choices as the defining problem of the information age, and machine learning and data analytics as the largest part of this solution.[25] The primary function of algorithmic decision making in the context of consumption of content is to narrow down the choices. Domingos is more optimistic about the impact of these technologies, and he says “last step of the decision is usually still for humans to make, but learners intelligently reduce the choices to something a human can manage.”[26] On the other hand, Pariser is more circumspect about the coercive result of machine learning algorithms. Whichever way we lean, we have to accept that a large part of personalisation algorithms is to select and prioritize content by categorising it on the basis of relevance and popularity. &amp;nbsp;&lt;/p&gt;
&lt;p&gt;Poell and van Dijck call this a new knowledge logic which in effect replaces human judgement (as, earlier exercised by editors) to some kind of proxy decisionmaking based on data. Their main thesis is that there is little evidence to suggest that the latter is more democratic than former and creates new problems of its own. They go on to compare the practices of various services including Facebook’s new graph and Twitter’s trending topic, and conclude that they prioritise breaking news stories over other kinds of content.[27] For instance, the algorithm for the trending topics depends not on the volume but the velocity of the tweets with the hashtag or term. It could be argued that given this predilection, the algorithms will rarely prefer complex content. If we go by Lippmann and Dewey’s idea that the role of the Fourth Estate is to inform public debate and accountability of those in positions of power, this aspect of Big Data algorithms does not correspond with this role.&lt;/p&gt;
&lt;h3&gt;Quantified Audience&lt;/h3&gt;
&lt;p dir="ltr"&gt;Another aspect of use of Big Data and algorithms in New Media that requires attention is that the networked infrastructure enables a quantified audience. C W Anderson who has studied newsroom practices in the US looked at role played by audience quantification and rationalization in shifting newswork practices. He concluded that more and more, journalists are less autonomous in their news decisions and increasingly reliant on audience metrics as a supplement to news &amp;nbsp;judgment.[28] Poell and van Dijck review the the practices by some leading publications such a New York Times, L.A. Times and Huffington Post, and degree to which audience metrics &amp;nbsp;dictates editorial decisions. While New York Times seems to prioritise content on their social media portals based on expectation of spike in user traffic, L.A. Times goes one step further by developing content specifically aimed towards promoting greater social participation. Neither of these practices though compare to the reliance on SEO and SMO strategies of web-born news providers like Huffington Post. They have traffic editors who trawl the Internet for trending topics and popular search terms, the feedback from them dictates the content creation.[29]&lt;/p&gt;
&lt;h3&gt;Conclusion&lt;/h3&gt;
&lt;p dir="ltr"&gt;The above factors demonstrate that the idea of New Media leading to the Fourth Estate performing its democratic functions does not take into account the actual practices. This idea is based on the erroneous assumption that technology, in general and algorithms, in particular are neutral. While the emergence of New Media might have reduced the gatekeeping role played by the editors, its strong prioritisation of content that will be popular reduce the validity of arguments that it leads to more informed public discussion. As Pariser said, the traditional media scores over the New Media inasmuch as there is an existence of a standard of division between editorial decisionmaking and advertiser interest. While this standard is flouted by media houses all the time, it exists as a metric to aspire to and measure service providers against. The New Media performs many of the same functions and maybe it is time to evolve some principles and ethical standards that take into account the need for it to perform these democratic functions.&lt;/p&gt;
&lt;h3&gt;Endnotes&amp;nbsp;&lt;/h3&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt; Eli Pariser, The Filter Bubble: What the Internet is
hiding from you (The Penguin Press, New York, 2011)&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;span class="MsoFootnoteReference"&gt;&lt;span class="MsoFootnoteReference"&gt;[2]&lt;/span&gt;&lt;/span&gt;&amp;nbsp;Walter Lippmann, Liberty and News (Harcourt, Brace
and Howe, New York 1920) available at&lt;a href="https://archive.org/details/libertyandnews01lippgoog"&gt;https://archive.org/details/libertyandnews01lippgoog&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt; Walter Lippmann, Public Opinion (Harcourt, Brace and
Howe, New York 1920) available at &lt;a href="http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html"&gt;http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt; Walter Lippmann, The Phantom Public (Transaction
Publishers, New York, 1925)&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 35.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 36.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt; &lt;a href="https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en"&gt;https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt; Fenwick McKelvey, “Algorithmic Media Need Democratic
Methods: Why Publics Matter”, available at &lt;a href="http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf"&gt;http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt; &lt;a href="http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1"&gt;http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt; Helen Ashman, Tim Brailsford, Alexandra Cristea, Quan
Z Sheng, Craig Stewart, Elaine Torns and Vincent Wade, “The ethical and social
implications of personalization technologies for e-learning” available at &lt;a href="http://www.sciencedirect.com/science/article/pii/S0378720614000524"&gt;http://www.sciencedirect.com/science/article/pii/S0378720614000524&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt; Sergey Brin and Lawrence Page, “The Anatomy of a
Large-Scale Hypertextual Web Search Engine” available at &lt;a href="http://infolab.stanford.edu/pub/papers/google.pdf"&gt;http://infolab.stanford.edu/pub/papers/google.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt; Ian Rogers, “The Google Pagerank Algorithm and How It
Works” available at &lt;a href="http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm"&gt;http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt; Trygve Olson and Terry Nelson, “The Internet’s Impact
on Political Parties and Campaigns”, available at &lt;a href="http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942"&gt;http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt; Ian Witten, “Bias, privacy and and personalisation on
the web”, available at &lt;a href="http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf"&gt;http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 10.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt; &lt;a href="https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/"&gt;https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt; Charles Heatwole, “Culture: A Geographical Perspective”
available at &lt;a href="http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html"&gt;http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 10.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Id&lt;/em&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 11.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt; Paul Mason, “Why Israel is losing the social media
war over Gaza?” available at &lt;a href="http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182"&gt;http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt; Gilad Lotan, Israel, Gaza, War &amp;amp; Data: Social
Networks and the Art of Personalizing Propaganda available at &lt;a href="http://www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html"&gt;www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt; Thomas Poell and José van Dijck, “Social Media and
Journalistic Independence” in Media Independence: Working with Freedom or
Working for Free?, edited by James Bennett &amp;amp; Niki Strange. (Routledge,
London, 2015)&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt; T Gillespie, “The politics of ‘platforms,” in New
Media &amp;amp; Society (Volume 12, Issue 3).&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt; Pedro Domingos, The Master Algorithm: How the quest
for the ultimate learning machine will re-make the world (Basic Books, New
York, 2015) at 38.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Ibid&lt;/em&gt; at 40.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
23.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt; C W Anderson, Between creative and quantified
audiences: Web metrics and changing patterns of newswork in local US newsrooms,
available at &lt;a href="https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms"&gt;https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms&lt;/a&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;
&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra &lt;/em&gt;Note 23.&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;span id="docs-internal-guid-24b4db2a-a606-d425-16ff-1d76b980367d"&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms'&gt;http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Human Rights</dc:subject>
    
    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Machine Learning</dc:subject>
    
    
        <dc:subject>Algorithms</dc:subject>
    
    
        <dc:subject>New Media</dc:subject>
    

   <dc:date>2017-01-16T07:20:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle">
    <title>New Approaches to Information Privacy – Revisiting the Purpose Limitation Principle</title>
    <link>http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle</link>
    <description>
        &lt;b&gt;Article on Aadhaar throwing light on privacy and data protection.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;This was &lt;a class="external-link" href="http://www.digitalpolicy.org/revisiting-the-principles-of-purpose-limitation-under-existing-data-protection-norms/"&gt;published in Digital Policy Portal&lt;/a&gt; on July 13, 2016.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Introduction&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Last year, Mukul Rohatgi, the Attorney General of India, called into question existing jurisprudence of the last 50 years on the constitutional validity of the right to privacy.&lt;sup&gt;1&lt;/sup&gt; Mohatgi was rebutting the arguments on privacy made against Aadhaar, the unique identity project initiated and implemented in the country without any legislative mandate.&lt;sup&gt;2&lt;/sup&gt; The question of the right to privacy becomes all the more relevant in the context of events over the last few years—among them, the significant rise in data collection by the state through various e-governance schemes,&lt;sup&gt;3&lt;/sup&gt; systematic access to personal data by various wings of the state through a host of surveillance and law enforcement initiatives launched in the last decade,&lt;sup&gt;4&lt;/sup&gt; the multifold increase in the number of Indians online, and the ubiquitous collection of personal data by private parties.&lt;sup&gt;5&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;These developments have led to a call for a comprehensive privacy legislation in India and the adoption of the National Privacy Principles as laid down by the Expert Committee led by Justice AP Shah.&lt;sup&gt;6&lt;/sup&gt; There are privacy-protection legislation currently in place such as the Information Technology Act, 2000 (IT Act), which was enacted to govern digital content and communication and provide legal recognition to electronic transactions. This legislation has provisions that can safeguard—and dilute—online privacy. At the heart of the data protection provisions in the IT Act lies section 43A and the rules framed under it, i.e., Reasonable security practices and procedures and sensitive personal data information.&lt;sup&gt;7&lt;/sup&gt;Section 43A mandates that body corporates who receive, possess, store, deal, or handle any personal data to implement and maintain ‘reasonable security practices’, failing which, they are held liable to compensate those affected. Rules drafted under this provision also mandated a number of data protection obligations on corporations such the need to seek consent before collection, specifying the purposes of data collection, and restricting the use of data to such purposes only. There have been questions raised about the validity of the Section 43A Rules as they seek to do much more than mandate in the parent provisions, Section 43A— requiring entities to maintain reasonable security practices.&lt;/p&gt;
&lt;h3&gt;Privacy as control?&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Even setting aside the issue of legal validity, the kind of data protection framework envisioned by Section 43A rules is proving to be outdated in the context of how data is now being collected and processed. The focus of Section 43 A Rules—as well as that of draft privacy legislations in India&lt;sup&gt;8&lt;/sup&gt;—is based on the idea of individual control. Most apt is Alan Westin’s definition of privacy: “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to other.”&lt;sup&gt;9&lt;/sup&gt; Westin and his followers rely on the normative idea of “informational self- determination”, the notion of a pure, disembodied, and atomistic self, capable of making rational and isolated choices in order to assert complete control over personal information. More and more this has proved to be a fiction especially in a networked society.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Much before the need for governance of information technologies had reached a critical mass in India, Western countries were already dealing with the implications of the use of these technologies on personal data. In 1973, the US Department of Health, Education and Welfare appointed a committee to address this issue, leading to a report called ‘Records, Computers and Rights of Citizens.’&lt;sup&gt;10&lt;/sup&gt; The Committee’s mandate was to “explore the impact of computers on record keeping about individuals and, in addition, to inquire into, and make recommendations regarding, the use of the Social Security number.” The Report articulated five principles which were to be the basis of fair information practices: transparency; use limitation; access and correction; data quality; and security. Building upon these principles, the Committee of Ministers of the Organization for Economic Cooperation and Development (OECD) arrived at the Guidelines on the Protection of Privacy and Transborder Flows of Personal Data in 1980.&lt;sup&gt;11&lt;/sup&gt; These principles— Collection Limitation, Data Quality, Purpose Specification, Use Limitation, Security Safeguards, Openness, Individual Participation and Accountability—are what inform most data protection regulations today including the APEC Framework, the EU Data Protection Directive, and the Section 43A Rules and Justice AP Shah Principles in India.&lt;/p&gt;
&lt;p&gt;Fred Cate describes the import of these privacy regimes as such:&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;“All of these data protection instruments reflect the same approach: tell individuals what data you wish to collect or use, give them a choice, grant them access, secure those data with appropriate technologies and procedures, and be subject to third-party enforcement if you fail to comply with these requirements or individuals’ expressed preferences”&lt;sup&gt;12&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;This is in line with Alan Westin’s idea of privacy exercised through individual control. Therefore the focus of these principles is on empowering the individuals to exercise choice, but not on protecting individuals from harmful or unnecessary practices of data collection and processing. The author of this article has earlier written&lt;sup&gt;13&lt;/sup&gt; about the sheer inefficacy of this framework which places the responsibility on individuals. Other scholars like Daniel Solove,&lt;sup&gt;14&lt;/sup&gt; Jonathan Obar&lt;sup&gt;15&lt;/sup&gt; and Fred Cate&lt;sup&gt;16&lt;/sup&gt; have also written about the failure of traditional data protection practices of notice and consent. While these essays dealt with the privacy principles of choice and informed consent, this paper will focus on the principles of purpose limitation.&lt;/p&gt;
&lt;h3&gt;Purpose Limitation and Impact of Big Data&lt;/h3&gt;
&lt;p&gt;The principles of purpose limitation or purpose specification seeks to ensure the following four objectives:&lt;/p&gt;
&lt;ol style="list-style-type: lower-alpha;"&gt;
&lt;li&gt;Personal information collected and processed should be adequate and relevant to the purposes for which they are processed.&lt;/li&gt;
&lt;li&gt;The entities collect, process, disclose, make available, or otherwise use personal information only for the stated purposes.&lt;/li&gt;
&lt;li&gt;In case of change in purpose, the data’s subject needs to be informed and their consent has to be obtained.&lt;/li&gt;
&lt;li&gt;After personal information has been used in accordance with the identified purpose, it has to be destroyed as per the identified procedures.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;The purpose limitation along with the data minimisation principle—which requires that no more data may be processed than is necessary for the stated purpose—aim to limit the use of data to what is agreed to by the data subject. These principles are in direct conflict with new technology which relies on ubiquitous collection and indiscriminate uses of data. The main import of Big Data technologies on the inherent value in data which can be harvested not by the primary purposes of data collection but through various secondary purposes which involve processing of the data repeatedly.&lt;sup&gt;17&lt;/sup&gt;Further, instead to destroying the data when its purpose has been achieved, the intent is to retain as much data as possible for secondary uses. Importantly, as these secondary uses are of an inherently unanticipated nature, it becomes impossible to account for it at the stage of collection and providing the choice to the data subject.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Followers of the discourse on Big Data would be well aware of its potential impacts on privacy. De-identification techniques to protect the identities of individuals in dataset face a threat from an increase in the amount of data available either publicly or otherwise to a party seeking to reverse-engineer an anonymised dataset to re-identify individuals. &lt;sup&gt;18&lt;/sup&gt; Further, Big Data analytics promise to find patterns and connections that can contribute to the knowledge available to the public to make decisions. What is also likely is that it will lead to revealing insights about people that they would have preferred to keep private.&lt;sup&gt;19&lt;/sup&gt;In turn, as people become more aware of being constantly profiled by their actions, they will self-regulate and ‘discipline’ their behaviour. This can lead to a chilling effect.&lt;sup&gt;20&lt;/sup&gt; Meanwhile, Big Data is also fuelling an industry that incentivises businesses to collect more data, as it has a high and growing monetary value. However, Big Data also promises a completely new kind of knowledge that can prove to be revolutionary in fields as diverse as medicine, disaster-management, governance, agriculture, transport, service delivery, and decision-making.&lt;sup&gt;21&lt;/sup&gt; As long as there is a sufficiently large and diverse amount of data, there could be invaluable insights locked in it, accessing which can provide solutions to a number of problems. In light of this, it is important to consider what kind of regulatory framework is most suitable which could facilitate some of the promised benefits of Big Data and at the same time mitigate its potential harm. This, coupled with the fact that the existing data protection principles have, by most accounts, run their course, makes the examination of alternative frameworks even more important. This article will examine some alternate proposals made to the existing framework of purpose limitation below.&lt;/p&gt;
&lt;h3&gt;Harms-based approach&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Some scholars like Fred Cate&lt;sup&gt;22&lt;/sup&gt; and Daniel Solove&lt;sup&gt;23&lt;/sup&gt; have argued that there is a need for the primary focus of data protection law to move from control at the stage of data collection to actual use cases. In his article on the failure of Fair Information Practice Principles,&lt;sup&gt;24&lt;/sup&gt;Cate puts forth a proposal for ‘Consumer Privacy Protection Principles.’ Cate envisions a more interventionist role of the data protection authorities by regulating information flows when required, in order to protect individuals from risky or harmful uses of information. Cate’s attempt is to extend the principles of consumer protection law of prevention and remedy of harms.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In a re-examination of the OECD Privacy Principles, Cate and Viktor Mayer Schöemberger attempt to discard the use of personal data to only purposes specified. They felt that restricting the use of personal to only specified purposes could significantly threaten various research and beneficial uses of Big Data. Instead of articulating a positive obligations of what personal data collected could be used for, they attempt to arrive at a negative obligation of use-cases prevented by law. Their working definition of the Use specification principle broaden the scope of use cases by only preventing use of data “if the use is fraudulent, unlawful, deceptive or discriminatory; society has deemed the use inappropriate through a standard of unfairness; the use is likely to cause unjustified harm to the individual; or the use is over the well-founded objection of the individual, unless necessary to serve an over-riding public interest, or unless required by law.”&lt;sup&gt;25&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;While most standards in the above definition have established understanding in jurisprudence, the concept of unjustifiable harm is what we are interested in. Any theory of harms-based approach goes back to John Stuart Mill’s dictum that the only justifiable purpose to exert power over the will of an individual is to prevent harm to others. Therefore, any regulation that seeks to control or prevent autonomy of individuals (in this case, the ability of individuals to allow data collectors to use their personal data, and the ability of data collectors to do so, without any limitation) must clearly demonstrate the harm to the individuals in question.&lt;/p&gt;
&lt;p&gt;Fred Cate articulates the following steps to identify tangible harm and respond to its presence:&lt;sup&gt;26&lt;/sup&gt;&lt;/p&gt;
&lt;ol style="list-style-type: lower-alpha;"&gt;
&lt;li&gt;Focus on Use — Actual use of the data should be considered, not mere possession. The assumption is that the collection, possession, or transfer of information do not significantly harm people, rather it is the use of information following such collection, possession, or transfer.&lt;/li&gt;
&lt;li&gt;Proportionality — Any regulatory measure must be proportional to the likelihood and severity of the harm identified.&lt;/li&gt;
&lt;li&gt;Per se Harmful Uses — Uses which are always harmful must be prohibited by law&lt;/li&gt;
&lt;li&gt;Per se not Harmful Uses — If uses can be considered inherently not harmful, they should not be regulated.&lt;/li&gt;
&lt;li&gt;Sensitive Uses — In case where the uses are not per se harmful or not harmful, individual consent must be sought for using that data for those purposes.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;The proposal by Cate argues for what is called a ‘use based system’, which is extremely popular with American scholars. Under this system, data collection itself is not subject to restrictions; rather, only the use of data is regulated. This argument has great appeal for both businesses who can reduce their overheads significantly if consent obligations are done away with as long as they use the data in ways which are not harmful, as well as critics of the current data protection framework which relies on informed consent. Lokke Moerel explains the philosophy of ‘harms based approach’ or ‘use based system’ in United States by juxtaposing it against the ‘rights based approach’ in Europe.&lt;sup&gt;27&lt;/sup&gt; In Europe, rights of individuals with regard to processing of their personal data is a fundamental human right and therefore, a precautionary principle is followed with much greater top-down control upon data collection. However, in the United States, there is a far greater reliance on market mechanisms and self-regulating organisations to check inappropriate processing activities, and government intervention is limited to cases where a clear harm is demonstrable.&lt;sup&gt;28&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Continuing research by the Centre for Information Policy Leadership under its Privacy Risk Framework Project looks at a system of articulating what harms and risks arising from use of collected data. They have arrived a matrix of threats and harms. Threats are categorised as —a) inappropriate use of personal information and b) personal information in the wrong hands. More importantly for our purposes, harms are divided into: a) tangible harms which are physical or economic in nature (bodily harm, loss of liberty, damage to earning power and economic interests); b) intangible harms which can be demonstrated (chilling effects, reputational harm, detriment from surveillance, discrimination and intrusion into private life); and c) societal harm (damage to democratic institutions and loss of social trust).&lt;sup&gt;29&lt;/sup&gt;For any harms-based system, a matrix like above needs to emerge clearly so that regulation can focus on mitigating practices leading to the harms.&lt;/p&gt;
&lt;h3&gt;Legitimate interests&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Lokke Moerel and Corien Prins, in their article “Privacy for Homo Digitalis – Proposal for a new regulatory framework for data protection in the light of Big Data and Internet of Things”&lt;sup&gt;30&lt;/sup&gt; use the ideal of responsive regulation which considers empirically observable practices and institutions while determining the regulation and enforcement required. They state that current data protection frameworks—which rely on mandating some principles of how data has to be processed—is exercised through merely procedural notification and consent requirements. Further, Moerel and Prins feel that data protection law cannot only involve a consideration of individual interest but also needs to take into account collective interest. Therefore, the test must be a broader assessment than merely the purpose limitation articulating the interests of the parties directly involved, but whether a legitimate interest is achieved.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Legitimate interest has been put forth as an alternative to the purpose limitation. Legitimate is not a new concept and has been a part of the EU Data Protection Directive and also finds a place in the new General Data Protection Regulation. Article 7 (f) of the EU Directive&lt;sup&gt;31&lt;/sup&gt; provided for legitimate interest balanced against the interests or fundamental rights and freedoms of the data subject as the last justifiable reason for use of data. Due to confusion in its interpretation, the Article 29 Working Party, in 2014,&lt;sup&gt;32&lt;/sup&gt;looked into the role of legitimate interest and arrived at the following factors to determine the presence of a legitimate interest— a) the status of the individual (employee, consumer, patient) and the controller (employer, company in a dominant position, healthcare service); b) the circumstances surrounding the data processing (contract relationship of data subject and processor); c) the legitimate expectations of the individual.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Federico Ferretti has criticised the legitimate interest principle as vague and ambiguous. The balancing of legitimate interest in using the data against fundamental rights and freedoms of the data subject gives the data controllers some degree of flexibility in determining whether data may be processed; however, this also reduces the legal certainty that data subject have of their data not being used for purposes they have not agreed to.&lt;sup&gt;33&lt;/sup&gt;However, it is this paper’s contention that it is not the intent of the legitimate interest criteria but the lack of consensus on its application which creates an ambiguity. Moerel and Prins articulate a test for using legitimate interest which is cognizant of the need to use data for the purpose of Big Data processing, as well as ensuring that the rights of data subjects are not harmed.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;As demonstrated earlier, the processing of data and its underlying purposes have become exceedingly complex and the conventional tool to describe these processes ‘privacy notices’ are too lengthy, too complex and too profuse in numbers to have any meaningful impact.&lt;sup&gt;34&lt;/sup&gt;The idea of information self-determination, as contemplated by Westin in American jurisprudence, is not achieved under the current framework. Moerel and Prins recommend five factors&lt;sup&gt;35&lt;/sup&gt; as relevant in determining the legitimate interest. Of the five, the following three are relevant to the present discussion:&lt;/p&gt;
&lt;ol style="list-style-type: lower-alpha;"&gt;
&lt;li style="text-align: justify;"&gt;Collective Interest — A cost-benefit analysis should be conducted, which examines the implications for privacy for the data subjects as well as the society, as a whole.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;The nature of the data — Rather than having specific categories of data, the nature of data needs to be assessed contextually to determine legitimate interest.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Contractual relationship and consent not independent grounds — This test has two parts. First, in case of contractual relationship between data subject and data controller: the more specific the contractual relationship, the more restrictions apply to the use of the data. Second, consent does not function as a separate principle which, once satisfied, need not be revisited. The nature of the consent (opportunities made available to data subject, opt in/opt out, and others) will continue to play a role in determining legitimate interest.&lt;/li&gt;&lt;/ol&gt;
&lt;h3&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Replacing the purpose limitation principles with a use-based system as articulated above poses the danger of allowing governments and the private sector to carry out indiscriminate data collection under the blanket guise that any and all data may be of some use in the future. The harms-based approach has many merits and there is a stark need for more use of risk assessments techniques and privacy impact assessments in data governance. However, it is important that it merely adds to the existing controls imposed at data collection, and not replace them in their entirety. On the other hand, the legitimate interests principle, especially as put forth by Moerel and Prins, is more cognizant of the different factors at play — the inefficacy of existing purpose limitation principles, the need for businesses to use data for purposes unidentified at the stage of collection, and the need to ensure that it is not misused for indiscriminate collection and purposes. However, it also poses a much heavier burden on data controllers to take into account various factors before determining legitimate interest. If legitimate interest has to emerge as a realistic alternative to purpose limitation, there needs to be greater clarity on how data controllers must apply this principle.&lt;/p&gt;
&lt;h3&gt;Endnotes&lt;/h3&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Prachi Shrivastava, “Privacy not a fundamental right, argues Mukul Rohatgi for Govt as Govt affidavit says otherwise,” Legally India, Jyly 23, 2015, http://www.legallyindia.com/Constitutional-law/privacy-not-a-fundamental-right-argues-mukul-rohatgi-for-govt-as-govt-affidavit-says-otherwise.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt; Rebecca Bowe, “Growing Mistrust of India’s Biometric ID Scheme,” Electronic Frontier Foundation, May 4, 2012, https://www.eff.org/deeplinks/2012/05/growing-mistrust-india-biometric-id-scheme.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Lisa Hayes, “Digital India’s Impact on Privacy: Aadhaar numbers, biometrics, and more,” Centre for Democracy and Technology, January 20, 2015, https://cdt.org/blog/digital-indias-impact-on-privacy-aadhaar-numbers-biometrics-and-more/.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;“India’s Surveillance State,” Software Freedom Law Centre, http://sflc.in/indias-surveillance-state-our-report-on-communications-surveillance-in-india/.&lt;/li&gt;
&lt;li&gt;“Internet Privacy in India,” Centre for Internet and Society, http://cis-india.org/telecom/knowledge-repository-on-internet-access/internet-privacy-in-india.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Vivek Pai, “Indian Government says it is still drafting privacy law, but doesn’t give timelines,” Medianama, May 4, 2016, http://www.medianama.com/2016/05/223-government-privacy-draft-policy/.&lt;/li&gt;
&lt;li&gt;Information Technology (Intermediaries Guidelines) Rules, 2011,&lt;br /&gt; http://deity.gov.in/sites/upload_files/dit/files/GSR314E_10511%281%29.pdf.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Discussion Points for the Meeting to be taken by Home Secretary at 2:30 pm on 7-10-11 to discuss the drat Privacy Bill, http://cis-india.org/internet-governance/draft-bill-on-right-to-privacy.&lt;/li&gt;
&lt;li&gt;Alan Westin, Privacy and Freedom (New York: Atheneum, 2015).&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;US Secretary’s Advisory Committee on Automated Personal Data Systems, Records, Computers and the Rights of Citizens, http://www.justice.gov/opcl/docs/rec-com-rights.pdf.&lt;/li&gt;
&lt;li&gt;OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Fred Cate, “The Failure of Information Practice Principles,” in Consumer Protection in the Age of the Information Economy, ed. Jane K. Winn (Burlington: Aldershot, Hants, England, 2006) http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1156972.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Amber Sinha and Scott Mason, “A Critique of Consent in Informational Privacy,” Centre for Internet and Society, January 11, 2016, http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy.&lt;/li&gt;
&lt;li&gt;Daniel Solove, “Privacy self-management and consent dilemma,” Harvard Law Review 126, (2013): 1880.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Jonathan Obar, “Big Data and the Phantom Public: Walter Lippmann and the fallacy of data privacy self management,” Big Data and Society 2(2), (2015), doi: 10.1177/2053951715608876.&lt;/li&gt;
&lt;li&gt;Supra Note 12.&lt;/li&gt;
&lt;li&gt;Supra Note 14.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Paul Ohm, “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization” available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1450006; Arvind Narayanan and Vitaly Shmatikov, “Robust De-anonymization of Large Sparse Datasets” available at https://www.cs.utexas.edu/~shmat/shmat_oak08netflix.pdf.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;D. Hirsch, “That’s Unfair! Or is it? Big Data, Discrimination and the FTC’s Unfairness Authority,” Kentucky Law Journal, Vol. 103, available at: http://www.kentuckylawjournal.org/wp-content/uploads/2015/02/103KyLJ345.pdf&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;A Marthews and C Tucker, “Government Surveillance and Internet Search Behavior”, available at http://ssrn.com/abstract=2412564; Danah Boyd and Kate Crawford, “Critical Questions for Big Data: Provocations for a cultural, technological, and scholarly phenomenon”, Information, Communication &amp;amp; Society, Vol. 15, Issue 5, (2012).&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Scott Mason, “Benefits and Harms of Big Data”, Centre for Internet and Society, available at http://cis-india.org/internet-governance/blog/benefits-and-harms-of-big-data#_ftn37.&lt;/li&gt;
&lt;li&gt;Cate, “The Failure of Information Practice Principles.”&lt;/li&gt;
&lt;li&gt;Solove, “Privacy self-management and consent dilemma,” 1882.&lt;/li&gt;
&lt;li&gt;Cate, “The Failure of Information Practice Principles.”&lt;/li&gt;
&lt;li&gt;Fred Cate and Viktor Schoenberger, “Notice and Consent in a world of Big Data,” International Data Privacy Law 3(2), (2013): 69.&lt;/li&gt;
&lt;li&gt;Solove, “Privacy self-management and consent dilemma,” 1883.&lt;/li&gt;
&lt;li&gt;Lokke Moerel, “Netherlands: Big Data Protection: How To Make The Draft EU Regulation On Data Protection Future Proof”, Mondaq, March 11. 2014, http://www.mondaq.com/x/298416/data+protection/Big+Data+Protection+How+To+Make+The+Dra%20ft+EU+Regulation+On+Data+Protection+Future+Proof%20al%20Lecture.&lt;/li&gt;
&lt;li&gt;Moerel, “Netherlands: Big Data Protection.”&lt;/li&gt;
&lt;li&gt;Centre for Information Policy Leadership, “A Risk-based Approach to Privacy: Improving Effectiveness in Practice,” Hunton and Williams LLP, June 19, 2014, https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/white_paper_1-a_risk_based_approach_to_privacy_improving_effectiveness_in_practice.pdf.&lt;/li&gt;
&lt;li&gt;Lokke Moerel and Corien Prins, “Privacy for Homo Digitalis: Proposal for a new regulatory framework for data protection in the light of Big Data and Internet of Things”, Social Science Research Network, May 25, 2016, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2784123.&lt;/li&gt;
&lt;li&gt;EU Directive 95/46/EC – The Data Protection Directive, https://www.dataprotection.ie/docs/EU-Directive-95-46-EC-Chapter-2/93.htm.&lt;/li&gt;
&lt;li&gt;Article 29 Data Protection Working Party, “Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC,” http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf.&lt;/li&gt;
&lt;li&gt;Frederico Ferretti, “Data protection and the legitimate interest of data controllers: Much ado about nothing or the winter of rights?,” Common Market Law Review 51(2014): 1-26. http://bura.brunel.ac.uk/bitstream/2438/9724/1/Fulltext.pdf.&lt;/li&gt;
&lt;li&gt;Sinha and Mason, “A Critique of Consent in Informational Privacy.”&lt;/li&gt;
&lt;li&gt;Moerel and Prins, “Privacy for Homo Digitalis.”&lt;/li&gt;&lt;/ol&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle'&gt;http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-11-09T13:54:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime">
    <title>India’s Data Protection Regime Must Be Built Through an Inclusive and Truly Co-Regulatory Approach</title>
    <link>http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime</link>
    <description>
        &lt;b&gt;We must move India past its existing consultative processes for rule-making, which often prompts stakeholders to take adversarial and extremely one-sided positions.
&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in the &lt;a class="external-link" href="https://thewire.in/201123/inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime/"&gt;Wire&lt;/a&gt; on December 1, 2017.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Earlier this week, the Ministry of Electronics and Information Technology released &lt;a title="a white paper" href="http://meity.gov.in/white-paper-data-protection-framework-india-public-comments-invited" target="_blank"&gt;&lt;span style="text-decoration: underline;"&gt;a white paper&lt;/span&gt;&lt;/a&gt; by a “committee of experts” appointed a few months back led by former Supreme Court judge, Justice B.N. Srikrishna, on a data protection framework for India. The other members of the committee are Aruna Sundararajan, Ajay Bhushan Pandey, Ajay Kumar, Rajat Moona, Gulshan Rai, Rishikesha Krishnan, Arghya Sengupta and Rama Vedashree.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With the exception of Justice Srikrishna and Krishnan, the rest of the committee members are either part of the government or part of organisations that have worked closely with the government on separate issues relating to technology, with some of them also having taken positions against the fundamental right to privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Refreshingly, the committee and the ministry has opted for a consultative process outlining the issues they felt relevant to a data protection law, and espousing provisional views on each of the issues and seeking public responses on them. The paper states that on the basis of the response received, the committee will conduct public consultations with citizens and stakeholders. Legitimate concerns &lt;a title="were raised earlier" href="http://indianexpress.com/article/india/citizens-group-questions-data-privacy-panel-composition-aadhaar-4924220/" target="_blank"&gt;&lt;span style="text-decoration: underline;"&gt;were raised earlier&lt;/span&gt;&lt;/a&gt; about the constitution of the committee and the lack of inclusion of different voices on it. However, if the committee follows an inclusive, transparent and consultative process in the drafting of the data protection legislation, it would go a long way in addressing these concerns.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The paper seeks response to as many as 231 questions covering a broad spectrum of issues relating to data protection – including definitions of terms such as personal data, sensitive personal data, processing, data controller and processor – the purposes for which exemptions should be available, cross border flow of data, data localisation and the right to be forgotten.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While a thorough analysis of all the issues up for discussion would require a more detailed evaluation, at this point, the process of rule-making and the kind of governance model envisaged in this paper are extremely important issues to consider.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In part IV of the paper on ‘Regulation and Enforcement’, there is a discussion on a co-regulatory approach for the governance of data protection in India. The paper goes so far as to provisionally take a view that it may be appropriate to pursue a co-regulatory approach which involves “a spectrum of frameworks involving varying levels of government involvement and industry participation”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the discussion on co-regulation in the white paper is limited to the section on regulation and enforcement. A truly inclusive and co-regulatory approach ought to involve active participation from non-governmental stakeholders in the rule-making process itself. In India, unfortunately, we lack a strong tradition of lawmakers engaging in public consultations and participation of other stakeholders in the process of drafting laws and regulation. One notable exception has been the Telecom Regulatory Authority of India (TRAI), which periodically seeks public responses on consultation papers it releases and also holds open houses occasionally. It is heartening to see the committee of experts and the ministry follow a similar process in this case.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, these are essentially examples of ‘notice and comment’ rulemaking where the government actors stand as neutral arbiters who must decide on written briefs submitted to it in response to consultation papers or draft regulations that it notifies to the public.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This process is, by its very nature, adversarial, and often means that different stakeholders do not reveal their true priorities but must take extreme one-sided positions, as parties tend to at the beginning of a negotiation.This also prevents the stakeholders from sharing an honest assessment of the actual regulatory challenge they may face, lest it undermine their position.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This often pits industry and public interest proponents against each other, sometimes also leading to different kinds of industry actors in adversarial positions. An excellent example of this kind of posturing, also relevant to this paper, is visible in the responses submitted to the TRAI on the its recent consultation paper on ‘Privacy, Security and Ownership of data in Telecom Sector’. One of the more contentious issue raised by the TRAI was about the adequacy of the existing data protection framework under the license agreement with telecom companies, and if there was a need to bring about greater parity in regulation between telecom companies and over-the-top (OTT) service providers. Rather than facilitating an actual discussion on what is a complex regulatory issues, and the real practical challenges it poses for the stakeholders, this form of consultation simply led to the telecom companies and OTT services providers submitting contrasting extreme positions without much scope for engagement between two polar arguments.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A truly co-regulatory approach which also extends to rulemaking would involve collaborative processes which are far less adversarial in their design and facilitate joint problem solving through multiple face to face meetings. Such processes are also more likely to lead to better rule making by using the more specialised knowledge of the different stakeholders about technology, domain-specific issues, industry realities and low cost solutions. Further, by bringing the regulated parties into the rulemaking process, the ownership of the policy is shared, often leading to better compliance.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Within the domain of data protection law itself, we have a few existing models of robust co-regulation which entail the involvement of stakeholders not just at the level of enforcement but also at the level of drafting. The oldest and most developed form of this kind of privacy governance can be seen in the study of the Dutch privacy statute. It involved a central privacy legislations with broad principles, sectoral industry-drafted “codes of conduct”, government evaluations and certifications of these codes; and a legal safe harbour for those companies that follow the approved code for their sector. Over a period of 20 years, the Dutch experience saw the approval of 20 sectoral codes across a variety of sectors such as banking, insurance, pharmaceuticals, recruitment and medical research.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Other examples of policies espousing this approach include two documents from the US – first, a draft bill titled ‘Commercial Privacy Bill of Rights Act of 2011’ introduced before the Congress by John McCain and John Kerry, and second, a White House Paper titled ‘Consumer Data Privacy In A Networked World: A Framework For Protecting Privacy And Promoting Innovation In The Global Digital Economy’ released by the Obama administration. Neither of these documents have so far led to a concrete policy. Both of these policies envisioned broadly worded privacy requirements to be passed by the Congress, followed by the detailed rules to be&lt;span&gt; drafted&lt;/span&gt;. The Obama administration white paper is more inclusive in mandating that ‘multi-stakeholder groups’ draft the codes that include not only industry representatives but also privacy advocates, consumer groups, crime victims, academics, international partners, federal and state civil and criminal law enforcement representatives and other relevant groups.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The principles that emerge out this consultative process are likely to guide the data protection law in India for a long time to come. Among democratic regimes with a significant data-driven market, India is extremely late in arriving at a data protection law. The least that it can do at this point is to learn from the international experience and scholarship which has shown that merits of a co-regulatory approach which entails active participation of the government, industry, civil society and academia in the drafting and enforcement of a robust data protection law.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime'&gt;http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-01-01T16:18:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good">
    <title>India's Data Protection Framework Will Need to Treat Privacy as a Social and Not Just an Individual Good</title>
    <link>http://editors.cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good</link>
    <description>
        &lt;b&gt;The idea that technological innovations may compete with privacy of individuals assumes that there is social and/or economic good in allowing unrestricted access to data. However, it must be remembered that data is potentially a toxic asset, if it is not collected, processed, secured and shared in the appropriate way.&lt;/b&gt;
        &lt;div class="field-label-hidden      field-type-text-with-summary field-name-body field" style="text-align: justify; "&gt;
&lt;div class="field-items"&gt;
&lt;div class="even field-item"&gt;
&lt;p&gt;Published in Economic &amp;amp; Political Weekly, Volume 53, Issue No. 18, 05 May, 2018. Article can be &lt;a class="external-link" href="http://www.epw.in/engage/article/for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good"&gt;accessed online here&lt;/a&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;In             July 2017, the Ministry of Electronics and Information             Technology (MeITy) in India set up a committee headed by a             former judge, B N Srikrishna, to address the growing clamour             for privacy protections at a time when both private             collection of data and public projects like Aadhaar are             reported to pose major privacy risks (Maheshwari 2017). The             Srikrishna Committee is in the process of providing its             input, which will go on to inform India’s data-protection             law.&lt;/p&gt;
&lt;p&gt;While             the committee released a white paper with provisional views,             seeking feedback a few months ago, it may be discussing a             data protection framework without due consideration to how             data practices have evolved.&lt;/p&gt;
&lt;p&gt;In             early 2018, a series of stories based on investigative             journalism by &lt;em&gt;Guardian&lt;/em&gt;and &lt;em&gt;Observer&lt;/em&gt; revealed             that the data of 87 million Facebook users was used for the             Trump campaign by a political consulting firm, Cambridge             Analytica, without their permissions. Aleksandr Kogan, a             psychology researcher at the University of Cambridge,             created an application called “thisisyourdigitallife” and             collected data from 270,000 participants through a             personality test using Facebook’s application programming             interface (API), which allows developers to integrate with             various parts of the Facebook platform (Fruchter et al             2018). This data was collected purportedly for academic             research purposes only. Kogan’s application also collected             profile data from each of the participants’ friends, roughly             87 million people.&lt;/p&gt;
&lt;p&gt;The             kinds of practices concerning the sharing and processing of             data exhibited in this case are not unique. These are, in             fact, common to the data economy in India as well. It can be             argued that the Facebook–Cambridge Analytica incident is             representative of data practices in the data-driven digital             economy. These new practices pose important questions for             data protection laws globally, and how these may need to             evolve to address data protection, particularly for India,             which is in the process of drafting its own data protection             law.&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;Privacy as Control&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;Most             modern data protection laws focus on individual control. In             this context, the definition by the late Alan Westin             (2015) characterises privacy as:&lt;/p&gt;
&lt;blockquote style="padding-left: 20px; "&gt;
&lt;p&gt;The claim               of individuals, groups, or institutions to determine for               themselves when, how, and to what extent information about               them is communicated to other.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The             idea of “privacy as control” is what finds articulation in             data protection policies across jurisdictions, beginning             with the Fair Information Practice Principles (FIPP) from             the United States (US) (Dixon 2006). These FIPPs are the             building blocks of modern information privacy law (Schwartz             1999) and not only play a significant role in the             development of privacy laws in the US, but also inform data             protection laws in most privacy regimes internationally             (Rotenberg 2001), including the nine “National Privacy             Principles” articulated by the Justice A P Shah Committee in             India. Much of this approach is also reflected in the white             paper released by the committee, led by Justice Srikrishna,             towards the creation of data protection laws in India             (Srikrishna 2017)&lt;/p&gt;
&lt;p&gt;This             approach essentially involves the following steps (Cate             2006):&lt;/p&gt;
&lt;p&gt;(i)             Data controllers are required to tell individuals what data             they wish to collect and use and give them a choice to share             the data. &lt;br /&gt; (ii) Upon sharing, the individuals have rights such as being             granted access, and data controllers have obligations such             as securing the data with appropriate technologies and             procedures, and only using it for the purposes identified.&lt;/p&gt;
&lt;p&gt;The             objective in this approach is to make the individual             empowered and allow them to weigh their own interests in             exercising their consent. The allure of this paradigm is             that, in one elegant stroke, it seeks to “ensure that             consent is informed and free and thereby also (seeks) to             implement an acceptable tradeoff between privacy and             competing concerns.” (Sloan and Warner 2014). This approach             is also easy to enforce for both regulators and businesses.             Data collectors and processors only need to ensure that they             comply with their privacy policies, and can thus reduce             their liability while, theoretically, consumers have the             information required to exercise choice. In recent years,             however, the emergence of big data, the “Internet of             Things,” and algorithmic decision-making has significantly             compromised the notice and consent model (Solove 2013).&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;Limitations of Consent &lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;Some             cognitive problems, such as long and difficult to understand             privacy notices, have always existed with regard to the             issue of informed consent, but lately these problems have             become aggravated. Privacy notices often come in the form of             long legal documents, much to the detriment of the readers’             ability to understand them. These policies are “long,             complicated, full of jargon and change frequently” (Cranor             2012).&lt;/p&gt;
&lt;p&gt;Kent             Walker (2001) lists five problems that privacy notices             typically suffer from:&lt;/p&gt;
&lt;p&gt;(i)             Overkill: Long and repetitive text in small print.&lt;br /&gt; (ii) Irrelevance: Describing situations of little concern to             most consumers.&lt;br /&gt; (iii) Opacity: Broad terms that reflect limited truth, and             are unhelpful to track and control the information collected             and stored.&lt;br /&gt; (iv) Non-comparability: Simplification required to achieve             comparability will lead to compromising of accuracy.&lt;br /&gt; (v) Inflexibility: Failure to keep pace with new business             models.&lt;/p&gt;
&lt;p&gt;Today,             data is collected continuously with every use of online             services, making it humanly impossible to exercise             meaningful consent. &lt;br /&gt; The quantity of data being generated is expanding at an             exponential rate. With connected devices, smartphones,             appliances transmitting data about our usage, and even the             smart cities themselves, data now streams constantly from             almost every sector and function of daily life, “creating             countless new digital puddles, lakes, tributaries and oceans             of information” (Bollier 2010).&lt;/p&gt;
&lt;p&gt;The             infinitely complex nature of the data ecosystem renders             consent of little value in cases where individuals may be             able to read and comprehend privacy notices. As the uses of             data are so diverse, and often not limited by a purpose             identified at the beginning, individuals cannot             conceptualise how their data will be aggregated and possibly             used or reused.&lt;/p&gt;
&lt;p&gt;Seemingly             innocuous bits of data revealed at different stages could be             combined to reveal sensitive information about the             individual. While the regulatory framework is designed such             that individuals are expected to engage in cost–benefit             analysis of trading their data to avail services, this             ecosystem makes such individual analysis impossible.&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;Conflicts Between Big Data               and Individual Control&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;The             thrust of big data technologies is that the value of data             resides not in its primary purposes, but in its numerous             secondary purposes, where data is reused many times over             (Schoenberger and Cukier 2013).&lt;/p&gt;
&lt;p&gt;On             the other hand, the idea of privacy as control draws from             the “data minimisation” principle, which requires             organisations to limit the collection of personal data to             the minimum extent necessary to obtain their legitimate             purpose and to delete data no longer required. Control is             excercised and privacy is enhanced by ensuring data             minimisation. These two concepts are in direct conflict.             Modern data-driven businesses want to retain as much data as             possible for secondary uses. Since these secondary uses are,             by their nature, unanticipated, their practices run counter             to the very principle of purpose limitation (Tene and             Polonetsky 2012).&lt;/p&gt;
&lt;p&gt;It             is evident from such data-sharing practices, as demonstrated             by the Cambridge Analytica–Facebook story, that platform             architectures are designed with a clear view to collect as             much data as possible. This is amply demonstrated by the             provision of a “friends permission” feature by Facebook on             its platform to allow individuals to share information not             just about themselves, but also about their friends. For the             principle of informed consent to be meaningfully             implemented, it is necessary for users to have access to             information about intended data practices, purposes and             usage, so they consciously share data about themselves.&lt;/p&gt;
&lt;p&gt;In             reality, however, privacy policies are more likely to serve             as liability disclaimers for companies than any kind of             guarantee of privacy for consumers. A case in point is Mark             Zuckerberg’s facile claim that there was no “data-breach" in             the Cambridge Analytica–Facebook incident. Instead of asking             each of the 87 million users whether they wanted their data             to be collected and shared further, Facebook designed a             platform that required consent in any form only from 270,000             users. Not only were users denied the opportunity to give             consent, their consent was assumed through a feature which             was on by default. This is representative of how privacy             trade-offs are conceived by current data-driven business             models. Participation in a digital ecosystem is by itself             deemed as users’ consent to relinquish control over how             their data is collected, who may have access to it, and what             purposes it may be used for.&lt;/p&gt;
&lt;p&gt;Yet,             Zuckerberg would have us believe that the primary privacy             issue of concern is not about how his platform enabled the             collection of users’ data without their explicit consent,             but in the subsequent unauthorised sharing of the data by             Kogan. Zuckerberg’s insistence that collection of data of             people without their consent is not a data breach is             reminiscent of the UIDAI’s recent claims in India that             publication of Aadhaar numbers and related information by             several government websites  is not a data breach, so long             as its central biometric database in secure (Sharma 2018).             In such cases also, the intended architecture ensured the             seeding of other databases with Aadhaar numbers, thus             creating multiple potential points of failure through             disclosure. Similarly, the design flaws in direct benefit             transfers enabled Airtel to create payments bank accounts             with the customers’ knowledge (&lt;em&gt;Hindu Business Line 2017&lt;/em&gt;). Such claims             clearly suggest the very limited responsibility data             controllers (both public and private) are willing to take             for personal data that they collect, while wilfully             facilitating and encouraging data practices which may lead             to greater risk to data.&lt;/p&gt;
&lt;p&gt;On             this note, it is also relevant to point out that the             Srikrishna committee white paper begins with identifying             informational privacy and data innovation as its two key             objectives. It states that “a firm legal framework for data             protection is the foundation on which data-driven innovation             and entrepreneurship can flourish in India.”&lt;span&gt; &lt;/span&gt;&lt;/p&gt;
&lt;p&gt;Conversations             around privacy and data have become inevitably linked to the             idea of technological innovation as a competing interest.             Before engaging in such conversations, it is important to             acknowledge that the value of innovation as a competing             interest itself is questionable. It is not a competing             right, nor a legitimate public interest endeavour, nor a             proven social good.&lt;/p&gt;
&lt;p&gt;The             idea that in policymaking, technological innovations may             compete with privacy of individuals assumes that there is             social and/or economic good in allowing unrestricted access             to data. The social argument is premised on the promises of             mathematical models and computational capacity being capable             of identifying key insights from data. In turn, these             insights may be useful in public and private             decision-making. However, it must be remembered that data is             potentially a toxic asset, if it is not collected,             processed, secured and shared in the appropriate way.             Sufficient research suggests that indiscriminate data             collection is greatly increasing the ratio of noise to             signal, and can lead to erroneous insights. Further, the             greater the amount of data you collect, the greater is the             attack surface that leads to cybersecurity risks. Further,             incidents such as Facebook–Cambridge Analytica demonstrate             that toxicity of data in various ways and underscores the             need for data regulation at every stage of the data             lifecycle (Scheiner  2016). These are important tempering             factors that need to be kept in mind while evaluating data             innovation as a key mover of policy or regulation.&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;Privacy as Social Good&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;As             long as privacy is framed as arising primarily from             individual control, data controllers will continue to engage             in practices that compromise the ability to exercise choice.             There is a need to view privacy as a social good, and             policymaking should ensure its preservation and enhancement.             Contractual protections and legal sanctions can themselves             do little if platform architectures are designed to do the             exact opposite.&lt;/p&gt;
&lt;p&gt;More             importantly, policymaking needs to recognise privacy not             merely as an individual right, available for individuals to             forego when engaging with data-driven business models, but             also as a social good. The recognition of something as a             social good deems it desirable by definition, and a             legitimate goal of law and policy, rather than rely             completely on market forces for its achievement.&lt;/p&gt;
&lt;p&gt;The             Puttaswamy judgment (K Puttaswamy v Union of India             2017) lends sufficient weight to privacy’s social value by             identifying it as fundamental to any individual development             through its dependence on solitude, anonymity, and temporary             releases from social duties.&lt;/p&gt;
&lt;p&gt;Sociological             scholarship demonstrates that different types of social             relationships, be it Gesellschaft (interest groups and             acquaintances) or Gemeinschaft (friendship, love, and             marriage), and the nature of these relationships depend on             the ability to conceal certain things (Simmel 1906).             Demonstrating this in the context of friendships, it has             been stated that such relationships “present a very peculiar             synthesis in regard to the question of discretion, of             reciprocal revelation and concealment.” Friendships, much             like most other social relationships, are very much             dependent on our ability to selectively present ourselves to             others. Contrast this with Zuckerberg’s stated aim of making             the world more “open” where information about people flows             freely and effectively without any individual control.             Contrast this also with government projects such as the             Aadhaar which intends to act as one universal identity which             can provide a 360-degree view of citizens.&lt;/p&gt;
&lt;p&gt;Other             scholars such as Julie Cohen (2012) and Anita Allen (2011)             have demonstrated that data that a person produces or has             control over concerns both herself and others. Individuals             can be exposed not only because of their own actions and             choices, but also made vulnerable merely because others have             been careless with their data. This point is amply             demonstrated in the Facebook–Cambridge Analytica incident.             What this means is that protection of privacy requires not             just individual action, but in a sense, requires group             co-ordination. It is my argument that this group interest of             privacy as a social good must be the basis of policymaking             and regulation of data in the future, in addition to the             idea of privacy as an individual right. In the absence of             attention to the social good aspect of privacy, individual             consumers are left to their own devices to negotiate  their             privacy trade-offs with large companies and governments and             are significantly compromised.&lt;/p&gt;
&lt;p&gt;What             this translates into is a regulatory framework and data             protection frameworks should not be value-neutral in their             conception of privacy as a facet of individual control. The             complete reliance of data regulation on the data subject to             make an informed choice is, in my opinion, an idea that has             run its course. If privacy is viewed as a social good, then             the data protection framework, including the laws and the             architecture must be designed with a view to protect it,             rather than leave it entirely to the market forces.&lt;/p&gt;
&lt;h2&gt;&lt;strong&gt;The Way Forward&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;Data             protection laws need to be re-evaluated, and policymakers             must recognise Lawrence Lessig’s dictum that “code is law.”             Like laws, architecture and norms can play a fundamental             role in regulation. Regulatory intervention for technology             need not mean regulation of technology only, but also how             technology itself may be leveraged for regulation (Lessig             2006; Reidenberg 1998). It is key that the latter is not             left only in the hands of private players. &lt;br /&gt; Zuckerberg, in his testimony (&lt;em&gt;Washington Post&lt;/em&gt; 2018) before             the United States Senate's Commerce and Judiciary             committees, asserted that "AI tools" are central to any             strategy for addressing hate speech, fake news, and             manipulations that use data ecosystems for targeting.&lt;/p&gt;
&lt;p&gt;What             is most concerning in his testimony is the complete lack of             mention of standards, public scrutiny and peer-review             processes, which “AI tools” and regulatory technologies need             to be subject to. Further, it cannot be expected that             data-driven businesses will view privacy as a social good or             be publicly accountable.&lt;/p&gt;
&lt;p&gt;As             policymakers in India gear up for writing the country’s data             protection law, they must acknowledge that their             responsibility extends to creating norms and principles that             will inform future data-driven platforms and regulatory             technologies.&lt;/p&gt;
&lt;p&gt;Since             issues of privacy and data protection will have to be             increasingly addressed at the level of how architectures             enable data collection, and more importantly how data is             used after collection, policymakers must recognise that             being neutral about these practices is no longer enough.             They must take normative positions on data collection,             processing and sharing practices. These positions cannot be             implemented through laws only, but need to be translated             into technological solutions and norms.  Unless a             multipronged approach comprising laws, architecture and             norms is adopted, India’s new data protection regime may end             up with limited efficacy.&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good'&gt;http://editors.cis-india.org/internet-governance/blog/epw-amber-sinha-may-18-2018-for-indias-data-protection-regime-to-be-efficient-policymakers-should-treat-privacy-as-a-social-good&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-18T06:22:57Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function">
    <title>How Function Of State May Limit Informed Consent: Examining Clause 12 Of The Data Protection Bill</title>
    <link>http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function</link>
    <description>
        &lt;b&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.&lt;/b&gt;
        &lt;p&gt;The blog post was &lt;a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/"&gt;published in Medianama&lt;/a&gt; on February 18, 2022. This is the first of a two-part series by Amber Sinha.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In 2018, hours after the Committee of Experts led by Justice Srikrishna Committee released their report and draft bill, I wrote &lt;a href="https://www.livemint.com/Opinion/zY8NPWoWWZw8AfI5JQhjmL/Draft-privacy-bill-and-its-loopholes.html"&gt;an opinion piece&lt;/a&gt; providing my quick take on what was good and bad about the bill. A section of my analysis focused on Clause 12 (then Clause 13) which provides for non-consensual processing of personal data for state functions. I called this provision a ‘carte-blanche’ which effectively allowed the state to process a citizen’s data for practically all interactions between them without having to deal with the inconvenience of seeking consent. My former colleague, Pranesh Prakash &lt;a href="https://twitter.com/pranesh/status/1023116679440621568"&gt;pointed out&lt;/a&gt; that this was not a correct interpretation of the provision as I had missed the significance of the word ‘necessary’ which was inserted to act as a check on the powers of the state. He also pointed out, correctly, that in its construction, this provision is equivalent to the position in European General Data Protection Regulation (Article 6 (i) (e)), and is perhaps even more restrictive.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While I agree with what Pranesh says above (his claims are largely factual, and there can be no basis for disagreement), my view of Clause 12 has not changed. While Clause 35 has been a focus of considerable discourse and analysis, for good reason, I continue to believe that Clause 12 remains among the most dangerous provisions of this bill, and I will try to unpack here, why.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Data Protection Bill 2021 has a chapter on the grounds for processing personal data, and one of those grounds is consent by the individual. The rest of the grounds deal with various situations in which personal data can be processed without seeking consent from the individual. Clause 12 lays down one of the grounds. It allows the state to process data without the consent of the individual in the following cases —&lt;/p&gt;
&lt;p&gt;a)  where it is necessary to respond to a medical emergency&lt;br /&gt;b)  where it is necessary for state to provide a service or benefit to the individual&lt;br /&gt;c)  where it is necessary for the state to issue any certification, licence or permit&lt;br /&gt;d)  where it is necessary under any central or state legislation, or to comply with a judicial order&lt;br /&gt;e)  where it is necessary for any measures during an epidemic, outbreak or public health&lt;br /&gt;f)  where it is necessary for safety procedures during disaster or breakdown of public order&lt;/p&gt;
&lt;p&gt;In order to carry out (b) and (c), there is also the added requirement that the state function must be authorised by law.&lt;/p&gt;
&lt;h2&gt;Twin restrictions in Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The use of the words ‘necessary’ and ‘authorised by law’ is intended to pose checks on the powers of the state. The first restriction seeks to limit actions to only those cases where the processing of personal data would be necessary for the exercise of the state function. This should mean that if the state function can be exercised without non-consensual processing of personal data, then it must be done so. Therefore, while acting under this provision, the state should only process my data if it needs to do so, to provide me with the service or benefit. The second restriction means that this would apply to only those state functions which are authorised by law, meaning only those functions which are supported by validly enacted legislation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What we need to keep in mind regarding Clause 12 is that the requirement of ‘authorised by law’ does not mean that legislation must provide for that specific kind of data processing. It simply means that the larger state function must have legal backing. The danger is how these provisions may be used with broad mandates. If the activity in question is non-consensual collection and processing of, say, demographic data of citizens to create state resident hubs which will assist in the provision of services such as healthcare, housing, and other welfare functions; all that may be required is that the welfare functions are authorised by law.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Scope of privacy under Puttaswamy&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It would be worthwhile, at this point, to delve into the nature of restrictions that the landmark Puttaswamy judgement discussed that the state can impose on privacy. The judgement clearly identifies the principles of informed consent and purpose limitation as central to informational privacy. As discussed repeatedly during the course of the hearings and in the judgement, privacy, like any other fundamental right, is not absolute. However, restrictions on the right must be reasonable in nature. In the case of Clause 12, the restrictions on privacy in the form of denial of informed consent need to be tested against a constitutional standard. In Puttaswamy, the bench ​was ​not ​required ​to ​provide ​a ​legal ​test ​to ​determine ​the ​extent ​and ​scope ​of the ​right ​to ​privacy, but they do provide sufficient ​guidance ​for ​us ​to ​contemplate ​how ​the ​limits ​and ​scope ​of ​the ​constitutional ​right ​to ​privacy ​could ​be ​determined ​in ​future ​cases.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Puttaswamy judgement clearly states that “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” By locating the right not just in Article 21 but also in the entirety of Part III, the bench clearly requires that “the drill of various Articles to which the right relates must be scrupulously followed.” This means that where transgressions on privacy relate to different provisions in Part III, the different tests under those provisions will apply along with those in Article 21. For instance, where the restrictions relate to personal freedoms, the tests under both Article 19 (right to freedoms) and Article 21 (right to life and liberty) will apply.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the case of Clause 12, the three tests laid down by Justice Chandrachud are most operative —&lt;br /&gt;a) the existence of a “law”&lt;br /&gt;b) a “legitimate State interest”&lt;br /&gt;c) the requirement of “proportionality”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is already reflected in the use of the phrase ‘authorised by law’ in Clause 12. The test under Article 21 would imply that the function of the state should not merely be authorised by law, but that the law, in both its substance and procedure, must be ‘fair, just and reasonable.’ The next test is that of ‘legitimate state interest’. In its report, the Joint Parliamentary Committee places emphasis on Justice Chandrachud’s use of “allocation of resources for human development” in an illustrative list of legitimate state interests. The report claims that the ground, functions of the state, thus satisfies the legitimate state interest. We do not dispute this claim.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Proportionality and Clause 12&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;It is the final test of ‘proportionality’ articulated by the Puttaswamy judgement, which is most operative in this context. Unlike Clauses 42 and 43 which include the twin tests of necessity and proportionality, the committee has chosen to only employ one ground in Clause 12. Proportionality is a commonly employed ground in European jurisprudence and common law countries such as Canada and South Africa, and it is also an integral part of Indian jurisprudence. As commonly understood, the proportionality test consists of three parts —&lt;/p&gt;
&lt;p&gt;a)  the limiting measures must be carefully designed, or rationally connected, to the objective&lt;br /&gt;b)  they must impair the right as little as possible&lt;br /&gt;c)  the effects of the limiting measures must not be so severe on individual or group rights that the legitimate state interest, albeit important, is outweighed by the abridgement of rights.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The first test is similar to the test of proximity under Article 19. The test of ‘necessity’ in Clause 12 must be viewed in this context. It must be remembered that the test of necessity is not limited to only situations where it may not be possible to obtain consent while providing benefits. My reservations with the sufficiency of this standard stem from observations made in the report, as well as the relatively small amount of jurisprudence on this term in Indian law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Srikrishna Report interestingly mentions three kinds of scenarios where consent should not be required — where it is not appropriate, necessary, or relevant for processing. The report goes on to give an example of inappropriateness. In cases where data is being gathered to provide welfare services, there is an imbalance in power between the citizen and the state. Having made that observation, the committee inexplicably arrives at a conclusion that the response to this problem is to further erode the power available to citizens by removing the need for consent altogether under Clause 12. There is limited jurisprudence on the standard of ‘necessity’ under Indian law. The Supreme Court has articulated this test as ‘having reasonable relation to the object the legislation has in view.’ If we look elsewhere for guidance on how to read ‘necessity’, the ECHR in Handyside v United Kingdom held it to be neither “synonymous with indispensable” nor does it have the “flexibility of such expressions as admissible, ordinary, useful, reasonable or desirable.” In short, there must be a pressing social need to satisfy this ground.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the other two tests of proportionality do not find a mention in Clause 12 at all. There is no requirement of ‘narrow tailoring’, that the scope of non-consensual processing must impair the right as little as possible. It is doubly unfortunate that this test does not find a place, as unlike necessity, ‘narrow tailoring’ is a test well understood in Indian law. This means that while there is a requirement to show that processing personal data was necessary to provide a service or benefit, there is no requirement to process data in a way that there is minimal non-consensual processing. The fear is that as long as there is a reasonable relation between processing data and the object of the function of state, state authorities and other bodies authorised by it, do not need to bother with obtaining consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly, the third test of proportionality is also not represented in this provision. It provides a test between the abridgement of individual rights and legitimate state interest in question, and it requires that the first must not outweigh the second. The absence of the proportionality test leaves Clause 12 devoid of any such consideration. Therefore, as long as the test of necessity is met under this law, it need not evaluate the denial of consent against the service or benefit that is being provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state, by setting the threshold to circumvent informed consent extremely low. In the next post, I will demonstrate the ease with which Clause 12 can allow indiscriminate data sharing by focusing on the Indian government’s digital healthcare schemes.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function'&gt;http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2022-03-01T14:56:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme">
    <title>Governing ID: Kenya’s Huduma Namba Programme</title>
    <link>http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
&lt;p&gt;In our fourth case-study, we use our Evaluation Framework for Digital ID to examine the use of Digital ID in Kenya.&lt;/p&gt;
&lt;p&gt;Read the &lt;a class="external-link" href="https://digitalid.design/evaluation-framework-case-studies/kenya.html"&gt;case-study&lt;/a&gt; or download as &lt;a href="http://editors.cis-india.org/internet-governance/digital-id-kenya-case-study" class="internal-link" title="Digital ID Kenya Case Study"&gt;PDF&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme'&gt;http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>internet governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital ID</dc:subject>
    
    
        <dc:subject>Digital Identity</dc:subject>
    

   <dc:date>2020-03-02T13:19:15Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017">
    <title>Discussion on Ranking Digital Rights in India (Delhi, January 07)</title>
    <link>http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017</link>
    <description>
        &lt;b&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of Privacy International, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues. Please join us on Saturday, January 07, at the India Islamic Cultural Centre, New Delhi, for a presentation of our findings followed by an open structured discussion on the methodology and implications of the study.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Download: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_RDRIndia-Discussion_07012017_Invitation.pdf"&gt;Invitation and agenda&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;p&gt;The &lt;a href="https://rankingdigitalrights.org/"&gt;Ranking Digital Rights Corporate Responsibility Index&lt;/a&gt; is a project hosted by the Open Technology Institute at New America Foundation that aims to rank Information and Communications Technology (ICTs) companies with respect to their Governance, Freedom of Expression, and Privacy practices. The inaugural Corporate Accountability Index, released in November 2015, evaluated 16 companies based on the project’s methodology that included 31 indicators in total.&lt;/p&gt;
&lt;p&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of &lt;a href="https://privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues.&lt;/p&gt;
&lt;p&gt;Please join us on Saturday, January 07, at the India Islamic Cultural Centre, New Delhi, for a presentation of our findings followed by an open structured discussion on the methodology and implications of the Ranking Digital Rights study. We will begin at 10:30 am with a round of tea and coffee.&lt;/p&gt;
&lt;p&gt;The event is open to all but the venue has limited space. The participants are requested to RSVP by sending an email to &lt;a href="mailto:nisha@cis-india.org?subject=RSVP: Ranking Digital Rights Discussion"&gt;nisha@cis-india.org&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;To further encourage programmers, researchers, journalists, students, and users in general to use and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology, we are also organising a “rankathon” on Sunday, January 08, at the CIS office in Delhi. More details can be found &lt;a href="http://cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We look forward to your participation and contribution to the discussion. Please support us by sharing this invitation with your colleagues and networks.&lt;/p&gt;
&lt;h2&gt;Agenda&lt;/h2&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;10:30-11:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Coffee and Tea&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;11:00-11:15&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;11:15-13:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Presentation of the Findings and Discussion&lt;/strong&gt; &lt;em&gt;Divij Joshi and Aditya Singh Chawla&lt;/em&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;13:00-14:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Lunch&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;14:00-15:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Open Discussion #1: Parameters of Evaluation&lt;/strong&gt;&lt;br /&gt;The RDR methodology was based upon evaluating commitments to uphold human rights through their services – in particular towards their commitment to users’ freedom of expression and privacy. Are there other parameters that may be considered in the Indian context?&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;15:00-16:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Open Discussion #2: Towards Protecting Digital Rights&lt;/strong&gt;&lt;br /&gt;What steps can be taken by the government, civil society, and industry in India to create an environment that recognizes and protects users digital rights? What are the relevant legal, political, and economic factors to take into consideration towards this? What are steps that other, multinational ICT companies have taken? Would these be realistic for Indian companies to implement?&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;16:00-16:30&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;16:30-17:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Coffee and Tea&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017'&gt;http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Ranking Digital Rights</dc:subject>
    
    
        <dc:subject>Digital Rights</dc:subject>
    

   <dc:date>2016-12-29T07:07:34Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/openness/design-public-conclave-6th-edition">
    <title>Design Public Conclave, 6th Edition</title>
    <link>http://editors.cis-india.org/openness/design-public-conclave-6th-edition</link>
    <description>
        &lt;b&gt;The 6th edition of the Design Public Conclave was hosted by Civic Labs, an initiative of the Center for Knowledge Studies, and part of the Vihara Innovation Network, in partnership with Social Innovation Exchange, Okapi, Business World, Business World for Smart Cities, and the Delhi Jal Board.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;This &lt;a href="http://designpublic.in/"&gt;edition of the conclave&lt;/a&gt; was focused on the challenges and opportunities faced by Indian cities. It sought to explore new mechanisms for integrating collaborative dialogue and problem solving into processes of government and citizen interaction. Participants included individuals from organisations such as Okapi, Hyderabad Urban Labs, Fields of View, Innovation Academy, Hewlett Packard, LIRNEasia, among others.&lt;/p&gt;
&lt;p&gt;The conclave began with a round of light yoga before moving into the introductory session. Namit Arora, a member of the Delhi Dialogue Commission, who gave the opening remarks introduced some of the subjects to be discussed and raised issues of citizen engagement, massive migration, pollution, unplanned growth, housing, water and power shortage, social problems like sectarianism and crime as some of the challenges faced in civic innovation. He stressed the lack of engagement between public and private parties and the issue of having no sense of commons in civic life in India.&lt;/p&gt;
&lt;h2&gt;What is Civic Innovation?&lt;/h2&gt;
&lt;p&gt;The first panel titled “What is civic innovation?” comprised Diastika Rahwidiati from Pulse Lab, Pavan Srinath from Takshashila Institution, Sriganesh Lokanathan from LIRNEasia and Aditya Dev Sood from Vihara Innovation Network. Pavan raised questions about how more people can be involved in civic issues, and spoke about the training program for public governance run by the Takshashila Institution as a means towards that. He also shared the example of Bangalore Political Action Committee, a citizen’s collective that includes several eminent personalities from the city that aims to improve the quality of life in the city. The panel continued to discuss how technology can be harnessed for social activism, and how the data revolution and data sciences can be used for civic innovation. Questions were asked about whether digital activism, such as civic hackathons, is not just a passing fad. A lot of solutions that are only technological in nature, can be misinformed, and so it is essential that other actors are involved along with technologists.&lt;/p&gt;
&lt;h2&gt;The Vision of a Smart City&lt;/h2&gt;
&lt;p&gt;Next, Sumit D. Chowdhury from the Ministry of Urban Development, Karuna Gopal from Foundation for Futuristic Cities, Parvathi Menon from Innovation Alchemy, Debashish Rao from HP, Bharath Palavalli from Fields of View and Namrata Mehta from CivicLabs spoke about how smart cities can be built. Parvathi Menon kicked off the conversation by saying that while it is impossible to design smart cities, it is possible to design smart communities. Sumit Chowdhury shared some of the factors that, in his opinion, make a smart city—the creation of scalable infrastructure, transparency in governance, velocity of business and quality of life. A city that can measure itself and use that knowledge to improve itself is a true smart city. Bharat Palavalli chimed in that while technology can make cities more efficient, efficiency can be dangerous. It can become easy to forget who the city is becoming more efficient for. Here, Sumit brought up the example of Shivpur in Maharashtra, where there are water meters in every village, public consciousness about planning and services and timely payment of taxes by citizen to drive the point that smart cities are driven by communities, and technology plays a role in enabling processes and the State in institutionalizing successful solutions. Finally, it was pointed out that under the 100 Smart Cities Initiative, the MoUD does not have a consistent understanding of what smart cities should be.&lt;/p&gt;
&lt;h2&gt;Dialogue between Society and State&lt;/h2&gt;
&lt;p&gt;This panel was followed by Elizabeth Elson’s keynote talk, “The dialogue between society and the state.” She spoke about the the power struggle between citizens and the government even in the case of technological application about who brings about change. She shared her experiences from the MAMPU programme. She pointed out some issues faced during the programme like too much focus on symptoms without really understanding the underlying causes, the use of intermediaries, creating mutually empowering coalitions. Elizabeth Elson pointed out that the terms, innovation and technology are used interchangeably . She pointed out that this was problematic as all technological solutions were not innovative. Another important issue that she raised was the need for technological intervention make media more accountable to the society. This session was followed by lunch.&lt;/p&gt;
&lt;h2&gt;Changing Society and Governments&lt;/h2&gt;
&lt;p&gt;The next session was moderated by Sumadro Chattapadhyay of Centre for Internet and Society. This panel included Garima Agarwal from Ashoka Innovators, Bangalore and Maesy Angelina from MAMPU programme, Jakarta. The session focussed on what were the appropriate modes of dialogues between civil society, private sector and government. Maesy Angelina focussed on design thinking as one of key methodologies for social innovation. Garima Agarwal emphasised on the importance of developing empathy as an institution. The panel said that while civil society and private sector could continue to point out the issues to the government, very often there is a failure of the government apparatus in that they do not know how to respond to these issues.&lt;/p&gt;
&lt;h2&gt;Civic Tech Demos&lt;/h2&gt;
&lt;p&gt;After lunch, there was a small session of brief pitches of examples of civic technological innovations. These include Local Circles, Meri Awaaz, SocialCops, On Track Media and BusBud. The issues that the solutions sought to addressed ranged from citizen engagement, awareness about reproductive issues, MNREGA, public transport and parking. I was reminded of the words of Pia Mancini who felt that she had failed in leveraging technology to solve governance issues as those problems were not technological but cultural. Having said that, a number of the ideas and the desire of use technology to solve social problems were laudable and one hopes to see more applications like these in future.&lt;/p&gt;
&lt;h2&gt;Breakout Sessions&lt;/h2&gt;
&lt;p&gt;This was followed by three simultaneous breakout sessions on the following topics – 1) Form and Function: Data Protocols for Civic Innovation, 2) Water Management for Improved Urban Health, and 3) Gaming for Decentralized Waste Management. I was part of the group discussing data protocols for civic innovation. Various question were raised with the implications of open data. One of the recurring themes was&amp;nbsp; the question of ownership of data and who had a rightful claim over it. We broke the discussion down into two heads – risks of data and opportunities for governance and solutions. Among risks, we discussed issues such as privacy risks, chilling effects on free speech, reliability of data, profusion of data without clear insights, social profiling and re-identification of anonymised data. We look at different forms and opportunities for governance including licensing and control, cross linking of data silos, clear guidelines on who controls and owns data. The failure of conventional data protection principles like collection limitation and data minimisation principles were also considered and alternate models which involved having hierarchies of different kinds of data based on potential harm through misuse were discussed. After the breakout sessions, each group made a presentation of their observation.&lt;/p&gt;
&lt;h2&gt;Concluding&lt;/h2&gt;
&lt;p&gt;The final session was on accelerating civic innovation. The panel comprised Kartik Desai from ASHA Impact, Delhi, Nishesh Mehta from Water Co-Lab, Ahmedabad, AIyong Paul Seong from USAID, Delhi, Santosh Singh from World Bank, Delhi and Aditya Dev Sood from Vihara Innovation Network. The discussion was focussed on what kinds of services can have an impact on the way citizens interact with the state. Elizabeth Elson’s keynote on the dialogues between the state and the citizens is also relevant with regard to this discussion. Different actors including citizens, civil society actors, government institutions and industry were discussed as agents who may create the new platforms for interaction. The conclave concluded with dinner and drinks in the lawns of the Vihara Innovation Campus.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/openness/design-public-conclave-6th-edition'&gt;http://editors.cis-india.org/openness/design-public-conclave-6th-edition&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Open Data</dc:subject>
    
    
        <dc:subject>Open Innovation</dc:subject>
    
    
        <dc:subject>Openness</dc:subject>
    

   <dc:date>2016-06-18T16:45:05Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy">
    <title>Deep Packet Inspection: How it Works and its Impact on Privacy</title>
    <link>http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy</link>
    <description>
        &lt;b&gt; In the last few years, there has been extensive debate and discussion around network neutrality in India. The online campaign in favor of Network Neutrality was led by Savetheinternet.in in India. The campaign was a spectacular success and facilitated sending  over a million emails supporting the cause of network neutrality, eventually leading to ban on differential pricing. Following in the footsteps of the Shreya Singhal judgement, the fact that the issue of net neutrality has managed to attract wide public attention is an encouraging sign for a free and open Internet in India. Since the debate has been focused largely on zero rating, other kinds of network practices impacting network neutrality have yet to be comprehensively explored in the Indian context, nor their impact on other values. In this article, the author focuses on network management, in general, and deep packet inspection, in particular and how it impacts the privacy of users.&lt;/b&gt;
        &lt;h3 style="text-align: justify; "&gt;&lt;a name="_ek69t4linon1"&gt;&lt;/a&gt; Background&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In the last few years, there has been extensive debate and discussion around network neutrality in India. The online campaign in favor of Network Neutrality was led by Savetheinternet.in in India. The campaign, captured in detail by an article in Mint,	&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; was a spectacular success and facilitated sending over a million emails supporting 	the cause of network neutrality, eventually leading to ban on differential pricing. Following in the footsteps of the Shreya Singhal judgement, the fact 	that the issue of net neutrality has managed to attract wide public attention is an encouraging sign for a free and open Internet in India. Since the 	debate has been focused largely on zero rating, other kinds of network practices impacting network neutrality have yet to be comprehensively explored in 	the Indian context, nor their impact on other values. In this article, I focus on network management, in general, and deep packet inspection, in particular 	and how it impacts the privacy of users.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_ft3wpj7p1jf1"&gt;&lt;/a&gt; The Architecture of the Internet&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Internet exists as a network acting as an intermediary between providers of content and it users.	&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Traditionally, the network did not distinguish between those who provided content 	and those who were recipients of this service, in fact often, the users also functioned as content providers. The architectural design of the Internet 	mandated that all content be broken down into data packets which were transmitted through nodes in the network transparently from the source machine to the 	destination machine.&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; As discussed in detail later, as per the OSI model, the network 	consists of 7 layers. We will go into each of these layers in detail below, however is important to understand that at the base is the physical layer of 	cables and wires, while at the top is application layer which contains all the functions that people want to perform on the Internet and the content 	associated with it. The layers in the middle can be characterised as the protocol layers for the purpose of this discussion. What makes the architecture of 	the Internet remarkable is that these layers are completely independent of each other, and in most cases, indifferent to the other layers. The protocol 	layer is what impacts net neutrality. It is this layer which provides the standards for the manner in which the data must flow through the network. The 	idea was for the it to be as simple and feature free as possible such that it is only concerned with the transmission data as fast as possible ('best 	efforts principle') while innovations are pushed to the layers above or below it.&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This aspect of the Internet's architectural design, which mandates that network features are implemented as the end points only (destination and source 	machine), i.e. at the application level, is called the 'end to end principle'.&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This 	means that the intermediate nodes do not differentiate between the data packets in any way based on source, application or any other feature and are only concerned with transmitting data as fast as possible, thus creating what has been described as a 'dumb' or neutral network.	&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This feature of the Internet architecture was also considered essential to what 	Jonathan Zittrain has termed as the 'generative' model of the Internet.&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Since, the 	Internet Protocol remains a simple layer incapable of discrimination of any form, it meant that no additional criteria could be established for what kind 	of application would access the Internet. Thus, the network remained truly open and ensured that the Internet does not privilege or become the preserve of 	a class of applications, nor does it differentiate between the different kinds of technologies that comprise the physical layer below.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the above model speaks of a dumb network not differentiating between the data packets that travel through it, in truth, the network operators engage 	in various kinds of practices that priorities, throttle or discount certain kinds of data packets. In her thesis essay at the Oxford Internet Institute, 	Alissa Cooper&lt;a href="#_ftn8" name="_ftnref8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; states that traffic management involves three different set of 	criteria- a) Some subsets of traffic needs to be managed, and arriving at a criteria to identify those subsets the criteria can be based on source, 	destination, application or users, b) Trigger for the traffic management measure which - could be based upon time of the day, usage threshold or a specific 	network condition, and c) the traffic treatment put into practice when the trigger is met. The traffic treatment can be of three kinds. The first is 	Blocking, in which traffic is prevented from being delivered. The second is Prioritization under which identified traffic is sent sooner or later. This is 	usually done in cases of congestion and one kind of traffic needs to be prioritized. The third kind of treatment is Rate limiting where identified traffic 	is limited to a defined sending rate.&lt;a href="#_ftn9" name="_ftnref9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The dumb network does not interfere with an 	application's operation, nor is it sensitive to the needs of an application, and in this way it treats all information sent over it as equal. In such a 	network, the content of the packets is not examined, and Internet providers act according to the destination of the data as opposed to any other factor. 	However, in order to perform traffic management in various circumstances, Deep packet Inspection technology, which does look at the content of data packets 	is commonly used by service providers.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_r7ojhgh467u5"&gt;&lt;/a&gt; Deep Packet Inspection&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Deep packet inspection (DPI) enables the examination of the content of a data packets being sent over the Internet. Christopher Parsons explains the header 	and the payload of a data packet with respect to the OSI model. In order to understand this better, it is more useful to speak of network in terms of the 	seven layers in the OSI model as opposed to the three layers discussed above.&lt;a href="#_ftn10" name="_ftnref10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Under the OSI model, the top layer, the Application Layer is in contact with the software making a data request. For instance, if the activity in question 	is accessing a webpage, the web-browser makes a request to access a page which is then passed on to the lower layers. The next layer is the Presentation 	Layer which deals with the format in which the data is presented. This lateral performs encryption and compression of the data. In the above example, this 	would involve asking for the HTML file. Next comes the Session Layer which initiates, manages and ends communication between the sender and receiver. In 	the above example, this would involve transmitting and regulating the data of the webpage including its text, images or any other media. These three layers 	are part of the 'payload' of the data packet.&lt;a href="#_ftn11" name="_ftnref11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The next four layers are part of the 'header' of the data packet. It begins with the Transport Layer which collects data from the Payload and creates a 	connection between the point of origin and the point of receipt, and assembles the packets in the correct order. In terms of accessing a webpage, this 	involves connecting the requesting computer system with the server hosting the data, and ensuring the data packets are put together in an arrangement which 	is cohesive when they are received. The next layer is the Data Link Layer. This layer formats the data packets in such a way that that they are compatible 	with the medium being used for their transmission. The final layer is the Physical Layer which determines the actual media used for transmitting the 	packets.&lt;a href="#_ftn12" name="_ftnref12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The transmission of the data packet occurs between the client and server, and packet inspect occurs through some equipment placed between the client and 	the server. There are various ways in which packet inspection has been classified and the level of depth that the inspection needs to qualify in order to 	be categorized as Deep Packet Inspection. We rely on Parson's classification system in this article. According to him, there are three broad categories of 	packet inspection - shallow, medium and deep.&lt;a href="#_ftn13" name="_ftnref13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Shallow packet inspection involves the inspection of the only the header, and usually checking it against a blacklist. The focus in this form of inspection 	is on the source and destination (IP address and packet;s port number). This form of inspection primarily deals with the Data Link Layer and Network Layer 	information of the packet. Shallow Packet Inspection is used by firewalls.&lt;a href="#_ftn14" name="_ftnref14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Medium Packet Inspection involves equipment existing between computers running the applications and the ISP or Internet gateways. They use application 	proxies where the header information is inspected against their loaded parse-list and used to look at a specific flows. These kinds of inspections 	technologies are used to look for specific kinds of traffic flows and take pre-defined actions upon identifying it. In this case, the header and a small 	part of the payload is also being examined.&lt;a href="#_ftn15" name="_ftnref15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, Deep Packet Inspection (DPI) enables networks to examine the origin, destination as well the content of data packets (header and payload). These 	technologies look for protocol non-compliance, spam, harmful code or any specific kinds of data that the network wants to monitor. The feature of the DPI 	technology that makes it an important subject of study is the different uses it can be put to. The use cases vary from real time analysis of the packets to 	interception, storage and analysis of contents of a packets.&lt;a href="#_ftn16" name="_ftnref16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_pi28w1745j15"&gt;&lt;/a&gt; The different purposes of DPI&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Network Management and QoS&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The primary justification for DPI presented is network management, and as a means to guarantee and ensure a certain minimum level of QoS (Quality of 	Service). Quality of Service (QoS) as a value conflicting with the objectives of Network Neutrality, has emerged as a significant discussion point in this 	topic. Much like network neutrality, QoS is also a term thrown around in vague, general and non-definitive references. The factors that come into play in 	QoS are network imposed delay, jitter, bandwidth and reliability. Delay, as the name suggests, is the time taken for a packet to be passed by the sender to the receiver. Higher levels of delay are characterized by more data packets held 'in transit' in the network.	&lt;a href="#_ftn17" name="_ftnref17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; A paper by Paul Ferguson and Geoff Huston described the TCP as a 'self clocking' 	protocol.&lt;a href="#_ftn18" name="_ftnref18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This enables the transmission rate of the sender to be adjusted as per 	the rate of reception by the receiver. As the delay and consequent stress on the protocol increases, this feedback ability begins to lose its sensitivity. 	This becomes most problematic in cases of VoIP and video applications. The idea of QoS generally entails consistent service quality with low delay, low 	jitter and high reliability through a system of preferential treatment provided to some traffic on a criteria formulated around the need of such traffic to 	have greater latency sensitivity and low delay and jitter. This is where Deep Packet Inspection comes into play. In 1991, Cisco pioneered the use of a new 	kind of router that could inspect data packets flowing through the network. DPI is able to look inside the packets and its content, enabling it to classify 	packets according to a formulated policy. DPI, which was used a security tool, to begin with, is a powerful tool as it allows ISPs to limit or block 	specific applications or improve performances of applications in telephony, streaming and real-time gaming. Very few scholars believe in an all-or-nothing approach to network neutrality and QoS and debate often comes down to what forms of differentiations are reasonable for service providers to practice.	&lt;a href="#_ftn19" name="_ftnref19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Security&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Deep Packet inspection was initially intended as a measure to manage the network and protect it from transmitting malicious programs . As mentioned above, Shallow Packet Inspection was used to secure LANs and keep out certain kinds of unwanted traffic.	&lt;a href="#_ftn20" name="_ftnref20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Similarly, DPI is used for identical purposes, where it is felt useful to 	enhance security and complete a 'deeper' inspection that also examines the payload along with the header information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Surveillance&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The third purpose of DPI is what concerns privacy theorists the most. The fact that DPI technologies enable the network operators to have access to the actual content of the data packets puts them a position of great power as well as making them susceptible to significant pressure from the state.	&lt;a href="#_ftn21" name="_ftnref21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; For instance, in US, the ISPs are required to conform to the provisions of the 	Communications Assistance for Law Enforcement Act (CALEA) which means they need to have some surveillance capacities designed into their systems. What is 	more disturbing for privacy theorists compared to the use of DPI for surveillance under legislation like CALEA, are the other alleged uses by organisation 	like the National Security Agency through back end access to the information via the ISPs. Aside from the US government, there have been various reports of use of DPI by governments in countries like China,&lt;a href="#_ftn22" name="_ftnref22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Malaysia&lt;a href="#_ftn23" name="_ftnref23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and Singapore.	&lt;a href="#_ftn24" name="_ftnref24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Behavioral targeting&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;DPI also enables very granular tracking of the online activities of Internet users. This information is invaluable for the purposes of behavioral targeting 	of content and advertising. Traditionally, this has been done through cookies and other tracking software. DPI allows new way to do this, so far exercised 	only through web-based tools to ISPs and their advertising partners. DPI will enable the ISPs to monitor contents of data packets and use this to create profiles of users which can later be employed for purposes such as targeted advertising.	&lt;a href="#_ftn25" name="_ftnref25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_gn60r7ifwcge"&gt;&lt;/a&gt; Impact on Privacy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Each of the above use-cases has significant implications for the privacy of Internet users as the technology in question involves access, tracking or 	retention of their online communication and usage activity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Alyssa Cooper compares DPI with other technologies carrying out content inspection such as caching services and individual users employing firewalls or packet sniffers. She argues that one of the most distinguishing feature of DPI is the potential for "mission-creep."	&lt;a href="#_ftn26" name="_ftnref26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Kevin Werbach writes that while networks may deploy DPI for implementation under 	CALEA or traffic peer-to-peer shaping, once deployed DPI techniques can be used for completely different purposes such as pattern matching of intercepted 	content and storage of raw data or conclusions drawn from the data.&lt;a href="#_ftn27" name="_ftnref27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This scope of 	mission creep is even more problematic as it is completely invisible. As opposed to other technologies which rely on cookies or other web-based services, 	the inspection occurs not at the end points, but somewhere in the middle of the network, often without leaving any traces on the user's system, thus 	rendering them virtually undiscoverable.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Much like other forms of surveillance, DPI threatens the sense that the web is a space where people can engage freely with a wide range of people and 	services. For such a space to continue to exist, it is important for people to feel secure about their communication and transaction on medium. This notion 	of trust is severely harmed by a sense that users are being surveilled and their communication intercepted. This has obvious chilling effect on free speech 	and could also impact electronic commerce.&lt;a href="#_ftn28" name="_ftnref28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Allyssa Cooper also points out another way in which DPI differs from other content tracking technologies. As the DPI is deployed by the ISPs, it creates a 	greater barrier to opting out and choosing another service. There are only limited options available to individuals as far as ISPs are concerned. 	Christopher Parsons does a review of ISPs using DPI technology in UK, US and Canada and offers that various ISPs do provide in their terms of services that 	they use DPI for network management purposes. However, this information is often not as easily accessible as the terms and conditions of online services. 	A;so, As opposed to online services, where it is relatively easier to migrate to another service, due to both presence of more options and the ease of 	migration, it is a much longer and more difficult process to change one's ISP.&lt;a href="#_ftn29" name="_ftnref29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_n5w8euzb4xhb"&gt;&lt;/a&gt; Measures to mitigate risk&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Currently, there are no existing regulatory frameworks in India which deal govern DPI technology in any way. The International Telecommunications Union 	(ITU) prescribes a standard for DPI&lt;a href="#_ftn30" name="_ftnref30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; however, the standard does not engage with 	any questions of privacy and requires all DPI technologies to be capable of identifying payload data, and prescribing classification rules for specific 	applications, thus, conflicting with notions of application agnosticism in network management. More importantly, the requirements to identify, decrypt and 	analyse tunneled and encrypted data threaten the reasonable expectation of privacy when sending and receiving encrypted communication. In this final 	section, I look at some possible principles and practices that may be evolved in order to mitigate privacy risks caused due to DPI technology.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Limiting 'depth' and breadth&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It has been argued that inherently what DPI technology intends to do is matching of patterns in the inspected content against a pre-defined list which is 	relevant to the purpose how which DPI is employed. Much like data minimization principles applicable to data controllers and data processors, it is 	possible for network operators to minimize the depth of the inspection (restrict it to header information only or limited payload information) so as to 	serve the purpose at hand. For instance, in cases where the ISP is looking to identify peer-to-peer traffic, there are protocols which declare their names 	in the application header itself. Similarly, a network operators looking to generate usage data about email traffic can do so simply by looking at port 	number and checking them against common email ports.&lt;a href="#_ftn31" name="_ftnref31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, this mitigation 	strategy may not work well for other use-cases such as blocking malicious software or prohibited content or monitoring for the sake of behavioral 	advertising.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While depth referred to the degree of inspection within data packets, breadth refers to the volume of packets being inspected. Alyssa Cooper argues that 	for many DPI use cases, it may be possible to rely on pattern matching on only the first few data packets in a flow, in order to arrive at sufficient data 	to take appropriate response. Cooper uses the same example about peer-to-peer traffic. In some cases, the protocol name may appear on the header file of 	only the first packet of a flow between two peers. In such circumstances, the network operators need not look beyond the header files of the first packet 	in a flow, and can apply the network management rule to the entire flow.&lt;a href="#_ftn32" name="_ftnref32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Data retention&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Aside from the depth and breadth of inspection, another important question whether and for along is there a need for data retention. All use cases may not 	require any kind of data retention and even in case where DPI is used for behavioral advertising, only the conclusions drawn may be retained instead of 	retaining the payload data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Transparency&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One of the issues is that DPI technology is developed and deployed outside the purview of standard organizations like ISO. Hence, there has been a lack of 	open, transparent standards development process in which participants have deliberated the impact of the technology. It is important for DPI to undergo 	these process which are inclusive, in that there is participation by non-engineering stakeholders to highlight the public policy issues such as privacy. Further, aside from the technology, the practices by networks need to be more transparent.	&lt;a href="#_ftn33" name="_ftnref33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Disclosure of the presence of DPI, the level of detail being inspected or retained and the purpose for deployment of DPI can be done. Some ISPs provide some of these details in their terms of service and website notices.	&lt;a href="#_ftn34" name="_ftnref34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, as opposed to web-based services, users have limited interaction with 	their ISP. It would be useful for ISPs to enable greater engagement with their users and make their practices more transparent.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The very nature of of the DPI technology renders some aspects of recognized privacy principles like notice and consent obsolete. The current privacy frameworks under FIPP&lt;a href="#_ftn35" name="_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and OECD	&lt;a href="#_ftn36" name="_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; rely on the idea of empowering the individual by providing them with knowledge 	and this knowledge enables them to make informed choices. However, for this liberal conception of privacy to function meaningfully, it is necessary that 	there are real and genuine choices presented to the alternatives. While some principles like data minimisation, necessity and proportionality and purpose 	limitation can be instrumental in ensuring that DPI technology is used only for legitimate purposes, however, without effective opt-out mechanisms and 	limited capacity of individual to assess the risks, the efficacy of privacy principles may be far from satisfactory.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ongoing Aadhaar case and a host of surveillance projects like CMS, NATGRID, NETRA&lt;a href="#_ftn37" name="_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and NMAC	&lt;a href="#_ftn38" name="_ftnref38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; have raised concerns about the state conducting mass-surveillance, particularly 	of online content. In this regard, it is all the more important to recognise the potential of Deep Packet Inspection technologies for impact on privacy 	rights of individuals. Earlier, the Centre for Internet and Society had filed Right to Information applications with the Department of Telecommunications, Government of India regarding the use of DPI, and the government had responded that there was no direction/reference to the ISPs to employ DPI technology.	&lt;a href="#_ftn39" name="_ftnref39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Similarly, MTNL also responded to the RTI Applications and denied using the 	technology.&lt;a href="#_ftn40" name="_ftnref40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; It is notable though, that they did not respond to the questions 	about the traffic management policies they follow. Thus, so far there has been little clarity on actual usage of DPI technology by the ISPs.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ashish Mishra, "India's Net Neutrality Crusaders", available at 			&lt;a href="http://mintonsunday.livemint.com/news/indias-net-neutrality-crusaders/2.3.2289565628.html"&gt; http://mintonsunday.livemint.com/news/indias-net-neutrality-crusaders/2.3.2289565628.html &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.livinginternet.com/i/iw_arch.htm"&gt;http://www.livinginternet.com/i/iw_arch.htm&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Vinton Cerf and Robert Kahn, "A protocol for packet network intercommunication", available at 			&lt;a href="https://www.semanticscholar.org/paper/A-protocol-for-packet-network-intercommunication-Cerf-Kahn/7b2fdcdfeb5ad8a4adf688eb02ce18b2c38fed7a"&gt; https://www.semanticscholar.org/paper/A-protocol-for-packet-network-intercommunication-Cerf-Kahn/7b2fdcdfeb5ad8a4adf688eb02ce18b2c38fed7a &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ganley and Ben Algove, "Network Neutrality-A User's Guide", available at			&lt;a href="http://wiki.commres.org/pds/NetworkNeutrality/NetNeutrality.pdf"&gt;http://wiki.commres.org/pds/NetworkNeutrality/NetNeutrality.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; J H Saltzer, D D Clark and D P Reed, "End-to-End arguments in System Design", available at			&lt;a href="http://web.mit.edu/Saltzer/www/publications/endtoend/endtoend.pdf"&gt;http://web.mit.edu/Saltzer/www/publications/endtoend/endtoend.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 4.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Jonathan Zittrain, The future of Internet - and how to stop it, (Yale University Press and Penguin UK, 2008) available at 			&lt;a href="https://dash.harvard.edu/bitstream/handle/1/4455262/Zittrain_Future%20of%20the%20Internet.pdf?sequence=1"&gt; https://dash.harvard.edu/bitstream/handle/1/4455262/Zittrain_Future%20of%20the%20Internet.pdf?sequence=1 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Alissa Cooper, How Regulation and Competition Influence Discrimination in Broadband Traffic Management: A Comparative Study of Net Neutrality in 			the United States and the United Kingdom available at 			&lt;a href="http://ora.ox.ac.uk/objects/uuid:757d85af-ec4d-4d8a-86ab-4dec86dab568"&gt; http://ora.ox.ac.uk/objects/uuid:757d85af-ec4d-4d8a-86ab-4dec86dab568 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Christopher Parsons, "The Politics of Deep Packet Inspection: What Drives Surveillance by Internet Service Providers?", available at 			&lt;a href="https://www.christopher-parsons.com/the-politics-of-deep-packet-inspection-what-drives-surveillance-by-internet-service-providers/"&gt; https://www.christopher-parsons.com/the-politics-of-deep-packet-inspection-what-drives-surveillance-by-internet-service-providers/ &lt;/a&gt; at 15.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 16.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 19.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Jay Klein, "Digging Deeper Into Deep Packet Inspection (DPI)", available at			&lt;a href="http://spi.unob.cz/papers/2007/2007-06.pdf"&gt;http://spi.unob.cz/papers/2007/2007-06.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Tim Wu, "Network Neutrality: Broadband Discrimination", available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=388863"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=388863&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ferguson and Geoff Huston, "Quality of Service on the Internet: Fact, Fiction,&lt;/p&gt;
&lt;p&gt;or Compromise?", available at &lt;a href="http://www.potaroo.net/papers/1998-6-qos/qos.pdf"&gt;http://www.potaroo.net/papers/1998-6-qos/qos.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Barbara van Schewick, "Network Neutrality and Quality of Service: What a non-discrimination Rule should look like", available at 			&lt;a href="http://cyberlaw.stanford.edu/downloads/20120611-NetworkNeutrality.pdf"&gt; http://cyberlaw.stanford.edu/downloads/20120611-NetworkNeutrality.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 14.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ohm, "The Rise and Fall of Invasive ISP Surveillance," available at 			&lt;a href="http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf"&gt; http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ben Elgin and Bruce Einhorn, "The great firewall of China", available at 			&lt;a href="http://www.bloomberg.com/news/articles/2006-01-22/the-great-firewall-of-china"&gt; http://www.bloomberg.com/news/articles/2006-01-22/the-great-firewall-of-china &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Mike Wheatley, "Malaysia's Web Heavily Censored Before Controversial Elections", available at 			&lt;a href="http://siliconangle.com/blog/2013/05/06/malaysias-web-heavily-censored-before-controversial-elections/"&gt; http://siliconangle.com/blog/2013/05/06/malaysias-web-heavily-censored-before-controversial-elections/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Fazal Majid, "Deep packet inspection rears it ugly head" available at			&lt;a href="https://majid.info/blog/telco-snooping/"&gt;https://majid.info/blog/telco-snooping/&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Alissa Cooper, "Doing the DPI Dance: Assessing the Privacy Impact of Deep Packet Inspection," in W. Aspray and P. Doty (Eds.), Privacy in America: 			Interdisciplinary Perspectives, Plymouth, UK: Scarecrow Press, 2011 at 151.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 148.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Kevin Werbach, "Breaking the Ice: Rethinking Telecommunications Law for the Digital Age", Journal of Telecommunications and High Technology, 			available at &lt;a href="http://www.jthtl.org/articles.php?volume=4"&gt;http://www.jthtl.org/articles.php?volume=4&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra &lt;/i&gt; Note 25 at 149.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra &lt;/i&gt; Note 25 at 147.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; International Telecommunications Union, Recommendation ITU-T.Y.2770, Requirements for Deep Packet Inspection in next generation networks, available 			at &lt;a href="https://www.itu.int/rec/T-REC-Y.2770-201211-I/en"&gt;https://www.itu.int/rec/T-REC-Y.2770-201211-I/en&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra &lt;/i&gt; Note 25 at 154.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 156.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 10.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ohm, "The Rise and Fall of Invasive ISP Surveillance", available at 			&lt;a href="http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf"&gt; http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn35"&gt;
&lt;p&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.nist.gov/nstic/NSTIC-FIPPs.pdf"&gt;http://www.nist.gov/nstic/NSTIC-FIPPs.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn36"&gt;
&lt;p&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm"&gt; https://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn37"&gt;
&lt;p&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; "India's Surveillance State" Software Freedom Law Centre, available at 			&lt;a href="http://sflc.in/indias-surveillance-state-our-report-on-communications-surveillance-in-india/"&gt; http://sflc.in/indias-surveillance-state-our-report-on-communications-surveillance-in-india/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn38"&gt;
&lt;p&gt;&lt;a href="#_ftnref38" name="_ftn38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Amber Sinha, "Are we losing our right to privacy and freedom on speech on Indian Internet", DNA, available at 			&lt;a href="http://www.dnaindia.com/scitech/column-are-we-losing-the-right-to-privacy-and-freedom-of-speech-on-indian-internet-2187527"&gt; http://www.dnaindia.com/scitech/column-are-we-losing-the-right-to-privacy-and-freedom-of-speech-on-indian-internet-2187527 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn39"&gt;
&lt;p&gt;&lt;a href="#_ftnref39" name="_ftn39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://cis-india.org/telecom/use-of-dpi-technology-by-isps.pdf"&gt;http://cis-india.org/telecom/use-of-dpi-technology-by-isps.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn40"&gt;
&lt;p&gt;&lt;a href="#_ftnref40" name="_ftn40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Smita Mujumdar, "Use of DPI Technology by ISPs - Response by the Department of Telecommunications" available at 			&lt;a href="http://cis-india.org/telecom/dot-response-to-rti-on-use-of-dpi-technology-by-isps"&gt; http://cis-india.org/telecom/dot-response-to-rti-on-use-of-dpi-technology-by-isps &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy'&gt;http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-12-16T23:14:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/files/data-protection-submission">
    <title>Data Protection Submission</title>
    <link>http://editors.cis-india.org/internet-governance/files/data-protection-submission</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/files/data-protection-submission'&gt;http://editors.cis-india.org/internet-governance/files/data-protection-submission&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-04-18T16:37:05Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/files/data-for-the-benefit-of-people">
    <title>Data for the Benefit of People</title>
    <link>http://editors.cis-india.org/internet-governance/files/data-for-the-benefit-of-people</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/files/data-for-the-benefit-of-people'&gt;http://editors.cis-india.org/internet-governance/files/data-for-the-benefit-of-people&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-12-01T04:21:32Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/counter-comments-on-trais-consultation-paper-on-privacy-security-and-ownership-of-data-in-telecom-sector">
    <title>Counter Comments on TRAI's Consultation Paper on Privacy, Security and Ownership of Data in Telecom Sector</title>
    <link>http://editors.cis-india.org/internet-governance/blog/counter-comments-on-trais-consultation-paper-on-privacy-security-and-ownership-of-data-in-telecom-sector</link>
    <description>
        &lt;b&gt;The Centre for Internet &amp; Society (CIS) has commented on the Consultation Paper on Privacy, Security and Ownership of Data in Telecom Sector published by the Telecom Regulatory Authority of India on August 9, 2017.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The submission is divided in three main parts. The first part 'Preliminary' introduces the document. The second part 'About CIS' is an overview of the organization. The third part contains the 'Counter Comments' on the Consultation Paper taking into account the submission made by other stakeholders.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Download the &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/counter-comments.pdf"&gt;full submission here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/counter-comments-on-trais-consultation-paper-on-privacy-security-and-ownership-of-data-in-telecom-sector'&gt;http://editors.cis-india.org/internet-governance/blog/counter-comments-on-trais-consultation-paper-on-privacy-security-and-ownership-of-data-in-telecom-sector&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-11-23T14:29:06Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017">
    <title>Comments on the Right to Information Rules, 2017</title>
    <link>http://editors.cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017</link>
    <description>
        &lt;b&gt;On March 31st, 2017, the Ministry of Personnel, Public Grievances and Pensions, Department of Personnel and Training released a Circular framing rules under the Right to Information Act, 2005 (“RTI Rules”). The Ministry invited comments on on the RTI Rules. CIS submitted its comments on April 25, 2017.&lt;/b&gt;
        
&lt;h3 dir="ltr"&gt;1. Preliminary&lt;/h3&gt;
&lt;p dir="ltr"&gt;1.1 On March 31st, 2017, the Ministry of Personnel, Public Grievances and Pensions, Department of Personnel and Training released a Circular framing rules under the Right to Information Act, 2005 (“RTI Rules”). The Ministry invited comments on on the RTI Rules.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;2. The Centre for Internet and Society&lt;/h3&gt;
&lt;p dir="ltr"&gt;2.1. The Centre for Internet and Society, (“CIS”), is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with diverse abilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, and open access), internet governance, telecommunication reform, digital privacy, and cyber-security.&lt;/p&gt;
&lt;h3 dir="ltr"&gt;3. Comments&lt;/h3&gt;
&lt;p dir="ltr"&gt;3.1 General Comments&lt;/p&gt;
&lt;p dir="ltr"&gt;The new RTI Rules introduce various procedural hurdles and provides a great deal of discretionary power to the CIC in dealing with RTI applications and appeals. One of the provisions which has attracted attention in the past also is the abatement of appeals upon the death of the RTI applications. This provision, explored in more detail is especially objectionable in light of the threats that RTI activists face.&lt;/p&gt;
&lt;p&gt;&lt;strong id="docs-internal-guid-f3638231-aeb5-9d2f-4329-a2fd7d07f81a"&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2 Specific Comments&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.1 Rule 4 of the RTI Rules states that the fees for providing information under the RTI Act would be ‘as notified by Central Government from time to time’. While the RTI Rules also prescribe the fee for filing RTI applications, this phrase provides a window to increase the fees through subsequent notifications. We recommend that the phrase “or as notified by Central Government from time to time” be deleted in order prevent prohibitive increase in the fees in future.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.2 Rule 4 of the RTI Rules also specifies the fees for provision of information via floppies and diskettes. There is no plausible reason to engage in continued rulemaking applicable to outdated modes of data storage. It would be of much more help if the rules were to prescribe fees for CDs, DVDs and email. We also submit that no fees need be charged for information provided through emails, and this mode of communication must be adopted where possible.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.3 Rule 8 (1)(viii) states that every appellant must affirm that they have not filed an appeal pertaining to similar matters before the Commission or any court. However, the same matter can lead to multiple counts of causes of actions, and the principle of res judicata barring further action should not apply in these cases. Therefore, it is recommended that this requirement is deleted.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.4 &amp;nbsp;Rule 12 permits the withdrawal of an appeal on the request of the appellant and &amp;nbsp;the &amp;nbsp;abatement &amp;nbsp;of &amp;nbsp;an &amp;nbsp;appeal &amp;nbsp;on &amp;nbsp;the &amp;nbsp;death &amp;nbsp;of &amp;nbsp;the &amp;nbsp;appellant. This provisions needs to be evaluated in light of the increasing number of cases of threats received by RTI activists. There have been close to 400 documented cases of attacks on RTI applicants,[1] including cases of murder and physical assault. This provision will serve to enable withdrawal of RTI appeals through harassment and other means of coercion.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;Further, the abatement of an appeal upon death of an RTI appellant is a clause without any merit and could translate into murders of appellants to cause abatement of the appeal. Additionally, the Supreme Court’s judgment in the matter of Union of India v. Namit Sharma[2] must be kept in mind which clarified the position that RTI applications and appeals are not in the nature of lis and deal with the question of whether requested information ought to be disclosed. Therefore, there is no reason why appeals should abate upon the demise of the appellant.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.5 &amp;nbsp;Rule 14 permits the CIC to return complaints due to non-compliance with the procedural rules in Rule 13. Such rules[3] have been used in the past to return complaints on unreasonable or artificial grounds. This is an example of additional procedural hurdles introduced by through the rulemaking process instead of making the process more citizen friendly.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;br /&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.2.6 Rule 15 (iii) of the RTI Rules gives the CIC the discretion to close a case without even allowing hearing to the applicant. There is no requirement on the CIC to provide a detailed reasoning of its determination either. This rule is in violation of the right to be heard before adjudication under natural justice principles.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;3.7 The redressal mechanism under Rule 16 of the RTI Rules leaves a lot to be desired. Beginning with the use of the term ‘communication’ to refer to the complaint regarding a non-compliance of the CIC’s order, the rule takes a cavalier approach to addressing the significant number of cases of non-compliance with the CIC’s order. Further, there is no clear procedure spelt out with regard to how the CIC will deal with such matters and whether parties may be heard before making an adjudication. Further, there is an inconsistency in that a communication may be rejected if not submitted in the prescribed format, whereas in the case of appeals it clearly stated that they may not be returned/rejected only on the ground of non-compliance with the format.&lt;/p&gt;
&lt;p dir="ltr"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;[1]  http://attacksonrtiusers.org&lt;/p&gt;
&lt;p dir="ltr"&gt;[2]  https://indiankanoon.org/doc/47938967/&lt;/p&gt;
&lt;p dir="ltr"&gt;[3]  Rule 9 of the RTI Rules, 2012.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017'&gt;http://editors.cis-india.org/openness/blog-old/comments-on-the-right-to-information-rules-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Openness</dc:subject>
    
    
        <dc:subject>RTI</dc:subject>
    
    
        <dc:subject>Call for Comments</dc:subject>
    

   <dc:date>2017-04-27T09:25:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
