<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="http://editors.cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>http://editors.cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 41 to 52.
        
  </description>
  
  
  
  
  <image rdf:resource="http://editors.cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/comments-on-information-technology-security-of-prepaid-payment-instruments-rules-2017"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/social-media-monitoring"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/aadhaar-bill-fails-to-incorporate-suggestions-by-the-standing-committee"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/openness/design-public-conclave-6th-edition"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations">
    <title>Analysis of Key Provisions of the Aadhaar Act Regulations </title>
    <link>http://editors.cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations</link>
    <description>
        &lt;b&gt;In exercise of their powers under of the powers conferred by Aadhaar (Targeted Delivery of Financial and other Subsidies, Benefits and Services) Act, 2016, (Aadhaar Act) the UIDAI has come out with a set of five regulations in late 2016 last year. In this policy brief, we look at the five regulations, their key provisions and highlight point out the unresolved, issues, unaddressed, and created issues as result of these   regulations. &lt;/b&gt;
        &lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;This blog post was edited by Elonnai Hickok&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;h3 style="text-align: justify; "&gt;Introduction&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;At the outset it is important to note that a concerning feature of these regulations is that they intend to govern the processes of a body which has been in existence for over six years, and has engaged in all the activities sought to be governed by these policies at a massive scale, considering the claims of over one billion Aadhaar number holders. However, the regulation do not acknowledge, let alone address past processes, practices, enrollments, authentications, use of technology etc.  this fact, and there are no provisions that effectively address  the past operations of the UIDAI. Below is an analysis of the five regulations issued thus far by the UIDAI.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Unique Identification Authority of India (Transactions of Business at Meetings of the Authority) Regulations&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;These regulations framed under clause (h) of sub-section (2) of section 54 read with sub-section (1) of section 19 of the Aadhaar Act, deal with the meetings of the UIDAI, the process following up to each meeting, and the manner in which all meetings are to be conducted.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 3.&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Meetings of the Authority– (1) There shall be no less than three meetings of the Authority in a financial year on such dates and at such places as the Chairperson may direct and the interval between any two meetings shall not in any case, be longer than five months&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;The number of times that UIDAI would meet in a year is far too less, taking in account the significance of the responsibilities of UIDAI as the sole body for policy making for all issues related to Aadhaar. In contrast, the Telecom Regulatory Authority of India is required to meet at least once a month. Other bodies such as SEBI and IRDAI are also required to meet at least four times&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and six times&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; in a year respectively.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 8 (5)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Decisions taken at every meeting of the Authority shall be published on the website of Authority unless the Chairperson determines otherwise on grounds of ensuring confidentiality.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;The Chairperson has the power to determine withholding publication of the decisions of the meeting on the broad grounds of ‘confidentiality’. Given the fact that the decisions taken by UIDAI as a public body can have very real implications for the rights of residents, the ground of confidentiality is not sufficient to warrant withholding publication. It is curious that instead of referring to the clearly defined exceptions laid down in other similar provisions such as the exceptions in Section 8 of the Right to Information Act, 2005, the rules merely refer to vague and undefined criteria of ‘confidentiality’.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 14 (4)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Members of the Authority and invitees shall sign an initial Declaration at the first meeting of the Authority for maintaining the confidentiality of the business transacted at meetings of the Authority in Schedule II.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;The above provision, combined with the fact that there is no provision regarding publication of the minutes of the meetings of UIDAI raise serious questions about the transparency of  its functioning.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Unique Identification Authority of India (Enrolment and Update) Regulations&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;These regulations, framed under  sub-section (1), and sub-clauses (a), (b), (d,) (e), (j), (k), (l), (n), (r), (s), and (v) of sub-section (2), of Section 54 of the Aadhaar Act deals with the enrolment process, the generation of an Aadhaar number, updation of information and governs the conduct of enrolment agencies and associated third parties.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provisions:&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 8 (2), (3) and (4)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The standard enrolment/update software shall have the security features as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;All equipment used in enrolment, such as computers, printers, biometric devices and other accessories shall be as per the specifications issued by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The biometric devices used for enrolment shall meet the specifications, and shall be certified as per the procedure, as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 3 (2)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The standards for collecting the biometric information shall be as specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 4 (5)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The standards of the above demographic information shall be as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 6 (2)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For residents who are unable to provide any biometric information contemplated by these regulations, the Authority shall provide for handling of such exceptions in the enrolment and update software, and such enrolment shall be carried out as per the procedure as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 14 (2)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In case of rejection due to duplicate enrolment, resident may be informed about the enrolment against which his Aadhaar number has been generated in the manner as may be specified by the Authority.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;Though in February 2017,  the UIDAI published technical specifications for registered devices&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;, the regulations  leave unaddressed issues such as lack of appropriately defined security safeguards in the Aadhaar. There is a general trend of continued deferrals in the regulations by stating that matters would be specified later on important aspects such as rejection of applications, uploading of the enrolment packet to the CIDR, the procedure for enrolling residents with biometric exceptions, the procedure for informing residents about acceptance/rejection of enrolment application, specifying the convenience fee for updation of residents’ information, the procedure for authenticating individuals across services etc.c. There is a clear failure to exercise the mandate delegated to UIDAI, leaving key matters to determined at a future unspecified date. The delay and ambiguity around when regulations will be defined is  all the more problematic  in light of the fact that the project has been implemented since 2010 and the Aadhaar number is now mandatory for availing a number of services.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Further it is important to note that a number of policies put out by the UIDAI predate these regulations, on which the regulations are  completely silent, thus neither endorsing previous policies  nor suggesting that they may be revisited. Further, the regulations choose to not engage with the question of operation of the Aadhaar project, enrolment and storage of data etc prior to the notification of these regulations, or the policies which these regulations may regularise. For instance, the regulations do not specify any measures to deal with issues arising out of enrolment devices used prior to the development of the February 2017 specifications.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 32&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;The Authority shall set up a contact centre to act as a central point of contact for resolution of queries and grievances of residents, accessible to residents through toll free number(s) and/ or e-mail, as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(2) The contact centre shall:&lt;/p&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;Provide a mechanism to log queries or grievances and provide residents with a unique reference number for further tracking till closure of the matter;&lt;/li&gt;
&lt;li&gt;Provide regional language support to the extent possible;&lt;/li&gt;
&lt;li&gt;Ensure safety of any information received from residents in relation to their identity information;&lt;/li&gt;
&lt;li&gt;Comply with the procedures and processes as may be specified by the Authority for this purpose.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;(3) Residents may also raise grievances by visiting the regional offices of the Authority or through any other officers or channels as may be specified by the Authority.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;While the setting up of a grievance redressal mechanism under the regulations is a welcome move, there is little clarity about the procedure to be followed, nor is a timeline for it specified. The chapter on grievance redressal is in fact one of the shortest chapters in the regulations. The only provision in this chapter deals with the setting up of a contact centre, a curious choice of term for what is supposed to be the primary quasi judicial grievance redressal body for the Aadhaar project. In line with the indifferent and insouciant terminology of ‘contact centre’, the chapter is restricted to the matters of the logging of queries and grievances by the contact centre, and does not address the matter of procedure or timelines, and even the substantive provisions about the nature of redress available. Furthermore, the obligation on the contact centre to protect information received is limited to ‘ensuring safety’ an ambiguous standard that does not speak to any other standards in Indian law.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Aadhaar (Authentication) Regulations, 2016&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;These regulations, framed under  sub-section (1), and sub-clauses (f) and (w) of sub-section (2) of Section 54 of the Aadhaar Act deals with the authentication framework for Aadhaar numbers, the governance of authentication agencies and the procedure for collection, storage of authentication data and records.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provisions:&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 5 (1)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At the time of authentication, a requesting entity shall inform the Aadhaar number holder of the following details:—&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(a) the nature of information that will be shared by the Authority upon authentication;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(b) the uses to which the information received during authentication may be put; and&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(c) alternatives to submission of identity information&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sub-Regulation 6 (2)&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A requesting entity shall obtain the consent referred to in sub-regulation (1) above in physical or preferably in electronic form and maintain logs or records of the consent obtained in the manner and form as may be specified by the Authority for this purpose.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;Sub-regulation 5 mentions that at the time of authentication, requesting entities shall inform the Aadhaar number holder of alternatives to submission of identity information for the purpose of authentication. Similarly, sub-regulation 6 mentions that requesting entity shall obtain the consent of the Aadhaar number holder for the authentication. However, in neither of the above circumstances do the regulations specify the clearly defined options that must be made available to the Aadhaar number holder in case they do not wish submit identity information, nor do the regulations specify the procedure to be followed in case the Aadhaar number holder does not provide consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most significantly, this provision does little by way of allaying the fears raised by the language in Section 8 (4) of the Aadhaar Act which states that UIDAI “shall respond to an authentication query with a positive, negative or any other appropriate response sharing such identity information.” This section gives a very wide discretion to UIDAI to share personal identity information with third parties, and the regulations do not temper or qualify this power in any way.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Sub-Regulation 11 (1) and (4)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;The Authority may enable an Aadhaar number holder to permanently lock his biometrics and temporarily unlock it when needed for biometric authentication.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Authority may make provisions for Aadhaar number holders to remove such permanent locks at any point in a secure manner.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;A welcome provision in the regulation is that of biometric locking which allows Aadhaar number holders to permanently lock his biometrics and temporarily unlock it only when needed for biometric authentication. However, in the same breath, the regulation also provides for the UIDAI to make provisions to remove such locking without any specified grounds for doing so.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 18 (2), (3) and (4)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;The logs of authentication transactions shall be maintained by the requesting entity for a period of 2 (two) years, during which period an Aadhaar number holder shall have the right to access such logs, in accordance with the procedure as may be specified.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Upon expiry of the period specified in sub-regulation (2), the logs shall be archived for a period of five years or the number of years as required by the laws or regulations governing the entity, whichever is later, and upon expiry of the said period, the logs shall be deleted except those records required to be retained by a court or required to be retained for any pending disputes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The requesting entity shall not share the authentication logs with any person other than the concerned Aadhaar number holder upon his request or for grievance redressal and resolution of disputes or with the Authority for audit purposes. The authentication logs shall not be used for any purpose other than stated in this sub-regulation.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;While it is specified that the authentication logs collected by the requesting entities shall not be shared with any person other than the concerned Aadhaar number holder upon their request or for grievance redressal and resolution of disputes or with the Authority for audit purposes, and that the authentication logs may not be used for any other purpose, the maintenance of the logs for a period of seven years seems excessive. Similarly, the UIDAI is also supposed to store Authentication transaction data for over five years. This is in violation of the widely recognized data minimisation principles which seeks that data collectors and data processors delete personal data records when the purpose for which it has been collected if fulfilled. While retention of data for audit and dispute-resolution purpose is legitimate, the lack of specification of security standards and the overall lack of transparency and inadequate grievance redressal mechanism greatly exacerbate the risks associated with data retention.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Aadhaar (Sharing of Information) Regulations, 2016 and Aadhaar (Data security) Regulations, 2016&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Framed under the powers conferred by sub-section (1), and sub-clause (o) of sub-section (2), of Section 54 read with sub-clause (k) of sub-section (2) of Section 23, and sub-sections&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(2) and (4) of Section 29, of the Aadhaar Act, the Sharing of Information regulations look at the restrictions on sharing of identity information collected by the UIDAI and requesting entities. The Data Security regulation, framed under powers conferred by clause (p) of subsection (2) of section 54 of the Aadhaar Act, looks at security obligations of all service providers engaged by the UIDAI.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;Provision: Sub-Regulation 6 (1)&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;All agencies, consultants, advisors and other service providers engaged by the Authority, and ecosystem partners such as registrars, requesting entities, Authentication User Agencies and Authentication Service Agencies shall get their operations audited by an information systems auditor certified by a recognised body under the Information Technology Act, 2000 and furnish certified audit reports to the Authority, upon request or at time periods specified by the Authority.&lt;/p&gt;
&lt;h5 style="text-align: justify; "&gt;Observations:&lt;/h5&gt;
&lt;p style="text-align: justify; "&gt;The regulation states that audits shall be conducted by an information systems auditor certified by a recognised body under the Information Technology Act, 2000. However, there is no such certifying body under the Information Technology Act. This suggests a lack of diligence in framing the rules, and will inevitably to lead to inordinate delays, or alternately, a lack of a clear procedure in the appointment of  an auditor. Further, instead of prescribing a regular and proactive process of audits, the regulation only limits audits to when requested or as deemed appropriate by UIDAI. This is another, in line of many provisions, whose implication is power being concentrated in the hands of  UIDAI, with little scope for accountability and transparency.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In conclusion, it must be stated that the regulations promulgated by the UIDAI leave a lot to be desired. Some of the most important issues raised against the Aadhaar Act, which were delegated to the UIDAI’s rule making powers have not been addressed at all. Some of the most important issues such as data security policies, right to access records of Aadhaar number holders, procedure to be followed by the grievance redressal bodies, uploading of the enrolment packet to the CIDR, procedure for enrolling residents with biometric exceptions, procedure for informing residents about acceptance/rejection of enrolment application have left unaddressed and ‘may be specified’ at a later data. These failures leave a gaping hole especially in light of the absence of a comprehensive data protection legislation in India, as well the speed and haste with the enrolment and seeding has been done by the UIDAI, and the number of services, both private and public, which are using or planning to use the Aadhaar number and the authentication process as a primary identifier for residents.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at &lt;a href="https://uidai.gov.in/legal-framework/acts/regulations.html"&gt;https://uidai.gov.in/legal-framework/acts/regulations.html&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://www.irda.gov.in/ADMINCMS/cms/frmGeneral_Layout.aspx?page=PageNo62&amp;amp;flag=1"&gt;https://www.irda.gov.in/ADMINCMS/cms/frmGeneral_Layout.aspx?page=PageNo62&amp;amp;flag=1&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.sebi.gov.in/acts/boardregu.html"&gt;http://www.sebi.gov.in/acts/boardregu.html&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at &lt;a href="https://uidai.gov.in/legal-framework/acts/regulations.html"&gt;https://uidai.gov.in/legal-framework/acts/regulations.html&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt; &lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at:  https://uidai.gov.in/images/resource/aadhaar_registered_devices_2_0_09112016.pdf&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at &lt;a href="https://uidai.gov.in/legal-framework/acts/regulations.html"&gt;https://uidai.gov.in/legal-framework/acts/regulations.html&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Available at &lt;a href="https://uidai.gov.in/legal-framework/acts/regulations.html"&gt;https://uidai.gov.in/legal-framework/acts/regulations.html&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations'&gt;http://editors.cis-india.org/internet-governance/blog/analysis-of-key-provisions-of-aadhaar-act-regulations&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>UID</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>UIDAI</dc:subject>
    
    
        <dc:subject>Biometrics</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    

   <dc:date>2017-04-03T14:05:01Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/comments-on-information-technology-security-of-prepaid-payment-instruments-rules-2017">
    <title>Comments on Information Technology (Security of Prepaid Payment Instruments) Rules, 2017</title>
    <link>http://editors.cis-india.org/internet-governance/blog/comments-on-information-technology-security-of-prepaid-payment-instruments-rules-2017</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society submitted comments on the Information Technology (Security of Prepaid Payment Instruments) Rules, 2017. The comments were prepared by Udbhav Tiwari, Pranesh Prakash, Abhay Rana, Amber Sinha and Sunil Abraham. &lt;/b&gt;
        &lt;h3 style="text-align: justify; "&gt;1. Preliminary&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;1.1. This submission presents comments by the Centre for Internet and Society&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/a&gt; in response to the Information Technology (Security of Prepaid Payment Instruments) Rules 2017 (“the Rules”).&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/a&gt; The Ministry of Electronics and Information Technology (MEIT) issued a consultation paper (pdf) which calls for developing a framework for security of digital wallets operating in the country on March 08, 2017. This proposed rules have been drafted under provisions of Information Technology Act, 2000, and comments have been invited from the general public and stakeholders before the enactment of these rules.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;2. The Centre for Internet and Society&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;2.1. The Centre for Internet and Society, (“CIS”), is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with diverse abilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, and open access), internet governance, telecommunication reform, digital privacy, and cyber-security.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;2.2. This submission is consistent with CIS’ commitment to safeguarding general public interest, and the interests and rights of various stakeholders involved, especially the privacy and data security of citizens. CIS is thankful to the MEIT for this opportunity to provide feedback to the draft rules.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;3. Comments&lt;/h3&gt;
&lt;h4 style="text-align: justify; "&gt;3.1  General Comments&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Penalty&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There is no penalty for not complying with these rules.  Even the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 doesn’t have penalties.  Under section 43A of the Information Technology Act (under which the 2011 Rules have been promulgated), a wrongful gain or a wrongful loss needs to be demonstrated.  This should not be a requirement for financial sector.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Expansion to Contractual Parties.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A majority of these rules, in order to be effective and realistically protect consumer interest, should also be expanded to third parties, agents, contractual relationships and any other relevant relationship an e-PPI issuer may delegate as a part of their functioning.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.2  Rule 2: Definitions&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Certain key words relevant to the field of e-PPI based digital payments such as authorisation, metadata, etc. are not defined in the rules and should both be defined and accounted for in the rules to ensure modern developments such as big data and machine learning, digital surveillance, etc. do not violate human rights and consumer interest.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.2  Rule 7: Definition of personal information&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Rule 7 provides an exhaustive list of data that will be deemed to be personal information for the purposes of the Rules. While &lt;b&gt;information collected&lt;/b&gt; at the time of issuance of the pre-paid payment instrument and during its use is included within the scope of Rule 7, it makes no reference to metadata generated and collected by the e-PPI issuer.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.3 Rule 4: Inadequate privacy protections&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Rule 4(2) specifies the details that the privacy policies of each e-PPI issuer must contain. However, these specifications are highly inadequate and fall well below the recommendations under the National Privacy Principles in Report of the Group of Experts on Privacy chaired by Justice A P Shah.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Suggestions: The Rules should include include clearly specified rights to access, correction and opt in/opt out, continuing obligations to seek consent in case of change in policy or purpose and deletion of data after purpose is achieved. Additionally, it must be required that a log of each version of past privacy policies be maintained along with the relevant period of applicability.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.4 Rule 10: Reasonable security practices&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Problem: Financial information (“such as bank account or credit card or debit card or other payment instrument details”) is already invoked in an inclusive manner in the definition of ‘personal information’ in Rule 7.  Given this there is no need to make the Reasonable Security Practices Rules applicable to financial data through this provisions: it already is, and it is best to avoid unnecessary redundancy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Solution: This entire rule should be removed.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.5  Rule 12: Traceability&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Problem: There is a requirement created under this rule that payment-related interactions with customers or other service providers be “appropriately trace[able]”.  But it is unclear what that would practically mean: would IP logging suffice? would IMEI need to be captured for mobile transactions? what is “appropriately” traceable? — none of those questions are answered.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Suggestion: The NPCI’s practices and RBI regulations, for instance, seek to limit the amount of information that entities like e-PPI providers have.  These rules need to be brought in line with those practices and regulations.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.6 Rule 5: Risk Assessment&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Rule 5 requires e-PPI issuers to carry out risk assessments associated with the security of the payments systems at least once a year and after any major security incident. However, there are no transparency requirements such as publications of details of such review, a summary of the analysis, any security vulnerabilities discovered etc.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Suggestion:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt;Broaden the scope of this provision to include not just risk assessments but also security audits.&lt;/li&gt;
&lt;li&gt;Mandate publication of risk assessment and security audit reports.&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt; &lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.7 Rule 11: End-to-End Encryption&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;The rule concerning end-to-end encryption (E2E) needs significantly greater detailing to be effective in ensuring the the protection of information at both storage and transit.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Suggestions: Elements such as Secure Element or a Secured Server and Trusted User Interface, both concepts to enable secure payments, can be detailed in the rule and a timeline can be established to require hardware, e-PPI practices and security standards to realistically account for such best practices to ensure modern, secure and industry accepted implementation of the rule.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.8 Rule 13: Retention of Information&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Problem: Rule 13 leaves the question of retention entirely unanswered by deferring the future rulemaking to the Central Government.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Suggestions: Rule 13 should be expanded to include the various categories of information that can be stored, guidelines for the short-term (fast access) and long-term storage of the information retained under the rule and other relevant details. The rule should also include the security standards that should be followed in the storage of such information, require access logs be maintained for whenever this information is accessed by individuals, detail secure destruction practices at the end of the retention period  and finally mandate that end users be notified by the e-PPI issuer of when such retained information is accessed in all situations bar exceptional circumstances such as national security, compromising an ongoing criminal investigations, etc.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.9 Rule 14: Reporting of Cyber Incidents&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Rule 14 is an excellent opportunity to uphold transparency, accountability and consumer rights by mandating time- and information-bound notification of cyber incidents to customers, including intrusions, database breaches and any other compromise of the integrity of the financial system. While the requirement of reporting such incidents to CERT-In is already present in the Rule 12 of the CERT Rules, the rule retains the optional nature of notifying customers. The rule should include an exhaustive list of categories or kinds of cyber incidents that should be reported to affected end users without compromising the investigation of such breaches by private organisations and public authorities. Further, the rule should also include penalties for non-compliance of this requirement (both to CERT-In and the consumer) to serve as an incentive for e-PPI issuers to uphold consumer public interest. The rule should be expanded to include a detailed mechanism for such reporting, including when e-PPI issuers and the CERT-In can withhold information from consumers as well as requiring the withheld information be disclosed when the investigation has been completed. Finally, the rule should also require that such disclosures be public in nature and consumers not be required to not disseminate such information to enable informed choice by the end user community.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Suggestion:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(1) In Rule 14(3) “may” should be substituted by “shall”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;(2) Penalties of up to 5 lakh rupees may be imposed for each day that the e-PPI issuer fails to report any severe vulnerability that could likely result in harm to customers.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.10 Rule 15: Customer Awareness and Education&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Problem: Rule 15 on Customer Awareness and Education by e-PPI issuers does not take into account the vast lingual diversity and varied socio-economic demographic that makes up the end users of e-PPI providers in India, by mandating the actions under the rule must account for these factors prior to be propagated.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Solutions: The rule must ensure that e-PPI issuers track record in carrying out awareness is regularly held accountable by both the government and public disclosures on their websites. Further, the rule can be made more concrete and effective by including mobile operating systems in their scope (along with equipments), mandating awareness for best practices for inclusive technologies like USSD banking, specifying notifications to include SMS reports of financial transactions, etc.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.11 Rule 16: Grievance Redressal&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Problem: Rule 16 lays down the requirement of grievance redressal, without specifying appellate mechanisms (both within the organisation and at the regulatory level), accountability (via penalties) for non-compliance of the rule nor requiring a clear hierarchy of responsibility within the e-PPI organisation. These factors seriously compromise the efficacy of a grievance redressal framework.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt; &lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Solutions: Similar rules for grievance redressal that have been enacted by the Insurance Regulatory and Development Authority for the insurance sector and the Telecom Regulatory Authority of India for the telecom sector can and should serve as a reference point for this rule. Their effectiveness and real world operation should also be monitored by the relevant authorities while ensuring sufficient flexibility exists in the rule to uphold consumer rights and the public interest. Proper appellate mechanisms at the regulatory level are essential along with penalties for non-compliance.&lt;/p&gt;
&lt;h4 style="text-align: justify; "&gt;3.12 Rule 17: Security Standards&lt;/h4&gt;
&lt;p style="text-align: justify; "&gt;Problem: Rule 17 empowers the Central Government to mandate security standards to be followed by e-PPI issuers operating in India. While appreciable in its overall outlook on ensuring a minimum standard of security, the Rule needs be improved upon to make it more effective. This can be in done by specifying certain minimum security standards to ensure all e-PPI issuers have a minimal level of security, instead of leaving them open to being intimated at a later date.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Solutions: Standards that can either be made mandatory or be used as a reference point to create a new standard under Rule 17(2) are ISO/IEC 14443, IS 14202, ISO/IEC 7816, PCI DSS, etc. Further, the Rule should include penalties for non-compliance of these standards, to make them effectively enforceable by both the government and end users alike. Additional details like the maximum time period in which such security standards should be implemented post their notification, requiring regular third party audits to ensure continuing compliance and effectiveness and requiring updated standards be used upon their release will go a long way in ensuring e-PPI issuers fulfil their mandate under these Rules.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://cis-india.org/"&gt;http://cis-india.org/&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://meity.gov.in/sites/upload_files/dit/files/draft-rules-security%20of%20PPI-for%20public%20comments.pdf"&gt;http://meity.gov.in/sites/upload_files/dit/files/draft-rules-security%20of%20PPI-for%20public%20comments.pdf&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/comments-on-information-technology-security-of-prepaid-payment-instruments-rules-2017'&gt;http://editors.cis-india.org/internet-governance/blog/comments-on-information-technology-security-of-prepaid-payment-instruments-rules-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Information Technology</dc:subject>
    

   <dc:date>2017-03-23T01:54:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar">
    <title>Can the Judiciary Upturn the Lok Sabha Speaker’s Decision on Aadhaar?</title>
    <link>http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar</link>
    <description>
        &lt;b&gt;When ruling on the petition filed by Jairam Ramesh challenging passing the Aadhaar Act as a money Bill, the court has differing precedents to look at.&lt;/b&gt;
        &lt;p&gt;The article was &lt;a class="external-link" href="https://thewire.in/110795/aadhaar-money-bill-judiciary/"&gt;published in the Wire&lt;/a&gt; on February 21, 2017.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;In &lt;a href="http://thewire.in/2016/04/24/the-aadhaar-act-is-not-a-money-bill-31297/" target="_blank" title="an earlier article"&gt;an earlier article&lt;/a&gt;, I had argued that the characterisation of the &lt;a href="https://www.google.co.in/url?sa=t&amp;amp;rct=j&amp;amp;q=&amp;amp;esrc=s&amp;amp;source=web&amp;amp;cd=5&amp;amp;cad=rja&amp;amp;uact=8&amp;amp;ved=0ahUKEwj0xo6U_KDSAhVHLo8KHcygCVEQFggvMAQ&amp;amp;url=https%3A%2F%2Fuidai.gov.in%2Fimages%2Fthe_aadhaar_act_2016.pdf&amp;amp;usg=AFQjCNHDmJKdO8jdfGZJKLKRJQpHdf1Frw&amp;amp;sig2=B_YbWncu6eyZHJ1MFTD0NA" rel="external nofollow" target="_blank" title="Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act"&gt;Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act&lt;/a&gt;,  as a money Bill by Sumitra Mahajan, speaker of the Lok Sabha, was  erroneous. Specifically, I had argued that upon perusal of Article 110  (1) of the constitution, the Aadhaar Act does not satisfy the conditions  required of a money Bill. For a legislation to be classified as a money  Bill, it must comprise of ‘only’ provisions dealing with the following  matters: (a) imposition, regulation and abolition of any tax, (b)  borrowing or other financial obligations of the government of India, (c)  custody, withdrawal from or payment into the Consolidated Fund of India  (CFI) or Contingent Fund of India, (d) appropriation of money out of  CFI, (e) expenditure charged on the CFI or (f) receipt or custody or  audit of money into CFI or public account of India; or (g) any matter  incidental to any of the matters specified in sub-clauses (a) to (f).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Article 110 is modelled on Section 1(2) of the UK’s Parliament Act, 1911, which also defines money Bills as those only dealing with certain enumerated matters. The use of the word ‘only’ was brought up by Ghanshyam Singh Gupta during the constituent assembly debates. He pointed out that the use of the word ‘only’ limits the scope money Bills to only those legislations which did not deal with other matters. His amendment to delete the word ‘only’ was rejected, clearly establishing the intent of the framers of the constitution to keep the ambit of money Bills extremely narrow. G.V. Mavalankar, the first speaker of Lok Sabha, had stated that the word ‘only’ must not be construed so as to give an overly restrictive meaning. For instance, a Bill which deals with taxation could have provisions which deal with the administration of the tax. The finance minister, Arun Jaitley, referred to these words by Mavalankar, justifying the classification of the Aadhaar Act as a money Bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the Aadhaar Bill does makes references to benefits, subsidies and services funded by the CFI, even a cursory reading of the Bill reveals its main objectives as creating a right to obtain a unique identification number and providing for a statutory apparatus to regulate the entire process. Any reasonable reading of the legislation would be hard pressed to view all provisions in the Aadhaar Act, aside from the one creating a charge on the CFI, as merely administrative provisions incidental to the creation such charge. The mere fact of establishing the Aadhaar number as the identification mechanism for benefits and subsidies funded by the CFI does not give it the character of a money Bill. The Bill merely speaks of facilitating access to unspecified subsidies and benefits rather than their creation and provision being the primary object of the legislation. Erskine May’s seminal textbook, Parliamentary Practice, is instructive in this respect and makes it clear that a legislation which simply makes a charge on the consolidated fund does not becomes a money Bill if otherwise its character is not that of one. Further, the subordinate regulations notified under the Aadhaar Act deal almost entirely with matters to do with enrolment, updation, authentication of the Aadhaar number and related matters such as data security regulations and sharing of information collected, rather than the provision of benefits or subsidies or disbursal of funds otherwise from the CFI.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, in the context of the petition filed by former Union minister Jairam Ramesh challenging the passage of the law on Aadhaar as a money Bill, the more important question is whether the judiciary has a right to question the speaker’s decision in such a matter. If not, any other questions about whether the legislation is a money Bill will remain merely academic in nature.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Irregularity vs illegality&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Article 110 (3) clearly states that with regard to the question whether a legislation is a money Bill or not, the decision of the speaker is final and binding. The question is whether such a clause completely excludes any judicial review. Further, Article 122 prohibits the courts from questioning the validity of any proceedings in parliament on the ground of any alleged irregularity of procedure.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;During the arguments in the court, the attorney general questioned the locus standi of Ramesh. The petition has been made under Article 32 of the constitution and the government argued that no fundamental rights of Ramesh were violated. However, the court has asked Ramesh to make his submission and adjourned the hearing to July. The petition by Ramesh would hinge largely on the powers of the judiciary to question the decision of the speaker of the Lok Sabha.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The powers of privilege that parliamentarians enjoy are integral to the principle of separation of powers. The rationale behind parliamentary privilege is to prevent interference in the lawmakers’ powers to perform essential functions. The ability to speak and vote inside the legislature without the fear of punishment is certainly essential to the role of a lawmaker. However, the extent of this protection lies at the centre of this discussion. During the constituent assembly debates, H.V. Kamath and others had argued for a schedule to exhaustively codify the existing privileges. However, B.R. Ambedkar pointed to the difficulty of doing so and parliamentary privilege on the lines of the British parliamentary practice was retained in the constitution. In the last few decades, a judicial position has emerged that courts could exercise a limited degree of scrutiny over privileges, as they are primarily responsible for interpreting the constitution.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the matter of &lt;a href="https://indiankanoon.org/doc/1757390/" rel="external nofollow" target="_blank" title="Raja Ram Pal vs The Hon’ble Speaker, Lok Sabha"&gt;&lt;i&gt;Raja Ram Pal vs The Hon’ble Speaker, Lok Sabh&lt;/i&gt;a&lt;/a&gt;,  it had been clarified that proceedings of the legislature were immune  from questioning by courts in the case of procedural irregularity but  not in the case of illegality. In this case, the Supreme Court while  dealing with Article 122 stated that it does not oust review by the  judiciary in cases of “gross illegality, irrationality, violation of  constitutional mandate, mala fides, non-compliance with rules of natural  justice and perversity.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In 1968, the speaker of the Punjab legislative assembly adjourned the  proceedings for a period of two months following rowdy behaviour.  Subsequently, an ordinance preventing such a suspension was promulgated  and the legislature was summoned by the governor to consider some  expedient financial matters. The speaker disagreed with the decision and  after some confusion, the deputy speaker passed a few Bills as money  Bills. While looking into the question of what was protected from  judicial review, the &lt;a href="https://indiankanoon.org/doc/36589/" rel="external nofollow" target="_blank" title="court stated"&gt;court stated&lt;/a&gt; that the protection did not extend to breaches of mandatory provisions  of the constitution, only to directory provisions. By that logic, if  Article 110 (1) is seen as a mandatory provision, a breach of its  provisions could lead to an interpretation that the Supreme Court may  well question an erroneous decision by the speaker of the Lok Sabha to  certify a legislation as a money Bill. The use of the word “shall” in  Article 110 (1), the nature and design of the provision, its overriding  impact on the other constitutional provisions granting the Rajya Sabha  powers are ample evidence of its mandatory nature. Based on the above,  Anup Surendranath has &lt;a href="http://ccgdelhi.org/doc/%28CCG-NLU%29%20Aadhaar%20Money%20Bill.pdf" rel="external nofollow" target="_blank" title="argued"&gt;argued&lt;/a&gt; that  the passage of the Aadhaar Act as a money Bill when it does not satisfy  the constitutional conditions for it does amount to a gross illegality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The judicial precedent in &lt;i&gt;&lt;a href="https://indiankanoon.org/doc/60568976/" rel="external nofollow" target="_blank" title="Mohd. Saeed Siddiqui vs State of Uttar Pradesh"&gt;Mohd. Saeed Siddiqui vs State of Uttar Pradesh&lt;/a&gt;&lt;/i&gt; where the matter of the court’s power to question the decision of a  speaker was considered, though, leans in the other direction. In 2012,  the &lt;a href="https://www.google.co.in/url?sa=t&amp;amp;rct=j&amp;amp;q=&amp;amp;esrc=s&amp;amp;source=web&amp;amp;cd=1&amp;amp;cad=rja&amp;amp;uact=8&amp;amp;ved=0ahUKEwiRtov_iKHSAhVLuo8KHYhsClcQFggbMAA&amp;amp;url=http%3A%2F%2Fwww.lawsofindia.org%2Fdownloadfile.php%3Flawid%3D7834%26file%3Duttar_pradesh%2F1981%2F1981UP7.pdf%26pageurl%3D%252Fsingle%252Falpha%252F7.html&amp;amp;usg=AFQjCNGRW8-NChXALunaUbjZRrlM4IvCkA&amp;amp;sig2=rg6YCMf7qRqNw08NnctuhQ" rel="external nofollow" target="_blank" title="Uttar Pradesh Lokayukta and Up-Lokayuktas (Amendment) Act"&gt;Uttar Pradesh Lokayukta and Up-Lokayuktas (Amendment) Act&lt;/a&gt;,  2012 was passed as money Bill by the Uttar Pradesh state legislature.  Subsequently, a writ petition was filed challenging its constitutional  validity. A three-judge bench of the Supreme Court looked into the  application of Article 212. It is the provision corresponding to Article  122, dealing with the power of the courts to inquire into the  proceedings of the state legislature. The court held that Article 212  makes “it clear that the finality of the decision of the Speaker and the  proceedings of the State Legislature being important privilege of the  State Legislature, viz., freedom of speech, debate and proceedings are  not to be inquired by the Courts.” Importantly, ‘proceedings of the  legislature’ were deemed to include within its scope everything done in  transacting parliamentary business, including the passage of the Bill.  While the court did acknowledge the limitations of parliamentary  privilege as established in the &lt;i&gt;Raja Ram Pal&lt;/i&gt; case, it did not adequately take into account the reasoning in it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Aadhaar Act is a legislation which makes it mandatory of all  residents to enrol for a biometric identification system in order to  avail certain subsidies, benefits and services. It has huge potential  risks for individual privacy and national security and has been the  subject of an extremely high profile Public Interest Litigation. Its  passage as a money Bill, without any oversight from the Rajya Sabha and  an opportunity for substantial debate and discussion, is a fraud on the  Constitution. Whether or not the court chooses to see it that way  remains to be seen.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar'&gt;http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-february-21-2017-can-the-judiciary-upturn-the-lok-sabha-speakers-decision-on-aadhaar&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-02-27T15:44:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report">
    <title>Privacy after Big Data - Workshop Report</title>
    <link>http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society (CIS) and the Sarai programme, CSDS, organised a workshop on 'Privacy after Big Data: What Changes? What should Change?' on Saturday, November 12, 2016 at Centre for the Study of Developing Societies in New Delhi. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;This workshop aimed to build a dialogue around some of the key government-led big data initiatives in India and elsewhere that are contributing significant new challenges and concerns to the ongoing debates on the right to privacy. It was an open event.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In this age of big data, discussions about privacy are intertwined with the use of technology and the data deluge. Though big data possesses enormous value for driving innovation and contributing to productivity and efficiency, privacy concerns have gained significance in the dialogue around regulated use of data and the means by which individual privacy might be compromised through means such as surveillance, or protected. The tremendous opportunities big data creates in varied sectors ranges from financial technology, governance, education, health, welfare schemes, smart cities to name a few. With the UID project re-animating the Right to Privacy debate in India, and the financial technology ecosystem growing rapidly, striking a balance between benefits of big data and privacy concerns is a critical policy question that demands public dialogue and research to inform an evidence based decision. Also, with the advent of potential big data initiatives like the ambitious Smart Cities Mission under the Digital India Scheme, which would rely on harvesting large data sets and the use of analytics in city subsystems to make public utilities and services efficient, the tasks of ensuring data security on one hand and protecting individual privacy on the other become harder.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This workshop sought to discuss some of the emerging problems due to the advent of big data and possible ways to address these problems. The workshop began with Amber Sinha of CIS and Sandeep Mertia of Sarai introducing the topic of big data and implications for privacy. Both speakers tried to define big data and brief history of the evolution of the term and raised questions about how we understand it. Dr. Usha Ramanathan spoke on the right to privacy in the context of the ongoing Aadhaar case and Vipul Kharbanda introduced the concept of Habeas Data as a possible solution to the privacy problems posed by big data.  Amelia Andersotter discussed national centralised digital ID systems and their evolution in Europe, often operating at a cross-functional scale, and highlighted its implications for discussions on data protection, welfare governance, and exclusion from public and private services. Srikanth Lakshmanan spoke of the issues with technology and privacy, and possible technological solutions.  Dr. Anupam Saraph discussed the rise of digital banking and Aadhaar based payments and its potential use for corrupt practices. Astha Kapoor of Microsave spoke about her experience of implementation of digital money solution in rural India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Post lunch, Dr. Anja Kovacs and Mathew Rice spoke on the rise of mass communication surveillance across the world, and the evolving challenges of regulating surveillance by government agencies. Mathew also spoke of privacy movements by citizens and civil society in regions. In the final speaking session, Apar Gupta and Kritika Bhardwaj traced the history of jurisprudence on the right to privacy and the existing regulations and procedures. In the final session, the participants discussed various possible solutions to privacy threats from big data and identity projects including better regulation, new approached such as harms based regulation and privacy risk assessments, and conceiving privacy as a horizontal right. The workshop ended with vote of thanks from the organizers.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The agenda for the event can be accessed &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS-Sarai_PrivacyAfterBigData_ConceptAgenda.pdf"&gt;here&lt;/a&gt;, and the transcript is available &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/privacy-after-big-data/"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report'&gt;http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-01-27T01:09:17Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/social-media-monitoring">
    <title>Social Media Monitoring</title>
    <link>http://editors.cis-india.org/internet-governance/blog/social-media-monitoring</link>
    <description>
        &lt;b&gt;We see a trend of social media and communication monitoring and surveillance initiatives in India which have the potential to create a chilling effect on free speech online and raises question about the privacy of individuals. In this paper, Amber Sinha looks at social media monitoring as a tool for surveillance, the current state of social media surveillance in India, and evaluate how the existing regulatory framework in India may deal with such practices in future.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Social Media Monitoring: &lt;a href="http://cis-india.org/internet-governance/files/social-media-monitoring/at_download/file"&gt;Download&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/h3&gt;
&lt;p&gt;In 2014, the Government of India launched the much lauded and popular citizen outreach website called MyGov.in. A press release by the government announced that they had roped in global consulting firm PwC to assist in the data mining exercise to process and filter key points emerging from debates on Mygov.in. While this was a welcome move, the release also mentioned that the government intended to monitor social media sites in order to gauge popular opinion. Further, earlier this year, the government set up National Media Analytics Centre (NMAC) to monitor blogs, media channels, news outlets and social media platforms. The tracking software used by NMAC will generate tags to classify post and comments on social media into negative, positive and neutral categories, paying special attention to “belligerent” comments, and also look at the past patterns of posts. A project called NETRA has already been reported in the media a few years back which would intercept and analyse internet traffic using pre-defined filters. Alongside, we see other initiatives which intend to use social media data for predictive policing purposes such as CCTNS and Social Media Labs.&lt;/p&gt;
&lt;p&gt;Thus, we see a trend of social media and communication monitoring and surveillance initiatives announced by the government which have the potential to create a chilling effect on free speech online and raises question about the
privacy of individuals. Various commentators have raised concerns about the legal validity of such programmes and whether they were in violation of the fundamental rights to privacy and free expression, and the existing surveillance laws in India. The lack of legislation governing these programmes often translates into an absence of transparency and due procedure. Further, a lot of personal communication now exists in the public domain which
renders traditional principles which govern interception and monitoring of personal communications futile. In the last few years, the blogosphere and social media websites in India have also changed and become platforms for more dissemination of political content, often also accompanied by significant vitriol, ‘trolling’ and abuse. Thus, we see greater policing of public or semi-public spaces online. In this paper, we look at social media monitoring as a
tool for surveillance, the current state of social media surveillance in India and evaluate how the existing regulatory framework in India may deal with such practices in future.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/social-media-monitoring'&gt;http://editors.cis-india.org/internet-governance/blog/social-media-monitoring&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Surveillance</dc:subject>
    

   <dc:date>2017-01-16T14:23:13Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms">
    <title>New Media, personalisation and the role of algorithms</title>
    <link>http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms</link>
    <description>
        &lt;b&gt;In his much acclaimed book, The Filter Bubble, Eli Pariser explains how personalisation of services on the web works and laments that they are creating individual bubbles for each user, which run counter to the idea of the Internet as an inherently open place. While Pariser’s book looks at the practices of various large companies providing online services, he briefly touches upon the role of new media such as search engines and social media portals in new curation. Building upon Pariser’s unexplored argument, this article looks at the impact of algorithmic decision-making and Big Data in the context of news reporting and curation.&lt;/b&gt;
        &lt;em&gt;&lt;br /&gt;&lt;/em&gt;
&lt;blockquote&gt;
&lt;div&gt;
&lt;div&gt;&lt;em&gt;Everything which bars freedom and fullness of communication sets up barriers that divide human beings into sets and cliques, into antagonistic sects and factions, and thereby undermines the democratic way of life. &lt;/em&gt;—John Dewey&lt;/div&gt;
&lt;/div&gt;
&lt;/blockquote&gt;
&lt;p&gt;&amp;nbsp;Eli Pariser, in his book, The Filter Bubble,[1] refers to the scholarship by Walter Lippmann and John Dewey as integral to the evolution of the understanding of the democratic and ethical duties of the Fourth Estate. Lippmann was disillusioned by the role of newspapers in propaganda for the First World War. He responded with three books in quick succession — Liberty and the News,[2] Public Opinion[3] and The Phantom Public.[4] Lippmann brought attention the fact that the process of news-reporting was conducted through privately determined and unexamined standards. The failure of the Fourth Estate to perform its democratic functions, was, in the opinion of Lippmann, one of the prime factors responsible for the public not being an informed and rational entity. John Dewey, while rejecting Lippmann’s arguments that matters of public policy can only be determined by inside experts with training and education, did acknowledge the his critique of the media.&lt;/p&gt;
&lt;p&gt;Pariser points to the creation of a wall between editorial decisionmaking and advertiser interests, as the eventual result of the Lippmann and Dewey debate. While accepting that this division between the financial and reporting sides of media houses has not been always observed, Pariser emphasises that the fact that the standard exists is important.[5] Unlike traditional media, the new media which relies on algorithmic decision-making for personalisation is not subject to the same standards which try to mitigate the influence of commercial interests on editorial decisions while performing many of the same functions as the traditional media.[6] &amp;nbsp;&lt;/p&gt;
&lt;h3&gt;How personalisation algorithms work&lt;/h3&gt;
&lt;p dir="ltr"&gt;Kevin Slavin, at his famous talk in the TEDGLobal Conference, characterised algorithms as “maths that computers use to decide stuff” and that it was infiltrating every aspect of our lives.[7] According to Slavin’s view, algorithms can be seen as control technologies and shape our world constantly through media and information systems, dynamically modifying content and function through these programmed routines. Search engines and social media platforms perpetually rank user-generated content through algorithms.[8]&lt;/p&gt;
&lt;p&gt;Personalisation technologies have various advantages. It translates into more relevant content, which for service providers means more clicks and revenue and for consumer, less time spent on finding the content.[9] However, it also leads to privacy compromise, lack of control and reduced individual capability.[10] Search engines like Google use the famous PageRank algorithm, which combined with geographical location and previous searches yields most relevant search results.[11] PageRank algorithm uses various real time variables dependent on both voluntary and involuntary user inputs. These variables include number of clicks, number of occurrences of the key terms and number of references by other credible pages etc. This data in turn determines the order of pages in search results and influences the way we perceive, understand and analyse information.[12] Maps showing real time traffic information retrieve data from laser and infrared sensors alongside the road and from information from devices of users. Once this real time data is combined with historical trends, these maps recommend rout to every user, hence influencing the traffic patterns.[13]&lt;/p&gt;
&lt;p&gt;Even though this phenomenon of personalization may appears to be new, it has been prevalent in the society for ages.[14] The history of mass media culture clearly shows personalization has always been a method to increase market, market reach and customer satisfaction.[15] Newspapers have sections dedicated to special topics, radio and TV have channels dedicated to different interest groups, age groups and consumers.[16] These personalised sections in a newspaper and personalised channels on radio and television don’t just provide greater satisfaction to the readers or listeners or consumers, they also provide targeted advertisement space for the advertisers and content developers. However, digital footprints and mass collection of data have made this phenomenon much more granular and detailed. Geographical location of an individual can tell a lot about their community, their culture and other important traits local to a community.[17] This data further assists in personalisation. Current developments in technology not only help in better collection of data about personal preferences but also help in better personalisation.&lt;/p&gt;
&lt;p&gt;Pariser mentions three ways in which the personalization technologies of this day are different from those of the past. First, for the very first time, individuals are alone in the filter bubble. While in traditional forms of personalisation, there were various individuals who shared the same frame of reference, now there is a separate sets of filters governing the dissemination of content to each individual.[18] Second, the personalisation technologies are entirely invisible now, and there is little that consumers can do to control or modify them.[19] Third, often the decision to be subject to these personalisation technologies is not an informed choice. A good example of this would be an individual’s geographical location.[20]&lt;/p&gt;
&lt;h3&gt;The neutrality of New Media?&lt;/h3&gt;
&lt;p dir="ltr"&gt;More and more, we have noticed personalisation technologies having an impact on how we consume news on the Internet. Google News, Facebook’s News Feed which tries to put together a dynamic feed for both personal and global stories, and Twitter’s trending hashtag feature, have brought forward these services are key drivers of an emerging news ecosystem. Initially, this new media was hailed as a natural consequence of the Internet which would enable greater public participation, allow journalists to find more stories and engage with the readers directly. &amp;nbsp;An illustration of the same could be seen in the way Internet based news media and social networking websites behaved in the aftermath of Israel’s attacks on a United Nations run school in Gaza strip. While much of the international Internet media covered the story, Israel’s home media did not cover the story. The only exception to this was the liberal Israeli news website Ha’aretz.[21] Network graph details of Twitter, for a few days immediately after the incident clearly show the social media manifestation of the event in the personalised cyberspace. It is clearly visible that when most of the word was re-tweeting news of this heinous act of Israel, Israeli’s hardly re-tweeted this news. In fact they were busty re-tweeting the news of rocket attacks on Israel.[22]&lt;/p&gt;
&lt;p&gt;The use of social media in newsmaking was hailed by many scholars as symptomatic of the decentralisation characteristic of the Internet. It has been seen as movement towards greater grassroots participation by negating the ‘gatekeeping’ role traditionally played by editors. &amp;nbsp;Thomas Poell and José van Dijck punch holes in theory of social media and other online technologies as mere facilitators of user participation and translators of user preferences through Big Data analytics.[23] They quote T. Gillespie’s work which talks of the narrative of these online services as platforms which are “open, neutral, egalitarian and progressive support for activity.”[24]&lt;/p&gt;
&lt;p&gt;Pedro Domingos calls the overwhelming number of choices as the defining problem of the information age, and machine learning and data analytics as the largest part of this solution.[25] The primary function of algorithmic decision making in the context of consumption of content is to narrow down the choices. Domingos is more optimistic about the impact of these technologies, and he says “last step of the decision is usually still for humans to make, but learners intelligently reduce the choices to something a human can manage.”[26] On the other hand, Pariser is more circumspect about the coercive result of machine learning algorithms. Whichever way we lean, we have to accept that a large part of personalisation algorithms is to select and prioritize content by categorising it on the basis of relevance and popularity. &amp;nbsp;&lt;/p&gt;
&lt;p&gt;Poell and van Dijck call this a new knowledge logic which in effect replaces human judgement (as, earlier exercised by editors) to some kind of proxy decisionmaking based on data. Their main thesis is that there is little evidence to suggest that the latter is more democratic than former and creates new problems of its own. They go on to compare the practices of various services including Facebook’s new graph and Twitter’s trending topic, and conclude that they prioritise breaking news stories over other kinds of content.[27] For instance, the algorithm for the trending topics depends not on the volume but the velocity of the tweets with the hashtag or term. It could be argued that given this predilection, the algorithms will rarely prefer complex content. If we go by Lippmann and Dewey’s idea that the role of the Fourth Estate is to inform public debate and accountability of those in positions of power, this aspect of Big Data algorithms does not correspond with this role.&lt;/p&gt;
&lt;h3&gt;Quantified Audience&lt;/h3&gt;
&lt;p dir="ltr"&gt;Another aspect of use of Big Data and algorithms in New Media that requires attention is that the networked infrastructure enables a quantified audience. C W Anderson who has studied newsroom practices in the US looked at role played by audience quantification and rationalization in shifting newswork practices. He concluded that more and more, journalists are less autonomous in their news decisions and increasingly reliant on audience metrics as a supplement to news &amp;nbsp;judgment.[28] Poell and van Dijck review the the practices by some leading publications such a New York Times, L.A. Times and Huffington Post, and degree to which audience metrics &amp;nbsp;dictates editorial decisions. While New York Times seems to prioritise content on their social media portals based on expectation of spike in user traffic, L.A. Times goes one step further by developing content specifically aimed towards promoting greater social participation. Neither of these practices though compare to the reliance on SEO and SMO strategies of web-born news providers like Huffington Post. They have traffic editors who trawl the Internet for trending topics and popular search terms, the feedback from them dictates the content creation.[29]&lt;/p&gt;
&lt;h3&gt;Conclusion&lt;/h3&gt;
&lt;p dir="ltr"&gt;The above factors demonstrate that the idea of New Media leading to the Fourth Estate performing its democratic functions does not take into account the actual practices. This idea is based on the erroneous assumption that technology, in general and algorithms, in particular are neutral. While the emergence of New Media might have reduced the gatekeeping role played by the editors, its strong prioritisation of content that will be popular reduce the validity of arguments that it leads to more informed public discussion. As Pariser said, the traditional media scores over the New Media inasmuch as there is an existence of a standard of division between editorial decisionmaking and advertiser interest. While this standard is flouted by media houses all the time, it exists as a metric to aspire to and measure service providers against. The New Media performs many of the same functions and maybe it is time to evolve some principles and ethical standards that take into account the need for it to perform these democratic functions.&lt;/p&gt;
&lt;h3&gt;Endnotes&amp;nbsp;&lt;/h3&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt; Eli Pariser, The Filter Bubble: What the Internet is
hiding from you (The Penguin Press, New York, 2011)&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;span class="MsoFootnoteReference"&gt;&lt;span class="MsoFootnoteReference"&gt;[2]&lt;/span&gt;&lt;/span&gt;&amp;nbsp;Walter Lippmann, Liberty and News (Harcourt, Brace
and Howe, New York 1920) available at&lt;a href="https://archive.org/details/libertyandnews01lippgoog"&gt;https://archive.org/details/libertyandnews01lippgoog&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt; Walter Lippmann, Public Opinion (Harcourt, Brace and
Howe, New York 1920) available at &lt;a href="http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html"&gt;http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt; Walter Lippmann, The Phantom Public (Transaction
Publishers, New York, 1925)&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 35.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 36.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt; &lt;a href="https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en"&gt;https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt; Fenwick McKelvey, “Algorithmic Media Need Democratic
Methods: Why Publics Matter”, available at &lt;a href="http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf"&gt;http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt; &lt;a href="http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1"&gt;http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt; Helen Ashman, Tim Brailsford, Alexandra Cristea, Quan
Z Sheng, Craig Stewart, Elaine Torns and Vincent Wade, “The ethical and social
implications of personalization technologies for e-learning” available at &lt;a href="http://www.sciencedirect.com/science/article/pii/S0378720614000524"&gt;http://www.sciencedirect.com/science/article/pii/S0378720614000524&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt; Sergey Brin and Lawrence Page, “The Anatomy of a
Large-Scale Hypertextual Web Search Engine” available at &lt;a href="http://infolab.stanford.edu/pub/papers/google.pdf"&gt;http://infolab.stanford.edu/pub/papers/google.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt; Ian Rogers, “The Google Pagerank Algorithm and How It
Works” available at &lt;a href="http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm"&gt;http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt; Trygve Olson and Terry Nelson, “The Internet’s Impact
on Political Parties and Campaigns”, available at &lt;a href="http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942"&gt;http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt; Ian Witten, “Bias, privacy and and personalisation on
the web”, available at &lt;a href="http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf"&gt;http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 10.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt; &lt;a href="https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/"&gt;https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt; Charles Heatwole, “Culture: A Geographical Perspective”
available at &lt;a href="http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html"&gt;http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 10.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Id&lt;/em&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 11.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt; Paul Mason, “Why Israel is losing the social media
war over Gaza?” available at &lt;a href="http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182"&gt;http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt; Gilad Lotan, Israel, Gaza, War &amp;amp; Data: Social
Networks and the Art of Personalizing Propaganda available at &lt;a href="http://www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html"&gt;www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt; Thomas Poell and José van Dijck, “Social Media and
Journalistic Independence” in Media Independence: Working with Freedom or
Working for Free?, edited by James Bennett &amp;amp; Niki Strange. (Routledge,
London, 2015)&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt; T Gillespie, “The politics of ‘platforms,” in New
Media &amp;amp; Society (Volume 12, Issue 3).&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt; Pedro Domingos, The Master Algorithm: How the quest
for the ultimate learning machine will re-make the world (Basic Books, New
York, 2015) at 38.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Ibid&lt;/em&gt; at 40.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
23.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt; C W Anderson, Between creative and quantified
audiences: Web metrics and changing patterns of newswork in local US newsrooms,
available at &lt;a href="https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms"&gt;https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms&lt;/a&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;
&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra &lt;/em&gt;Note 23.&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;span id="docs-internal-guid-24b4db2a-a606-d425-16ff-1d76b980367d"&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms'&gt;http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Human Rights</dc:subject>
    
    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Machine Learning</dc:subject>
    
    
        <dc:subject>Algorithms</dc:subject>
    
    
        <dc:subject>New Media</dc:subject>
    

   <dc:date>2017-01-16T07:20:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017">
    <title>Rankathon on Digital Rights (Delhi, January 08)</title>
    <link>http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017</link>
    <description>
        &lt;b&gt;Please join us on Sunday, January 08, at the CIS office in Hauz Khas, Delhi, for a rankathon to visualise, and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology. The event will begin at 10:00 in the morning and participants can focus on one or more of three kinds of tasks: 1) visualising the CIS and Ranking Digital Rights data, 2) evaluating additional companies using the RDR methodology, and 3) evaluating the RDR methodology and its suitability for independent use.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Download: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_RDRIndia-Rankathon_08012017_Invitation.pdf"&gt;Invitation&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;p&gt;The &lt;a href="https://rankingdigitalrights.org/"&gt;Ranking Digital Rights Corporate Responsibility Index&lt;/a&gt; is a project hosted by the Open Technology Institute at New America Foundation that aims to rank Information and Communications Technology (ICTs) companies with respect to their Governance, Freedom of Expression, and Privacy practices. The inaugural Corporate Accountability Index, released in November 2015, evaluated 16 companies based on the project’s methodology that included 31 indicators in total.&lt;/p&gt;
&lt;p&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of &lt;a href="https://privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues.&lt;/p&gt;
&lt;p&gt;Please join us on Sunday, January 08, at the CIS office in Hauz Khas, Delhi, for a rankathon to visualise, and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology. The event will begin at 10:00 in the morning and participants can focus on one or more of three kinds of tasks:&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;
&lt;p&gt;visualising the CIS and Ranking Digital Rights data,&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;evaluating additional companies using the RDR methodology, and&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;evaluating the RDR methodology and its suitability for independent use.&lt;/p&gt;
&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;The event is open to all but the venue has limited space. The participants are requested to RSVP by sending an email to &lt;a href="mailto:nisha@cis-india.org?subject=RSVP: Rankathon on Digital Rights"&gt;nisha@cis-india.org&lt;/a&gt;. The final date for registering for the event is &lt;strong&gt;January 04&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;All visualisations and other outputs produced at the event will be published under open licenses. All participants are expected to bring their own laptop or any other items needed for their work. CIS will offer data, help with understanding how the Ranking Digital Rights methodology work, refreshments, and any other support as needed.&lt;/p&gt;
&lt;p&gt;We are also organising a discussion event on Saturday, January 07, at the India Islamic Cultural Centre, Delhi, to present our findings on digital rights practices of 8 Indian ICT companies, followed by an open structured discussion on the methodology of the Ranking Digital Rights study. Please find more details about this &lt;a href="http://cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We look forward to your participation and contribution to the discussion. Please support us by sharing this invitation with your colleagues and networks.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017'&gt;http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Rights</dc:subject>
    

   <dc:date>2016-12-29T07:10:09Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017">
    <title>Discussion on Ranking Digital Rights in India (Delhi, January 07)</title>
    <link>http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017</link>
    <description>
        &lt;b&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of Privacy International, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues. Please join us on Saturday, January 07, at the India Islamic Cultural Centre, New Delhi, for a presentation of our findings followed by an open structured discussion on the methodology and implications of the study.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Download: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_RDRIndia-Discussion_07012017_Invitation.pdf"&gt;Invitation and agenda&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;p&gt;The &lt;a href="https://rankingdigitalrights.org/"&gt;Ranking Digital Rights Corporate Responsibility Index&lt;/a&gt; is a project hosted by the Open Technology Institute at New America Foundation that aims to rank Information and Communications Technology (ICTs) companies with respect to their Governance, Freedom of Expression, and Privacy practices. The inaugural Corporate Accountability Index, released in November 2015, evaluated 16 companies based on the project’s methodology that included 31 indicators in total.&lt;/p&gt;
&lt;p&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of &lt;a href="https://privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues.&lt;/p&gt;
&lt;p&gt;Please join us on Saturday, January 07, at the India Islamic Cultural Centre, New Delhi, for a presentation of our findings followed by an open structured discussion on the methodology and implications of the Ranking Digital Rights study. We will begin at 10:30 am with a round of tea and coffee.&lt;/p&gt;
&lt;p&gt;The event is open to all but the venue has limited space. The participants are requested to RSVP by sending an email to &lt;a href="mailto:nisha@cis-india.org?subject=RSVP: Ranking Digital Rights Discussion"&gt;nisha@cis-india.org&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;To further encourage programmers, researchers, journalists, students, and users in general to use and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology, we are also organising a “rankathon” on Sunday, January 08, at the CIS office in Delhi. More details can be found &lt;a href="http://cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We look forward to your participation and contribution to the discussion. Please support us by sharing this invitation with your colleagues and networks.&lt;/p&gt;
&lt;h2&gt;Agenda&lt;/h2&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;10:30-11:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Coffee and Tea&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;11:00-11:15&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;11:15-13:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Presentation of the Findings and Discussion&lt;/strong&gt; &lt;em&gt;Divij Joshi and Aditya Singh Chawla&lt;/em&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;13:00-14:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Lunch&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;14:00-15:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Open Discussion #1: Parameters of Evaluation&lt;/strong&gt;&lt;br /&gt;The RDR methodology was based upon evaluating commitments to uphold human rights through their services – in particular towards their commitment to users’ freedom of expression and privacy. Are there other parameters that may be considered in the Indian context?&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;15:00-16:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Open Discussion #2: Towards Protecting Digital Rights&lt;/strong&gt;&lt;br /&gt;What steps can be taken by the government, civil society, and industry in India to create an environment that recognizes and protects users digital rights? What are the relevant legal, political, and economic factors to take into consideration towards this? What are steps that other, multinational ICT companies have taken? Would these be realistic for Indian companies to implement?&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;16:00-16:30&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;16:30-17:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Coffee and Tea&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017'&gt;http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Ranking Digital Rights</dc:subject>
    
    
        <dc:subject>Digital Rights</dc:subject>
    

   <dc:date>2016-12-29T07:07:34Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy">
    <title>Deep Packet Inspection: How it Works and its Impact on Privacy</title>
    <link>http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy</link>
    <description>
        &lt;b&gt; In the last few years, there has been extensive debate and discussion around network neutrality in India. The online campaign in favor of Network Neutrality was led by Savetheinternet.in in India. The campaign was a spectacular success and facilitated sending  over a million emails supporting the cause of network neutrality, eventually leading to ban on differential pricing. Following in the footsteps of the Shreya Singhal judgement, the fact that the issue of net neutrality has managed to attract wide public attention is an encouraging sign for a free and open Internet in India. Since the debate has been focused largely on zero rating, other kinds of network practices impacting network neutrality have yet to be comprehensively explored in the Indian context, nor their impact on other values. In this article, the author focuses on network management, in general, and deep packet inspection, in particular and how it impacts the privacy of users.&lt;/b&gt;
        &lt;h3 style="text-align: justify; "&gt;&lt;a name="_ek69t4linon1"&gt;&lt;/a&gt; Background&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In the last few years, there has been extensive debate and discussion around network neutrality in India. The online campaign in favor of Network Neutrality was led by Savetheinternet.in in India. The campaign, captured in detail by an article in Mint,	&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; was a spectacular success and facilitated sending over a million emails supporting 	the cause of network neutrality, eventually leading to ban on differential pricing. Following in the footsteps of the Shreya Singhal judgement, the fact 	that the issue of net neutrality has managed to attract wide public attention is an encouraging sign for a free and open Internet in India. Since the 	debate has been focused largely on zero rating, other kinds of network practices impacting network neutrality have yet to be comprehensively explored in 	the Indian context, nor their impact on other values. In this article, I focus on network management, in general, and deep packet inspection, in particular 	and how it impacts the privacy of users.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_ft3wpj7p1jf1"&gt;&lt;/a&gt; The Architecture of the Internet&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Internet exists as a network acting as an intermediary between providers of content and it users.	&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Traditionally, the network did not distinguish between those who provided content 	and those who were recipients of this service, in fact often, the users also functioned as content providers. The architectural design of the Internet 	mandated that all content be broken down into data packets which were transmitted through nodes in the network transparently from the source machine to the 	destination machine.&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; As discussed in detail later, as per the OSI model, the network 	consists of 7 layers. We will go into each of these layers in detail below, however is important to understand that at the base is the physical layer of 	cables and wires, while at the top is application layer which contains all the functions that people want to perform on the Internet and the content 	associated with it. The layers in the middle can be characterised as the protocol layers for the purpose of this discussion. What makes the architecture of 	the Internet remarkable is that these layers are completely independent of each other, and in most cases, indifferent to the other layers. The protocol 	layer is what impacts net neutrality. It is this layer which provides the standards for the manner in which the data must flow through the network. The 	idea was for the it to be as simple and feature free as possible such that it is only concerned with the transmission data as fast as possible ('best 	efforts principle') while innovations are pushed to the layers above or below it.&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This aspect of the Internet's architectural design, which mandates that network features are implemented as the end points only (destination and source 	machine), i.e. at the application level, is called the 'end to end principle'.&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This 	means that the intermediate nodes do not differentiate between the data packets in any way based on source, application or any other feature and are only concerned with transmitting data as fast as possible, thus creating what has been described as a 'dumb' or neutral network.	&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This feature of the Internet architecture was also considered essential to what 	Jonathan Zittrain has termed as the 'generative' model of the Internet.&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Since, the 	Internet Protocol remains a simple layer incapable of discrimination of any form, it meant that no additional criteria could be established for what kind 	of application would access the Internet. Thus, the network remained truly open and ensured that the Internet does not privilege or become the preserve of 	a class of applications, nor does it differentiate between the different kinds of technologies that comprise the physical layer below.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the above model speaks of a dumb network not differentiating between the data packets that travel through it, in truth, the network operators engage 	in various kinds of practices that priorities, throttle or discount certain kinds of data packets. In her thesis essay at the Oxford Internet Institute, 	Alissa Cooper&lt;a href="#_ftn8" name="_ftnref8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; states that traffic management involves three different set of 	criteria- a) Some subsets of traffic needs to be managed, and arriving at a criteria to identify those subsets the criteria can be based on source, 	destination, application or users, b) Trigger for the traffic management measure which - could be based upon time of the day, usage threshold or a specific 	network condition, and c) the traffic treatment put into practice when the trigger is met. The traffic treatment can be of three kinds. The first is 	Blocking, in which traffic is prevented from being delivered. The second is Prioritization under which identified traffic is sent sooner or later. This is 	usually done in cases of congestion and one kind of traffic needs to be prioritized. The third kind of treatment is Rate limiting where identified traffic 	is limited to a defined sending rate.&lt;a href="#_ftn9" name="_ftnref9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; The dumb network does not interfere with an 	application's operation, nor is it sensitive to the needs of an application, and in this way it treats all information sent over it as equal. In such a 	network, the content of the packets is not examined, and Internet providers act according to the destination of the data as opposed to any other factor. 	However, in order to perform traffic management in various circumstances, Deep packet Inspection technology, which does look at the content of data packets 	is commonly used by service providers.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_r7ojhgh467u5"&gt;&lt;/a&gt; Deep Packet Inspection&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Deep packet inspection (DPI) enables the examination of the content of a data packets being sent over the Internet. Christopher Parsons explains the header 	and the payload of a data packet with respect to the OSI model. In order to understand this better, it is more useful to speak of network in terms of the 	seven layers in the OSI model as opposed to the three layers discussed above.&lt;a href="#_ftn10" name="_ftnref10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Under the OSI model, the top layer, the Application Layer is in contact with the software making a data request. For instance, if the activity in question 	is accessing a webpage, the web-browser makes a request to access a page which is then passed on to the lower layers. The next layer is the Presentation 	Layer which deals with the format in which the data is presented. This lateral performs encryption and compression of the data. In the above example, this 	would involve asking for the HTML file. Next comes the Session Layer which initiates, manages and ends communication between the sender and receiver. In 	the above example, this would involve transmitting and regulating the data of the webpage including its text, images or any other media. These three layers 	are part of the 'payload' of the data packet.&lt;a href="#_ftn11" name="_ftnref11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The next four layers are part of the 'header' of the data packet. It begins with the Transport Layer which collects data from the Payload and creates a 	connection between the point of origin and the point of receipt, and assembles the packets in the correct order. In terms of accessing a webpage, this 	involves connecting the requesting computer system with the server hosting the data, and ensuring the data packets are put together in an arrangement which 	is cohesive when they are received. The next layer is the Data Link Layer. This layer formats the data packets in such a way that that they are compatible 	with the medium being used for their transmission. The final layer is the Physical Layer which determines the actual media used for transmitting the 	packets.&lt;a href="#_ftn12" name="_ftnref12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The transmission of the data packet occurs between the client and server, and packet inspect occurs through some equipment placed between the client and 	the server. There are various ways in which packet inspection has been classified and the level of depth that the inspection needs to qualify in order to 	be categorized as Deep Packet Inspection. We rely on Parson's classification system in this article. According to him, there are three broad categories of 	packet inspection - shallow, medium and deep.&lt;a href="#_ftn13" name="_ftnref13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Shallow packet inspection involves the inspection of the only the header, and usually checking it against a blacklist. The focus in this form of inspection 	is on the source and destination (IP address and packet;s port number). This form of inspection primarily deals with the Data Link Layer and Network Layer 	information of the packet. Shallow Packet Inspection is used by firewalls.&lt;a href="#_ftn14" name="_ftnref14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Medium Packet Inspection involves equipment existing between computers running the applications and the ISP or Internet gateways. They use application 	proxies where the header information is inspected against their loaded parse-list and used to look at a specific flows. These kinds of inspections 	technologies are used to look for specific kinds of traffic flows and take pre-defined actions upon identifying it. In this case, the header and a small 	part of the payload is also being examined.&lt;a href="#_ftn15" name="_ftnref15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, Deep Packet Inspection (DPI) enables networks to examine the origin, destination as well the content of data packets (header and payload). These 	technologies look for protocol non-compliance, spam, harmful code or any specific kinds of data that the network wants to monitor. The feature of the DPI 	technology that makes it an important subject of study is the different uses it can be put to. The use cases vary from real time analysis of the packets to 	interception, storage and analysis of contents of a packets.&lt;a href="#_ftn16" name="_ftnref16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_pi28w1745j15"&gt;&lt;/a&gt; The different purposes of DPI&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Network Management and QoS&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The primary justification for DPI presented is network management, and as a means to guarantee and ensure a certain minimum level of QoS (Quality of 	Service). Quality of Service (QoS) as a value conflicting with the objectives of Network Neutrality, has emerged as a significant discussion point in this 	topic. Much like network neutrality, QoS is also a term thrown around in vague, general and non-definitive references. The factors that come into play in 	QoS are network imposed delay, jitter, bandwidth and reliability. Delay, as the name suggests, is the time taken for a packet to be passed by the sender to the receiver. Higher levels of delay are characterized by more data packets held 'in transit' in the network.	&lt;a href="#_ftn17" name="_ftnref17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; A paper by Paul Ferguson and Geoff Huston described the TCP as a 'self clocking' 	protocol.&lt;a href="#_ftn18" name="_ftnref18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This enables the transmission rate of the sender to be adjusted as per 	the rate of reception by the receiver. As the delay and consequent stress on the protocol increases, this feedback ability begins to lose its sensitivity. 	This becomes most problematic in cases of VoIP and video applications. The idea of QoS generally entails consistent service quality with low delay, low 	jitter and high reliability through a system of preferential treatment provided to some traffic on a criteria formulated around the need of such traffic to 	have greater latency sensitivity and low delay and jitter. This is where Deep Packet Inspection comes into play. In 1991, Cisco pioneered the use of a new 	kind of router that could inspect data packets flowing through the network. DPI is able to look inside the packets and its content, enabling it to classify 	packets according to a formulated policy. DPI, which was used a security tool, to begin with, is a powerful tool as it allows ISPs to limit or block 	specific applications or improve performances of applications in telephony, streaming and real-time gaming. Very few scholars believe in an all-or-nothing approach to network neutrality and QoS and debate often comes down to what forms of differentiations are reasonable for service providers to practice.	&lt;a href="#_ftn19" name="_ftnref19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Security&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Deep Packet inspection was initially intended as a measure to manage the network and protect it from transmitting malicious programs . As mentioned above, Shallow Packet Inspection was used to secure LANs and keep out certain kinds of unwanted traffic.	&lt;a href="#_ftn20" name="_ftnref20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Similarly, DPI is used for identical purposes, where it is felt useful to 	enhance security and complete a 'deeper' inspection that also examines the payload along with the header information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Surveillance&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The third purpose of DPI is what concerns privacy theorists the most. The fact that DPI technologies enable the network operators to have access to the actual content of the data packets puts them a position of great power as well as making them susceptible to significant pressure from the state.	&lt;a href="#_ftn21" name="_ftnref21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; For instance, in US, the ISPs are required to conform to the provisions of the 	Communications Assistance for Law Enforcement Act (CALEA) which means they need to have some surveillance capacities designed into their systems. What is 	more disturbing for privacy theorists compared to the use of DPI for surveillance under legislation like CALEA, are the other alleged uses by organisation 	like the National Security Agency through back end access to the information via the ISPs. Aside from the US government, there have been various reports of use of DPI by governments in countries like China,&lt;a href="#_ftn22" name="_ftnref22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Malaysia&lt;a href="#_ftn23" name="_ftnref23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and Singapore.	&lt;a href="#_ftn24" name="_ftnref24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Behavioral targeting&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;DPI also enables very granular tracking of the online activities of Internet users. This information is invaluable for the purposes of behavioral targeting 	of content and advertising. Traditionally, this has been done through cookies and other tracking software. DPI allows new way to do this, so far exercised 	only through web-based tools to ISPs and their advertising partners. DPI will enable the ISPs to monitor contents of data packets and use this to create profiles of users which can later be employed for purposes such as targeted advertising.	&lt;a href="#_ftn25" name="_ftnref25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_gn60r7ifwcge"&gt;&lt;/a&gt; Impact on Privacy&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Each of the above use-cases has significant implications for the privacy of Internet users as the technology in question involves access, tracking or 	retention of their online communication and usage activity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Alyssa Cooper compares DPI with other technologies carrying out content inspection such as caching services and individual users employing firewalls or packet sniffers. She argues that one of the most distinguishing feature of DPI is the potential for "mission-creep."	&lt;a href="#_ftn26" name="_ftnref26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Kevin Werbach writes that while networks may deploy DPI for implementation under 	CALEA or traffic peer-to-peer shaping, once deployed DPI techniques can be used for completely different purposes such as pattern matching of intercepted 	content and storage of raw data or conclusions drawn from the data.&lt;a href="#_ftn27" name="_ftnref27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; This scope of 	mission creep is even more problematic as it is completely invisible. As opposed to other technologies which rely on cookies or other web-based services, 	the inspection occurs not at the end points, but somewhere in the middle of the network, often without leaving any traces on the user's system, thus 	rendering them virtually undiscoverable.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Much like other forms of surveillance, DPI threatens the sense that the web is a space where people can engage freely with a wide range of people and 	services. For such a space to continue to exist, it is important for people to feel secure about their communication and transaction on medium. This notion 	of trust is severely harmed by a sense that users are being surveilled and their communication intercepted. This has obvious chilling effect on free speech 	and could also impact electronic commerce.&lt;a href="#_ftn28" name="_ftnref28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Allyssa Cooper also points out another way in which DPI differs from other content tracking technologies. As the DPI is deployed by the ISPs, it creates a 	greater barrier to opting out and choosing another service. There are only limited options available to individuals as far as ISPs are concerned. 	Christopher Parsons does a review of ISPs using DPI technology in UK, US and Canada and offers that various ISPs do provide in their terms of services that 	they use DPI for network management purposes. However, this information is often not as easily accessible as the terms and conditions of online services. 	A;so, As opposed to online services, where it is relatively easier to migrate to another service, due to both presence of more options and the ease of 	migration, it is a much longer and more difficult process to change one's ISP.&lt;a href="#_ftn29" name="_ftnref29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;&lt;a name="_n5w8euzb4xhb"&gt;&lt;/a&gt; Measures to mitigate risk&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Currently, there are no existing regulatory frameworks in India which deal govern DPI technology in any way. The International Telecommunications Union 	(ITU) prescribes a standard for DPI&lt;a href="#_ftn30" name="_ftnref30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; however, the standard does not engage with 	any questions of privacy and requires all DPI technologies to be capable of identifying payload data, and prescribing classification rules for specific 	applications, thus, conflicting with notions of application agnosticism in network management. More importantly, the requirements to identify, decrypt and 	analyse tunneled and encrypted data threaten the reasonable expectation of privacy when sending and receiving encrypted communication. In this final 	section, I look at some possible principles and practices that may be evolved in order to mitigate privacy risks caused due to DPI technology.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Limiting 'depth' and breadth&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It has been argued that inherently what DPI technology intends to do is matching of patterns in the inspected content against a pre-defined list which is 	relevant to the purpose how which DPI is employed. Much like data minimization principles applicable to data controllers and data processors, it is 	possible for network operators to minimize the depth of the inspection (restrict it to header information only or limited payload information) so as to 	serve the purpose at hand. For instance, in cases where the ISP is looking to identify peer-to-peer traffic, there are protocols which declare their names 	in the application header itself. Similarly, a network operators looking to generate usage data about email traffic can do so simply by looking at port 	number and checking them against common email ports.&lt;a href="#_ftn31" name="_ftnref31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, this mitigation 	strategy may not work well for other use-cases such as blocking malicious software or prohibited content or monitoring for the sake of behavioral 	advertising.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While depth referred to the degree of inspection within data packets, breadth refers to the volume of packets being inspected. Alyssa Cooper argues that 	for many DPI use cases, it may be possible to rely on pattern matching on only the first few data packets in a flow, in order to arrive at sufficient data 	to take appropriate response. Cooper uses the same example about peer-to-peer traffic. In some cases, the protocol name may appear on the header file of 	only the first packet of a flow between two peers. In such circumstances, the network operators need not look beyond the header files of the first packet 	in a flow, and can apply the network management rule to the entire flow.&lt;a href="#_ftn32" name="_ftnref32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Data retention&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Aside from the depth and breadth of inspection, another important question whether and for along is there a need for data retention. All use cases may not 	require any kind of data retention and even in case where DPI is used for behavioral advertising, only the conclusions drawn may be retained instead of 	retaining the payload data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Transparency&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;One of the issues is that DPI technology is developed and deployed outside the purview of standard organizations like ISO. Hence, there has been a lack of 	open, transparent standards development process in which participants have deliberated the impact of the technology. It is important for DPI to undergo 	these process which are inclusive, in that there is participation by non-engineering stakeholders to highlight the public policy issues such as privacy. Further, aside from the technology, the practices by networks need to be more transparent.	&lt;a href="#_ftn33" name="_ftnref33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Disclosure of the presence of DPI, the level of detail being inspected or retained and the purpose for deployment of DPI can be done. Some ISPs provide some of these details in their terms of service and website notices.	&lt;a href="#_ftn34" name="_ftnref34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; However, as opposed to web-based services, users have limited interaction with 	their ISP. It would be useful for ISPs to enable greater engagement with their users and make their practices more transparent.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The very nature of of the DPI technology renders some aspects of recognized privacy principles like notice and consent obsolete. The current privacy frameworks under FIPP&lt;a href="#_ftn35" name="_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and OECD	&lt;a href="#_ftn36" name="_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; rely on the idea of empowering the individual by providing them with knowledge 	and this knowledge enables them to make informed choices. However, for this liberal conception of privacy to function meaningfully, it is necessary that 	there are real and genuine choices presented to the alternatives. While some principles like data minimisation, necessity and proportionality and purpose 	limitation can be instrumental in ensuring that DPI technology is used only for legitimate purposes, however, without effective opt-out mechanisms and 	limited capacity of individual to assess the risks, the efficacy of privacy principles may be far from satisfactory.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ongoing Aadhaar case and a host of surveillance projects like CMS, NATGRID, NETRA&lt;a href="#_ftn37" name="_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; and NMAC	&lt;a href="#_ftn38" name="_ftnref38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; have raised concerns about the state conducting mass-surveillance, particularly 	of online content. In this regard, it is all the more important to recognise the potential of Deep Packet Inspection technologies for impact on privacy 	rights of individuals. Earlier, the Centre for Internet and Society had filed Right to Information applications with the Department of Telecommunications, Government of India regarding the use of DPI, and the government had responded that there was no direction/reference to the ISPs to employ DPI technology.	&lt;a href="#_ftn39" name="_ftnref39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Similarly, MTNL also responded to the RTI Applications and denied using the 	technology.&lt;a href="#_ftn40" name="_ftnref40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; It is notable though, that they did not respond to the questions 	about the traffic management policies they follow. Thus, so far there has been little clarity on actual usage of DPI technology by the ISPs.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ashish Mishra, "India's Net Neutrality Crusaders", available at 			&lt;a href="http://mintonsunday.livemint.com/news/indias-net-neutrality-crusaders/2.3.2289565628.html"&gt; http://mintonsunday.livemint.com/news/indias-net-neutrality-crusaders/2.3.2289565628.html &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.livinginternet.com/i/iw_arch.htm"&gt;http://www.livinginternet.com/i/iw_arch.htm&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Vinton Cerf and Robert Kahn, "A protocol for packet network intercommunication", available at 			&lt;a href="https://www.semanticscholar.org/paper/A-protocol-for-packet-network-intercommunication-Cerf-Kahn/7b2fdcdfeb5ad8a4adf688eb02ce18b2c38fed7a"&gt; https://www.semanticscholar.org/paper/A-protocol-for-packet-network-intercommunication-Cerf-Kahn/7b2fdcdfeb5ad8a4adf688eb02ce18b2c38fed7a &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ganley and Ben Algove, "Network Neutrality-A User's Guide", available at			&lt;a href="http://wiki.commres.org/pds/NetworkNeutrality/NetNeutrality.pdf"&gt;http://wiki.commres.org/pds/NetworkNeutrality/NetNeutrality.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; J H Saltzer, D D Clark and D P Reed, "End-to-End arguments in System Design", available at			&lt;a href="http://web.mit.edu/Saltzer/www/publications/endtoend/endtoend.pdf"&gt;http://web.mit.edu/Saltzer/www/publications/endtoend/endtoend.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 4.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Jonathan Zittrain, The future of Internet - and how to stop it, (Yale University Press and Penguin UK, 2008) available at 			&lt;a href="https://dash.harvard.edu/bitstream/handle/1/4455262/Zittrain_Future%20of%20the%20Internet.pdf?sequence=1"&gt; https://dash.harvard.edu/bitstream/handle/1/4455262/Zittrain_Future%20of%20the%20Internet.pdf?sequence=1 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Alissa Cooper, How Regulation and Competition Influence Discrimination in Broadband Traffic Management: A Comparative Study of Net Neutrality in 			the United States and the United Kingdom available at 			&lt;a href="http://ora.ox.ac.uk/objects/uuid:757d85af-ec4d-4d8a-86ab-4dec86dab568"&gt; http://ora.ox.ac.uk/objects/uuid:757d85af-ec4d-4d8a-86ab-4dec86dab568 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Christopher Parsons, "The Politics of Deep Packet Inspection: What Drives Surveillance by Internet Service Providers?", available at 			&lt;a href="https://www.christopher-parsons.com/the-politics-of-deep-packet-inspection-what-drives-surveillance-by-internet-service-providers/"&gt; https://www.christopher-parsons.com/the-politics-of-deep-packet-inspection-what-drives-surveillance-by-internet-service-providers/ &lt;/a&gt; at 15.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 16.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 19.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Id&lt;/i&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Jay Klein, "Digging Deeper Into Deep Packet Inspection (DPI)", available at			&lt;a href="http://spi.unob.cz/papers/2007/2007-06.pdf"&gt;http://spi.unob.cz/papers/2007/2007-06.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Tim Wu, "Network Neutrality: Broadband Discrimination", available at			&lt;a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=388863"&gt;http://papers.ssrn.com/sol3/papers.cfm?abstract_id=388863&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ferguson and Geoff Huston, "Quality of Service on the Internet: Fact, Fiction,&lt;/p&gt;
&lt;p&gt;or Compromise?", available at &lt;a href="http://www.potaroo.net/papers/1998-6-qos/qos.pdf"&gt;http://www.potaroo.net/papers/1998-6-qos/qos.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Barbara van Schewick, "Network Neutrality and Quality of Service: What a non-discrimination Rule should look like", available at 			&lt;a href="http://cyberlaw.stanford.edu/downloads/20120611-NetworkNeutrality.pdf"&gt; http://cyberlaw.stanford.edu/downloads/20120611-NetworkNeutrality.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 14.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ohm, "The Rise and Fall of Invasive ISP Surveillance," available at 			&lt;a href="http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf"&gt; http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Ben Elgin and Bruce Einhorn, "The great firewall of China", available at 			&lt;a href="http://www.bloomberg.com/news/articles/2006-01-22/the-great-firewall-of-china"&gt; http://www.bloomberg.com/news/articles/2006-01-22/the-great-firewall-of-china &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Mike Wheatley, "Malaysia's Web Heavily Censored Before Controversial Elections", available at 			&lt;a href="http://siliconangle.com/blog/2013/05/06/malaysias-web-heavily-censored-before-controversial-elections/"&gt; http://siliconangle.com/blog/2013/05/06/malaysias-web-heavily-censored-before-controversial-elections/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Fazal Majid, "Deep packet inspection rears it ugly head" available at			&lt;a href="https://majid.info/blog/telco-snooping/"&gt;https://majid.info/blog/telco-snooping/&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Alissa Cooper, "Doing the DPI Dance: Assessing the Privacy Impact of Deep Packet Inspection," in W. Aspray and P. Doty (Eds.), Privacy in America: 			Interdisciplinary Perspectives, Plymouth, UK: Scarecrow Press, 2011 at 151.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 148.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Kevin Werbach, "Breaking the Ice: Rethinking Telecommunications Law for the Digital Age", Journal of Telecommunications and High Technology, 			available at &lt;a href="http://www.jthtl.org/articles.php?volume=4"&gt;http://www.jthtl.org/articles.php?volume=4&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra &lt;/i&gt; Note 25 at 149.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra &lt;/i&gt; Note 25 at 147.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; International Telecommunications Union, Recommendation ITU-T.Y.2770, Requirements for Deep Packet Inspection in next generation networks, available 			at &lt;a href="https://www.itu.int/rec/T-REC-Y.2770-201211-I/en"&gt;https://www.itu.int/rec/T-REC-Y.2770-201211-I/en&lt;/a&gt;.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra &lt;/i&gt; Note 25 at 154.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Ibid&lt;/i&gt; at 156.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;i&gt;Supra&lt;/i&gt; Note 10.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Paul Ohm, "The Rise and Fall of Invasive ISP Surveillance", available at 			&lt;a href="http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf"&gt; http://paulohm.com/classes/infopriv10/files/ExcerptOhmISPSurveillance.pdf &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn35"&gt;
&lt;p&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.nist.gov/nstic/NSTIC-FIPPs.pdf"&gt;http://www.nist.gov/nstic/NSTIC-FIPPs.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn36"&gt;
&lt;p&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="https://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm"&gt; https://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn37"&gt;
&lt;p&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; "India's Surveillance State" Software Freedom Law Centre, available at 			&lt;a href="http://sflc.in/indias-surveillance-state-our-report-on-communications-surveillance-in-india/"&gt; http://sflc.in/indias-surveillance-state-our-report-on-communications-surveillance-in-india/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn38"&gt;
&lt;p&gt;&lt;a href="#_ftnref38" name="_ftn38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Amber Sinha, "Are we losing our right to privacy and freedom on speech on Indian Internet", DNA, available at 			&lt;a href="http://www.dnaindia.com/scitech/column-are-we-losing-the-right-to-privacy-and-freedom-of-speech-on-indian-internet-2187527"&gt; http://www.dnaindia.com/scitech/column-are-we-losing-the-right-to-privacy-and-freedom-of-speech-on-indian-internet-2187527 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn39"&gt;
&lt;p&gt;&lt;a href="#_ftnref39" name="_ftn39"&gt;&lt;sup&gt;&lt;sup&gt;[39]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://cis-india.org/telecom/use-of-dpi-technology-by-isps.pdf"&gt;http://cis-india.org/telecom/use-of-dpi-technology-by-isps.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn40"&gt;
&lt;p&gt;&lt;a href="#_ftnref40" name="_ftn40"&gt;&lt;sup&gt;&lt;sup&gt;[40]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Smita Mujumdar, "Use of DPI Technology by ISPs - Response by the Department of Telecommunications" available at 			&lt;a href="http://cis-india.org/telecom/dot-response-to-rti-on-use-of-dpi-technology-by-isps"&gt; http://cis-india.org/telecom/dot-response-to-rti-on-use-of-dpi-technology-by-isps &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy'&gt;http://editors.cis-india.org/internet-governance/blog/deep-packet-inspection-how-it-works-and-its-impact-on-privacy&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-12-16T23:14:49Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle">
    <title>New Approaches to Information Privacy – Revisiting the Purpose Limitation Principle</title>
    <link>http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle</link>
    <description>
        &lt;b&gt;Article on Aadhaar throwing light on privacy and data protection.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;This was &lt;a class="external-link" href="http://www.digitalpolicy.org/revisiting-the-principles-of-purpose-limitation-under-existing-data-protection-norms/"&gt;published in Digital Policy Portal&lt;/a&gt; on July 13, 2016.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Introduction&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Last year, Mukul Rohatgi, the Attorney General of India, called into question existing jurisprudence of the last 50 years on the constitutional validity of the right to privacy.&lt;sup&gt;1&lt;/sup&gt; Mohatgi was rebutting the arguments on privacy made against Aadhaar, the unique identity project initiated and implemented in the country without any legislative mandate.&lt;sup&gt;2&lt;/sup&gt; The question of the right to privacy becomes all the more relevant in the context of events over the last few years—among them, the significant rise in data collection by the state through various e-governance schemes,&lt;sup&gt;3&lt;/sup&gt; systematic access to personal data by various wings of the state through a host of surveillance and law enforcement initiatives launched in the last decade,&lt;sup&gt;4&lt;/sup&gt; the multifold increase in the number of Indians online, and the ubiquitous collection of personal data by private parties.&lt;sup&gt;5&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;These developments have led to a call for a comprehensive privacy legislation in India and the adoption of the National Privacy Principles as laid down by the Expert Committee led by Justice AP Shah.&lt;sup&gt;6&lt;/sup&gt; There are privacy-protection legislation currently in place such as the Information Technology Act, 2000 (IT Act), which was enacted to govern digital content and communication and provide legal recognition to electronic transactions. This legislation has provisions that can safeguard—and dilute—online privacy. At the heart of the data protection provisions in the IT Act lies section 43A and the rules framed under it, i.e., Reasonable security practices and procedures and sensitive personal data information.&lt;sup&gt;7&lt;/sup&gt;Section 43A mandates that body corporates who receive, possess, store, deal, or handle any personal data to implement and maintain ‘reasonable security practices’, failing which, they are held liable to compensate those affected. Rules drafted under this provision also mandated a number of data protection obligations on corporations such the need to seek consent before collection, specifying the purposes of data collection, and restricting the use of data to such purposes only. There have been questions raised about the validity of the Section 43A Rules as they seek to do much more than mandate in the parent provisions, Section 43A— requiring entities to maintain reasonable security practices.&lt;/p&gt;
&lt;h3&gt;Privacy as control?&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Even setting aside the issue of legal validity, the kind of data protection framework envisioned by Section 43A rules is proving to be outdated in the context of how data is now being collected and processed. The focus of Section 43 A Rules—as well as that of draft privacy legislations in India&lt;sup&gt;8&lt;/sup&gt;—is based on the idea of individual control. Most apt is Alan Westin’s definition of privacy: “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to other.”&lt;sup&gt;9&lt;/sup&gt; Westin and his followers rely on the normative idea of “informational self- determination”, the notion of a pure, disembodied, and atomistic self, capable of making rational and isolated choices in order to assert complete control over personal information. More and more this has proved to be a fiction especially in a networked society.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Much before the need for governance of information technologies had reached a critical mass in India, Western countries were already dealing with the implications of the use of these technologies on personal data. In 1973, the US Department of Health, Education and Welfare appointed a committee to address this issue, leading to a report called ‘Records, Computers and Rights of Citizens.’&lt;sup&gt;10&lt;/sup&gt; The Committee’s mandate was to “explore the impact of computers on record keeping about individuals and, in addition, to inquire into, and make recommendations regarding, the use of the Social Security number.” The Report articulated five principles which were to be the basis of fair information practices: transparency; use limitation; access and correction; data quality; and security. Building upon these principles, the Committee of Ministers of the Organization for Economic Cooperation and Development (OECD) arrived at the Guidelines on the Protection of Privacy and Transborder Flows of Personal Data in 1980.&lt;sup&gt;11&lt;/sup&gt; These principles— Collection Limitation, Data Quality, Purpose Specification, Use Limitation, Security Safeguards, Openness, Individual Participation and Accountability—are what inform most data protection regulations today including the APEC Framework, the EU Data Protection Directive, and the Section 43A Rules and Justice AP Shah Principles in India.&lt;/p&gt;
&lt;p&gt;Fred Cate describes the import of these privacy regimes as such:&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;“All of these data protection instruments reflect the same approach: tell individuals what data you wish to collect or use, give them a choice, grant them access, secure those data with appropriate technologies and procedures, and be subject to third-party enforcement if you fail to comply with these requirements or individuals’ expressed preferences”&lt;sup&gt;12&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;This is in line with Alan Westin’s idea of privacy exercised through individual control. Therefore the focus of these principles is on empowering the individuals to exercise choice, but not on protecting individuals from harmful or unnecessary practices of data collection and processing. The author of this article has earlier written&lt;sup&gt;13&lt;/sup&gt; about the sheer inefficacy of this framework which places the responsibility on individuals. Other scholars like Daniel Solove,&lt;sup&gt;14&lt;/sup&gt; Jonathan Obar&lt;sup&gt;15&lt;/sup&gt; and Fred Cate&lt;sup&gt;16&lt;/sup&gt; have also written about the failure of traditional data protection practices of notice and consent. While these essays dealt with the privacy principles of choice and informed consent, this paper will focus on the principles of purpose limitation.&lt;/p&gt;
&lt;h3&gt;Purpose Limitation and Impact of Big Data&lt;/h3&gt;
&lt;p&gt;The principles of purpose limitation or purpose specification seeks to ensure the following four objectives:&lt;/p&gt;
&lt;ol style="list-style-type: lower-alpha;"&gt;
&lt;li&gt;Personal information collected and processed should be adequate and relevant to the purposes for which they are processed.&lt;/li&gt;
&lt;li&gt;The entities collect, process, disclose, make available, or otherwise use personal information only for the stated purposes.&lt;/li&gt;
&lt;li&gt;In case of change in purpose, the data’s subject needs to be informed and their consent has to be obtained.&lt;/li&gt;
&lt;li&gt;After personal information has been used in accordance with the identified purpose, it has to be destroyed as per the identified procedures.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;The purpose limitation along with the data minimisation principle—which requires that no more data may be processed than is necessary for the stated purpose—aim to limit the use of data to what is agreed to by the data subject. These principles are in direct conflict with new technology which relies on ubiquitous collection and indiscriminate uses of data. The main import of Big Data technologies on the inherent value in data which can be harvested not by the primary purposes of data collection but through various secondary purposes which involve processing of the data repeatedly.&lt;sup&gt;17&lt;/sup&gt;Further, instead to destroying the data when its purpose has been achieved, the intent is to retain as much data as possible for secondary uses. Importantly, as these secondary uses are of an inherently unanticipated nature, it becomes impossible to account for it at the stage of collection and providing the choice to the data subject.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Followers of the discourse on Big Data would be well aware of its potential impacts on privacy. De-identification techniques to protect the identities of individuals in dataset face a threat from an increase in the amount of data available either publicly or otherwise to a party seeking to reverse-engineer an anonymised dataset to re-identify individuals. &lt;sup&gt;18&lt;/sup&gt; Further, Big Data analytics promise to find patterns and connections that can contribute to the knowledge available to the public to make decisions. What is also likely is that it will lead to revealing insights about people that they would have preferred to keep private.&lt;sup&gt;19&lt;/sup&gt;In turn, as people become more aware of being constantly profiled by their actions, they will self-regulate and ‘discipline’ their behaviour. This can lead to a chilling effect.&lt;sup&gt;20&lt;/sup&gt; Meanwhile, Big Data is also fuelling an industry that incentivises businesses to collect more data, as it has a high and growing monetary value. However, Big Data also promises a completely new kind of knowledge that can prove to be revolutionary in fields as diverse as medicine, disaster-management, governance, agriculture, transport, service delivery, and decision-making.&lt;sup&gt;21&lt;/sup&gt; As long as there is a sufficiently large and diverse amount of data, there could be invaluable insights locked in it, accessing which can provide solutions to a number of problems. In light of this, it is important to consider what kind of regulatory framework is most suitable which could facilitate some of the promised benefits of Big Data and at the same time mitigate its potential harm. This, coupled with the fact that the existing data protection principles have, by most accounts, run their course, makes the examination of alternative frameworks even more important. This article will examine some alternate proposals made to the existing framework of purpose limitation below.&lt;/p&gt;
&lt;h3&gt;Harms-based approach&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Some scholars like Fred Cate&lt;sup&gt;22&lt;/sup&gt; and Daniel Solove&lt;sup&gt;23&lt;/sup&gt; have argued that there is a need for the primary focus of data protection law to move from control at the stage of data collection to actual use cases. In his article on the failure of Fair Information Practice Principles,&lt;sup&gt;24&lt;/sup&gt;Cate puts forth a proposal for ‘Consumer Privacy Protection Principles.’ Cate envisions a more interventionist role of the data protection authorities by regulating information flows when required, in order to protect individuals from risky or harmful uses of information. Cate’s attempt is to extend the principles of consumer protection law of prevention and remedy of harms.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In a re-examination of the OECD Privacy Principles, Cate and Viktor Mayer Schöemberger attempt to discard the use of personal data to only purposes specified. They felt that restricting the use of personal to only specified purposes could significantly threaten various research and beneficial uses of Big Data. Instead of articulating a positive obligations of what personal data collected could be used for, they attempt to arrive at a negative obligation of use-cases prevented by law. Their working definition of the Use specification principle broaden the scope of use cases by only preventing use of data “if the use is fraudulent, unlawful, deceptive or discriminatory; society has deemed the use inappropriate through a standard of unfairness; the use is likely to cause unjustified harm to the individual; or the use is over the well-founded objection of the individual, unless necessary to serve an over-riding public interest, or unless required by law.”&lt;sup&gt;25&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;While most standards in the above definition have established understanding in jurisprudence, the concept of unjustifiable harm is what we are interested in. Any theory of harms-based approach goes back to John Stuart Mill’s dictum that the only justifiable purpose to exert power over the will of an individual is to prevent harm to others. Therefore, any regulation that seeks to control or prevent autonomy of individuals (in this case, the ability of individuals to allow data collectors to use their personal data, and the ability of data collectors to do so, without any limitation) must clearly demonstrate the harm to the individuals in question.&lt;/p&gt;
&lt;p&gt;Fred Cate articulates the following steps to identify tangible harm and respond to its presence:&lt;sup&gt;26&lt;/sup&gt;&lt;/p&gt;
&lt;ol style="list-style-type: lower-alpha;"&gt;
&lt;li&gt;Focus on Use — Actual use of the data should be considered, not mere possession. The assumption is that the collection, possession, or transfer of information do not significantly harm people, rather it is the use of information following such collection, possession, or transfer.&lt;/li&gt;
&lt;li&gt;Proportionality — Any regulatory measure must be proportional to the likelihood and severity of the harm identified.&lt;/li&gt;
&lt;li&gt;Per se Harmful Uses — Uses which are always harmful must be prohibited by law&lt;/li&gt;
&lt;li&gt;Per se not Harmful Uses — If uses can be considered inherently not harmful, they should not be regulated.&lt;/li&gt;
&lt;li&gt;Sensitive Uses — In case where the uses are not per se harmful or not harmful, individual consent must be sought for using that data for those purposes.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;The proposal by Cate argues for what is called a ‘use based system’, which is extremely popular with American scholars. Under this system, data collection itself is not subject to restrictions; rather, only the use of data is regulated. This argument has great appeal for both businesses who can reduce their overheads significantly if consent obligations are done away with as long as they use the data in ways which are not harmful, as well as critics of the current data protection framework which relies on informed consent. Lokke Moerel explains the philosophy of ‘harms based approach’ or ‘use based system’ in United States by juxtaposing it against the ‘rights based approach’ in Europe.&lt;sup&gt;27&lt;/sup&gt; In Europe, rights of individuals with regard to processing of their personal data is a fundamental human right and therefore, a precautionary principle is followed with much greater top-down control upon data collection. However, in the United States, there is a far greater reliance on market mechanisms and self-regulating organisations to check inappropriate processing activities, and government intervention is limited to cases where a clear harm is demonstrable.&lt;sup&gt;28&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Continuing research by the Centre for Information Policy Leadership under its Privacy Risk Framework Project looks at a system of articulating what harms and risks arising from use of collected data. They have arrived a matrix of threats and harms. Threats are categorised as —a) inappropriate use of personal information and b) personal information in the wrong hands. More importantly for our purposes, harms are divided into: a) tangible harms which are physical or economic in nature (bodily harm, loss of liberty, damage to earning power and economic interests); b) intangible harms which can be demonstrated (chilling effects, reputational harm, detriment from surveillance, discrimination and intrusion into private life); and c) societal harm (damage to democratic institutions and loss of social trust).&lt;sup&gt;29&lt;/sup&gt;For any harms-based system, a matrix like above needs to emerge clearly so that regulation can focus on mitigating practices leading to the harms.&lt;/p&gt;
&lt;h3&gt;Legitimate interests&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Lokke Moerel and Corien Prins, in their article “Privacy for Homo Digitalis – Proposal for a new regulatory framework for data protection in the light of Big Data and Internet of Things”&lt;sup&gt;30&lt;/sup&gt; use the ideal of responsive regulation which considers empirically observable practices and institutions while determining the regulation and enforcement required. They state that current data protection frameworks—which rely on mandating some principles of how data has to be processed—is exercised through merely procedural notification and consent requirements. Further, Moerel and Prins feel that data protection law cannot only involve a consideration of individual interest but also needs to take into account collective interest. Therefore, the test must be a broader assessment than merely the purpose limitation articulating the interests of the parties directly involved, but whether a legitimate interest is achieved.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Legitimate interest has been put forth as an alternative to the purpose limitation. Legitimate is not a new concept and has been a part of the EU Data Protection Directive and also finds a place in the new General Data Protection Regulation. Article 7 (f) of the EU Directive&lt;sup&gt;31&lt;/sup&gt; provided for legitimate interest balanced against the interests or fundamental rights and freedoms of the data subject as the last justifiable reason for use of data. Due to confusion in its interpretation, the Article 29 Working Party, in 2014,&lt;sup&gt;32&lt;/sup&gt;looked into the role of legitimate interest and arrived at the following factors to determine the presence of a legitimate interest— a) the status of the individual (employee, consumer, patient) and the controller (employer, company in a dominant position, healthcare service); b) the circumstances surrounding the data processing (contract relationship of data subject and processor); c) the legitimate expectations of the individual.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Federico Ferretti has criticised the legitimate interest principle as vague and ambiguous. The balancing of legitimate interest in using the data against fundamental rights and freedoms of the data subject gives the data controllers some degree of flexibility in determining whether data may be processed; however, this also reduces the legal certainty that data subject have of their data not being used for purposes they have not agreed to.&lt;sup&gt;33&lt;/sup&gt;However, it is this paper’s contention that it is not the intent of the legitimate interest criteria but the lack of consensus on its application which creates an ambiguity. Moerel and Prins articulate a test for using legitimate interest which is cognizant of the need to use data for the purpose of Big Data processing, as well as ensuring that the rights of data subjects are not harmed.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;As demonstrated earlier, the processing of data and its underlying purposes have become exceedingly complex and the conventional tool to describe these processes ‘privacy notices’ are too lengthy, too complex and too profuse in numbers to have any meaningful impact.&lt;sup&gt;34&lt;/sup&gt;The idea of information self-determination, as contemplated by Westin in American jurisprudence, is not achieved under the current framework. Moerel and Prins recommend five factors&lt;sup&gt;35&lt;/sup&gt; as relevant in determining the legitimate interest. Of the five, the following three are relevant to the present discussion:&lt;/p&gt;
&lt;ol style="list-style-type: lower-alpha;"&gt;
&lt;li style="text-align: justify;"&gt;Collective Interest — A cost-benefit analysis should be conducted, which examines the implications for privacy for the data subjects as well as the society, as a whole.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;The nature of the data — Rather than having specific categories of data, the nature of data needs to be assessed contextually to determine legitimate interest.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Contractual relationship and consent not independent grounds — This test has two parts. First, in case of contractual relationship between data subject and data controller: the more specific the contractual relationship, the more restrictions apply to the use of the data. Second, consent does not function as a separate principle which, once satisfied, need not be revisited. The nature of the consent (opportunities made available to data subject, opt in/opt out, and others) will continue to play a role in determining legitimate interest.&lt;/li&gt;&lt;/ol&gt;
&lt;h3&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Replacing the purpose limitation principles with a use-based system as articulated above poses the danger of allowing governments and the private sector to carry out indiscriminate data collection under the blanket guise that any and all data may be of some use in the future. The harms-based approach has many merits and there is a stark need for more use of risk assessments techniques and privacy impact assessments in data governance. However, it is important that it merely adds to the existing controls imposed at data collection, and not replace them in their entirety. On the other hand, the legitimate interests principle, especially as put forth by Moerel and Prins, is more cognizant of the different factors at play — the inefficacy of existing purpose limitation principles, the need for businesses to use data for purposes unidentified at the stage of collection, and the need to ensure that it is not misused for indiscriminate collection and purposes. However, it also poses a much heavier burden on data controllers to take into account various factors before determining legitimate interest. If legitimate interest has to emerge as a realistic alternative to purpose limitation, there needs to be greater clarity on how data controllers must apply this principle.&lt;/p&gt;
&lt;h3&gt;Endnotes&lt;/h3&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Prachi Shrivastava, “Privacy not a fundamental right, argues Mukul Rohatgi for Govt as Govt affidavit says otherwise,” Legally India, Jyly 23, 2015, http://www.legallyindia.com/Constitutional-law/privacy-not-a-fundamental-right-argues-mukul-rohatgi-for-govt-as-govt-affidavit-says-otherwise.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt; Rebecca Bowe, “Growing Mistrust of India’s Biometric ID Scheme,” Electronic Frontier Foundation, May 4, 2012, https://www.eff.org/deeplinks/2012/05/growing-mistrust-india-biometric-id-scheme.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Lisa Hayes, “Digital India’s Impact on Privacy: Aadhaar numbers, biometrics, and more,” Centre for Democracy and Technology, January 20, 2015, https://cdt.org/blog/digital-indias-impact-on-privacy-aadhaar-numbers-biometrics-and-more/.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;“India’s Surveillance State,” Software Freedom Law Centre, http://sflc.in/indias-surveillance-state-our-report-on-communications-surveillance-in-india/.&lt;/li&gt;
&lt;li&gt;“Internet Privacy in India,” Centre for Internet and Society, http://cis-india.org/telecom/knowledge-repository-on-internet-access/internet-privacy-in-india.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Vivek Pai, “Indian Government says it is still drafting privacy law, but doesn’t give timelines,” Medianama, May 4, 2016, http://www.medianama.com/2016/05/223-government-privacy-draft-policy/.&lt;/li&gt;
&lt;li&gt;Information Technology (Intermediaries Guidelines) Rules, 2011,&lt;br /&gt; http://deity.gov.in/sites/upload_files/dit/files/GSR314E_10511%281%29.pdf.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Discussion Points for the Meeting to be taken by Home Secretary at 2:30 pm on 7-10-11 to discuss the drat Privacy Bill, http://cis-india.org/internet-governance/draft-bill-on-right-to-privacy.&lt;/li&gt;
&lt;li&gt;Alan Westin, Privacy and Freedom (New York: Atheneum, 2015).&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;US Secretary’s Advisory Committee on Automated Personal Data Systems, Records, Computers and the Rights of Citizens, http://www.justice.gov/opcl/docs/rec-com-rights.pdf.&lt;/li&gt;
&lt;li&gt;OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Fred Cate, “The Failure of Information Practice Principles,” in Consumer Protection in the Age of the Information Economy, ed. Jane K. Winn (Burlington: Aldershot, Hants, England, 2006) http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1156972.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Amber Sinha and Scott Mason, “A Critique of Consent in Informational Privacy,” Centre for Internet and Society, January 11, 2016, http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy.&lt;/li&gt;
&lt;li&gt;Daniel Solove, “Privacy self-management and consent dilemma,” Harvard Law Review 126, (2013): 1880.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Jonathan Obar, “Big Data and the Phantom Public: Walter Lippmann and the fallacy of data privacy self management,” Big Data and Society 2(2), (2015), doi: 10.1177/2053951715608876.&lt;/li&gt;
&lt;li&gt;Supra Note 12.&lt;/li&gt;
&lt;li&gt;Supra Note 14.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Paul Ohm, “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization” available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1450006; Arvind Narayanan and Vitaly Shmatikov, “Robust De-anonymization of Large Sparse Datasets” available at https://www.cs.utexas.edu/~shmat/shmat_oak08netflix.pdf.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;D. Hirsch, “That’s Unfair! Or is it? Big Data, Discrimination and the FTC’s Unfairness Authority,” Kentucky Law Journal, Vol. 103, available at: http://www.kentuckylawjournal.org/wp-content/uploads/2015/02/103KyLJ345.pdf&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;A Marthews and C Tucker, “Government Surveillance and Internet Search Behavior”, available at http://ssrn.com/abstract=2412564; Danah Boyd and Kate Crawford, “Critical Questions for Big Data: Provocations for a cultural, technological, and scholarly phenomenon”, Information, Communication &amp;amp; Society, Vol. 15, Issue 5, (2012).&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Scott Mason, “Benefits and Harms of Big Data”, Centre for Internet and Society, available at http://cis-india.org/internet-governance/blog/benefits-and-harms-of-big-data#_ftn37.&lt;/li&gt;
&lt;li&gt;Cate, “The Failure of Information Practice Principles.”&lt;/li&gt;
&lt;li&gt;Solove, “Privacy self-management and consent dilemma,” 1882.&lt;/li&gt;
&lt;li&gt;Cate, “The Failure of Information Practice Principles.”&lt;/li&gt;
&lt;li&gt;Fred Cate and Viktor Schoenberger, “Notice and Consent in a world of Big Data,” International Data Privacy Law 3(2), (2013): 69.&lt;/li&gt;
&lt;li&gt;Solove, “Privacy self-management and consent dilemma,” 1883.&lt;/li&gt;
&lt;li&gt;Lokke Moerel, “Netherlands: Big Data Protection: How To Make The Draft EU Regulation On Data Protection Future Proof”, Mondaq, March 11. 2014, http://www.mondaq.com/x/298416/data+protection/Big+Data+Protection+How+To+Make+The+Dra%20ft+EU+Regulation+On+Data+Protection+Future+Proof%20al%20Lecture.&lt;/li&gt;
&lt;li&gt;Moerel, “Netherlands: Big Data Protection.”&lt;/li&gt;
&lt;li&gt;Centre for Information Policy Leadership, “A Risk-based Approach to Privacy: Improving Effectiveness in Practice,” Hunton and Williams LLP, June 19, 2014, https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/white_paper_1-a_risk_based_approach_to_privacy_improving_effectiveness_in_practice.pdf.&lt;/li&gt;
&lt;li&gt;Lokke Moerel and Corien Prins, “Privacy for Homo Digitalis: Proposal for a new regulatory framework for data protection in the light of Big Data and Internet of Things”, Social Science Research Network, May 25, 2016, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2784123.&lt;/li&gt;
&lt;li&gt;EU Directive 95/46/EC – The Data Protection Directive, https://www.dataprotection.ie/docs/EU-Directive-95-46-EC-Chapter-2/93.htm.&lt;/li&gt;
&lt;li&gt;Article 29 Data Protection Working Party, “Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC,” http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf.&lt;/li&gt;
&lt;li&gt;Frederico Ferretti, “Data protection and the legitimate interest of data controllers: Much ado about nothing or the winter of rights?,” Common Market Law Review 51(2014): 1-26. http://bura.brunel.ac.uk/bitstream/2438/9724/1/Fulltext.pdf.&lt;/li&gt;
&lt;li&gt;Sinha and Mason, “A Critique of Consent in Informational Privacy.”&lt;/li&gt;
&lt;li&gt;Moerel and Prins, “Privacy for Homo Digitalis.”&lt;/li&gt;&lt;/ol&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle'&gt;http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-11-09T13:54:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/aadhaar-bill-fails-to-incorporate-suggestions-by-the-standing-committee">
    <title>Aadhaar Bill fails to incorporate suggestions by the Standing Committee</title>
    <link>http://editors.cis-india.org/internet-governance/blog/aadhaar-bill-fails-to-incorporate-suggestions-by-the-standing-committee</link>
    <description>
        &lt;b&gt;In 2011, a standing committee report led by Yashwant Sinha had been scathing in its indictments of the Aadhaar BIll introduced by the UPA government. Five years later, the NDA government has introduced a new bill which is a rehash of the same. I look at the concerns raised by the committee report, none of which have been addressed by the new bill.
&lt;/b&gt;
        
&lt;p id="docs-internal-guid-0c1d0148-5959-8221-80f0-984c1f109411" dir="ltr"&gt;The article was published by &lt;a class="external-link" href="http://thewire.in/2016/03/10/aadhaar-bill-fails-to-incorporate-standing-committees-suggestions-24433/"&gt;The Wire&lt;/a&gt;&lt;a class="external-link" href="https://globalvoices.org/2016/02/09/a-good-day-for-the-internet-everywhere-india-bans-differential-data-pricing/"&gt; &lt;/a&gt;on March 10, 2016&lt;/p&gt;
&lt;p dir="ltr"&gt;In December, 2010, the UPA Government introduced the National Identification Authority of India Bill, 2010 in the Parliament. It was subsequently referred to a Standing Committee on Finance by the Speaker of Lok Sabha under Rule 331E of the the Rules of Procedure and Conduct of Business in Lok Sabha. This Committee, headed by BJP leader Yashwant Sinha took evidence from the Minister of Planning and the UIDAI from the government, as well as seeking the view of parties such as the National Human Rights Commission, Indian Banks Association and researchers like Dr Reetika Khera and Dr. Usha Ramanathan. In 2011, having heard from various parties and considering the concerns and apprehensions about the UID scheme, the Committee deemed the bill unacceptable and suggested a re-consideration of the the UID scheme as well as the draft legislation.&lt;/p&gt;
&lt;p dir="ltr"&gt;The Aadhaar programme has so far been implemented under the Unique Identification Authority of India, a Central Government agency created through an executive order. This programme has been shrouded in controversy over issues of privacy and security resulting in a Public Interest Litigation filed by Judge Puttaswamy in the Supreme Court. While the BJP had criticised the project as well as the draft legislation &amp;nbsp;when it was in opposition, once it came to power and particularly, after it launched various welfare schemes like Digital India and Jan Dhan Yojna, it decided to continue with it and use Aadhaar as the identification technology for these projects. In the last year, there have been orders passed by the Supreme Court which prohibited making Aadhaar mandatory for availing services. One of the questions that the government has had to answer both inside and outside the court on the UID project is the lack of a legislative mandate for a project of this size. About five years later, the new BJP led government has come back with a rehash of the same old draft, and no comments made by the standing committee have been taken into account.&lt;/p&gt;
&lt;p dir="ltr"&gt;The Standing Committee on the old bill had taken great exception to the continued collection of data and issuance of Aadhaar numbers, while the Bill was pending in the Parliament. The report said that the implementation of the provisions of the Bill and continuing to incur expenditure from the exchequer was a circumvention of the prerogative powers of the Parliament. However, the project has continued without abeyance since its inception in 2009. I am listing below some of the issues that the Committee identified with the UID project and draft legislation, none of which have been addressed in current Bill.&lt;/p&gt;
&lt;p dir="ltr"&gt;One of the primary arguments made by proponents of Aadhaar has been that it would be useful in providing services to marginalized sections of the society who currently do not have identification cards and consequently, are not able to receive state sponsored services, benefits and subsidies. The report points that the project would not be able to achieve this as no statistical data on the marginalized sections of the society are being used to by UIDAI to provide coverage to them. The introducer systems which was supposed to provide Aadhaar numbers to those without any form of identification, has been used to enroll only 0.03% of the total number of people registered. Further, the &lt;a href="http://uidai.gov.in/UID_PDF/Committees/Biometrics_Standards_Committee_report.pdf"&gt;Biometrics Standards Committee of UIDAI&lt;/a&gt; has itself acknowledged the issues caused due to a high number of manual laborers in India which would lead to sub-optimal fingerprint scans. A &lt;a href="http://www.4gid.com/De-dup-complexity%20unique%20ID%20context.pdf"&gt;report by 4G Identity Solutions&lt;/a&gt; estimates that while in any population, approximately 5% of the people have unreadable fingerprints, in India it could lead to a failure to enroll up to 15% of the population. In this manner, the project could actually end up excluding more people.&lt;/p&gt;
&lt;p dir="ltr"&gt;The Report also pointed to a lack of cost-benefit analysis done before going ahead with scheme of this scale. It makes a reference to the &lt;a href="http://eprints.lse.ac.uk/684/1/identityreport.pdf"&gt;report&lt;/a&gt; by the London School of Economics on the UK Identity Project which was shelved due to a) huge costs involved in the project, b) the complexity of the exercise and unavailability of reliable, safe and tested technology, c) risks to security and safety of registrants, d) security measures at a scale that will result in substantially higher implementation and operational costs and e) extreme dangers to rights of registrants and public interest. The Committee Report insisted that such global experiences remained relevant to the UID project and need to be considered. However, the new Bill has not been drafted with a view to address any of these issues.&lt;/p&gt;
&lt;p dir="ltr"&gt;The Committee comes down heavily on the irregularities in data collection by the UIDAI. They raise doubts about the ability of the Registrars to effectively verify the registrants and a lack of any security audit mechanisms that could identify issues in enrollment. Pointing to the news reports about irregularities in the process being followed by the Registrars appointed by the UIDAI, the Committee deems the MoUs signed between the UIDAI and the Registrars as toothless. The involvement of private parties has been under question already with many questions being raised over the lack of appropriate safeguards in the contracts with the private contractors.&lt;/p&gt;
&lt;span id="docs-internal-guid-0c1d0148-595b-32fa-49d2-8f6a347a4c00"&gt;Perhaps the most significant observation of the Committee was that any scheme that facilitates creation of such a massive database of personal information of the people of the country and its linkage with other databases should be preceded by a comprehensive data protection law. By stating this, the Committee has acknowledged that in the absence of a privacy law which governs the collection, use and storage of the personal data, the UID project will lead to abuse, surveillance and profiling of individuals. It makes a reference to the Privacy Bill which is still at only the draft stage. The current data protection framework in the Section 43A rules under the Information Technology Act, 2000 are woefully inadequate and far too limited in their scope. While there are some protection built into Chapter VI of the new bill, these are nowhere as comprehensive as the ones articulated in the Privacy Bill. Additionally, these protections are subject to broad exceptions which could significantly dilute their impact.&lt;/span&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/aadhaar-bill-fails-to-incorporate-suggestions-by-the-standing-committee'&gt;http://editors.cis-india.org/internet-governance/blog/aadhaar-bill-fails-to-incorporate-suggestions-by-the-standing-committee&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>UID</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-03-10T15:58:57Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/openness/design-public-conclave-6th-edition">
    <title>Design Public Conclave, 6th Edition</title>
    <link>http://editors.cis-india.org/openness/design-public-conclave-6th-edition</link>
    <description>
        &lt;b&gt;The 6th edition of the Design Public Conclave was hosted by Civic Labs, an initiative of the Center for Knowledge Studies, and part of the Vihara Innovation Network, in partnership with Social Innovation Exchange, Okapi, Business World, Business World for Smart Cities, and the Delhi Jal Board.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;This &lt;a href="http://designpublic.in/"&gt;edition of the conclave&lt;/a&gt; was focused on the challenges and opportunities faced by Indian cities. It sought to explore new mechanisms for integrating collaborative dialogue and problem solving into processes of government and citizen interaction. Participants included individuals from organisations such as Okapi, Hyderabad Urban Labs, Fields of View, Innovation Academy, Hewlett Packard, LIRNEasia, among others.&lt;/p&gt;
&lt;p&gt;The conclave began with a round of light yoga before moving into the introductory session. Namit Arora, a member of the Delhi Dialogue Commission, who gave the opening remarks introduced some of the subjects to be discussed and raised issues of citizen engagement, massive migration, pollution, unplanned growth, housing, water and power shortage, social problems like sectarianism and crime as some of the challenges faced in civic innovation. He stressed the lack of engagement between public and private parties and the issue of having no sense of commons in civic life in India.&lt;/p&gt;
&lt;h2&gt;What is Civic Innovation?&lt;/h2&gt;
&lt;p&gt;The first panel titled “What is civic innovation?” comprised Diastika Rahwidiati from Pulse Lab, Pavan Srinath from Takshashila Institution, Sriganesh Lokanathan from LIRNEasia and Aditya Dev Sood from Vihara Innovation Network. Pavan raised questions about how more people can be involved in civic issues, and spoke about the training program for public governance run by the Takshashila Institution as a means towards that. He also shared the example of Bangalore Political Action Committee, a citizen’s collective that includes several eminent personalities from the city that aims to improve the quality of life in the city. The panel continued to discuss how technology can be harnessed for social activism, and how the data revolution and data sciences can be used for civic innovation. Questions were asked about whether digital activism, such as civic hackathons, is not just a passing fad. A lot of solutions that are only technological in nature, can be misinformed, and so it is essential that other actors are involved along with technologists.&lt;/p&gt;
&lt;h2&gt;The Vision of a Smart City&lt;/h2&gt;
&lt;p&gt;Next, Sumit D. Chowdhury from the Ministry of Urban Development, Karuna Gopal from Foundation for Futuristic Cities, Parvathi Menon from Innovation Alchemy, Debashish Rao from HP, Bharath Palavalli from Fields of View and Namrata Mehta from CivicLabs spoke about how smart cities can be built. Parvathi Menon kicked off the conversation by saying that while it is impossible to design smart cities, it is possible to design smart communities. Sumit Chowdhury shared some of the factors that, in his opinion, make a smart city—the creation of scalable infrastructure, transparency in governance, velocity of business and quality of life. A city that can measure itself and use that knowledge to improve itself is a true smart city. Bharat Palavalli chimed in that while technology can make cities more efficient, efficiency can be dangerous. It can become easy to forget who the city is becoming more efficient for. Here, Sumit brought up the example of Shivpur in Maharashtra, where there are water meters in every village, public consciousness about planning and services and timely payment of taxes by citizen to drive the point that smart cities are driven by communities, and technology plays a role in enabling processes and the State in institutionalizing successful solutions. Finally, it was pointed out that under the 100 Smart Cities Initiative, the MoUD does not have a consistent understanding of what smart cities should be.&lt;/p&gt;
&lt;h2&gt;Dialogue between Society and State&lt;/h2&gt;
&lt;p&gt;This panel was followed by Elizabeth Elson’s keynote talk, “The dialogue between society and the state.” She spoke about the the power struggle between citizens and the government even in the case of technological application about who brings about change. She shared her experiences from the MAMPU programme. She pointed out some issues faced during the programme like too much focus on symptoms without really understanding the underlying causes, the use of intermediaries, creating mutually empowering coalitions. Elizabeth Elson pointed out that the terms, innovation and technology are used interchangeably . She pointed out that this was problematic as all technological solutions were not innovative. Another important issue that she raised was the need for technological intervention make media more accountable to the society. This session was followed by lunch.&lt;/p&gt;
&lt;h2&gt;Changing Society and Governments&lt;/h2&gt;
&lt;p&gt;The next session was moderated by Sumadro Chattapadhyay of Centre for Internet and Society. This panel included Garima Agarwal from Ashoka Innovators, Bangalore and Maesy Angelina from MAMPU programme, Jakarta. The session focussed on what were the appropriate modes of dialogues between civil society, private sector and government. Maesy Angelina focussed on design thinking as one of key methodologies for social innovation. Garima Agarwal emphasised on the importance of developing empathy as an institution. The panel said that while civil society and private sector could continue to point out the issues to the government, very often there is a failure of the government apparatus in that they do not know how to respond to these issues.&lt;/p&gt;
&lt;h2&gt;Civic Tech Demos&lt;/h2&gt;
&lt;p&gt;After lunch, there was a small session of brief pitches of examples of civic technological innovations. These include Local Circles, Meri Awaaz, SocialCops, On Track Media and BusBud. The issues that the solutions sought to addressed ranged from citizen engagement, awareness about reproductive issues, MNREGA, public transport and parking. I was reminded of the words of Pia Mancini who felt that she had failed in leveraging technology to solve governance issues as those problems were not technological but cultural. Having said that, a number of the ideas and the desire of use technology to solve social problems were laudable and one hopes to see more applications like these in future.&lt;/p&gt;
&lt;h2&gt;Breakout Sessions&lt;/h2&gt;
&lt;p&gt;This was followed by three simultaneous breakout sessions on the following topics – 1) Form and Function: Data Protocols for Civic Innovation, 2) Water Management for Improved Urban Health, and 3) Gaming for Decentralized Waste Management. I was part of the group discussing data protocols for civic innovation. Various question were raised with the implications of open data. One of the recurring themes was&amp;nbsp; the question of ownership of data and who had a rightful claim over it. We broke the discussion down into two heads – risks of data and opportunities for governance and solutions. Among risks, we discussed issues such as privacy risks, chilling effects on free speech, reliability of data, profusion of data without clear insights, social profiling and re-identification of anonymised data. We look at different forms and opportunities for governance including licensing and control, cross linking of data silos, clear guidelines on who controls and owns data. The failure of conventional data protection principles like collection limitation and data minimisation principles were also considered and alternate models which involved having hierarchies of different kinds of data based on potential harm through misuse were discussed. After the breakout sessions, each group made a presentation of their observation.&lt;/p&gt;
&lt;h2&gt;Concluding&lt;/h2&gt;
&lt;p&gt;The final session was on accelerating civic innovation. The panel comprised Kartik Desai from ASHA Impact, Delhi, Nishesh Mehta from Water Co-Lab, Ahmedabad, AIyong Paul Seong from USAID, Delhi, Santosh Singh from World Bank, Delhi and Aditya Dev Sood from Vihara Innovation Network. The discussion was focussed on what kinds of services can have an impact on the way citizens interact with the state. Elizabeth Elson’s keynote on the dialogues between the state and the citizens is also relevant with regard to this discussion. Different actors including citizens, civil society actors, government institutions and industry were discussed as agents who may create the new platforms for interaction. The conclave concluded with dinner and drinks in the lawns of the Vihara Innovation Campus.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/openness/design-public-conclave-6th-edition'&gt;http://editors.cis-india.org/openness/design-public-conclave-6th-edition&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Open Data</dc:subject>
    
    
        <dc:subject>Open Innovation</dc:subject>
    
    
        <dc:subject>Openness</dc:subject>
    

   <dc:date>2016-06-18T16:45:05Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
