<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="http://editors.cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>http://editors.cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 1 to 15.
        
  </description>
  
  
  
  
  <image rdf:resource="http://editors.cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/artificial-intelligence-a-full-spectrum-regulatory-challenge-working-draft"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/prime-time-august-26-2019-sunil-abraham-linking-aadhaar-with-social-media-or-ending-encryption-is-counterproductive"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/openness/sunil-abraham-key-listener-speech-at-wikimedia-summit-2019"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/business-standard-january-2-2019-registering-for-aadhaar-in-2019"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/files/indias-contribution-to-internet-governance-debates"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/business-standard-march-28-2018-sunil-abraham-cambridge-analytica-scandal-how-india-can-save-democracy-from-facebook"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/files/icann-analysis"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/deccan-herald-january-20-2018-sunil-abraham-data-protection-we-can-innovate-leapfrog"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/business-standard-sunil-abraham-january-10-fixing-aadhaar"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/the-hindu-businessline-march-31-2017-sunil-abraham-its-the-technology-stupid"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="http://editors.cis-india.org/internet-governance/artificial-intelligence-a-full-spectrum-regulatory-challenge-working-draft">
    <title>Artificial Intelligence: a Full-Spectrum Regulatory Challenge [Working Draft]</title>
    <link>http://editors.cis-india.org/internet-governance/artificial-intelligence-a-full-spectrum-regulatory-challenge-working-draft</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
&lt;p&gt;Today, there are certain misconceptions regarding the regulation of AI. Some corporations would like us to believe that AI is being developed and used in a regulatory vacuum. Others in civil society organisations believe that AI is a regulatory circumvention strategy deployed by corporations. As a result, these organisations call for onerous regulations targeting corporations. However, some uses of AI by corporations can be completely benign and some uses AI by the state can result in the most egregious human rights violations. Therefore policy makers need to throw every regulatory tool from their arsenal to unlock the benefits of AI and mitigate its harms.&lt;/p&gt;
&lt;p&gt;This policy brief proposes a granular, full spectrum approach to the regulation of AI depending on who is using AI, who is impacted by that use and what human rights are impacted. Everything from deregulation, to forbearance, to updated regulations, to absolute and blanket prohibitions needs to be considered depending on the specifics. This approach stands in contrast to approaches of ethics, omnibus law, homogeneous principles, and human rights, which will result in inappropriate under-regulation or over-regulation of the sector.&lt;/p&gt;
&lt;p&gt;Find a copy of the working draft &lt;a href="http://editors.cis-india.org/internet-governance/artificial-intelligence-a-full-spectrum-regulatory-challenge-working-draft-pdf" class="internal-link" title="Artificial Intelligence: A Full-Spectrum Regulatory Challenge (Working Draft) PDF"&gt;here&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/artificial-intelligence-a-full-spectrum-regulatory-challenge-working-draft'&gt;http://editors.cis-india.org/internet-governance/artificial-intelligence-a-full-spectrum-regulatory-challenge-working-draft&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Regulatory Practices Lab</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    

   <dc:date>2020-08-04T06:10:13Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/prime-time-august-26-2019-sunil-abraham-linking-aadhaar-with-social-media-or-ending-encryption-is-counterproductive">
    <title>Linking Aadhaar with social media or ending encryption is counterproductive</title>
    <link>http://editors.cis-india.org/internet-governance/blog/prime-time-august-26-2019-sunil-abraham-linking-aadhaar-with-social-media-or-ending-encryption-is-counterproductive</link>
    <description>
        &lt;b&gt;Should Aadhaar be used as KYC for social media accounts? We have recently seen a debate on this question with even the courts hearing arguments in favour and against such a move. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in &lt;a class="external-link" href="https://theprimetime.in/linking-aadhaar-with-social-media-or-ending-encryption-is-counterproductive/"&gt;Prime Time&lt;/a&gt; on August 26, 2019.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The case began in Madras High Court and later Facebook moved the SC seeking transfer of the petition to the Apex court. The original petition was filed in July, 2018 and sought linking of Aadhaar numbers with user accounts to further traceability of messages.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Before we try and answer this question, we need to first understand the differences between the different types of data on social media and messaging platforms. If a crime happens on an end to end cryptographically secure channel like WhatsApp the police may request the following from the provider to help solve the case:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Identity data: Phone numbers of the accused. Names and addresses of the accused.&lt;/li&gt;
&lt;li&gt;Metadata: Sender, receiver(s), time, size of message, flag identifying a forwarded messages, delivery status, read status, etc.&lt;/li&gt;
&lt;li&gt;Payload Data: Actual content of the text and multimedia messages.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Different countries have taken different approaches to solving different layers of the surveillance problem. Let us start with identity data. Some like India require KYC for sale of SIM cards while others like the UK allow anonymous purchases. Corporations also have policies when it comes to anonymous speech on their platforms – Facebook for instance enforces a soft real ID policy while Twitter does not crack down on anonymous speech. The trouble with KYC the old fashioned way is that it exposes citizens to further risk. Every possessor of your identity documents is a potential attack surface. Indian regulation should not result in Indian identity documents being available in the millions to foreign corporations. Technical innovations are possible, like tokenisation, Aadhaar paperless local e-KYC or Aadhaar offline QR code along with one time passwords. These privacy protective alternatives must be mandatory for all and the Aadhaar numbers must be deleted from previously seeded databases. Countries that don’t require KYC have an alternative approach to security and law enforcement. They know that if someone like me commits a crime, it would be easy to catch me because I have been using the same telecom provider for the last fifteen years. This is true of long term customers regardless if they are pre-paid or post-paid. The security risk lies in the new numbers without this history that confirms identity. These countries use targeted big data analytics to determine risk and direct surveillance operations to target new SIM cards. My current understanding is that when it comes to basic user data – all the internet giants in India comply with what they consider as legitimate law enforcement requests. Some proprietary and free and open source [FOSS] alternatives to services offered by the giants don’t provide such direct cooperation in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;When it comes to payload data – it is almost impossible (meaning you will need supercomputers) to access the data unless the service/software provider breaks end-to-end cryptography. It is unwise, like some policy-makers are proposing, to prohibit end-to-end cryptography or mandate back doors because our national sovereignty and our capacity for technological self-determination depends on strong cryptography. A targeted ban or prohibition against proprietary providers might have a counterproductive consequence with users migrating to FOSS alternatives like Signal which won’t even give the police identity data. As a supporter of the free software movement, I would see this as a positive development but as a citizen I am aware that the fight against crime and terror will become harder. So government must pursue other strategies to getting payload data such as a comprehensive government hacking programme.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Meta-data is critical when it comes to separating the guilty from the innocent and apportioning blame during an investigation. For example, who was the originator of a message? Who got it and read it last? WhatsApp claims that it has implemented the Signal protocol faithfully meaning that they hold no meta-data when it comes to the messages and calls. Currently there is no regulation which mandates data retention for over the top providers but such requirements do exist for telecom providers. Just like access to meta-data provides some visibility into illegal activities it also provides visibility into legal activities. Therefore those using end-to-end cryptography on platforms with comprehensive meta-data retention policies will have their privacy compromised even though the payload data remains secure. Here is a parallel example to understand why this is important. Early last year, the Internet Engineering Task Force chose a version of TLS 1.3 that revealed less meta-data over one that provided greater visibility into the communications. This hardening of global open standards, through the elimination of availability of meta-data for middle-boxes, makes it harder for foreign governments to intercept Indian military and diplomatic communications via imported telecom infrastructure. Courts and policy makers across the world have to grapple with the following question: Are meta-data retention mandates for the entire population of users a “necessary and proportionate” legal measure to combat crime and terror. For me, it should not be illegal for a provider who voluntarily wishes to retain data, provided it is within legally sanctioned limits but it should not be requirement under law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are technical solutions that are yet to be properly discussed and developed as an alternative to blanket meta-data retention measures. For example, Dr. V Kamakoti has made a traceability proposal at the Madras High Court. This proposal has been critiqued by Anand Venkatanarayanan as being violative in spirit of the principles of end-to-end cryptography. Other technical solutions are required for those seeking justice and for those who wish to serve as informers for terror plots. I have proposed client side metadata retention. If a person who has been subjected to financial fraud wishes to provide all the evidence from their client, it should be possible for them to create a digital signed archive of messages for the police. This could be signed by the sender, the provider and also the receiver so that technical non-repudiation raises the evidentiary quality of the digital evidence. However, there may be other legal requirements such as the provision of notice to the sender so that they know that client side data retention has been turned on.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The need of the hour is sustained research and development of privacy protecting surveillance mechanisms. These solutions need to be debated thoroughly amongst mathematicians, cryptographers, scientists, technologists, lawyers, social scientists and designers so that solutions with the least negative impact can be rolled out either voluntarily by providers or as a result of regulation.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/prime-time-august-26-2019-sunil-abraham-linking-aadhaar-with-social-media-or-ending-encryption-is-counterproductive'&gt;http://editors.cis-india.org/internet-governance/blog/prime-time-august-26-2019-sunil-abraham-linking-aadhaar-with-social-media-or-ending-encryption-is-counterproductive&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-08-28T01:39:47Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/openness/sunil-abraham-key-listener-speech-at-wikimedia-summit-2019">
    <title>Sunil Abraham - Key Listener Speech at Wikimedia Summit 2019</title>
    <link>http://editors.cis-india.org/openness/sunil-abraham-key-listener-speech-at-wikimedia-summit-2019</link>
    <description>
        &lt;b&gt;The Wikimedia Summit 2019 – formerly known as "Wikimedia Conference" or "Chapters Meeting" – took place on 29–31 March 2019 in Berlin. Sunil Abraham made a speech at the summit organized in Berlin. &lt;/b&gt;
        &lt;p&gt;Sunil answers a series of questions at &lt;span&gt;the closing session of the Wikimedia Summit 2019&lt;/span&gt;:&lt;/p&gt;
&lt;h3&gt;What stands out?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Money. Creative Commons revenues are pegged at 2.4 million dollars. Mozilla Foundation gets 24 million dollars. Wikimedia Foundation gets 91 million dollars. So the job of pulling off the "Big Open" or the "creation of the meta movement" or "the movement of movements" is primarily the responsibility of the Wikimedia community given the scale of resources it is able to mobilize. For example, the Open Access movement has lost funding as its key donor Open Society Foundation after supporting the movement for 17 years is unable to support any further. The Wikipedia movement can easily save the global access movement by just allocating 1 million dollar for it.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;What concerns me?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Homogenization. Homogenization of time frames, homogenization of process. Should we, for example, stagger the time period for online community consultation on the draft recommendations, so that there is less 'consultation fatigue' By homogenizing the processes at the Summit, it would be risking infantilizing the community. Would this meeting have been more exciting and useful, if Working Groups had the freedom to fork the process, and do what works for them.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;What have I learned from my own journey and work?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Working with lawyers for the last 10 years, has led me to appreciate tests over principles. For example, in the open standards movement there is a constant question: is this particular standard an open standard? &lt;span&gt;There, free software acts as the canary in the coal mine:  If we cannot implement a standard using free software, then it is not an open standard. &lt;/span&gt;&lt;span&gt;Working with lawyers for the last 10 years, has led me to appreciate tests over principles. For example, in the open standards movement there is a constant question: is this particular standard an open standard?There, free software acts as the canary in the coal mine:  If we cannot implement a standard using free software, then it is not an open standard.&lt;/span&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;What have you learned that could be useful for the strategy process?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;From the process architect I have learned that we shouldn't focus on solving /this/ particular instance of the problem, we should focus on working on developing processes that solve these problems in the future. So, the emphasis is on process fixes. This is really the bleeding edge of regulatory theory these days. Since we are in Germany, I must mention the name of the German academic Gunther Teubner who developed this concept of reflexive regulation 26 years ago in his article 'Substantive and Reflexive Elements in Modern Law.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;What would you suggest to improve the strategy process?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The core of responsive regulation is community consultation processes. However, closing the loop on the consultation process is critical, otherwise participants feel that they have wasted time providing feedback. For example, the Indian telecom regulator first issues a consultation paper. Then solicits the first round of feedback, then solicits a second round of counter comments then they hold round tables, and, finally, they issue the recommendation or the regulation. But when they do that, they make sure they close the loop.They provide reasoned explanations for why suggestions were rejected. This might have to happen at both stages for this strategy development process. The working groups will have to say why they rejected certain pieces of feedback, and also the board will have to explain why they rejected certain recommendations from the working groups.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;What would be your wish for this movement?&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;As we enter adulthood as a movement,  it is important that we do not lose our youthful idealism. Idealism at two levels: ambition and vocabulary.  Global civil society is broadly divided into two groups. Those who work on tractable problems, like getting rid of polio.  And those who work on intractable problems, like saving and developing democracy. When monitoring and evaluation becomes a primary management lens for our movement, it shouldn't make us more and more risk-averse. &lt;span&gt;Let us not focus on the easy problems let us always focus, as a movement, on the hard problems. When it comes to vocabulary, I am not totally sure that phrases like 'product experience', 'target markets', and 'Knowledge as a Service' is the vocabulary of the movement. &lt;/span&gt;&lt;span&gt;Maybe, we need to think of two types of vocabulary, External facing vocabulary and internal facing vocabulary.&lt;/span&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Watch the Video&lt;/h3&gt;
&lt;p&gt;&lt;iframe frameborder="0" height="288" src="https://commons.wikimedia.org/wiki/File:Wikimedia_Summit_2019_-_Key_listener_Sunil_Abraham.webm?embedplayer=yes" width="512"&gt;&lt;/iframe&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Video, via Wikimedia Commons, source: &lt;/span&gt;&lt;a href="https://commons.wikimedia.org/wiki/File:Wikimedia_Summit_2019_-_Key_listener_Sunil_Abraham.webm" target="_blank"&gt;https://commons.wikimedia.org/wiki/File:Wikimedia_Summit_2019_-_Key_listener_Sunil_Abraham.webm&lt;/a&gt;. &lt;br /&gt;&lt;span&gt;Author, &lt;/span&gt;&lt;a class="gmail-m_-4889359088796478559gmail-new" href="https://commons.wikimedia.org/w/index.php?title=User:Anna_Rees_(WMDE)&amp;amp;action=edit&amp;amp;redlink=1" target="_blank" title="User:Anna Rees (WMDE) (page does not exist)"&gt;Anna Rees (WMDE)&lt;/a&gt;&lt;span&gt;: Uploader: &lt;/span&gt;&lt;a class="gmail-m_-4889359088796478559gmail-mw-userlink" href="https://commons.wikimedia.org/wiki/User:Cornelius_Kibelka_(WMDE)" target="_blank" title="User:Cornelius Kibelka (WMDE)"&gt;Cornelius Kibelka (WMDE)&lt;/a&gt;&lt;span&gt;, This file is licensed under the &lt;a class="gmail-m_-4889359088796478559extiw" href="https://en.wikipedia.org/wiki/en:Creative_Commons" target="_blank" title="w:en:Creative Commons"&gt;Creative Commons&lt;/a&gt; &lt;a class="gmail-m_-4889359088796478559gmail-text gmail-m_-4889359088796478559external" href="https://creativecommons.org/licenses/by-sa/4.0/deed.en" rel="nofollow" target="_blank"&gt;Attribution-Share Alike 4.0 International&lt;/a&gt; license.&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/openness/sunil-abraham-key-listener-speech-at-wikimedia-summit-2019'&gt;http://editors.cis-india.org/openness/sunil-abraham-key-listener-speech-at-wikimedia-summit-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Openness</dc:subject>
    
    
        <dc:subject>Wikipedia</dc:subject>
    

   <dc:date>2019-05-04T03:34:15Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating">
    <title>Intermediary liability law needs updating </title>
    <link>http://editors.cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating</link>
    <description>
        &lt;b&gt;The time has come for India to exert its foreign policy muscle. There is a less charitable name for intermediary liability regimes like Sec 79 of the IT Act — private censorship regimes. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in &lt;a class="external-link" href="https://www.business-standard.com/article/opinion/intermediary-liability-law-needs-updating-119020900705_1.html"&gt;Business Standard&lt;/a&gt; on February 9, 2019.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Intermediaries get immunity from liability emerging from user-generated and third-party content because they have no “actual knowledge” until it is brought to their notice using “take down” requests or orders.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Since some of the harm caused is immediate, irreparable and irreversible, it is the preferred alternative to approaching courts for each case. When intermediary liability regimes were first enacted, most intermediaries were acting as common carriers — ie they did not curate the experience of users in a substantial fashion. While some intermediaries like Wikipedia continue this common carrier tradition, others driven by advertising revenue no longer treat all parties and all pieces of content neutrally. Facebook, Google and Twitter do everything they can to raise advertising revenues. They make you depressed. And if they like you, they get you to go out and vote. There is an urgent need to update intermediary liability law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In response to being summoned by multiple governments, Facebook has announced the establishment of an independent oversight board. A global free speech court for the world’s biggest online country. The time has come for India to exert its foreign policy muscle. The amendments to our intermediary liability regime can have global repercussions, and shape the structure and functioning of this and other global courts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While with one hand Facebook dealt the oversight board, with the other hand it took down APIs that would enable press and civil society to monitor political advertising in real time. How could they do that with no legal consequences? The answer is simple — those APIs were provided on a voluntary basis. There was no law requiring them to do so.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are two approaches that could be followed. One, as scholar of regulatory theory Amba Kak puts it, is to “disincentivise the black box”. Most transparency reports produced by intermediaries today are on a voluntary basis; there is no requirement for this under law. Our new law could require a extensive transparency with appropriate privacy safeguards for the government, affected parties and the general public in terms of revenues, content production and consumption, policy development, contracts, service-level agreements, enforcement, adjudication and appeal. User empowerment measures in the user interface and algorithm explainability could be required. The key word in this approach is transparency.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The alternative is to incentivise the black box. Here faith is placed in technological solutions like artificial intelligence. To be fair, technological solutions may be desirable for battling child pornography, where pre-censorship (or deletion before content is published) is required. Fingerprinting technology is used to determine if the content exists in a global database maintained by organisations like the Internet Watch Foundation. A similar technology called Content ID is used pre-censor copyright infringement. Unfortunately, this is done by ignoring the flexibilities that exist in Indian copyright law to promote education, protect access knowledge by the disabled, etc. Even within such narrow application of technologies, there have been false positives. Recently, a video of a blogger testing his microphone was identified as a pre-existing copyrighted work.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The goal of a policy-maker working on this amendment should be to prevent repeats of the Shreya Singhal judgment where sections of the IT Act were read down or struck down. To avoid similar constitution challenges in the future, the rules should not specify any new categories of illegal content, because that would be outside the scope of the parent clause. The fifth ground in the list is sufficient — “violates any law for the time being in force”. Additional grounds, such as “harms minors in anyway”, is vague and cannot apply to all categories of intermediaries — for example, a dating site for sexual minorities. The rights of children need to be protected. But that is best done within the ongoing amendment to the POCSO Act.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As an engineer, I vote to eliminate redundancy. If there are specific offences that cannot fit in other parts of the law, those offences can be added as separate sections in the IT Act. For example, even though voyeurism is criminalised in the IT Act, the non-consensual distribution of intimate content could be criminalised, as it has been done in the Philippines.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Provisions that have to do with data retention and government access to that data for the purposes of national security, law enforcement and also anonymised datasets for the public interest should be in the upcoming Data Protection law. The rules for intermediary liability is not the correct place to deal with it, because data retention may also be required of those intermediaries that don’t handle any third-party information or user generated content. Finally, there have to be clear procedures in place for reinstatement of content that has been taken down.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;Disclosure: The Centre for Internet and Society receives grants from Facebook, Google and Wikimedia Foundation&lt;/i&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating'&gt;http://editors.cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    

   <dc:date>2019-02-13T00:05:30Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/business-standard-january-2-2019-registering-for-aadhaar-in-2019">
    <title>Registering for Aadhaar in 2019</title>
    <link>http://editors.cis-india.org/internet-governance/blog/business-standard-january-2-2019-registering-for-aadhaar-in-2019</link>
    <description>
        &lt;b&gt;It is a lot less scary registering for Aadhaar in 2019 than it was in 2010, given how the authentication modalities have since evolved.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in &lt;a class="external-link" href="https://www.business-standard.com/article/opinion/registering-for-aadhaar-in-2019-119010201018_1.html"&gt;Business Standard&lt;/a&gt; on January 2, 2019.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Last November, a global committee of lawmakers from nine countries the UK, Canada, Ireland, Brazil, Argentina, Singapore, Belgium, France and Latvia summoned Mark Zuckerberg to what they called an “international grand committee” in London. Mr. Zuckerberg was too spooked to show up, but Ashkan Soltani, former CTO of the FTC was among those who testified against Facebook. He said “in the US, a lot of the reticence to pass strong policy has been about killing the golden goose” referring to the innovative technology sector. Mr. Soltani went on to argue that “smart legislation will incentivise innovation”. This could be done either intentionally or unintentionally by governments. For example, a poorly thought through blocking of pornography can result in innovative censorship circumvention technologies. On other occasions, this can happen intentionally. I hope to use my inaugural column in these pages to provide an Indian example of such intentional regulatory innovation.&lt;br /&gt;&lt;br /&gt;Eight years ago, almost to this date, my colleague Elonnai Hickok wrote an open letter to the Parliamentary Finance Committee on what was then called the UID or Unique Identity. She compared Aadhaar to the digital identity project started by the National Democratic Alliance (NDA) government in 2001. Like the Vajpayee administration which was working in response to the Kargil War, she advocated a decentralised authentication architecture using smart cards based on public key cryptography. Last year, even before the five-judge constitutional bench struck down Section 57 of the Aadhaar Act, the UIDAI preemptively responded to this regulatory development by launching offline Aadhaar cards. This was to be expected especially since from the A.P. Shah Committee report, the Puttaswamy Judgment, the B.N. Srikrishna Committee consultation paper, report and bill, the principle of “privacy by design” was emerging as a key Indian regulatory principle in the domain of data protection.&lt;br /&gt;&lt;br /&gt;The introduction of the offline Aadhaar mechanism eliminates the need for biometrics during authentication. I have previously provided 11 reasons why biometrics is inappropriate technology for e-governance applications by democratic governments, and this comes as a massive relief for both human rights activists and security researchers. Second, it decentralises authentication, meaning that there is a no longer a central database that holds a 360-degree view of all incidents of identification and authentication. Third, it dramatically reduces the attack surface for Aadhaar numbers, since only the last four digits remain unmasked on the card. Each data controller using Aadhaar will have to generate his/her own series of unique identifiers to distinguish between residents. If those databases leak or get breached, it won’t tarnish the credibility of Aadhaar or the UIDAI to the same degree. Fourth, it increases the probability of attribution in case a data breach were to occur; if the breached or leaked data contains identifiers issued by a particular data controller, it would become easier to hold them accountable and liable for the associated harms. Fifth, unlike the previous iteration of the Aadhaar “card”, on which the QR code was easy to forge and alter, this mechanism provides for integrity and tamper detection because the demographic information contained within the QR code is digitally signed by the UIDAI. Finally, it retains the earlier benefit of being very cheap to issue, unlike smart cards.&lt;br /&gt;&lt;br /&gt;Thanks to the UIDAI, the private sector is also being forced to implement privacy by design. Previously, since everyone was responsible for protecting Aadhaar numbers, nobody was. Data controllers would gladly share the Aadhaar number with their contractors, that is, data processors, since nobody could be held responsible. Now, since their own unique identifiers could be used to trace liability back to them, data controllers will start using tokenisation when they outsource any work that involves processing of the collected data. Skin in the game immediately breeds more responsible behaviour in the ecosystem.&lt;br /&gt;&lt;br /&gt;The fintech sector has been rightfully complaining about regulatory and technological uncertainty from last year’s developments. This should be addressed by developing open standards and free software to allow for rapid yet secure implementation of these changes. The QR code standard itself should be an open standard developed by the UIDAI using some of the best practices common to international standard setting organisations like the World Wide Web Consortium, Internet Engineers Task Force and the Institute of Electrical and Electronics Engineers. While the UIDAI might still choose to take the final decision when it comes to various technological choices, it should allow stakeholders to make contributions through comments, mailing lists, wikis and face-to-face meetings. Once a standard has been approved, a reference implementation must be developed by the UIDAI under liberal licences, like the BSD licence that allows for both free software and proprietary software derivative works. For example, a software that can read the QR code as well as send and receive the OTP to authenticate the resident. This would ensure that smaller fintech companies with limited resources can develop secure systems.&lt;br /&gt;&lt;br /&gt;Since Justice Dhananjaya Y. Chandrachud’s excellent dissent had no other takers on the bench, holdouts like me must finally register for an Aadhaar number since we cannot delay filing taxes any further. While I would still have preferred a physical digital artefact like a smart card (built on an open standard), I must say it is a lot less scary registering for Aadhaar in 2019 than it was in 2010, given how the authentication modalities have since evolved.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/business-standard-january-2-2019-registering-for-aadhaar-in-2019'&gt;http://editors.cis-india.org/internet-governance/blog/business-standard-january-2-2019-registering-for-aadhaar-in-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-01-03T14:59:04Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news">
    <title>A trust deficit between advertisers and publishers is leading to fake news</title>
    <link>http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news</link>
    <description>
        &lt;b&gt;Transparency regulations is need of the hour. And urgently for election and political advertising. What do the ads look like? Who paid for them? Who was the target? How many people saw these advertisements? How many times? Transparency around viral content is also required.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in &lt;a class="external-link" href="https://www.hindustantimes.com/analysis/a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news/story-SVNH9ot3KD50XRltbwOyEO.html"&gt;Hindustan Times&lt;/a&gt; on September 24, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Traditionally, we have depended on the private censorship that  intermediaries conduct on their platforms. They enforce, with some  degree of success, their own community guidelines and terms of services  (TOS). Traditionally, these guidelines and TOS have been drafted keeping  in mind US laws since historically most intermediaries, including  non-profits like Wikimedia Foundation were founded in the US.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Across  the world, this private censorship regime was accepted by governments  when they enacted intermediary liability laws (in India we have Section  79A of the IT Act). These laws gave intermediaries immunity from  liability emerging from third party content about which they have no  “actual knowledge” unless they were informed using takedown notices.  Intermediaries set up offices in countries like India, complied with  some lawful interception requests, and also conducted geo-blocking to  comply with local speech regulation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For years, the Indian  government has been frustrated since policy reforms that it has pursued  with the US have yielded little fruit. American policy makers keep  citing shortcomings in the Indian justice systems to avoid expediting  the MLAT (Mutual Legal Assistance Treaties) process and the signing of  an executive agreement under the US Clout Act. This agreement would  compel intermediaries to comply with lawful interception and data  requests from Indian law enforcement agencies no matter where the data  was located.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The data localisation requirement in the draft  national data protection law is a result of that frustration. As with  the US, a quickly enacted data localisation policy is absolutely  non-negotiable when it comes to Indian military, intelligence, law  enforcement and e-governance data. For India, it also makes sense in the  cases of health and financial data with exceptions under certain   circumstances. However, it does not make sense for social media  platforms since they, by definition, host international networks of  people. Recently an inter ministerial committee recommended that  “criminal proceedings against Indian heads of social media giants” also  be considered. However, raiding Google’s local servers when a lawful  interception request is turned down or arresting Facebook executives  will result in retaliatory trade actions from the US.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the consequences of online recruitment, disinformation in  elections and fake news to undermine public order are indeed serious,  are there alternatives to such extreme measures for Indian policy  makers? Updating intermediary liability law is one place to begin. These  social media companies increasingly exercise editorial control, albeit  indirectly, via algorithms to claim that they have no “actual  knowledge”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;But they are no longer mere conduits or dumb pipes as  they are now publishers who collect payments to promote content.  Germany passed a law called NetzDG in 2017 which requires expedited  compliance with government takedown orders. Unfortunately, this law does  not have sufficient safeguards to prevent overzealous private  censorship. India should not repeat this mistake, especially given what  the Supreme Court said in the Shreya Singhal judgment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Transparency  regulations are imperative. And they are needed urgently for election  and political advertising. What do the ads look like? Who paid for them?  Who was the target? How many people saw these advertisements? How many  times? Transparency around viral content is also required. Anyone should  be able to see all public content that has been shared with more than a  certain percentage of the population over a historical timeline for any  geographic area. This will prevent algorithmic filter bubbles and echo  chambers, and also help public and civil society monitor  unconstitutional and hate speech that violates terms of service of these  platforms. So far the intermediaries have benefitted from surveillance —  watching from above. It is time to subject them to sousveillance —  watched by the citizens from below.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Data portability mandates and  interoperability mandates will allow competition to enter these monopoly  markets. Artificial intelligence regulations for algorithms that  significantly impact the global networked public sphere could require –  one, a right to an explanation and two, a right to influence automated  decision making that influences the consumers experience on the  platform.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The real solution lies elsewhere. Google and Facebook  are primarily advertising networks. They have successfully managed to  destroy the business model for real news and replace it with a business  model for fake news by taking away most of the advertising revenues from  traditional and new news media companies. They were able to do this  because there was a trust deficit between advertisers and publishers.  Perhaps this trust deficit could be solved by a commons-based solutions  based on free software, open standards and collective action by all  Indian new media companies.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news'&gt;http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Intermediary Liability</dc:subject>
    
    
        <dc:subject>Censorship</dc:subject>
    

   <dc:date>2018-10-02T06:44:55Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/files/indias-contribution-to-internet-governance-debates">
    <title>India's Contribution to Internet Governance Debates</title>
    <link>http://editors.cis-india.org/internet-governance/files/indias-contribution-to-internet-governance-debates</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/files/indias-contribution-to-internet-governance-debates'&gt;http://editors.cis-india.org/internet-governance/files/indias-contribution-to-internet-governance-debates&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-08-16T13:32:54Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around">
    <title>Spreading unhappiness equally around</title>
    <link>http://editors.cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around</link>
    <description>
        &lt;b&gt;The section of civil society opposed to Aadhaar is unhappy because the UIDAI and all other state agencies that wish to can process data non-consensually.&lt;/b&gt;
        &lt;p&gt;The article was published in &lt;a class="external-link" href="https://www.business-standard.com/article/opinion/spreading-unhappiness-equally-around-118073100008_1.html"&gt;Business Standard&lt;/a&gt; on July 31, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;There is a joke in policy-making circles — you know you have reached a good compromise if all the relevant stakeholders are equally unhappy. By that measure, the B N Srikrishna committee has done a commendable job since there are many with complaints.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some in the private sector are unhappy because their demonisation of the European Union’s General Data Protection Regulation (GDPR) has failed. The committee’s draft data protection Bill is closely modelled upon the GDPR in terms of rights, principles, design of the regulator and the design of the regulatory tools like impact assessments. With 4 per cent of global turnover as maximum fine, there is a clear signal that privacy infringements by transnational corporations will be reigned in by the regulator. Getting a law that has copied many elements of the European regulation is good news for us because the GDPR is recognised by leading human rights organisations as the global gold standard. But the bad news for us is that the Bill also has unnecessarily broad data localisation mandates for the private sector.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some in the fintech sector are unhappy because the committee rejected the suggestion that privacy be regulated as a property right. This is a positive from the human rights perspective, especially because this approach has been rejected across the globe, including the European Union. Property rights are inappropriate because a natural law framing of the enclosure of the commons into private property through labour does not translate to personal data. Also in comparison to patents — or “intellectual property” — the scale of possible discreet property holdings in personal information is several orders higher, posing unimaginable complexity for regulation, possibly creating a gridlock economy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The section of civil society opposed to Aadhaar is unhappy because the UIDAI and all other state agencies that wish to can process data non-consensually. A similar loophole exists in the GDPR. Remember the definition of processing includes “operations such as collection, recording, organisation, structuring, storage, adaptation, alteration, retrieval, use, alignment or combination, indexing, disclosure by transmission, dissemination or otherwise making available, restriction, erasure or destruction”. This means the UIDAI can collect data from you without your consent and does not have to establish consent for the data it has collected in the past. There is a “necessary” test which is supposed to constrain data collection. But for the last 10 odd years, the UIDAI has deemed it “necessary” to collect biometrics to give the poor subsidised grain. Will those forms of disproportionate non-consensual data collection continue? Most probably because the report recommends that the UIDAI continue to play the role of the regulator with heightened powers. Which is like trusting the fox with&lt;br /&gt;the henhouse.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Employees should be unhappy because the Bill has an expansive ground under which employers can nonconsensually harvest their data. The Bill allows for non-consensual processing of any data “necessary” for recruitment, termination, providing any benefit or service, verifying the attendance or any other activity related to the assessment of the performance”. This is permitted when consent is not an appropriate basis or would involve disproportionate effort on the part of the employer. This is basically a surveillance provision for employers. Either this ground should be removed like in the GDPR or a “proportionate” test should also be introduced otherwise disproportionate mechanisms like spyware on work computers will be installed by employees without providing notice.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Some free speech activists are unhappy because the law contains a “right to be forgotten” provision. They are concerned that this will be used by the rich and powerful to censor mainstream and alternative media. On the face of the “right to be forgotten” in the GDPR is a much more expansive “right to erasure”, whilst the Bill only provides for a more limited "right to restrict or prevent continuing disclosure”. However, the GDPR has a clear exception for “archiving purposes in the public interest, scientific or historical research purposes or statistical purposes”. The Bill like the GDPR does identify the two competing human rights imperatives — freedom of expression and the right to information. However, by missing the “public interest” test it does not sufficiently social power asymmetries.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Privacy and security researchers are unhappy because re-identification has been made an offence without a public interest or research exception. It is indeed a positive that the committee has made re-identification a criminal offence. This is because the de-identification standards notified by the regulator would always be catching up with the latest mathematical development. However, in order to protect the very research that the regulator needs to protect the rights of individuals, the Bill should have granted the formal and non-formal academic community immunity from liability and criminal prosecution.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Lastly but also most importantly, human rights activists are unhappy because the committee again like the GDPR did not include sufficiently specific surveillance law fixes. The European Union has historically handled this separately in the ePrivacy Regulation. Maybe that is the approach we must also follow or maybe this was a missed opportunity. Overall, the B N Srikrishna committee must be commended for producing a good data protection Bill. The task before us is to make it great and to have it enacted by Parliament at the earliest.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around'&gt;http://editors.cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-07-31T14:49:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill">
    <title>Lining up the data on the Srikrishna Privacy Draft Bill</title>
    <link>http://editors.cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill</link>
    <description>
        &lt;b&gt;In the run-up to the Justice BN Srikrishna committee report, some stakeholders have advocated that consent be eliminated and replaced with stronger accountability obligations. This was rejected and the committee has released a draft bill that has consent as the bedrock just like the GDPR. And like the GDPR there exists legal basis for nonconsensual processing of data for the “functions of the state”. What does this mean for lawabiding persons?&lt;/b&gt;
        &lt;p&gt;The article was published in &lt;a class="external-link" href="https://economictimes.indiatimes.com/small-biz/startups/newsbuzz/lining-up-the-data-on-the-srikrishna-privacy-draft-bill/articleshow/65192296.cms"&gt;Economic Times&lt;/a&gt; on July 30, 2018&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Non-consensual processing is permitted in the bill as long it is “necessary for any function of the” Parliament or any state legislature. These functions need not be authorised by law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Or alternatively “necessary for any function of the state authorised by law” for the provision of a service or benefit, issuance of any certification, licence or permit.&lt;br /&gt;Fortunately, however, the state remains bound by the eight obligations in chapter two i.e., fair and reasonable processing, purpose limitation, collection limitation, lawful processing, notice and data quality and data storage limitations and accountability. This ground in the GDPR has two sub-clauses: one, the task passes the public interest test and two, the loophole like the Indian bill that possibly includes all interactions the state has with all persons.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The “necessary” test appears both on the grounds for non-consensual processing, and in the “collection limitation” obligation in chapter two of the bill. For sensitive personal data, the test is raised to “strictly necessary”. But the difference is not clarified and the word “necessary” is used in multiple senses.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Under the “collection limitation” obligation the bill says “necessary for the purposes of processing” which indicates a connection to the “purpose limitation” obligation. The “purpose limitation” obligation, however, only requires the state to have a purpose that is “clear, specific and lawful” and processing limited to the “specific purpose” and “any other incidental purpose that the data principal would reasonably expect the personal data to be used for”. It is perhaps important at this point to note that the phrase “data minimisation” does not appear anywhere in the bill.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Therefore “necessary” could broadly understood to mean data Parliament or the state legislature requires to perform some function unauthorised by law, and data the citizen might reasonably expect a state authority to consider incidental to the provision of a service or benefit, issuance of a certificate, licence or permit.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Or alternatively more conservatively understood to mean data without which it would be impossible for Parliament and state legislature to carry out functions mandated by the law, and data without it would be impossible for the state to provide the specific service or benefit or issue certificates, licences and permits. It is completely unclear like with the GDPR why an additional test of “strictly necessary” is — if you will forgive the redundancy — necessary.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;After 10 years of Aadhaar, the average citizen “reasonably expects” the state to ask for biometric data to provide subsidised grain. But it is not impossible to provide subsidised grain in a corruption-free manner without using surveillance technology that can be used to remotely, covertly and non-consensually identify persons. Smart cards, for example, implement privacy by design. Therefore a “reasonable expectation” test is not inappropriate since this is not a question about changing social mores.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;When it comes to persons that are not law abiding the bill has two exceptions — “security of the state” and “prevention, detection, investigation and prosecution of contraventions of law”. Here the “necessary” test is combined with the “proportionate” test.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The proportionate test further constrains processing. For example, GPS data may be necessary for detecting someone has jumped a traffic signal but it might not be a proportionate response for a minor violation. Along with the requirement for “procedure established by law”, this is indeed a well carved out exception if the “necessary” test is interpreted conservatively. The only points of concern here is that the infringement of a fundamental right for minor offences and also the “prevention” of offences which implies processing of personal data of innocent persons.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Ideally consent should be introduced for law-abiding citizens even if it is merely tokenism because you cannot revoke consent if you have not granted it in the first place. Or alternatively, a less protective option would be to admit that all egovernance in India will be based on surveillance, therefore “necessary” should be conservatively defined and the “proportionate” test should be introduced as an additional safeguard.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill'&gt;http://editors.cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-07-31T02:52:23Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention">
    <title>Why NPCI and Facebook need urgent regulatory attention </title>
    <link>http://editors.cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention</link>
    <description>
        &lt;b&gt;The world’s oldest networked infrastructure, money, is increasingly dematerialising and fusing with the world’s latest networked infrastructure, the Internet. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in the &lt;a class="external-link" href="https://economictimes.indiatimes.com/industry/banking/finance/banking/why-npci-and-facebook-need-urgent-regulatory-attention/articleshow/64522587.cms"&gt;Economic Times&lt;/a&gt; on June 10, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;As the network effects compound, disruptive acceleration hurtle us towards financial utopia, or dystopia. Our fate depends on what we get right and what we get wrong with the law, code and architecture, and the market.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Internet, unfortunately, has completely transformed from how it was first architected. From a federated, generative network based on free software and open standards, into a centralised, environment with an increasing dependency on proprietary technologies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In countries like Myanmar, some citizens misconstrue a single social media website, Facebook, for the internet, according to LirneAsia research. India is another market where Facebook could still get its brand mistaken for access itself by some users coming online. This is Facebook put so many resources into the battle over Basics, in the run-up to India’s network neutrality regulation. an odd corporation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On hand, its business model is what some term surveillance capitalism. On the other hand, by acquiring WhatsApp and by keeping end-toend (E2E) encryption “on”, it has ensured that one and a half billion users can concretely exercise their right to privacy. At the time of the acquisition, WhatsApp founders believed Facebook’s promise that it would never compromise on their high standards of privacy and security. But 18 months later, Facebook started harvesting data and diluting E2E.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In April this year, my colleague Ayush Rathi and I wrote in Asia Times that WhatsApp no longer deletes multimedia on download but continues to store it on its servers. Theoretically, using the very same mechanism, Facebook could also be retaining encrypted text messages and comprehensive metadata from WhatsApp users indefinitely without making this obvious.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;My friend, Srikanth Lakshmanan, founder of the CashlessConsumer collective, is a keen observer of this space. He says in India, “we are seeing an increasing push towards a bank-led model, thanks to National Payments Corporation of India (NPCI) and its control over Unified Payments Interface (UPI), which is also known as the cashless layer of the India Stack.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;NPCI is best understood as a shape shifter. Arundhati Ramanathan puts it best when she says “depending on the time and context, NPCI is a competitor. It is a platform. It is a regulator. It is an industry association. It is a profitable non-profit. It is a rule maker. It is a judge. It is a bystander.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This results in UPI becoming, what Lakshmanan calls, a NPCI-club-good rather than a new generation digital public good. He also points out that NPCI has an additional challenge of opacity — “it doesn’t provide any metrics on transaction failures, and being a private body, is not subject to proactive or reactive disclosure requirements under the RTI.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Technically, he says, UPI increases fragility in our financial ecosystem since it “is a centralised data maximisation network where NPCI will always have the superset of data.” Given that NPCI has opted for a bank-led model in India, it is very unlikely that Facebook able to leverage its monopoly the social media market duopoly it shares with in the digital advertising market to become a digital payments monopoly.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, NCPI and Facebook both share the following traits — one, an insatiable appetite for personal information; two, a fetish for hypercentralisation; three, a marginal commitment to transparency, and four, poor track record as a custodian of consumer trust. The marriage between these like-minded entities has already had a dubious beginning.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Previously, every financial technology wanting direct access to the NPCI infrastructure had to have a tie-up with a bank. But for Facebook and Google, as they are large players, it was decided to introduce a multi-bank model. This was definitely the right thing to do from a competition perspective. But, unfortunately, the marriage between the banks and the internet giant was arranged by NPCI in an opaque process and WhatsApp was exempted from the full NPCI certification process for its beta launch.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Both NPCI and Facebook need urgent regulatory attention. A modern data protection law and a more proactive competition regulator is required for Facebook. The NPCI will hopefully also be subjected to the upcoming data protection law. But it also requires a range of design, policy and governance fixes to ensure greater privacy and security via data minimisation and decentralisation; greater accountability and transparency to the public; separation of powers for better governance and open access policies to prevent anti-competitive behaviour.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention'&gt;http://editors.cis-india.org/internet-governance/blog/economic-times-june-10-2018-sunil-abraham-why-npci-and-facebook-need-urgent-regulatory-attention&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-12T02:07:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/business-standard-march-28-2018-sunil-abraham-cambridge-analytica-scandal-how-india-can-save-democracy-from-facebook">
    <title>Cambridge Analytica scandal: How India can save democracy from Facebook</title>
    <link>http://editors.cis-india.org/internet-governance/blog/business-standard-march-28-2018-sunil-abraham-cambridge-analytica-scandal-how-india-can-save-democracy-from-facebook</link>
    <description>
        &lt;b&gt;Hegemonic incumbents like Google and Facebook need to be tackled with regulation; govt should use procurement power to fund open source alternatives.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in the &lt;a class="external-link" href="http://www.business-standard.com/article/economy-policy/cambridge-analytica-scandal-how-india-can-save-democracy-from-facebook-118032800146_1.html"&gt;Business Standard&lt;/a&gt; on March 28, 2018&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;&lt;em&gt;The Cambridge Analytica scandal came to light when &lt;a class="storyTags" href="http://www.business-standard.com/search?type=news&amp;amp;q=whistleblower" target="_blank"&gt;whistleblower &lt;/a&gt;Wylie accused Cambridge Analytica of gathering details of 50 million Facebook users. Cambridge Analytica used this data to psychologically profile these users and manipulated their opinion in favour of Donald Trump. BJP and Congress have accused each other of using the services of Cambridge Analytica in India as well. How can India safeguard the democratic process against such intervention? The author tries to answer this question in this Business Standard Special.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;&lt;em&gt;&lt;/em&gt;&lt;/strong&gt;Those that celebrate the big data/artificial intelligence moment claim that traditional approaches to data protection are no longer relevant and therefore must be abandoned. The Cambridge Analytica episode, if anything, demonstrates how wrong they are. The principles of data protection need to be reinvented and weaponized, not discarded. In this article I shall discuss the reinvention of three such data protection principles. Apart from this I shall also briefly explore competition law solutions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;&lt;em&gt;Collect data only if mandated by regulation&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;&lt;em&gt;&lt;/em&gt;&lt;/strong&gt;One, data minimization is the principle that requires the data controller to collect data only if mandated to do so by regulation or because it is a prerequisite for providing a functionality. For example, Facebook’s messenger app on Android harvests call records and meta-data, without any consumer facing feature on the app that justifies such collection. Therefore, this is a clear violation of the data minimization principle. One of the ways to reinvent this principle is by borrowing from the best practices around warnings and labels on packaging introduced by the global anti-tobacco campaign. A permanent bar could be required in all apps, stating ‘Facebook holds W number of records across X databases over the time period Y, which totals Z Gb’. Each of these alphabets could be a hyperlink, allowing the user to easily drill down to the individual data record.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;&lt;strong&gt;Consent must be explicit, informed and voluntary&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;&lt;strong&gt;&lt;/strong&gt;&lt;/em&gt;Two, the principle of consent requires that the data controller secure explicit, informed and voluntary consent from the data subject unless there are exceptional circumstances. Unfortunately, consent has been reduced to a mockery today through obfuscation by lawyers in verbose “privacy notices” and “terms of services”. To reinvent consent we need to bring ‘Do Not Dial’ registries into the era of big data. A website maintained by the future Indian data protection regulator could allow individuals to check against their unique identifiers (email, phone number, Aadhaar). The website would provide a list of all data controllers that are holding personal information against a particular unique identifier. The data subject should then be able to revoke consent with one-click. Once consent is revoked, the data controller would have to delete all personal information that they hold, unless retention of such information is required under law (for example, in banking law). One-click revocation of consent will make data controllers like Facebook treat data subjects with greater respect.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;&lt;strong&gt;There must be a right to &lt;/strong&gt;&lt;/em&gt;&lt;em&gt;&lt;strong&gt;explanation&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;&lt;strong&gt;&lt;/strong&gt;&lt;/em&gt;Three, the right to explanation, most commonly associated with the General Data Protection Directive from the EU, is a principle that requires the data controller to make transparent the automated decision-making process when personal information is implicated. So far it has been seen as a reactive measure for user empowerment. In other words, the explanation is provided only when there is a demand for it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Facebook feeds that were used for manipulation through micro-targeting of content is an example of such automated decision making. Regulation in India should require a user empowerment panel accessible through a prominent icon that appears repeatedly in the feed. On clicking the icon the user will be able to modify the objectives that the algorithm is maximizing for. She can then choose to see content that targets a bisexual rather than a heterosexual, a Muslim rather than a Hindu, a conservative rather a liberal, etc. At the moment, Facebook only allows the user to stop being targeted for advertisements based on certain categories. However, to be less susceptible to psychological manipulation, the user should be allowed to define these categories, for both content and advertisements.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;&lt;strong&gt;How to fix the business model?&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;&lt;strong&gt;&lt;/strong&gt;&lt;/em&gt;From a competition perspective, Google and Facebook have destroyed the business model for real news, and replaced it with a business model for fake news, by monopolizing digital advertising revenues. Their algorithms are designed to maximize the amount of time that users spend on their platforms, and therefore, don’t have any incentive to distinguish between truth and falsehood. This contemporary crisis requires three types of interventions: one, appropriate taxation and transparency to the public, so that the revenue streams for fake news factories can be ended; two, the construction of a common infrastructure that can be shared by all traditional and new media companies in order to recapture digital advertising revenues; and three, immediate action by the competition regulator to protect competition between advertising networks operating in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;&lt;strong&gt;The Google challenge&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;em&gt;&lt;strong&gt;&lt;/strong&gt;&lt;/em&gt;With Google, the situation is even worse, since Google has dominance in both the ad network market and in the operating system market. During the birth of competition law, policy-makers and decision-makers acted to protect competition per se. This is because they saw competition as an essential component of democracy, open society, innovation, and a functioning market. When the economists from the Chicago school began to influence competition policy in the USA, they advocated for a singular focus on the maximization of consumer interest. The adoption of this ideology has resulted in competition regulators standing powerlessly by while internet giants wreck our economy and polity. We need to return to the foundational principles of competition law, which might even mean breaking Google into two companies. The operating system should be divorced from other services and products to prevent them from taking advantage of vertical integration. We as a nation need to start discussing the possible end stages of such a breakup.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In conclusion, all the fixes that have been listed above require either the enactment of a data protection law, or the amendment of our existing competition law. This, as we all know, can take many years. However, there is an opportunity for the government to act immediately if it wishes to. By utilizing procurement power, the central and state governments of India could support free and open source software alternatives to Google’s products especially in the education sector. The government could also stop using Facebook, Google and Twitter for e-governance, and thereby stop providing free advertising for these companies for print and broadcast media. This will make it easier for emerging firms to dislodge hegemonic incumbents.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/business-standard-march-28-2018-sunil-abraham-cambridge-analytica-scandal-how-india-can-save-democracy-from-facebook'&gt;http://editors.cis-india.org/internet-governance/blog/business-standard-march-28-2018-sunil-abraham-cambridge-analytica-scandal-how-india-can-save-democracy-from-facebook&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-03-28T15:44:00Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/files/icann-analysis">
    <title>ICANN Analysis</title>
    <link>http://editors.cis-india.org/internet-governance/files/icann-analysis</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/files/icann-analysis'&gt;http://editors.cis-india.org/internet-governance/files/icann-analysis&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2018-03-15T06:35:45Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/deccan-herald-january-20-2018-sunil-abraham-data-protection-we-can-innovate-leapfrog">
    <title>Data Protection: We can innovate, leapfrog</title>
    <link>http://editors.cis-india.org/internet-governance/blog/deccan-herald-january-20-2018-sunil-abraham-data-protection-we-can-innovate-leapfrog</link>
    <description>
        &lt;b&gt;About 27% of India's population is still illiterate or barely literate. Most privacy policies and terms of services for web and mobile applications are in English and therefore it is only 10% of us who can actually read them before we provide our consent.&lt;/b&gt;
        &lt;p&gt;The article was published in the &lt;a class="external-link" href="http://www.deccanherald.com/content/655018/data-protection-we-can-innovate.html"&gt;Deccan Herald&lt;/a&gt; on January 20, 2018.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Even if we can read them, we may not have the necessary legal training to understand them. According to a tweet thread by Pat Walshe (@privacymatters), the Tetris app, a popular video game, has a privacy policy that details the third-party advertising companies that they share data with. These third-parties include "123 Ad Networks; 13 Online Analytics companies; 62 Mobile Advertising Networks; 14 Mobile Analytics companies. The linked privacy policies for Tetris run to 407,000 words, compared to 450,000 words for the entire 'Lord of the Rings trilogy'." The child aged four and above that plays the game and her parents need an intermediary to deal with the corporations hiding behind Tetris.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Unlike the European Union, which has more than 37 years of history when it comes to data protection law, India is starting with a near blank slate after the Supreme Court confirmed that privacy is a constitutionally-guaranteed fundamental right in the Puttaswamy case judgement. While we would want to maintain adequacy and compatibility with the EU General Data Protection Regulation (GDPR) because it has become the global standard, we must realise that there is an opportunity for leapfrogging. This article attempts to introduce the reader to three different visions for intermediaries that have emerged within the Indian data protection debate around the accountability principle. I will also provide a brief sketch of an idea that we are developing at the Centre for Internet and Society. This is an incomplete list as there must be more proposals for regulatory innovation around the accountability principle that I am currently unaware of.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;n Account Aggregators: The 'India Stack' ecosystem that has been built around the Aadhaar programme first proposed intermediaries called Account Aggregators. Account Aggregators manage consent artifacts. India Stack has traditionally been described as having four layers -- presenceless, paperless, cashless and consent. The consent layer is supposed to feature Account Aggregators. If, for example, a data subject wanting an insurance policy visits an insurance portal, the portal would collect personal information and a consent artifact from her and pass it on to multiple insurance companies. These insurance companies would send personalised bids to the portal, which would be displayed on a comparative grid to enable empowered selection.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The data structure consent artifact has been provided in the Master Direction from RBI titled "Non-Banking Financial Company Account Aggregator Directions," published in September 2016. How does this work? The fields includes (i) identity and optional contact information; (ii) nature of the financial information requested; (iii) purpose; (iv) the identity of the recipients, if any; (v) URL/address for notifications when the consent artifact is used; (vi) consent artifact creation date, expiry date, identity and signature/digital signature of the Account Aggregator; and (vii) any other attribute as may be prescribed by the RBI. While Account Aggregators make it frictionless for the grant of consent and also for the harvesting of consent by data controllers, it does not make it easy for you to manage and revoke your consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;n Data Trusts: Most recently, Na.Vijayashankar, a Bengaluru-based cybersecurity and cyberlaw expert, has proposed intermediaries called 'Data Trusts' registered with the regulator and who (i) will work as escrow agents for the personal data (which would be classified by type for different degrees of protection); (ii) will make privacy notices accessible by translating them into accessible language and formats; (iii) disclose data minimally to different data controllers based on the purpose limitation; (iv) issue tokens or pseudonymous identifiers and monetise the data for the benefit of the data subject. To ensure that Data Trusts truly protect the interests of the data subject, Vijayashankar proposes three requirements: (a) public performance reviews (b) audits by the regulator and (c) "an arms-length relationship with the data collectors." In his proposal, Data Trusts are firms with "the ability to process a real-time request from the data subject to supply appropriate data to the data collector."&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;n Learned Intermediaries: The Takshashila Institution published a paper titled Beyond Consent: A New Paradigm for Data Protection, authored by Rahul Matthan, partner at the law firm Trilegal. Learned Intermediaries would perform mandatory audits on all data controllers above a particular threshold. Like Vijayashankar, Matthan also requires these intermediaries to be certified by an appropriate authority. The main harm that he focuses on is, bias or discrimination. He proposes three stages of audit which are designed for the age of Big Data and Artificial Intelligence: "(i) Database Query Review; (ii) Black Box Audits; and (iii) Algorithm Review". Matthan also tentatively considers a rating system. Learned Intermediaries are a means to address information asymmetry in the market by making data subjects more aware. The impact of churn on their bottom-lines, it is hoped, will force data controllers to behave in an accountable manner, protecting rights and mitigating harms.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;n Consent Brokers: Finally, I have proposed the model of a 'Consent Broker' by modifying the concept of the Account Aggregator. Like the Account Aggregator proposal, we would want a competitive set of consent brokers who will manage consent artifacts for data subjects. However, I believe there should be a 1:1 relationship between data subjects and consent brokers so that the latter compete for the business of data subjects. Like Vijayashankar, I believe that the consent broker must have an "arms-length distance" from data controllers and must be prohibited from making any money from them. Consent brokers could also be trusted to take proactive actions for the data subjects, such as access and correction.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The need of the hour is the production of regulatory innovations and robust discussions around them for all the nine privacy principles in the Justice AP Shah committee report -- notice, choice and consent, collection limitation, purpose limitation, access and correction, disclosure of information, security, openness and accountability.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/deccan-herald-january-20-2018-sunil-abraham-data-protection-we-can-innovate-leapfrog'&gt;http://editors.cis-india.org/internet-governance/blog/deccan-herald-january-20-2018-sunil-abraham-data-protection-we-can-innovate-leapfrog&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-01-22T01:45:46Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/business-standard-sunil-abraham-january-10-fixing-aadhaar">
    <title>Fixing Aadhaar: Security developers' task is to trim chances of data breach</title>
    <link>http://editors.cis-india.org/internet-governance/blog/business-standard-sunil-abraham-january-10-fixing-aadhaar</link>
    <description>
        &lt;b&gt;The task before a security developer is not only to reduce the probability of identity breach but to eliminate certain occurrences.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in &lt;a class="external-link" href="http://www.business-standard.com/article/opinion/fixing-aadhaar-security-developers-task-is-to-trim-chances-of-data-breach-118010901281_1.html"&gt;Business Standard&lt;/a&gt; on January 10, 2017&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;I feel no joy when my prophecies about digital identity systems come true. This is because from a Popperian perspective these are low-risk prophecies. I had said that that all centralised identity databases will be breached in the future. That may or may not happen within my lifetime so I can go to my grave without worries about being proven wrong. Therefore, the task before a security developer is not only to reduce the probability but more importantly to eliminate the possibility of certain occurrences.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The blame for fragility in digital identity systems today can be partially laid on a World Bank document titled “Ten Principles on Identification for Sustainable Development” which has contributed to the harmonisation of approaches across jurisdictions. Principle three says, “Establishing a robust — unique, secure, and accurate — identity”. The keyword here is “a”. Like The Lord of the Rings, the World Bank wants “one digital ID to rule them all”. For Indians, this approach must be epistemologically repugnant as ours is a land which has recognised the multiplicity of truth since ancient times.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;In “Identities Research Project: Final Report” funded by Omidyar Network and published by Caribou Digital — the number one finding is “people have always had, and managed, multiple personal identities”. And the fourth finding is “people select and combine identity elements for transactions during the course of everyday life”. As researchers they have employed indirect language, for layman the key takeaway is a single national ID for all persons and all purposes is an ahistorical and unworkable solution.&lt;/span&gt;&lt;/p&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p&gt;&lt;img src="http://editors.cis-india.org/home-images/AadhaarBS.png" style="text-align: justify; " title="Aadhaar BS" class="image-inline" alt="Aadhaar BS" /&gt;&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;&lt;span style="float: left; "&gt;&lt;span style="float: left; "&gt;&lt;i&gt;Revoke all &lt;a class="storyTags" href="http://www.business-standard.com/search?type=news&amp;amp;q=aadhaar" target="_blank"&gt;Aadhaar &lt;/a&gt;numbers that have been compromised, breached, leaked, illegally published or inadvertently disclosed and regenerate new global identifiers. Photo: Reuters&lt;/i&gt;&lt;/span&gt;&lt;/span&gt;&lt;/div&gt;
&lt;div style="text-align: justify; "&gt;&lt;span style="float: left; "&gt;&lt;br /&gt;&lt;/span&gt;&lt;/div&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;div style="text-align: justify; "&gt;&lt;span&gt;monoculture can be prevented. The traditional approach is followed in the US - you could have multiple documents that are accepted as valid ID. Or you could have multiple identity providers providing ID artifacts using an interoperable framework as they do in the UK. Another approach is tokenisation. The first time tokenisation was suggested in the Aadhaar context was in an academic paper published in August 2016 by Shweta Agrawal, Subhashis Banerjee and Subodh Sharma from IIT Delhi titled “Privacy and Security of Aadhaar: A Computer Science Perspective”.&lt;/span&gt;&lt;/div&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The paper in its fourth key recommendation says “cryptographically embed Aadhaar ID into Authentication User Agency (AUAs) and KYC User Agency (aka KUAs) — specific IDs making correlation impossible”. The paper considers several designs for such local identifier where — 1) no linking is possible, 2) only unidirectional linking is possible, and 3) bidirectional linking is possible referring to a similar scheme in the LSE identity report.&lt;/span&gt;&lt;/p&gt;
&lt;p id="_mcePaste" style="text-align: justify; "&gt;Though I had spoken about tokenisation as a fix for Aadhaar earlier, I wrote about it for the first time on the 31st of March, 2017, in The Hindu. The steps would be required are as follows. First, revoke all Aadhaar numbers that have been compromised, breached, leaked, illegally published or inadvertently disclosed and regenerate new global identifiers aka Aadhaar Numbers. Second, reduce the number of KYC transactions by eliminating all use cases that don’t result in corresponding transparency or security benefits. For example, most developed economies don’t have KYC for mobile phone connections. Three, the UIDAI should issue only tokens to those government entities and private sector service providers that absolutely must have KYC. When the NATGRID wants to combine subsets of 20 different databases for up to 12 different intelligence/law enforcement agencies they will have to approach the UIDAI with the token or Aadhaar number of the suspect. The UIDAI will then be able to release corresponding tokens and/or the Aadhaar number to the NATGRID. Implementing tokenisation introduces both technical and institutional checks and balances in our surveillance systems.&lt;/p&gt;
&lt;p id="_mcePaste" style="text-align: justify; "&gt;On 25th of July 2017, UIDAI published the first document providing implementation details for tokenisation wherein KUAs and AUAs were asked to generate the tokens. But this approach assumed that KYC user agencies could be trusted. This is because the digital identity solution for the nation as conceived by Aadhaar architects is based on the problem statement of digital identity within a firm. Within a firm all internal entities can be trusted. But in a nation state you cannot make this assumption. Airtel, a KUA, diverted 190 crores of LPG subsidy to more than 30 lakh payment bank accounts that were opened without informed consent. Axis Bank Limited, Suvidha Infoserve (a business correspondent) and eMudhra (an e-sign provider or AUA) have been accused of using replay attacks to perform unauthorised transactions. In November last year, the UIDAI indicated to the media that they were working on the next version of tokenisation — this time called dummy numbers or virtual numbers. This work needs to be accelerated to mitigate some of the risks in the current system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The paper in its fourth key recommendation says “cryptographically embed Aadhaar ID into Authentication User Agency (AUAs) and KYC User Agency (aka KUAs) — specific IDs making correlation impossible”. The paper considers several designs for such local identifier where — 1) no linking is possible, 2) only unidirectional linking is possible, and 3) bidirectional linking is possible referring to a similar scheme in the LSE identity report.Though I had spoken about tokenisation as a fix for Aadhaar earlier, I wrote about it for the first time on the 31st of March, 2017, in The Hindu. The steps would be required are as follows. First, revoke all Aadhaar numbers that have been compromised, breached, leaked, illegally published or inadvertently disclosed and regenerate new global identifiers aka Aadhaar Numbers. Second, reduce the number of KYC transactions by eliminating all use cases that don’t result in corresponding transparency or security benefits. For example, most developed economies don’t have KYC for mobile phone connections. Three, the UIDAI should issue only tokens to those government entities and private sector service providers that absolutely must have KYC. When the NATGRID wants to combine subsets of 20 different databases for up to 12 different intelligence/law enforcement agencies they will have to approach the UIDAI with the token or Aadhaar number of the suspect. The UIDAI will then be able to release corresponding tokens and/or the Aadhaar number to the NATGRID. Implementing tokenisation introduces both technical and institutional checks and balances in our surveillance systems.On 25th of July 2017, UIDAI published the first document providing implementation details for tokenisation wherein KUAs and AUAs were asked to generate the tokens. But this approach assumed that KYC user agencies could be trusted. This is because the digital identity solution for the nation as conceived by Aadhaar architects is based on the problem statement of digital identity within a firm. Within a firm all internal entities can be trusted. But in a nation state you cannot make this assumption. Airtel, a KUA, diverted 190 crores of LPG subsidy to more than 30 lakh payment bank accounts that were opened without informed consent. Axis Bank Limited, Suvidha Infoserve (a business correspondent) and eMudhra (an e-sign provider or AUA) have been accused of using replay attacks to perform unauthorised transactions. In November last year, the UIDAI indicated to the media that they were working on the next version of tokenisation — this time called dummy numbers or virtual numbers. This work needs to be accelerated to mitigate some of the risks in the current system.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/business-standard-sunil-abraham-january-10-fixing-aadhaar'&gt;http://editors.cis-india.org/internet-governance/blog/business-standard-sunil-abraham-january-10-fixing-aadhaar&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-01-10T16:47:59Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/the-hindu-businessline-march-31-2017-sunil-abraham-its-the-technology-stupid">
    <title>It’s the technology, stupid</title>
    <link>http://editors.cis-india.org/internet-governance/blog/the-hindu-businessline-march-31-2017-sunil-abraham-its-the-technology-stupid</link>
    <description>
        &lt;b&gt;Eleven reasons why the Aadhaar is not just non-smart but also insecure.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="http://www.thehindubusinessline.com/blink/cover/11-reasons-why-aadhaar-is-not-just-nonsmart-but-also-insecure/article9608225.ece"&gt;published in Hindu Businessline&lt;/a&gt; on March 31, 2017.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Aadhaar is insecure because it is based on biometrics. Biometrics is surveillance technology, a necessity for any State. However, surveillance is much like salt in cooking: essential in tiny quantities, but counterproductive even if slightly in excess. Biometrics should be used for targeted surveillance, but this technology should not be used in e-governance for the following reasons:&lt;br /&gt;&lt;br /&gt;One, biometrics is becoming a remote technology. High-resolution cameras allow malicious actors to steal fingerprints and iris images from unsuspecting people. In a couple of years, governments will be able to identify citizens more accurately in a crowd with iris recognition than the current generation of facial recognition technology.&lt;br /&gt;&lt;br /&gt;Two, biometrics is covert technology. Thanks to sophisticated remote sensors, biometrics can be harvested without the knowledge of the citizen. This increases effectiveness from a surveillance perspective, but diminishes it from an e-governance perspective.&lt;br /&gt;&lt;br /&gt;Three, biometrics is non-consensual technology. There is a big difference between the State identifying citizens and citizens identifying themselves to the state. With biometrics, the State can identify citizens without seeking their consent. With a smart card, the citizen has to allow the State to identify them. Once you discard your smart card the State cannot easily identify you, but you cannot discard your biometrics.&lt;br /&gt;&lt;br /&gt;Four, biometrics is very similar to symmetric cryptography. Modern cryptography is asymmetric. Where there is both a public and a private key, the user always has the private key, which is never in transit and, therefore, intermediaries cannot intercept it. Biometrics, on the other hand, needs to be secured during transit. The UIDAI’s (Unique Identification Authority of India overseeing the rollout of Aadhaar) current fix for its erroneous choice of technology is the use of “registered devices”; but, unfortunately, the encryption is only at the software layer and cannot prevent hardware interception.&lt;br /&gt;&lt;br /&gt;Five, biometrics requires a centralised network; in contrast, cryptography for smart cards does not require a centralised store for all private keys. All centralised stores are honey pots — targeted by criminals, foreign States and terrorists.&lt;br /&gt;&lt;br /&gt;Six, biometrics is irrevocable. Once compromised, it cannot be secured again. Smart cards are based on asymmetric cryptography, which even the UIDAI uses to secure its servers from attacks. If cryptography is good for the State, then surely it is good for the citizen too.&lt;br /&gt;&lt;br /&gt;Seven, biometrics is based on probability. Cryptography in smart cards, on the other hand, allows for exact matching. Every biometric device comes with ratios for false positives and false negatives. These ratios are determined in near-perfect lab conditions. Going by press reports and even UIDAI’s claims, the field reality is unsurprisingly different from the lab. Imagine going to an ATM and not being sure if your debit card will match your bank’s records.&lt;br /&gt;&lt;br /&gt;Eight, biometric technology is proprietary and opaque. You cannot independently audit the proprietary technology used by the UIDAI for effectiveness and security. On the other hand, open smart card standards like SCOSTA (Smart Card Operating System for Transport Applications) are based on globally accepted cryptographic standards and allow researchers, scientists and mathematicians to independently confirm the claims of the government.&lt;br /&gt;&lt;br /&gt;Nine, biometrics is cheap and easy to defeat. Any Indian citizen, even children, can make gummy fingers at home using Fevicol and wax. You can buy fingerprint lifting kits from a toystore. To clone a smart card, on the other hand, you need a skimmer, a printer and knowledge of cryptography.&lt;br /&gt;&lt;br /&gt;Ten, biometrics undermines human dignity. In many media photographs — even on the @UIDAI’s Twitter stream — you can see the biometric device operator pressing the applicant’s fingers, especially in the case of underprivileged citizens, against the reader. Imagine service providers — say, a shopkeeper or a restaurant waiter — having to touch you every time you want to pay. Smart cards offer a more dignified user experience.&lt;br /&gt;&lt;br /&gt;Eleven, biometrics enables the shirking of responsibility, while cryptography requires a chain of trust.&lt;br /&gt;&lt;br /&gt;Each legitimate transaction has repudiable signatures of all parties responsible. With biometrics, the buck will be passed to an inscrutable black box every time things go wrong. The citizens or courts will have nobody to hold to account.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The precursor to Aadhaar was called MNIC (Multipurpose National Identification Card). Initiated by the NDA government headed by Atal Bihari Vajpayee, it was based on the open SCOSTA standard. This was the correct technological choice.&lt;br /&gt;&lt;br /&gt;Unfortunately, the promoters of Aadhaar chose biometrics in their belief that newer, costlier and complex technology is superior to an older, cheaper and simpler alternative.&lt;br /&gt;&lt;br /&gt;This erroneous technological choice is not a glitch or teething problem that can be dealt with legislative fixes such as an improved Aadhaar Act or an omnibus Privacy Act. It can only be fixed by destroying the centralised biometric database, like the UK did, and shifting to smart cards.&lt;br /&gt;&lt;br /&gt;In other words, you cannot fix using the law what you have broken using technology.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/the-hindu-businessline-march-31-2017-sunil-abraham-its-the-technology-stupid'&gt;http://editors.cis-india.org/internet-governance/blog/the-hindu-businessline-march-31-2017-sunil-abraham-its-the-technology-stupid&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sunil</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Biometrics</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-04-07T12:53:21Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
