<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="http://editors.cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>http://editors.cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 721 to 735.
        
  </description>
  
  
  
  
  <image rdf:resource="http://editors.cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/digital-id-forum-2019"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/events/why-cyber-security-and-online-privacy-are-vital-for-success-of-democracy-and-freedom-of-expression"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/major-security-flaw-namo-app"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/hindustan-times-may-2-2017-details-of-135-million-aadhaar-card-holders-may-have-leaked-claims-cis-report"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/news/hindu-january-6-2014-deepa-kurup-despite-apex-court-order-ioc-proceeds-with-aadhar-linked-dbt"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/desisec-episode-1-film-release-and-screening"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/desi-sec-cybersecurity-and-civil-society-in-india"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/digtial-identities-research-plan"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019">
    <title>Divergence between the General Data Protection Regulation and the Personal Data Protection Bill, 2019</title>
    <link>http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
&lt;p&gt;Our note on the divergence between the General Data Protection Regulation and the Personal Data Protection Bill can be downloaded as a PDF &lt;a href="http://editors.cis-india.org/internet-governance/divergence-between-the-gdpr-and-pdp-bill-2019" class="internal-link" title="Divergence between the GDPR and PDP Bill 2019"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The European Union’s General Data
Protection Regulation (GDPR), replacing the 1995 EU Data Protection Directive
came into effect in May 2018. It harmonises the data protection regulations
across the European Union. In India, the Ministry of Electronics and
Information Technology had constituted a Committee of Experts (chaired by
Justice Srikrishna) to frame recommendations for a data protection framework in
India. The Committee submitted its report and a draft Personal Data Protection
Bill in July 2018 (2018 Bill). Public comments were sought on the bill till
October 2018. The Central Government revised the Bill and introduced the
revised version of the Personal Data Protection Bill (PDP Bill) on December 11,
2019 in the Lok Sabha.&lt;/p&gt;
&lt;p&gt;The PDP Bill has incorporated certain
aspects of the GDPR, such as requirements for notice to be given to the data
principal, consent for processing of data, establishment of a data protection
authority, etc. However, there are some differences and in this note we have highlighted
the areas of divergence between the two. It only includes
provisions which are common to the GDPR and the PDP Bill. It does not include
the provisions on (i) Appellate Tribunal, (ii) Finance, Account and Audit; and
(iii) Non- Personal Data.&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019'&gt;http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Pallavi Bedi</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2020-02-21T11:08:50Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017">
    <title>Discussion on Ranking Digital Rights in India (Delhi, January 07)</title>
    <link>http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017</link>
    <description>
        &lt;b&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of Privacy International, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues. Please join us on Saturday, January 07, at the India Islamic Cultural Centre, New Delhi, for a presentation of our findings followed by an open structured discussion on the methodology and implications of the study.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Download: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_RDRIndia-Discussion_07012017_Invitation.pdf"&gt;Invitation and agenda&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;p&gt;The &lt;a href="https://rankingdigitalrights.org/"&gt;Ranking Digital Rights Corporate Responsibility Index&lt;/a&gt; is a project hosted by the Open Technology Institute at New America Foundation that aims to rank Information and Communications Technology (ICTs) companies with respect to their Governance, Freedom of Expression, and Privacy practices. The inaugural Corporate Accountability Index, released in November 2015, evaluated 16 companies based on the project’s methodology that included 31 indicators in total.&lt;/p&gt;
&lt;p&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of &lt;a href="https://privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues.&lt;/p&gt;
&lt;p&gt;Please join us on Saturday, January 07, at the India Islamic Cultural Centre, New Delhi, for a presentation of our findings followed by an open structured discussion on the methodology and implications of the Ranking Digital Rights study. We will begin at 10:30 am with a round of tea and coffee.&lt;/p&gt;
&lt;p&gt;The event is open to all but the venue has limited space. The participants are requested to RSVP by sending an email to &lt;a href="mailto:nisha@cis-india.org?subject=RSVP: Ranking Digital Rights Discussion"&gt;nisha@cis-india.org&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;To further encourage programmers, researchers, journalists, students, and users in general to use and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology, we are also organising a “rankathon” on Sunday, January 08, at the CIS office in Delhi. More details can be found &lt;a href="http://cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We look forward to your participation and contribution to the discussion. Please support us by sharing this invitation with your colleagues and networks.&lt;/p&gt;
&lt;h2&gt;Agenda&lt;/h2&gt;
&lt;table class="plain"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;10:30-11:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Coffee and Tea&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;11:00-11:15&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;11:15-13:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Presentation of the Findings and Discussion&lt;/strong&gt; &lt;em&gt;Divij Joshi and Aditya Singh Chawla&lt;/em&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;13:00-14:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Lunch&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;14:00-15:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Open Discussion #1: Parameters of Evaluation&lt;/strong&gt;&lt;br /&gt;The RDR methodology was based upon evaluating commitments to uphold human rights through their services – in particular towards their commitment to users’ freedom of expression and privacy. Are there other parameters that may be considered in the Indian context?&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;15:00-16:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Open Discussion #2: Towards Protecting Digital Rights&lt;/strong&gt;&lt;br /&gt;What steps can be taken by the government, civil society, and industry in India to create an environment that recognizes and protects users digital rights? What are the relevant legal, political, and economic factors to take into consideration towards this? What are steps that other, multinational ICT companies have taken? Would these be realistic for Indian companies to implement?&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;16:00-16:30&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;16:30-17:00&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Coffee and Tea&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017'&gt;http://editors.cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Ranking Digital Rights</dc:subject>
    
    
        <dc:subject>Digital Rights</dc:subject>
    

   <dc:date>2016-12-29T07:07:34Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence">
    <title>Discrimination in the Age of Artificial Intelligence </title>
    <link>http://editors.cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence</link>
    <description>
        &lt;b&gt;The dawn of Artificial Intelligence (AI) has been celebrated by both government and industry across the globe. AI offers the potential to augment many existing bureaucratic processes and improve human capacity, if implemented in accordance with principles of the rule of law and international human rights norms. Unfortunately, AI-powered solutions have often been implemented in ways that have resulted  in the automation, rather than mitigation, of existing societal inequalities.&lt;/b&gt;
        &lt;p&gt;This was originally published by &lt;a class="external-link" href="http://ohrh.law.ox.ac.uk/discrimination-in-the-age-of-artificial-intelligence/"&gt;Oxford Human Rights Hub&lt;/a&gt; on October 23, 2018&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/ArtificialIntelligence.jpg/@@images/3b551d39-e419-442c-8c9d-7916a2d39378.jpeg" alt="Artificial Intelligence" class="image-inline" title="Artificial Intelligence" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Image Credit: Sarla Catt via Flickr, used under a Creative Commons license available at https://creativecommons.org/licenses/by/2.0/&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the international human rights law context, AI solutions pose a  threat to norms which prohibit discrimination. International Human  Rights Law &lt;a href="https://books.google.co.in/books/about/International_Human_Rights_Law.html?id=YkcXAgAAQBAJ&amp;amp;redir_esc=y"&gt;recognizes that discrimination&lt;/a&gt; may take place in two possible ways, directly or indirectly. Direct  discrimination occurs when an individual is treated less favourably than  someone else similarly situated on one of the grounds prohibited in  international law, which, as per the &lt;a href="http://www.equalrightstrust.org/ertdocumentbank/Human%20Rights%20Committee,%20General%20Comment%2018.pdf"&gt;Human Rights Committee,&lt;/a&gt; includes race, colour, sex, language, religion, political or other  opinion, national or social origin, property, birth or other status.  Indirect discrimination occurs when a policy, rule or requirement is  ‘outwardly neutral’ but has a disproportionate impact on certain groups  that are meant to be protected by one of the prohibited grounds of  discrimination. A clear example of indirect discrimination recognized by  the European Court of Human Rights arose in the case of &lt;a href="http://www.errc.org/cikk.php?cikk=3559"&gt;&lt;i&gt;DH&amp;amp;Ors v Czech Republic&lt;/i&gt;&lt;/a&gt;.  The ECtHR struck down an apparently neutral set of statutory rules,  which implemented a set of tests designed to evaluate the intellectual  capability of children but which resulted in an excessively high  proportion of minority Roma children scoring poorly and consequently  being sent to special schools, possibly because the tests were blind to  cultural and linguistic differences. This case acts as a useful analogy  for the potential disparate impacts of AI and should serve as useful  precedent for future litigation against AI-driven solutions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Indirect discrimination by AI may occur &lt;a href="https://cis-india.org/internet-governance/ai-and-governance-case-study-pdf"&gt;at two stages&lt;/a&gt;. First is the &lt;b&gt;usage of incomplete or inaccurate training data&lt;/b&gt; that results in the algorithm processing data that may not accurately reflect reality. Cathy O’Neil explains this &lt;a href="https://weaponsofmathdestructionbook.com/"&gt;using a simple example&lt;/a&gt;.  There are two types of crimes-those that are ‘reported’ and others that  are only ‘found’ if a policeman is patrolling the area. The first  category includes serious crimes such as murder or rape while the second  includes petty crimes such as vandalism or possession of illicit drugs  in small quantities. Increased police surveillance in areas in US cities  where Black or Hispanic people reside lead to more crimes being ‘found’  there. Thus, data is likely to suggest that these communities commit a  higher proportion of crimes than they actually do – indirect  discrimination that has been empirically been shown through research  published by &lt;a href="https://www.propublica.org/article/bias-in-criminal-risk-scores-is-mathematically-inevitable-researchers-say"&gt;Pro Publica&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Discrimination may also occur at the stage of &lt;b&gt;data processing&lt;/b&gt;, which is done through a metaphorical &lt;a href="https://www.sentient.ai/blog/understanding-black-box-artificial-intelligence/"&gt;‘black-box’&lt;/a&gt; that accepts inputs and generates outputs without revealing to the  human developer how the data was processed. This conundrum is compounded  by the fact that the algorithms are often utilised to solve an  amorphous problem-which attempts to break down a complex question into a  simple answer. An example is the development of ‘risk profiles’ of  individuals for the  &lt;a href="http://fortune.com/longform/ai-bias-problem/"&gt;determination of insurance premiums.&lt;/a&gt; Data might show that an accident is more likely to take place in inner  cities due  to more densely packed populations in these areas. Racial  and ethnic minorities tend to reside more in these areas, which means  that algorithms could learn that minorities are more likely to get into  accidents, thereby generating an outcome (‘risk profile’) that  indirectly discriminates on grounds of race or ethnicity.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It would be wrong to ignore discrimination, both direct and indirect,  that occurs as a result of human prejudice. The key difference between  that and discrimination by AI lies in the ability of other individuals  to compel the decision-maker to explain the factors that lead to the  outcome in question and testing its validity against principles of human  rights. The increasing amounts of discretion and, consequently, power  being delegated to autonomous systems mean that principles of  accountability which audit and check indirect discrimination need to be  built into the design of these systems. In the absence of these  principles, we risk surrendering core tenets of human rights law to the  whims of an algorithmically crafted reality.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence'&gt;http://editors.cis-india.org/internet-governance/blog/oxford-human-rights-hub-arindrajit-basu-october-23-2018-discrimination-in-the-age-of-artificial-intelligence&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Arindrajit Basu</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-10-26T14:47:57Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook">
    <title>Digital Native: Delete Facebook?</title>
    <link>http://editors.cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook</link>
    <description>
        &lt;b&gt;You can check out any time you like, but you can never leave.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="http://indianexpress.com/article/technology/social/digital-native-delete-facebook-5127198/"&gt;published in Indian Express&lt;/a&gt; on April 8, 2018.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;One fine day, we all woke up and were told that &lt;/span&gt;&lt;a href="http://indianexpress.com/about/facebook/"&gt;Facebook&lt;/a&gt;&lt;span&gt; sold our data to Cambridge Analytica and then they made dastardly profiles of us to target us with advertisement and political propaganda, so, we made a beeline for #DeleteFacebook. The most surprising part about the expose is how much of a non-event it is. We have been warned, at least since the Edward Snowden revelations, if not earlier, that our data is the new oil, coal and gold. It is being used as a resource, it is being mined from our everyday digital transactions, and it is precious because it can result in a massive social engineering without our consent or knowledge. Ever since Facebook started expanding its domain from being a friends-poke-friends-with-livestock website, we have been warned that the ambition of Facebook was never to connect you with your friends but to be your friend.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;span&gt;Time and again, we have been told that the sapient Facebook algorithm remembers everything you say and do, anticipates all your future needs, and listens to the most banal litany of your life. More than your mom, your partner or your shrink, it’s the Facebook algorithm which is interested in all your quotidian uselessness. It is not the stranger who accesses your post that should worry you. The biggest perpetrator of privacy violations on Facebook is Facebook itself. There is good reason why a company that offers its prime products for free is valuated as one of the richest corporations in the world. The product of Facebook – it has always been known – is us.&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;Why, then, are we suddenly taken aback at the fact that Facebook sold us? And while we are sharing our thoughts (ironically on Facebook) about deleting our profiles, the question that remains is this: How much of your digital life are you willing to erase? Because, and I am sorry if this pricks your filter bubble, Facebook’s problem is not really a Facebook problem. It is almost the entire World Wide Web, where we lost the battle for data ownership and platform openness more than two decades ago. Name one privately owned free service that you use on the internet and I will show you the section in its “terms and services” where you have surrendered your data. In fact, you can’t even find government services, tied up with their private partners, where your data is safe and stored in privacy vaults where it won’t be abused.&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;It is time to realise that the popular ’90s meme “All your base are belong to us” is the lived reality of our digital lives. As we forego ownership for convenience, as our governments sold our sovereignty for profits, and as digital corporations became behemoths that now have the capacity to challenge and write our constitutional and fundamental rights, we are waking up to a battle that has already been fought and resolved. A large part of our physical hardware to access the internet is privately owned. This means that almost all our PCs, tablets, phones, servers are owned and open to exploitation by private companies. Every time your phone does an automatic update or your PC goes into house-cleaning mode, you have to realise that you are being stored, somewhere in the cloud in ways that you cannot imagine.&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;&lt;span&gt;It is tiring to hear this alarm and panic around Facebook’s data trading. Not only is it legal, it is something that has been happening for a while, most of us have been aware of it, and we have resolutely ignored it because, you know, cute cats. If somebody tells you that they are against privately owned physical property and are going to start a revolution to take away all private property and make it equally shared with the public, you would laugh at them because they are arriving at the battle scene after the war is over. This digital wokeness trend to #DeleteFacebook is the digital equivalent of that moment. If you want to fight, fight the governments and nations who can still protect us. Participate in conversations around Internet governance. Take responsibility to educate yourself about the politics of how the digital world operates. But stop trying to feel virtuous because you pulled out of a social media network, pretending that that is the end of the problem.&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook'&gt;http://editors.cis-india.org/raw/indian-express-nishant-shah-april-8-2018-digital-native-delete-facebook&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>nishant</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Facebook</dc:subject>
    
    
        <dc:subject>Researchers at Work</dc:subject>
    

   <dc:date>2018-05-06T03:08:25Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions">
    <title>Digital illusions</title>
    <link>http://editors.cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions</link>
    <description>
        &lt;b&gt;The Watal Committee’s report presents the government with an impossible road map to a cashless nirvana. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by V. Sridhar was &lt;a class="external-link" href="http://www.frontline.in/the-nation/digital-illusions/article9541506.ece?homepage=true"&gt;published in Frontline&lt;/a&gt;, Print edition: March 3, 2017&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;MORE than two months after demonetising an overwhelming proportion of the currency in circulation, the Narendra Modi government now appears to have settled on its key objective for setting out on the unprecedented economic adventure. After shifting the goalposts several times—initially it was a means of combating terrorism and fake currency, later it was a war on black money and still later it was to forcibly march the country towards a “cashless” future, which was then modified to a more reasonable “less cash” society—the government now ostensibly has the road map to undertake the hazardous journey to an age when cash will no longer be king.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There is no better and time-tested means for a government bent on carrying out its whims than to appoint a committee headed by a former bureaucrat to give it the report that would justify what it has already decided to do. In August 2016, months before demonetisation, it constituted the Committee on Digital Payments, chaired by Ratan P. Watal, Principal Adviser, NITI Aayog, and former Secretary, Ministry of Finance. The committee dutifully submitted its report in double quick time on December 9, which was approved by the Finance Ministry on December 27.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The haste with which the committee has gone about its business is evident throughout the report. The committee’s slant is also evident in its approach, especially the reverence with which it welcomes the demonetisation move, even though it was commissioned before November 8, and its recourse to suspect data from private industry and multinational companies even when better quality data were available from official sources such as the Reserve Bank of India (RBI). The report’s lack of rigour, especially in tackling the substantive issues pertaining to monetary policy, was also hindered by the fact that not a single economist of worth, not even a specialist in monetary economics, was present in the committee.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Reckless rush&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;However, to blame the committee alone would be futile. The government, by pursuing an ambitious and reckless push towards “less cash” before setting out a regulatory framework governing digital payments, in effect, placed the cart before the horse.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The report reveals not just the haste with which the Watal Committee has pursued its mission with evangelical zeal but its utter lack of respect for conceptual issues. Nowhere is this more evident than in its recommendation that the regulatory responsibilities for governing the digital payments system be distanced from the RBI. This not only is out of tune with global practices, but it reveals the committee’s sheer inability to understand the fact that although payments account for just a small fraction of what a banking system does, they impinge on modern banking and monetary policy in crucial ways.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In a modern economy, currency creation by the central bank through fiat money is not the only means by which money is created. Deposits with banks, for instance, which provide the base for credit creation, are a means by which banks “create” money. From this perspective, a mobile wallet service provider also acts like a bank; even the users’ monies are held only for a brief period until transactions happen.&lt;br /&gt;&lt;br /&gt;Thus, it appears fit and proper that such services are also governed by the central bank. However, the Watal Committee has recommended that they be supervised by an entity that has a measure of independence from the RBI. This suggestion is dangerous because such entities can potentially pose a systemic risk, which is a key responsibility of a central bank. There is also the risk of regulatory capture of the suggested body, the Payments Regulatory Board (PRB), if sections of the payments industry exercise their newly acquired clout.&lt;br /&gt;&lt;br /&gt;The committee’s enthusiastic acceptance of the “go cashless” mantra is also evident in the data it has sourced. A good example of how it cherry-picked data is its use of a highly dubious (or at the very least, utterly misplaced) dataset to make the point that India is far too dependent on cash. It points to data sourced from the International Monetary Fund (IMF) and other sources to claim that India’s cash-GDP (gross domestic product) ratio is 12.04 per cent, much higher than countries such as Brazil, Mexico and South Africa.&lt;br /&gt;&lt;br /&gt;However, this much-abused dataset, quoted widely by advocates of demonetisation, is an inaccurate measure because it only captures the extent of physical currency in circulation and ignores short-term deposits, which are defined as “broad money”. Logically, these deposits must be included because they are virtually on call by depositors and are, therefore, liquid. Secondly, the fact that such deposits have been increasing as a proportion of the currency in circulation, aided by the spread of banking in India, makes them particularly relevant in the Indian context. The committee, in its bid to justify sending the nation on a cashless path, proceeds to evaluate the “high” costs that cash imposes on the Indian economy. It quotes from McKinsey and Visa, both of which may have a vested interest in India’s mission to go cashless, to drive home the point that going digital would result in huge savings. It quotes McKinsey to claim that “transitioning to an electronic platform for government payments itself could save approximately Rs.100,000 crore annually, with the cost of the transition being estimated at Rs.60,000-70,000 crore” and a Visa report that claims a total investment of Rs.60,000 crore over five years towards creating a digital payments ecosystem could reduce the country’s cost of cash from 1.7 per cent of the GDP to 1.3 per cent.&lt;br /&gt;&lt;br /&gt;Even while pushing the benefits of going cashless, the committee does admit that the transition to digital payments “cannot be agnostic to the actual costs incurred by the end customers, the reasons for preferring cash, and the factors inhibiting the uptake of existent channels of digital payments”.&lt;br /&gt;&lt;br /&gt;A large part of the Indian economy is its “black” counterpart, estimated at about 60 per cent of the legitimate part of India’s national income. Since a significant portion of the currency in circulation caters to the demand from the shadow economy, apart from the huge segment that is engaged in legitimate but informal economic activity, these estimates miss a significant chunk of the economy and its need for cash. Conceptually, to that extent, they significantly overstate the extent of cash relative to real GDP, including the portion missing from official data.&lt;br /&gt;&lt;br /&gt;The naive assumption that digitalised financial transactions are scale-neutral and costless, painless and efficient lies at the heart of the Watal Committee’s report. This has obvious implications for India’s large informal economy, which the Modi government is pushing, under pain of death, towards formality through digital channels. For instance, basic data on the usage of debit cards show how skewed the demand for cards is in India. In August 2016, cash withdrawals at ATMs accounted for 92.28 per cent of the value of all debit card transactions in the country. Thus, less than 8 per cent of the total value was made at point-of-sale (PoS) terminals.&lt;br /&gt;&lt;br /&gt;This statistic is a clear indication of a divide that mirrors the income and consumption divide in Indian society. When banks issue cards (debit, credit or any other), card payment system companies such as Mastercard and Visa provide an interface with the customer for which the issuer pays a fee, which is, in any case, recovered from customers. According to a recent study by Visa, the penetration of PoS terminals has slowed down significantly since 2012, when the RBI set limits on what the card companies could charge as merchant discount rate (MDR), the amount charged from sellers. This reveals that card companies may have been slowing down penetration in order to bargain for a bigger slice of the transaction fee. Although the rates apply not just to card-based purchases but to cash withdrawals too (and have been waived or lowered in the wake of demonetisation on a purely temporary basis), there is no guarantee that they will not increase once the situation returns to normal. This is aggravated by the fact that the government may have little or no control, or the will, to prevent banks and card issuers from charging higher rates later. This has been demonstrated in the past with, for example, ATM-based withdrawals, for which customers have to pay a fee after a minimum number of transactions.&lt;br /&gt;&lt;br /&gt;The flat fee (as a percentage) is regressive, especially because it punishes smaller sellers. It is in this sense that finance, digital or otherwise, is never scale-neutral. The fact that the immediate victims of demonetisation are small-scale producers and retailers implies that the balance has been tilted against them and in favour of larger producers and retailers after November 8. By skewing the field against small and tiny enterprises, demonetisation has been the vehicle for a massive and unprecedented transfer of incomes and wealth from the poor to the rich.&lt;br /&gt;&lt;br /&gt;There is also a fundamental asymmetry in the use of technology in the financial services industry. ATMs, which have been around for decades, were originally touted as a technology that increases efficiency in the use of cash; you only need to withdraw as much as you need, so there is no motive to hoard cash. But that was not the motive for introducing ATMs; the real reason was that they enabled banks to reduce their workforce to cut costs. As ATMs became more ubiquitous, banks started moving from cost cutting to profit-seeking by levying a fee for every transaction above a minimum threshold. In effect, the gains from technology are boosting the profitability of banks while the wider systemic benefits made possible by the same technology have been sacrificed, as the imposition of fees above a minimum threshold actually drives people to hoard cash.&lt;br /&gt;&lt;br /&gt;A study by Visa in October 2016, titled Accelerating The Growth of Digital Payments in India: A Five-Year Outlook, reveals that a one percentage point reduction in cash in circulation as percentage of GDP would require digital transactions of personal consumption expenditure to multiply ninefold. In other words, Visa suggested that digital transactions as a percentage of personal consumption expenditure would need to increase from 4 per cent to 36 per cent if the cash-GDP ratio has to reduce from 11 per cent to 10 per cent.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Security concerns&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Apart from these weighty economic issues, which are central to the move towards digital financial transactions, there are other critically important issues that the committee has either ignored or swept under the carpet. The question of privacy and security was a central issue at a recent conference on digital payments organised by HasGeek, a platform for software developers, in Bengaluru. Several experts, including some from the payments industry, pointed out the serious security and privacy issues that are being ignored in the rush to go digital. For example, an expert on data security warned that the mindless rush to mobile-based transactions was especially scary because most Android phones are vulnerable because they leak data. In fact, he noted that it may be safer for Android mobile users to perform digital transactions using desktop browsers.&lt;br /&gt;&lt;br /&gt;But what is more scary is the manner in which Aadhaar is being touted by the committee as the magic wand by which the digital era can be ushered in quickly. It recommends that mobile number-based and Aadhaar-based “fully interoperable payments” be prioritised within 60 days and that the National Payments Corporation of India (NPCI) be responsible for ensuring this.&lt;br /&gt;&lt;br /&gt;There has been significant resistance to the idea of an Aadhaar-enabled service for digital transactions, primarily because of security and privacy concerns. Entities such as the Centre for Internet and Society have warned against linking Aadhaar to the financial inclusion project because it violates the Supreme Court stricture against making Aadhaar mandatory. Kiran Jonnalagadda of HasGeek pointed out that the Aadhaar system offered only “single factor authorisation”. He said in a recent tweet that Aadhaar involved only a permanent login ID without “a changeable password”, which, from a systemic point of view, made it open to abuse.&lt;br /&gt;&lt;br /&gt;Longstanding critics of the Aadhaar project have pointed out the launch of such a countrywide programme at a time when a regulatory regime is not even in place, and when India does not have privacy protection laws, is dangerously misplaced. They have pointed to the fact that unlike in the case of a debit or credit card, which can be replaced when its integrity has been compromised, the theft of biometric characteristics of a user implies that they are compromised forever. This is not science fiction but a very real possibility as has been demonstrated across the world.&lt;br /&gt;&lt;br /&gt;There are also serious worries that the high failure rate of biometric verification would hurt the poor, supposedly the main target group of the Aadhaar project; the large-scale denial of services such as access to the public distribution system has already been documented across the country. Extending a failed system to real-time financial transactions, thus, appears to be dangerously misplaced. The fundamental issue is this: can a digital mode of payment effectively provide the same level of trust between the transacting parties that is central to a cash-based transaction? The answer to that depends critically on whether the digital mode provides the same level of convenience, cost, predictability and certainty.&lt;br /&gt;&lt;br /&gt;The Watal Committee has produced a report that the political masters sought. Its lack of appreciation of the economic issues underpinning financial transactions and of the wider economic processes in the Indian economy are obvious. Effectively, it has delivered what the Modi government asked for—an impossible road map to a cashless nirvana for a people already suffering the effects of demonetisation.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions'&gt;http://editors.cis-india.org/internet-governance/news/frontline-v-sridhar-march-3-2017-digital-illusions&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-02-16T14:53:39Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/digital-id-forum-2019">
    <title>Digital ID Forum 2019</title>
    <link>http://editors.cis-india.org/internet-governance/news/digital-id-forum-2019</link>
    <description>
        &lt;b&gt;Sunil Abraham was one of the panelists at this event at Chulalongkorn University on July 3, 2019.&lt;/b&gt;
        &lt;p&gt;&lt;img src="http://editors.cis-india.org/home-images/DigitalID.png" alt="Digital ID" class="image-inline" title="Digital ID" /&gt;&lt;/p&gt;
&lt;p&gt;&lt;span&gt;Click to &lt;/span&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/files/digital-id-forum"&gt;view the agenda&lt;/a&gt;&lt;span&gt;. Also see &lt;/span&gt;&lt;a class="external-link" href="https://en.wikipedia.org/wiki/Asia_Source"&gt;Wikipedia page&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/digital-id-forum-2019'&gt;http://editors.cis-india.org/internet-governance/news/digital-id-forum-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital ID</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Appropriate Use of Digital ID</dc:subject>
    
    
        <dc:subject>Digital Identity</dc:subject>
    

   <dc:date>2019-08-07T14:09:16Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support">
    <title>Digital Delivery and Data System for Farmer Income Support</title>
    <link>http://editors.cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support</link>
    <description>
        &lt;b&gt;This report, jointly published by the Centre for Internet &amp; Society and Privacy International, highlights the digital systems deployed by the government to augment farmer income. It analyses the PM-Kisan and Kalia schemes in Odisha and Andhra Pradesh. &lt;/b&gt;
        &lt;h2&gt;Executive Summary&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;This study provides an in-depth analysis of two direct cash transfer schemes in India – Krushak Assistance for Livelihood and Income Augmentation (KALIA) and Pradhan Mantri Kisan Samman Nidhi (PM-KISAN) – which aim to provide income support to farmers. The paper examines the role of data systems in the delivery and transfer of funds to the beneficiaries of these schemes, and analyses their technological framework and processes.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We find that the use of digital technologies, such as direct benefit transfer (DBT) systems, can improve the efficiency and ensure timely transfer of funds. However, we observe that the technology-only system is not designed with the last beneficiaries in mind; these people not only have no or minimal digital literacy but are also faced with a lack of technological infrastructure, including internet connectivity and access to the system that is largely digital.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Necessary processes need to be implemented and personnel on the ground enhanced in the existing system, to promptly address the grievances of farmers and other challenges.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This study critically analyses the direct cash transfer scheme and its impact on the beneficiaries. We find that despite the benefits of direct benefit transfer (DBT) systems, there have been many instances of failures, such as the exclusion of several eligible households from the database.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The study also looks at gender as one of the components shaping the impact of digitisation on beneficiaries. We also identify infrastructural and policy constraints, in sync with the technological framework adopted and implemented, that impact the implementation of digital systems for the delivery of welfare. These include a lack of reliable internet connectivity in rural areas and low digital literacy among farmers. We analyse policy frameworks at the central and state levels and find discrepancies between the discourse of these schemes and their implementation on the ground.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We conclude the study by discussing the implications of datafication, which is the process of collecting, analysing, and managing data through the lens of data justice. Datafication can play a crucial role in improving the efficiency and transparency of income support schemes for farmers. However, it is important to ensure that the interests of primary beneficiaries are considered – the system should work as an enabling, not a disabling, factor. This appears to be the case in many instances since the current system does not give primacy to the interests of farmers. We offer recommendations for policymakers and other stakeholders to strengthen these schemes and improve the welfare of farmers and end users.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="http://editors.cis-india.org/internet-governance/files/digital-tools-farmers-report/at_download/file" class="external-link"&gt;&lt;b&gt;Click to download the full report&lt;/b&gt;&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support'&gt;http://editors.cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>sameet</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital Technologies</dc:subject>
    
    
        <dc:subject>Data Governance</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2023-10-18T23:40:25Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/events/why-cyber-security-and-online-privacy-are-vital-for-success-of-democracy-and-freedom-of-expression">
    <title>Digital Citizens: Why Cyber Security and Online Privacy are Vital to the Success of Democracy and Freedom of Expression</title>
    <link>http://editors.cis-india.org/events/why-cyber-security-and-online-privacy-are-vital-for-success-of-democracy-and-freedom-of-expression</link>
    <description>
        &lt;b&gt;Michael Oghia will give a presentation which will show why cyber security and online privacy are vital for democracy and freedom of expression.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;In the time when Edward Snowden is fighting for both clemency and to be known as a brave whistle blower that exposed government wrongdoing, cyber security and online privacy have never been more important. As &lt;a class="external-link" href="https://www.youtube.com/watch?feature=player_embedded&amp;amp;v=H0I7wi3ZLG8&amp;amp;noredirect=1"&gt;Jacob Applebaum discussed in May last year&lt;/a&gt;, and CIS’ Maria Xynou &lt;a href="http://editors.cis-india.org/internet-governance/events/big-democracy-big-surveillance-a-talk-by-maria-xynou" class="external-link"&gt;presented recently in December&lt;/a&gt;, surveillance throughout the world is increasing. With security apparatus’ likethe NSA and now India’s Central Monitoring System, coupled with corporate data centers around the world storing our e–mails, address books, preferences, and passwords, it is easy to see how our online privacy is increasingly being threatened and often, violated.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Indeed, online privacy is inextricably linked to freedom of expression, and freedom of expression is a fundamental civil liberty imperative to democracy. Moreover, online security and privacy are essential to good, transparent, and accountable democratic governance. This is largely because surveillance, censorship, and monitoring ultimately create environments where self-censorship is the norm, as is the fear of the government instead of spaces that allow for freedom of expression and democratic dialogue and dissent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What I would like to accomplish my speaking at CIS is not to merely educate about the dangers posed to Internet security or to world democracy, but rather to:&lt;/p&gt;
&lt;ol&gt;
&lt;li style="text-align: justify; "&gt;Reiterate the importance of digital privacy and cyber security to the success of democracy and the continued protection of free expression.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Encourage citizens, technology specialists, Internet and privacy advocates, and others to see themselves as part of a larger system of democratic governance and civic participation. This means understanding how technical capabilities intersect with civil society, and then use them to advocate for a more open, accessible, and private cyberspace.&lt;/li&gt;
&lt;li style="text-align: justify; "&gt;Reinforce that digital media literacy education is vital to ensuring a free, open, accessible, and democratic Internet.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Additionally, I want to present ideas and recommendations for what you can do to engage with these problems, and how we can collaborate together to address them.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;About the Public Intelligence Project&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The Public Intelligence Project is an independent, non-partisan, not-for-profit think tank conducting research, education, and advocacy on the importance of diversity, critical thinking, dialogue, and freedom of expression. We seek to promote more robust systems of participatory democracy, civic engagement, and conflict prevention in order to create a culture of democracy.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Michael Oghia&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Michael is responsible for a new project at Meta-Culture called the Public Intelligence Project, which focuses on expanding participatory democracy, civic engagement, and conflict prevention by conducting research, education, and advocacy on the intersections between diversity, dialogue, critical thinking, and freedom of expression. While new to the conflict resolution field, as a poet, musician, editor, writer, blogger, and activist, he is well-versed in the importance of freedom of expression and participating in the democratic process. He was born in Kentucky to Lebanese-Syrian parents, and after graduating with a BS in sociology from the University of Louisville, he moved to Lebanon to pursue an MA in sociology from the American University of Beirut. There, he had the opportunity to witness the Arab Revolutions first-hand while research about topics such as Internet ownership in the Middle East, social movements, Arab media, globalization, Arab youth and family, and his thesis subject, romantic love in the Arab world. Michael enjoys engaging Twitter conversations, and has an unnatural affinity for crunchy peanut butter.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;Date: Tuesday, January 14, 2014&lt;br /&gt;Time: 6.30 p.m. to 8.00 p.m.&lt;br /&gt;Talk by: Michael Oghia&lt;br /&gt;Title: Research &amp;amp; Advocacy Consultant, and Project Manager&lt;br /&gt;Organisation: Meta-Culture / Public Intelligence Project&lt;br /&gt;Websites: &lt;a class="moz-txt-link-abbreviated" href="http://www.meta-culture.in"&gt;www.meta-culture.in&lt;/a&gt; &lt;a class="moz-txt-link-rfc2396E" href="http://www.meta-culture.in"&gt;&amp;lt;http://www.meta-culture.in&amp;gt;&lt;/a&gt; &amp;amp; &lt;a class="moz-txt-link-abbreviated" href="http://www.publicintelligenceproject.org"&gt;www.publicintelligenceproject.org&lt;/a&gt; &lt;a class="moz-txt-link-rfc2396E" href="http://www.publicintelligenceproject.org"&gt;&amp;lt;http://www.publicintelligenceproject.org&amp;gt;&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/events/why-cyber-security-and-online-privacy-are-vital-for-success-of-democracy-and-freedom-of-expression'&gt;http://editors.cis-india.org/events/why-cyber-security-and-online-privacy-are-vital-for-success-of-democracy-and-freedom-of-expression&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Social Media</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Event</dc:subject>
    

   <dc:date>2014-01-08T04:59:10Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/major-security-flaw-namo-app">
    <title>Developer team fixed vulnerabilities in Honorable PM's app and API</title>
    <link>http://editors.cis-india.org/internet-governance/blog/major-security-flaw-namo-app</link>
    <description>
        &lt;b&gt;The official app of Narendra Modi, the Indian Prime Minister, was found to contain a security flaw in 2015 that exposed millions of people's personal data.  A few days ago a very similar flaw was reported again.  This post by Bhavyanshu Parasher, who found the flaw and sought to get it fixed last year, explains the technical details behind the security vulnerability.&lt;/b&gt;
        &lt;p&gt;&lt;strong&gt;This blog post has been authored by Bhavyanshu Parasher&lt;/strong&gt;. The original post can be&lt;a class="external-link" href="https://bhavyanshu.me/major-security-flaw-pm-app/09/29/2015"&gt; read here&lt;/a&gt;.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2 style="text-align: justify; "&gt;What were the issues?&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;The main issue was how the app was communicating with the API served by narendramodi.in.&lt;/span&gt;&lt;/p&gt;
&lt;div id="_mcePaste" style="text-align: justify; "&gt;&lt;ol&gt;
&lt;li&gt;I was able to extract private data, like email addresses, of each registered user just by iterating over user IDs.&lt;/li&gt;
&lt;li&gt;There was no authentication check for API endpoints. Like, I was able to comment as any xyz user just by hand-crafting the requests.&lt;/li&gt;
&lt;li&gt;The API was still being served over HTTP instead of HTTPS.&lt;/li&gt;
&lt;/ol&gt;&lt;/div&gt;
&lt;h3 style="text-align: justify; "&gt;Fixed&lt;/h3&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;The most important issue of all. Unauthorized access to personal info, like email addresses, is fixed. I have tested it and can confirm it.&lt;/li&gt;
&lt;li&gt;A check to verify if a valid user is making the request to API endpoint is fixed. I have tested it and can confirm it.&lt;/li&gt;
&lt;li&gt;Blocked HTTP. Every response is served over HTTPS. The people on older versions (which was serving over HTTP) will get a message regarding this. I have tested it. It says something like “Please update to the latest version of the Narendra Modi App to use this feature and access the latest news and exciting new features”. It’s good that they have figuered out a way to deal with people running older versions of the app. Atleast now they will update the app.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2 style="text-align: justify; "&gt;Detailed Vulnerability Disclosure&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Found major security loophole in how the app accesses the “api.narendramodi.in/api/” API. At the time of disclosure, API was being served over “HTTP” as well as “HTTPS”. People who were still using the older version of the app were accessing endpoints over HTTP. This was an issue because data (passwords, email addresses) was being transmitted as plain text. In simple terms, your login credentials could easily be intercepted. MITM attack could easily fetch passwords and email addresses. Also, if your ISP keeps log of data, which it probably does, then they might already have your email address, passwords etc in plain text. So if you were using this app,&lt;strong&gt; I would suggest you to change your password immediately&lt;/strong&gt;. Can’t leave out a possibility of it being compromised.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Another major problem was that the token needed to access API was giving a false sense of security to developers. The access token could easily be fetched &amp;amp; anyone could send hand-crafted HTTP requests to the server. It would result in a valid JSON response without authenticating the user making the request. This included accessing user-data (primarily email address, fb profile pictures of those registered via fb) for any user and posting comments as any registered user of the app. There was no authentication check on the API endpoint. Let me explain you with a demo.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The API endpoint to fetch user profile information (email address) was getprofile. Before the vulnerability was fixed, the endpoint was accessible via “http://www.narendramodi.in/api/getprofile?userid=useridvalue&amp;amp;token=sometokenvalue”. As you can see, it only required two parameters. userid, which we could easily iterate on starting from 1 &amp;amp; token which was a fixed value. There was no authentication check on API access layer. Hand-crafting such requests resulted in a valid JSON response which exposed critical data like email addresses of each and every user. I quickly wrote a very simply script to fetch some data to demonstrate. Here is the sample output for xrange(1,10).&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/App.png/@@images/7bec3ca6-0808-4d19-9711-bc084b507f61.png" alt="App" class="image-inline" title="App" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Not just email addresses, using this method you could spam on any article pretending to be any user of the app. There was no authentication check as to who was making what requests to the API. See,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/copy_of_App.png/@@images/2e499adb-b621-4bc4-a490-f8957c9ac1d7.png" alt="App" class="image-inline" title="App" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;They have fixed all these vulnerabilities. I still believe it wouldn’t have taken so long if I would have been able to get in touch with team of engineers directly right from the beginning. In future, I hope they figure out an easier way to communicate. Such issues must be addressed as soon as they are found but the communication gap cost us lot of time. The team did a great job by fixing the issues and that’s what matters.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;h2 style="text-align: justify; "&gt;Disclosure to officials&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The email address provided on Google play store returned a response stating “The email account that you tried to reach is over quota”. Had to get in touch with authorities via twitter.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Vulnerability disclosed to authorities on 30th sep, 2015 around 5:30 AM&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/Tweet1.png" alt="Tweet 1" class="image-inline" title="Tweet 1" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;After about 30 hours of reporting the vulnerabillity&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/Tweet2.png" alt="Tweet 2" class="image-inline" title="Tweet 2" /&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Proposed Solution&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;Consulted &lt;/span&gt;&lt;a href="https://twitter.com/pranesh_prakash"&gt;@pranesh_prakash&lt;/a&gt;&lt;span&gt; as well regarding the issue.&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;span&gt;&lt;img src="http://editors.cis-india.org/home-images/Tweet3.png" alt="Tweet 3" class="image-inline" title="Tweet 3" /&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;After this, I mailed them a solution regarding the issues.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;h2 style="text-align: justify; "&gt;Discussion with developer&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Received &lt;strong&gt;phone call&lt;/strong&gt; from a developer. Discussed possible solutions to fix it.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;strong&gt;The solution that I proposed could not be implemented &lt;/strong&gt;since the vulnerability is caused by a design flaw that should have been thought about right from the beginning when they started developing the app. It just proved how difficult it is to fix such issues for mobile apps. For web apps, it’s lot easier. Why? Because for mobile apps, you need to consider backward compatibility. If they applied my proposed solution, it would crash app for people running the older versions. Main problem is that &lt;strong&gt;people don’t upgrade to latest versions leaving themselves vulnerable to security flaws&lt;/strong&gt;. The one I proposed is a better way of doing it I think but it will break for people using older versions as stated by the developer. Though, they (developers) have come up with solutions that I think would fix most of the issues and can be considered an alternative.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/Tweet4.png" alt="Tweet 4" class="image-inline" title="Tweet 4" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On Oct 3rd, I received mail from one of the developers who informed me they have fixed it. I could not check it out at that time as I was busy but I checked it around 5 PM. &lt;strong&gt;I can now confirm they have fixed all three issues&lt;/strong&gt;.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;h2 style="text-align: justify; "&gt;Update 12/02/2016&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;a class="external-link" href="http://www.dailyo.in/variety/narendra-modi-namo-app-hacker-security-concerns-javed-khatri-demonetisation-survey-bjp-voter-data/story/1/14347.html"&gt;This vulnerability&lt;/a&gt; in NM app is similar to the one I got fixed last year. Like I said before also, the vulnerability is because of how the API has been designed. They released the same patch which they did back then. Removing email addresses from the JSON output is not really a patch. I wonder why would they introduce personal information in JSON output again if they knew that’s a privacy problem and has been reported by me a year back. He showed how he was able to follow any user being any user. Similarly, I was able to comment on any post using account of any user of the app. When I talked to the developer back then he mentioned it will be difficult to migrate users to a newer/secure version of the app so they are releasing this patch for the meantime. It was more of a backward compatibility issue because of how API was designed. The only solution to this problem is to rewrite the API from scratch and add standard auth methods for API. That should take care of most of vulnerabilities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Also read:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a class="external-link" href="http://www.newindianexpress.com/nation/2016/dec/02/narendra-modi-app-hacked-by-youngster-points-out-risk-to-7-million-users-data-1544933--1.html"&gt;Narendra Modi app hacked by youngster, points out risk to 7 million users’ data&lt;/a&gt; (New Indian Express; December 2, 2016)&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://indiatoday.intoday.in/story/security-22-year-old-hacks-modi-app-private-data-7-million/1/825661.html"&gt;Security flaw: 22-year-old hacks Modi app and accesses private data of 7 million people&lt;/a&gt; (India Today; December 2, 2016)&lt;/li&gt;
&lt;li&gt;&lt;a class="external-link" href="http://thewire.in/84148/tech-security-namo-api/"&gt;The NaMo App Non-Hack is Small Fry – the Tech Security on Government Apps Is Worse&lt;/a&gt; (The Wire; December 3, 2016)&lt;/li&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/major-security-flaw-namo-app'&gt;http://editors.cis-india.org/internet-governance/blog/major-security-flaw-namo-app&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>pranesh</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Security</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Hacking</dc:subject>
    
    
        <dc:subject>Mobile Apps</dc:subject>
    
    
        <dc:subject>Data Management</dc:subject>
    

   <dc:date>2016-12-04T19:08:56Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/hindustan-times-may-2-2017-details-of-135-million-aadhaar-card-holders-may-have-leaked-claims-cis-report">
    <title>Details of 135 million Aadhaar card holders may have leaked, claims CIS report</title>
    <link>http://editors.cis-india.org/internet-governance/news/hindustan-times-may-2-2017-details-of-135-million-aadhaar-card-holders-may-have-leaked-claims-cis-report</link>
    <description>
        &lt;b&gt;The disclosure came as part of a CIS report titled ‘Information Security Practices of Aadhaar (or lack thereof): A Documentation of Public Availability of Aadhaar Numbers with Sensitive Personal Financial Information’.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The news from the Press Trust of India was published in the &lt;a class="external-link" href="http://www.hindustantimes.com/india-news/details-of-135-million-aadhaar-card-holders-may-have-leaked-claims-cis-report/story-39nojShtnAmr3EruCKbdrL.html"&gt;Hindustan Times&lt;/a&gt; on May 2, 2017.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Aadhaar numbers and personal information of as many as 135 million Indians could have been leaked from four government portals due to lack of IT security practices, the Centre for Internet and Society has claimed.&lt;br /&gt;&lt;br /&gt;“Based on the numbers available on the websites looked at, estimated number of Aadhaar numbers leaked through these four portals could be around 130-135 million,” the report by CIS said.&lt;br /&gt;&lt;br /&gt;Further, as many as 100 million bank account numbers could have been “leaked” from the four portals, it added.&lt;br /&gt;&lt;br /&gt;The portals where the purported leaks happened were those of National Social Assistance Programme, National Rural Employment Guarantee Scheme, as well as two websites of the Andhra Pradesh government.&lt;br /&gt;&lt;br /&gt;“Over 23 crore beneficiaries have been brought under Aadhaar programme for DBT (Direct Benefit Transfer), and if a significant number of schemes have mishandled data in a similar way, we could be looking at a data leak closer to that number,” it cautioned.&lt;br /&gt;&lt;br /&gt;The disclosure came as part of a CIS report titled ‘Information Security Practices of Aadhaar (or lack thereof): A Documentation of Public Availability of Aadhaar Numbers with Sensitive Personal Financial Information’.&lt;br /&gt;&lt;br /&gt;When contaced, a senior official of the Unique Identification Authority of India (UIDAI) said that there was no breach in its own database. The UIDAI issues Aadhaar to citizens.&lt;br /&gt;&lt;br /&gt;The CIS report claimed that the absence of “proper controls” in populating the databases could have disastrous results as it may divulge sensitive information about individuals, including details about address, photographs and financial data.&lt;br /&gt;&lt;br /&gt;“The lack of consistency of data masking and de- identification standard is an issue of great concern...the masking of Aadhaar numbers does not follow a consistent pattern,” the report added.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/hindustan-times-may-2-2017-details-of-135-million-aadhaar-card-holders-may-have-leaked-claims-cis-report'&gt;http://editors.cis-india.org/internet-governance/news/hindustan-times-may-2-2017-details-of-135-million-aadhaar-card-holders-may-have-leaked-claims-cis-report&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-05-20T08:42:57Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/news/hindu-january-6-2014-deepa-kurup-despite-apex-court-order-ioc-proceeds-with-aadhar-linked-dbt">
    <title>Despite apex court order, IOC proceeds with Aadhaar-linked DBT</title>
    <link>http://editors.cis-india.org/news/hindu-january-6-2014-deepa-kurup-despite-apex-court-order-ioc-proceeds-with-aadhar-linked-dbt</link>
    <description>
        &lt;b&gt;Once DBT starts, there is no other method to avail of subsidy: IOC official.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Deepa Kurup was &lt;a class="external-link" href="http://www.thehindu.com/news/cities/bangalore/despite-apex-court-order-ioc-proceeds-with-aadhaar-seeding/article5542193.ece"&gt;published in the Hindu&lt;/a&gt; on January 6, 2014. Sunil Abraham is quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Despite an interim order by the Supreme Court disallowing the government from making the Aadhaar number mandatory for accessing State subsidies and benefits, Indian Oil Corporation (IOC) Ltd. continues to inform consumers that they will not get their LPG subsidy if they do not seed their Aadhaar-linked bank accounts to the IOC database.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;SMSes and publicity material released by IOC in the past week indicate that the company is going ahead with the Union government’s deadlines for the Direct Benefit Transfer scheme for LPG. While the deadline for Udupi and Dharwad districts has been extended till January-end, the “grace period” for Bangalore Urban will expire on March 1.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Over the past week, LPG consumers have been receiving frequent SMSes requesting them to submit their Aadhaar number to their LPG distributor and their bank, with “no further delay”. Though the SMS does not state whether or not this is mandatory, frequent messages have been instilling a sense of urgency and panic among consumers. Further, several consumers told &lt;i&gt;The Hindu&lt;/i&gt; that, upon enquiry, distributors had been telling them that they would have to forego their subsidy amount (for nine cylinders a year) if they failed to register their details with the IOC database. Once the DBT scheme is enforced, the IOC will migrate customers entirely to the new system — that is, consumers will have to pay the market price, and the subsidy amount will be credited to their bank accounts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;‘&lt;b&gt;No other method’&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Senior IOC officials said that while the oil manufacturing company was desisting from making statements on whether or not this was mandatory, in effect those whose details would not be seeded to the database would not be able to avail of the benefit. “Basically, once the DBT scheme starts there is no other method to receive or avail of the subsidy. As of now, there is no alternative method,” said R.K. Arora, executive director, Karnataka State office. He pointed out that in rural areas several other subsidies were already linked to Aadhaar, and the DBT scheme was at 100 per cent in Tumkur and Mysore districts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As of January 1, an IOC official said, only 30 per cent of LPG consumers in the Bangalore Circle had ‘seeded’ their accounts to the IOC database, while in Udupi and Dharwad it was roughly around 50 per cent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“We are not claiming it’s mandatory, and currently all companies have submitted an affidavit seeking the order be reconsidered. Meanwhile, we have just asked people to submit the details to the distributor as soon as they can,” the official said. He added that IOC was likely to keep extending the deadline to “be on the safe side”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Meanwhile, there is confusion among consumers on the issue. Krishnan Pillai, a resident of R.T. Nagar here, said Aadhaar numbers were being delayed, and there was huge anxiety among people. “Last week, I saw an advertisement that implied that I will lose subsidy if I don’t submit my number. Is the Supreme Court verdict not applicable?” he said. Sumitra Gupta, a charted accountant from Majestic, said distributors were telling them to “ignore news report on the Supreme Court verdict”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“This is arm twisting,” she said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;‘&lt;b&gt;So-called voluntary’&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sunil Abraham of the Centre for Internet and Society, a Bangalore-based NGO that has been part of the anti-Aadhaar campaign, said IOC was “pushing the boundary”. “From the very beginning, people have been objecting to the so-called voluntary nature of the scheme. It’s unfortunate that the will of the Supreme Court in its interim order on such as a critical component of our citizenship is also being ignored,” he said.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/news/hindu-january-6-2014-deepa-kurup-despite-apex-court-order-ioc-proceeds-with-aadhar-linked-dbt'&gt;http://editors.cis-india.org/news/hindu-january-6-2014-deepa-kurup-despite-apex-court-order-ioc-proceeds-with-aadhar-linked-dbt&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>UID</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-01-31T06:50:33Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/desisec-episode-1-film-release-and-screening">
    <title>DesiSec: Episode 1 - Film Release and Screening</title>
    <link>http://editors.cis-india.org/internet-governance/desisec-episode-1-film-release-and-screening</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society is pleased to to announce the release of the first documentary film on cybersecurity in India - DesiSec. 
We hope you can join us for a special screening of the first episode of DesiSec, on 11th December, at CIS!&lt;/b&gt;
        
&lt;div&gt;Early 2013, the Centre for Internet and Society began shooting its first documentary film project.&amp;nbsp;After months of researching and interviewing activists and experts, CIS is thrilled to announce the release of the first documentary film on cybersecurity in India - &lt;strong&gt;DesiSec: Cybersecurity and Civi Society in India&lt;/strong&gt;.&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;Trailer link:&amp;nbsp;&lt;a href="http://editors.cis-india.org/internet-governance/blog/cis-cybersecurity-series-film-trailer"&gt;http://cis-india.org/internet-governance/blog/cis-cybersecurity-series-film-trailer&lt;/a&gt;&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;CIS is hosting a special screening of &lt;strong&gt;DesiSec: Episode 1&lt;/strong&gt; on &lt;strong&gt;11th December, 2013, 6 pm&lt;/strong&gt; and invites you to this event. The first episode is centered around the issue of privacy and surveillance in cyber space and how it affects Indian society.&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;We look forward to seeing you there!&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;RSVP:&amp;nbsp;&lt;a href="mailto:purba@cis-india.org" target="_blank"&gt;purba@cis-india.org&lt;/a&gt;&lt;/div&gt;
&lt;div&gt;Venue:&amp;nbsp;http://osm.org/go/yy4fIjrQL?m=&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;&lt;strong&gt;&lt;em&gt;This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.&lt;/em&gt;&lt;/strong&gt;&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/desisec-episode-1-film-release-and-screening'&gt;http://editors.cis-india.org/internet-governance/desisec-episode-1-film-release-and-screening&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>purba</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cyberspace</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Surveillance</dc:subject>
    
    
        <dc:subject>Cyber Security Film</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Event</dc:subject>
    

   <dc:date>2013-12-17T08:13:32Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/desi-sec-cybersecurity-and-civil-society-in-india">
    <title>DesiSec: Cybersecurity and Civil Society in India</title>
    <link>http://editors.cis-india.org/internet-governance/blog/desi-sec-cybersecurity-and-civil-society-in-india</link>
    <description>
        &lt;b&gt;As part of its project on mapping cyber security actors in South Asia and South East Asia, the Centre for Internet &amp; Society conducted a series of interviews with cyber security actors. The interviews were compiled and edited into one documentary. The film produced by Purba Sarkar, edited by Aaron Joseph, and directed by Oxblood Ruffin features Malavika Jayaram, Nitin Pai, Namita Malhotra, Saikat Datta, Nishant Shah, Lawrence Liang, Anja Kovacs, Sikyong Lobsang Sangay and, Ravi Sharada Prasad.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;Originally the idea was to do 24 interviews with an array of international experts: Technical, political, policy, legal, and activist. The project was initiated at the University of Toronto and over time a possibility emerged. Why not shape these interviews into a documentary about cybersecurity and civil society? And why not focus on the world’s largest democracy, India? Whether in India or the rest of the world there are several issues that are fundamental to life online: Privacy, surveillance, anonymity and, free speech. DesiSec includes all of these, and it examines the legal frameworks that shape how India deals with these  challenges.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;From the time it was shot till the final edit there has only been one change in the juridical topography: the dreaded 66A of the IT Act has been struck down. Otherwise, all else is in tact. DesiSec was produced by Purba Sarkar, shot and edited by Aaron Joseph, and directed by Oxblood Ruffin. It took our team from Bangalore to Delhi and, Dharamsala. We had the honour of interviewing: Malavika Jayaram, Nitin Pai, Namita Malhotra, Saikat Datta, Nishant Shah, Lawrence Liang, Anja Kovacs, Sikyong Lobsang Sangay and, Ravi Sharada Prasad. Everyone brought something special to the discussion and we are grateful for their insights. Also, we are particularly pleased to include the music of Charanjit Singh for the intro/outro of DesiSec. Mr. Singh is the inventor of acid house music, predating the Wikipedia entry for that category by five years. Someone should correct that.&lt;/p&gt;
&lt;p&gt;DesiSec is released under the Creative Commons License Attribution 3.0 Unported (CC by 3.0). You can watch it on Vimeo: &lt;a href="https://vimeo.com/123722680" target="_blank"&gt;https://vimeo.com/123722680&lt;/a&gt; or download it legally and free of charge via torrent. Feel free to show, remix, and share with your friends. And let us know what you think!&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;Video&lt;/h2&gt;
&lt;p&gt;&lt;iframe frameborder="0" height="315" src="https://www.youtube.com/embed/8N3JUqRRvys" width="560"&gt;&lt;/iframe&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/desi-sec-cybersecurity-and-civil-society-in-india'&gt;http://editors.cis-india.org/internet-governance/blog/desi-sec-cybersecurity-and-civil-society-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Laird Brown</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Censorship</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Cyber Security Film</dc:subject>
    
    
        <dc:subject>Featured</dc:subject>
    
    
        <dc:subject>Chilling Effect</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Homepage</dc:subject>
    
    
        <dc:subject>Cyber Security Interview</dc:subject>
    

   <dc:date>2015-06-29T16:25:43Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices">
    <title>Design Concerns in Creating Privacy Notices</title>
    <link>http://editors.cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices</link>
    <description>
        &lt;b&gt;The purpose of privacy notices and choice mechanisms is to notify users of the data practices of a system, so they can make informed privacy decisions. &lt;/b&gt;
        
&lt;p&gt;This blog post was edited by Elonnai Hickok.&lt;/p&gt;
&lt;hr /&gt;
&lt;h2&gt;The Role of Design in Enabling Informed Consent&lt;/h2&gt;
&lt;p align="left"&gt;Currently, privacy notices and choice mechanisms, are largely ineffective. Privacy and security researchers have concluded that privacy notices not only fail to help consumers make informed privacy decisions but are mostly ignored by them. [1] They have been reduced to being a mere necessity to ensure legal compliance for companies. The design of privacy systems has an essential role in determining whether the users read the notices and understand them. While it is important to assess the data practices of a company, the communication of privacy policies to users is also a key factor in ensuring that the users are protected from privacy threats. If they do not read or understand the privacy policy, they are not protected by it at all.&lt;/p&gt;
&lt;p align="left"&gt;The visual communication of a privacy notice is determined by the User Interface (UI) and User Experience (UX) design of that online platform. User experience design is broadly about creating the logical flow from one step to the next in any digital system, and user interface design ensures that each screen or page that the user interacts with has a consistent visual language and styling. This compliments the path created by the user experience designer. [2] UI/UX design still follows the basic principles of visual communication where information is made understandable, usable and interesting with the use of elements such as colours, typography, scale, and spacing.&lt;/p&gt;
&lt;p align="left"&gt;In order to facilitate informed consent, the design principles are to be applied to ensure that the privacy policy is presented clearly, and in the most accessible form. A paper by Batya Friedman, Peyina Lin, and Jessica K. Miller, ‘Informed Consent By Design’, presents a model of informed consent for information systems. [3] It mentions the six components of the model; Disclosure, Comprehension, Voluntariness, Competence, Agreement, Minimal Distraction. The design of a notice should achieve these components to enable informed consent. Disclosure and comprehension lead to the user being ‘informed’ while ‘consent’ encompasses voluntariness, competence, and agreement. Finally, The tasks of being informed and giving consentshould happen with minimal distraction, without diverting users from their primary taskor overwhelming them with unnecessary noise.[4]&lt;/p&gt;
&lt;p align="left"&gt;UI/UX design builds upon user behaviour to anticipate their interaction with the platform. It has led to practices where the UI/UX design is directed at influencing the user to respond in a way that is desired by the system. For instance, the design of default options prompts users to allow the system to collect their data when the ‘Allow’ button is checked by default. Such practices where the interface design is used to push users in a particular direction are called “dark patterns”.[5] These are tricks used in websites and apps that make users buy or sign up for things that they did not intend to. [6] Dark patterns are often followed as UI/UX trends without the consequences on users being questioned. This has had implications on the design of privacy systems as well. Privacy notices are currently being designed to be invisible instead of drawing attention towards them.&lt;/p&gt;
&lt;p align="left"&gt;Moreover, most communication designers believe that privacy notices are beyond their scope of expertise. They do not consider themselves accountable for how a notice comes across to the user. Designers also believe that they have limited agency when it comes to designing privacy notices as most of the decisions have been already taken by the company or the service. They can play a major role in communicating privacy concerns at an interface level, but the issues of privacy are much deeper. Designers tend to find ways of informing the user without compromising the user experience, and in the process choose aesthetic decisions over informed consent.&lt;/p&gt;
&lt;p align="left"&gt;&amp;nbsp;&lt;/p&gt;
&lt;h2 style="text-align: justify;"&gt;Issues with Visual Communication of Privacy Notices&lt;/h2&gt;
&lt;p align="left"&gt;The ineffectiveness of privacy notices can be attributed to several broad issues such as the complex language and length, their timing, and location. In 2015, the Center for Plain Language [7] published a privacy-policy analysis report [8] for TIME.com [9], evaluating internet-based companies’ privacy policies to determine how well they followed plain language guidelines. The report concluded that among the most popular companies, Google and Facebook had the more accessible notices, while Apple, Uber, and Twitter were ranked as less accessible. The timing of notices is also crucial in ensuring that it is read by the users. The primary task for the user is to avail the service being offered. The goals of security and privacy are valued but are only secondary in this process. [10] Notices are presented at a time when they are seen as a barrier between the user and the service. People thus, choose to ignore the notices and move on to their primary task. Another concern is disassociated notices or notices which are presented on a separate website or manual. The added effort of going to an external website also gets in the way of the users which leads to them not reading the notice. While most of these issues can be dealt with at the strategic level of designing the notice, there are also specific visual communication design issues that are required to be addressed.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;Invisible Structure and Organisation of Information&lt;/h3&gt;
&lt;p align="left"&gt;Long spells of text with no visible structure or content organisation is the lowest form of privacy notices. These are the blocks of text where the information is flattened with no visual markers such as a section separator, or contrasting colour and typography to distinguish between the types of content. In such notices, the headings and subheadings are also not easy to locate and comprehend. For a user, the large block of text appears to be pointless and irrelevant, and they begin to dismiss or ignore it. Further, the amount of time it would take for the user to read the entire text and comprehend it successfully, is simply impractical, considering the number of websites they visit regularly.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/CollectionandUseofPersonalInformation.jpg" alt="null" class="image-inline" title="Collection and Use of Personal Information" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;The privacy policy notice by Apple [11] with no use of colours or visuals.&lt;/em&gt;&lt;/p&gt;
&lt;p align="center"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/PrivacyPolicyTwitter.jpg" alt="null" class="image-inline" title="Privacy Policy Twitter" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;The privacy policy notice by Twitter [12] no visual segregator&lt;/em&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Visual Contrast Between Front Interface and Privacy Notices&lt;/h3&gt;
&lt;p align="left"&gt;The front facing interface of an app or website is designed to be far more engaging than the privacy notice pages. There is a visible difference in the UI/UX design of the pages, almost as if the privacy notices were not designed at all. In case of Uber’s mobile app, the process of adding a destination, selecting the type of cab and confirming a ride has been made simple to do for any user. This interface has been thought through keeping in mind the users’ behaviour and needs. It allows for quick and efficient use of the service. As opposed to the process of buying into the service, the privacy notice on the app is complex and unclear.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img class="image-inline image-inline" src="UberApp.jpg" alt="Uber App Interface 2" height="397" width="224" /&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; &lt;img class="image-inline image-inline" src="UberApp_PrivacyNotice.jpg" alt="Uber App Interface" height="397" width="224" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Uber mobile app screenshots of the front interface (left) and the policy notice page (right)&lt;/em&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Gaining Trust Through the Initial Pitch&lt;/h3&gt;
&lt;p align="left"&gt;A pattern in the privacy notices of most companies is that they attempt to establish credibility and gain confidence by stating that they respect the users’ privacy. This can be seen in the introductory text of the privacy notices of Apple and LinkedIn. The underlying intent seems to be that since the company understands that the users’ privacy is important, the users can rely on them and not read the full notice.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/ApplePrivacyNote.jpg" alt="null" class="image-inline" title="Apple Privacy Note" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Introduction text to Apple’s privacy policy notice [13]&lt;/em&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;&amp;nbsp;&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/LinkedInPrivacyNote.jpg" alt="null" class="image-inline" title="LinkedIn Privacy Note" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Introduction text to LinkedIn’s privacy policy notice [14]&lt;/em&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Low Navigability&lt;/h3&gt;
&lt;p align="left"&gt;The text heavy notices need clear content pockets which can be navigated through easily using mechanisms such as menu bar. Navigability of a document allows for quick locating of sections, and moving between them. Several companies miss to follow this. Apple and Twitter privacy notices (shown above), have low navigability as the reader has no prior indication of how many sections there are in the notice. The reader could have summarised the content based on the titles of the sections if it were available in a table of contents or a menu. Lack of a navigation system leads to endless scrolling to reach the end of the page.&lt;/p&gt;
&lt;p align="left"&gt;Facebook privacy notice, on the other hand is an example of good navigability. It uses typography and colour to build a clear structure of information that can be navigated through easily using the side menu. The menu doubles up as a table of contents for the reader. The side menu however, does not remain visible while scrolling down the page. This means while the user is reading through a section, they cannot switch to a different section from the menu directly. They will need to click on the ‘Return to top’ button and then select the section from the menu.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/DataPolicy.jpg" alt="null" class="image-inline" title="Data Policy" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Navigation menu in the Facebook Data Policy page [15]&lt;/em&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Lack of Visual Support&lt;/h3&gt;
&lt;p align="left"&gt;Privacy notices can rely heavily on visuals to convey the policies more efficiently. These could be visual summaries or supporting infographics. The data flow on the platform and how it would affect the users can be clearly visualised using infographics. But, most notices fail to adopt them. The Linkedin privacy notice [16] page shows a video at the beginning of its privacy policy. Although this could have been an opportunity to explain the policy in the video, LinkedIn only gives an introduction to the notice and follows it with a pitch to use the platform. The only visual used in notices currently are icons. Facebook uses icons to identify the different sections so that they can be located easily. But, apart from being identifiers of sections, these icons do not contribute to the communication of the policy. It does not make reading of the full policy any easier.&lt;/p&gt;
&lt;h3 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h3&gt;
&lt;h3 style="text-align: justify;"&gt;Icon Heavy ‘Visual’ Privacy Notices&lt;/h3&gt;
&lt;p align="left"&gt;The complexity of privacy notices has led to the advent of online tools and generators that create short notices or summaries for apps and websites to supplement the full text versions of policies. Most of these short notices use icons as a way of visually depicting the categories of data that is being collected and shared. iubenda [17], an online tool, generates policy notice summary and full text based on the inputs given by the client. It asks for the services offered by the site or app, and the type of data collection. Icons are used alongside the text headings to make the summary seem more ‘visual’ and hence more easily consumable. It makes the summary more inviting to read, but does not reduce the time for reading.&lt;/p&gt;
&lt;p align="left"&gt;Another icon-based policy summary generator was created by KnowPrivacy. [18] They developed a policy coding methodology by creating icon sets for types of data collected, general data practices, and data sharing. The use of icons in these short notices is more meaningful as they show which type of data is collected or not collected, shared or not shared at a glance without any text. This facilitates comparison between data practices of different apps.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/Google.jpg" alt="null" class="image-inline" title="Google" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Icon based short policy notice created for Google by KnowPrivacy [19]&lt;/em&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h2&gt;
&lt;h2 style="text-align: justify;"&gt;Initiatives to Counter Issues with the Design of Privacy Notices&lt;/h2&gt;
&lt;p align="left"&gt;Several initiatives have called out the issues with privacy notices and some have even countered them with tools and resources. The TIME.com ranking of internet-based companies’ privacy policies brought attention to the fact that some of the most popular platforms have ineffective policy notices. A user rights initiative called Terms of Services; Didn’t Read [20] rates and labels websites’ terms &amp;amp; privacy policies.&amp;nbsp;There is also the Usable Privacy Policy Project which develops techniques to semi-automatically analyze privacy policies with crowdsourcing, natural language processing, and machine learning. [21] It uses artificial intelligence to sift through the most popular sites on the Internet, including Facebook, Reddit, and Twitter, and annotate their privacy policies. They realise that it is not practical for people to read privacy policies. Thus, their aim is to use technology to extract statements from the notices and match them with things that people care about. However, even AI has not been fully successful in making sense of the dense documents and missed out some important context. [22]&lt;/p&gt;
&lt;p align="left"&gt;One of the more provocative initiatives is the Me and My Shadow ‘Lost in Small Print’ [23] project. It shows the text for the privacy notices of companies like LinkedIn, Facebook, WhatsApp, etc. and then ‘reveals’ the data collection and use information that would closely affect the users.&lt;/p&gt;
&lt;p align="left"&gt;Issues with notices have also been addressed by standardising their format, so people can interpret the information faster. The Platform for Privacy Preferences Project (P3P) [24] was one of the initial efforts in enabling websites to share their privacy practices in a standard format. Similar to KnowPrivacy’s policy coding, there are more design initiatives that are focusing on short privacy notice design. An organisation offering services in Privacy Compliance and Risk Management Solutions called TrustArc, [25] is also in the process of designing an interactive icon-based privacy short notice.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/PrivacySummary.jpg" alt="null" class="image-inline" title="Privacy Summary" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;TrustArc’s proposed design [26] for the short notice for a sample site&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;Most efforts have been done in simplifying the notices so as to decode the complex terminology. But, there have been very few evaluations and initiatives to improve the design of these notices.&lt;/p&gt;
&lt;h2&gt;&lt;br /&gt;&lt;/h2&gt;
&lt;h2&gt;Recommendations&lt;/h2&gt;
&lt;h3&gt;Multilayered Privacy Notices&lt;/h3&gt;
&lt;p align="left"&gt;One of the existing suggestions on increasing usability of privacy notices are multilayered privacy notices. [27] Multilayered privacy notices comprise a very short notice designed for use on portable digital devices where there is limited space, condensed notice that contains all the key factors in an easy to understand way, and a complete notice with all the legal requirements. [28] Some of the examples above use this in the form of short notices and summaries. The very short notice layer consists of who is collecting the information, primary uses of information, and contact details of the organisation.[29] Condensed notice layer covers scope or who does the notice apply to, personal information collected, uses and sharing, choices, specific legal requirements if any, and contact information. [30] In order to maintain consistency, the sequence of topics in the condensed and the full notice must be same. Words and phrases should also be consistent in both layers. Although an effective way of simplifying information, multi-layered notices must be reconsidered along with the timing of notices. For instance, it could be more suitable to show very short notices at the time of collection or sharing of user data.&lt;/p&gt;
&lt;h3 align="left"&gt;Supporting Infographics&lt;/h3&gt;
&lt;p align="left"&gt;Based on their visual design, the currently available privacy notices can be broadly classified into 4 categories; (i) the text only notices which do not have a clearly visible structure, (ii) the text notices with a contents menu that helps in informing of the structure and in navigating, (iii) the notices with basic use of visual elements such as icons used only to identify sections or headings, (iv) multilayered notices or notices with short summary before giving out the full text. There is still a lack of visual aid in all these formats. The use of visuals in the form of infographics to depict data flows could be more helpful for the users both in short summaries and complete text of policy notices.&lt;/p&gt;
&lt;h3 align="left"&gt;Integrating the Privacy Notices with the Rest of the System&lt;/h3&gt;
&lt;p align="left"&gt;The design of privacy notices usually seems disconnected to the rest of the app or website. The UI/UX design of privacy notices requires as much attention as the consumer-facing interface of a system. The contribution of the designer has to be more than creating a clean layout for the text of the notice. The integration of privacy notices with the rest of the system is also related to the early involvement of the designer in the project. The designer needs to understand the information flows and data practices of a system in order to determine whether privacy notices are needed, who should be notified, and about what. This means that decisions such as selecting the categories to be represented in the short or condensed notice, the datasets within these categories, and the ways of representing them would all be part of the design process. The design interventions cannot be purely visual or UI/UX based. They need to be worked out keeping in mind the information architecture, content design, and research. By integrating the notices, strategic decisions on the timing and layering of content can be made as well, apart from the aesthetic decisions. Just as the aim of the front face of the interface in a system makes it easier for the user to avail the service, the policy notice should also help the user in understanding the consequences, by giving them clear notice of the unexpected collection or uses of their data.&lt;/p&gt;
&lt;h3 align="left"&gt;Practice Based Frameworks on Designing Privacy Notices&lt;/h3&gt;
&lt;p align="left"&gt;There is little guidance available to communication designers for the actual design of privacy notices which is specific to the requirements and characteristics of a system. [31] The UI/UX practice needs to be expanded to include ethical ways of designing privacy notices online. The paper published by Florian Schaub, Rebecca Balebako, Adam L. Durity, and Lorrie Faith Cranor, called, ‘A Design Space for Effective Privacy Notice’ in 2015 offers a comprehensive design frame­work and standardised vocabulary for describing privacy notice options. [32] The objective of the paper is to allow designers to use this framework and vocabulary in creating effective privacy notices. The design space suggested has four key dimensions, ‘timing’, ‘channel’, ‘modality’ and ‘control’. [33] It also provides options for each of these dimensions. For example, ‘timing’ options are ‘at setup’, ‘just in time’, ‘context-dependent’, ‘periodic’, ‘persistent’, and ‘on demand’. The dimensions and options in the design space can be expanded to accommodate new systems and interaction methods.&lt;/p&gt;
&lt;h3 align="left"&gt;Considering the Diversity of Audiences&lt;/h3&gt;
&lt;p align="left"&gt;For the various mobile apps and services, there are multiple user groups who use them. The privacy notices are hence not targeted to one kind of an audience. There are diverse audiences who have different privacy preferences for the same system. [34] The privacy preferences of these diverse groups of users’ must be accommodated. In a typical design process for any system, multiple user personas are identified. The needs and behaviour of each persona is used to determine the design of the interface. Privacy preferences must also be observed as part of these considerations for personas, especially while designing the privacy notices. Different users may need different kinds of notices based on which data practices affect them.[35] Thus, rather than mandating a single mechanism for obtaining informed consent for all users in all situations, designers need to provide users with a range of mechanisms and levels of control. [36]&lt;/p&gt;
&lt;h3 align="left"&gt;Ethical Framework for Design Practitioners&lt;/h3&gt;
&lt;p align="left"&gt;An ethical framework is required for design practitioners that can be followed at the level of both deciding the information flow and the experience design. With the prevalence of ‘dark patterns’, the visual design of notices is used to trick users into accepting it. Design ethics can play a huge role in countering such practices. Will Dayable, co-director at Squareweave, [37] a developer of web and mobile apps, suggests that UI/UX designers should “Design Like They’re (Users are) Drunk”. [38]&amp;nbsp;&amp;nbsp;He asks designers to imagine the user to be in a hurry and still allow them access to all the information necessary for making a decision. He concludes that good privacy UX and UI is about actually trying to communicate with users rather than trying to slip one past them. In principle, an ethical design practice would respect the rights of the users and proactively design to facilitate informed consent.&lt;/p&gt;
&lt;h2 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h2&gt;
&lt;h2 style="text-align: justify;"&gt;Reconceptualising Privacy Notices&lt;/h2&gt;
&lt;p align="left"&gt;Based on the above recommendations, a guiding sample for multilayered privacy notices has been created. Each system would need its own structure and mechanisms for notices, which are integrated with its data practice, audiences, and medium, but this sample notice provides basic guidelines for creating effective and accessible privacy notices. The aesthetic decisions would also vary based on the interface design of a system.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/SampleEye.jpg" alt="null" class="image-inline" title="Sample Eye" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Sample Fixed Icon for Privacy Notifications&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;A fixed icon can appear along with all privacy notifications on the system, so that the users can immediately know that the notification is about a privacy concern. This icon should capture attention instantly and suggest a sense of caution. Besides its use as a call to attention, the icon can also lead to a side panel for privacy implications from all actions that the user takes.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/SampleVeryShortNotice.jpg" alt="null" class="image-inline" title="Sample Very Short Notice" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Sample Very Short Notice on Desktop and Mobile Platforms&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;The very short notices can be shown when an action from the user would lead to data collection or sharing. The notice mechanism should be designed to provide notices at different times tailored to a user’s needs in that context. The styling and placement of the ‘Allow’ and ‘Don’t Allow’ buttons should not be biased towards the ‘Allow’ option. The text used in very short and condensed notice layers should be engaging yet honest in its communication.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/DataCollected.jpg" alt="null" class="image-inline" title="Data Collected" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Sample Summary Notice&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;The summary or the condensed notice layer should allow the user to gauge at a glance, how the data policy is going to affect them. This can be combined with a menu that lists the topics covered in the full notice. The menu would double up as a navigation mechanism for users. It should be visible to users even as they scroll down to the full notice. The condensed notice can also be supported by an infographic depicting the flow of data in the system.&lt;/p&gt;
&lt;p align="center"&gt;&lt;img src="http://editors.cis-india.org/home-images/DataCollection.jpg" alt="null" class="image-inline" title="Data Collection" /&gt;&lt;/p&gt;
&lt;p align="center"&gt;&lt;em&gt;Sample Navigation Menu&lt;/em&gt;&lt;/p&gt;
&lt;p align="left"&gt;All the images in this section use sample text for the purpose of illustrating the structure and layout&lt;/p&gt;
&lt;p align="left"&gt;The full notice can be made accessible by creating a clear information hierarchy in the text. The menu which is available on the side while scrolling down the text would facilitate navigation and familiarity with the structure of the notice.&lt;/p&gt;
&lt;h2 style="text-align: justify;"&gt;&lt;br /&gt;&lt;/h2&gt;
&lt;h2 style="text-align: justify;"&gt;Conclusion&lt;/h2&gt;
&lt;p align="left"&gt;The presentation of privacy notices directly influences the decisions of users online and ineffective notices make users vulnerable to their data being misused. But currently, there is little conversation about privacy and data protection among designers. Design practice has to become sensitive to privacy and security requirements. Designers need to take the accountability of creating accessible notices which are beneficial to the users, rather than to the companies issuing them. They must prioritise the well-being of users over aesthetics and user experience even. The aesthetics of a platform must be directed at achieving transparency in the privacy notice by making it easily readable.&lt;/p&gt;
&lt;p align="left"&gt;The design community in India has a more urgent task at hand of building a design practice that is informed by privacy. Comparing the privacy notices of Indian and global companies, Indian companies have an even longer way to go in terms of communicating the notices effectively. Most Indian companies such as Swiggy, [39] 99acres, [40] and Paytm [41] have completely textual privacy policy notices with no clear information hierarchy or navigation. Ola Cabs [42]&amp;nbsp; provides an external link to their privacy notice, which opens as a pdf, making it even more inaccessible. Thus, there is a complete lack of design input in the layout of these notices.&lt;/p&gt;
&lt;p align="left"&gt;Designers must engage in conversations with technologists and researchers, and include privacy and other user rights in design education in order to prepare practitioners for creating more valuable digital platforms.&lt;/p&gt;
&lt;hr /&gt;
&lt;ol&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.fastcodesign.com/3032719/ui-ux-who-does-what-a-designers-guide-to-the-tech-industry"&gt;https://www.fastcodesign.com/3032719/ui-ux-who-does-what-a-designers-guide-to-the-tech-industry&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf"&gt;https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf"&gt;https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://fieldguide.gizmodo.com/dark-patterns-how-websites-are-tricking-you-into-givin-1794734134"&gt;https://fieldguide.gizmodo.com/dark-patterns-how-websites-are-tricking-you-into-givin-1794734134&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://darkpatterns.org/"&gt;https://darkpatterns.org/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://centerforplainlanguage.org/"&gt;https://centerforplainlanguage.org/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://centerforplainlanguage.org/wp-content/uploads/2016/11/TIME-privacy-policy-analysis-report.pdf"&gt;https://centerforplainlanguage.org/wp-content/uploads/2016/11/TIME-privacy-policy-analysis-report.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://time.com/3986016/google-facebook-twitter-privacy-policies/"&gt;http://time.com/3986016/google-facebook-twitter-privacy-policies/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.safaribooksonline.com/library/view/security-and-usability/0596008279/ch04.html"&gt;https://www.safaribooksonline.com/library/view/security-and-usability/0596008279/ch04.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.apple.com/legal/privacy/en-ww/?cid=wwa-us-kwg-features-com"&gt;https://www.apple.com/legal/privacy/en-ww/?cid=wwa-us-kwg-features-com&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://twitter.com/privacy?lang=en"&gt;https://twitter.com/privacy?lang=en&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.apple.com/legal/privacy/en-ww/?cid=wwa-us-kwg-features-com"&gt;https://www.apple.com/legal/privacy/en-ww/?cid=wwa-us-kwg-features-com&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/legal/privacy-policy"&gt;https://www.linkedin.com/legal/privacy-policy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.facebook.com/privacy/explanation"&gt;https://www.facebook.com/privacy/explanation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/legal/privacy-policy"&gt;https://www.linkedin.com/legal/privacy-policy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://www.iubenda.com/blog/2013/06/13/privacy%C2%ADpolicy%C2%ADfor%C2%ADandroid%C2%ADapp/"&gt;http://www.iubenda.com/blog/2013/06/13/privacy­policy­for­android­app/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://knowprivacy.org/policies_methodology.html"&gt;http://knowprivacy.org/policies_methodology.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://knowprivacy.org/profiles/google"&gt;http://knowprivacy.org/profiles/google&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://tosdr.org/"&gt;https://tosdr.org/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://explore.usableprivacy.org/"&gt;https://explore.usableprivacy.org/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://motherboard.vice.com/en_us/article/a3yz4p/browser-plugin-to-read-privacy-policy-carnegie-mellon"&gt;https://motherboard.vice.com/en_us/article/a3yz4p/browser-plugin-to-read-privacy-policy-carnegie-mellon&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://myshadow.org/lost-in-small-print"&gt;https://myshadow.org/lost-in-small-print&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.w3.org/P3P/"&gt;https://www.w3.org/P3P/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://www.trustarc.com/blog/2011/02/17/privacy-short-notice-designpart-i-background/"&gt;http://www.trustarc.com/blog/2011/02/17/privacy-short-notice-designpart-i-background/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://www.trustarc.com/blog/?p=1253"&gt;http://www.trustarc.com/blog/?p=1253&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf"&gt;https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf"&gt;https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf"&gt;https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.safaribooksonline.com/library/view/security-and-usability/0596008279/ch04.html"&gt;https://www.safaribooksonline.com/library/view/security-and-usability/0596008279/ch04.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf"&gt;https://www.ftc.gov/system/files/documents/public_comments/2015/10/00038-97832.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf"&gt;https://vsdesign.org/publications/pdf/Security_and_Usability_ch24.pdf&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.squareweave.com.au/"&gt;https://www.squareweave.com.au/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://iapp.org/news/a/how-ui-and-ux-can-ko-privacy/"&gt;https://iapp.org/news/a/how-ui-and-ux-can-ko-privacy/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.swiggy.com/privacy-policy"&gt;https://www.swiggy.com/privacy-policy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.99acres.com/load/Company/privacy"&gt;https://www.99acres.com/load/Company/privacy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://pages.paytm.com/privacy.html"&gt;https://pages.paytm.com/privacy.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://s3-ap-southeast-1.amazonaws.com/ola-prod-website/privacy_policy.pdf"&gt;https://s3-ap-southeast-1.amazonaws.com/ola-prod-website/privacy_policy.pdf&lt;/a&gt;&lt;/li&gt;&lt;/ol&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices'&gt;http://editors.cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>saumyaa</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-06T13:45:40Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/digtial-identities-research-plan">
    <title>Design and Uses of Digital Identities - Research Plan</title>
    <link>http://editors.cis-india.org/internet-governance/blog/digtial-identities-research-plan</link>
    <description>
        &lt;b&gt;In our research project about uses and design of digital identity systems, we ask two core questions: a) What are appropriate uses of ID?, and b) How should we think about the technological design of ID? Towards the first research question, we have worked on first principles and will further develop definitions, legal tests and applications of these principles. Towards the second research question, we have first identified a set of existing and planned digital identity systems that represent a paradigm of how such a system can be envisioned and implemented, and will look to identify key design choices which are causing divergence in paradigm.&lt;/b&gt;
        
&lt;h4&gt;Read the research plan &lt;a class="external-link" href="https://digitalid.design/research-plan.html"&gt;here&lt;/a&gt;.&lt;/h4&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/digtial-identities-research-plan'&gt;http://editors.cis-india.org/internet-governance/blog/digtial-identities-research-plan&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha and Pooja Saxena</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital ID</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Appropriate Use of Digital ID</dc:subject>
    
    
        <dc:subject>Digital Identity</dc:subject>
    

   <dc:date>2019-08-17T07:58:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
