<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="http://editors.cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>http://editors.cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 691 to 705.
        
  </description>
  
  
  
  
  <image rdf:resource="http://editors.cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/ethical-issues-in-open-data"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/enlarging-the-small-print"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/news/livemint-january-31-2014-anuja-moulishree-srivastava-election-panel-rejects-google-proposal-for-electoral-services-tie-up"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/observer-research-foundation-shashidhar-kj-and-kashish-parpiani-july-22-2019-easing-the-us-india-divergence-on-data-localisation"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/e-governance-identity-privacy.pdf"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/dsci-infosys-roundtable"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/ethical-issues-in-open-data">
    <title>Ethical Issues in Open Data</title>
    <link>http://editors.cis-india.org/internet-governance/blog/ethical-issues-in-open-data</link>
    <description>
        &lt;b&gt;On August 1, 2013, I took part in a web meeting, organized and hosted by Tim Davies of the World Wide Web foundation. The meeting, titled “Ethical issues in Open Data,” had an agenda focused around privacy considerations in the context of the open data movement.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The main panelists, Carly Nyst and Sam Smith from &lt;a class="external-link" href="http://https//www.privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, as well as Steve Song from the &lt;a class="external-link" href="http://www.idrc.ca/EN/Pages/default.aspx"&gt;International  Development Research Centre&lt;/a&gt;, were joined by roughly a dozen other privacy and development researchers from around the globe in the hour long session.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The primary issue of the meeting was the concern over modern capabilities of cross-analytics for de-anonymizing data sets and revealing personally identifiable information (PII) in open data. Open data can constitute publicly available information such as budgets, infrastructures, and population statistics, as long as the data meets the three open data characteristics: accessibility, machine readability, and availability for re-use. “Historically,” said Tim Davies, “public registers have been protected through obscurity.” However, both the capabilities of data analysts and the definition of personal data have continued to expand in recent years. This concern thus presents a conflict between researchers who advocate governments releasing open data reports, and researchers who emphasize privacy in the developing world.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Steve Song, advisor to IDRC Information &amp;amp; Networks program, spoke of the potential collateral damage that comes with publishing more and more types of information. Song addressed the imperative of the meeting in saying, “privacy needs to be a core part of open data conversation.” In his presentation, he gave a particularly interesting example of the tensions between public and private information implications. Following the infamous &lt;a class="external-link" href="http://en.wikipedia.org/wiki/Sandy_Hook_Elementary_School_shooting"&gt;2012 school shooting in Newtown, Connecticut&lt;/a&gt;, the information on Newtown’s gun permit owning citizens (made publicly available through America’s &lt;a class="external-link" href="http://foia.state.gov/"&gt;Freedom of Information Act&lt;/a&gt;) was aggregated into an interactive map which revealed the citizens’ addresses. This obviously became problematic for the Newtown community, as the map not only singled out homes which exercised their right to bear arms but also indirectly revealed which homes were without firearm protection and thereby more vulnerable to theft and crime. The Newtown example clearly demonstrates the relationship (and conflict) between open data and privacy; it resolves to the conflict between the right to information and the right to privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An apparent issue surrounding open data is its perceived binary nature. Many advocates either view data as being open, or not; any intermediary boundaries are only forms of governments limiting data accessibility. Therefore, a point raised by meeting attendee Raed Sharif aptly presented an open data counter-argument. Sarif noted how, inversely, privacy conceptions may form a threat to open data. He mentioned how governments could take advantage of privacy arguments to justify their refusal to publish open reports. &lt;br /&gt;&lt;br /&gt;However, Carly Nyst summarized the privacy concern and argument in her remarks near the end of the meeting. Namely, she reasoned that the open data mission is viable, if only limited to generic data, i.e., data about infrastructure, or other information that is in no way personal. Doing so will avoid obstructions of individual privacy. Until more advanced anonymization techniques can be achieved, which can overcome modern re-identification methods, publicly publishing PII may prove too risky. It was generally agreed upon during the meeting that open data is not inherently bad, and in fact its analysis and availability can be beneficial, but the threat of its misuse makes it dangerous. For the future of open data, researchers and advocates should perhaps consider more nuanced approaches to the concept in order to respect considerations for other ethical issues, such as privacy.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/ethical-issues-in-open-data'&gt;http://editors.cis-india.org/internet-governance/blog/ethical-issues-in-open-data&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>kovey</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Open Data</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-08-07T09:19:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age">
    <title>Ethical Data Design Practices in the AI (Artificial Intelligence) Age</title>
    <link>http://editors.cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age</link>
    <description>
        &lt;b&gt;Shweta Mohandas was a panelist at discussion on Ethical Data Design Practices in the AI (Artificial Intelligence) Age, organised by Startup Grind, Bangalore on July 28, 2018 at NUMA Bangalore. &lt;/b&gt;
        &lt;h2&gt;Agenda&lt;/h2&gt;
&lt;p&gt;&lt;b&gt;Ethical Data Design Practices in the Age&lt;/b&gt;&lt;/p&gt;
&lt;p dir="ltr" style="text-align: justify; "&gt;The panel discussion is intended to explore the challenges we face when designing the user experiences of the complex behavioral agents that increasingly run our lives.&lt;/p&gt;
&lt;p dir="ltr"&gt;Discussion centred around how to:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Understand current thinking by the AI community on ethics and morality in computing and the challenges it presents. &lt;/li&gt;
&lt;li&gt;Explore examples of the ethical choices that products make now and will make in the near future.&lt;/li&gt;
&lt;li&gt;Learn how designers might approach designing experiences that face moral dilemmas.&lt;/li&gt;
&lt;/ul&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age'&gt;http://editors.cis-india.org/internet-governance/news/ethical-data-design-practices-in-the-ai-artificial-intelligence-age&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Artificial Intelligence</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-08-01T23:14:21Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/enlarging-the-small-print">
    <title>Enlarging the Small Print: A Study on Designing Effective Privacy Notices for Mobile Applications</title>
    <link>http://editors.cis-india.org/internet-governance/blog/enlarging-the-small-print</link>
    <description>
        &lt;b&gt;The Word’s biggest modern lie is often wholly considered to lie in the sentence “I haveread and agreed to the Terms and Conditions.” It is a well-known fact, backed by empirical research that consumers often skip reading cumbersome privacy notices. The reasons for these range from the lengthy nature, complicated legal jargon and inopportune moments when these notices are displayed. This paper seeks to compile and analyse the different simplified designs of privacy notices that have been proposed for mobile applications that encourage consumers to make informed privacy decisions.&lt;/b&gt;
        &lt;h2 style="text-align: justify; "&gt;Introduction: Ideas of Privacy and Consent Linked with Notices&lt;/h2&gt;
&lt;h3 style="text-align: justify; "&gt;The Notice and Choice Model&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Most modern laws and data privacy principles seek to focus on individual control. As Alan Westin of Columbia University characterises privacy, "it is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to other,"	&lt;a href="#_ftn1" name="_ftnref1"&gt;[1]&lt;/a&gt; Or simply put, personal information privacy is "the ability of the individual to personally control 	information about himself."&lt;a href="#_ftn2" name="_ftnref2"&gt;[2]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The preferred mechanism for protecting online privacy that has emerged is that of Notice and Choice.&lt;a href="#_ftn3" name="_ftnref3"&gt;[3]&lt;/a&gt; The model, identified as "the most fundamental principle" in online privacy,&lt;a href="#_ftn4" name="_ftnref4"&gt;[4]&lt;/a&gt; refers to&lt;a href="http://itlaw.wikia.com/wiki/Post" title="Post"&gt;consumers&lt;/a&gt; consenting to privacy policies before availing of an online service.	&lt;a href="#_ftn5" name="_ftnref5"&gt;[5]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The following 3 standards of expectations of privacy in electronic communications have emerged in the United States courts:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;KATZ TEST: Katz v. United States,&lt;a href="#_ftn6" name="_ftnref6"&gt;[6]&lt;/a&gt; a wiretap case, established expectation of privacy as one society is 	prepared to recognize as ―reasonable. &lt;a href="#_ftn7" name="_ftnref7"&gt;[7]&lt;/a&gt;This concept is critical to a court's understanding of a new 	technology because there is no established precedent to guide its analysis&lt;a href="#_ftn8" name="_ftnref8"&gt;[8]&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;KYLLO/ KYLLO-KATZ HYBRID TEST: Society's reasonable expectation of privacy is higher when dealing with a new technology that is not ―generally 	available to the public.&lt;a href="#_ftn9" name="_ftnref9"&gt;[9]&lt;/a&gt;This follows the logic that it is reasonable to expect common data collection practices to be used but not rare ones. &lt;a href="#_ftn10" name="_ftnref10"&gt;[10]&lt;/a&gt; In Kyllo v. United States	&lt;a href="#_ftn11" name="_ftnref11"&gt;[11]&lt;/a&gt; law enforcement used a thermal imaging device to observe the relative heat levels inside a house. 	Though as per Katz the publicly available thermal radiation technology is reasonable, the uncommon means of collection was not. This modification to the 	Katz standard is extremely important in the context of mobile privacy. Mobile communications may be subdivided into smaller parts of audio from a phone 	call, e-mail, and data related to a user's current location. Following an application of the hybrid Katz/Kyllo test, the reasonable expectation of privacy 	in each of those communications would be determined separately&lt;a href="#_ftn12" name="_ftnref12"&gt;[12]&lt;/a&gt;, by evaluating the general accessibility 	of the technology required to capture each stream.&lt;a href="#_ftn13" name="_ftnref13"&gt;[13]&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;DOUBLE CLICK TEST: DoubleClick&lt;a href="#_ftn14" name="_ftnref14"&gt;[14]&lt;/a&gt; illustrates the potential problems of transferring consent to a third 	party, one to whom the user never provided direct consent or is not even aware of. The court held that for DoubleClick, an online advertising network, to 	collect information from a user it needed only to obtain permission from the website that user accessed, and not from the user himself. The court reasoned 	that the information the user disclosed to the website was analogous to information one discloses to another person during a conversation. Just as the 	other party to the conversation would be free to tell his friends about anything that was said, a website should be free to disclose any information it 	receives from a user's visit after the user has consented to use the website's services. &lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;These interpretations have weakened the standards of online privacy. While the Katz test vaguely hinges on societal expectations, the Kyllo Test to an 	extent strengthens privacy rights by disallowing uncommon methods of collection, but as the DoubleClick Test illustrates, once the user has consented to 	such practices he cannot object to the same. There have been sugestions to consider personal information as property when it shares features of property 	like location data.&lt;a href="#_ftn15" name="_ftnref15"&gt;[15]&lt;/a&gt; It is fixed when it is in storage, it has a monetary value, and it is sold and traded on a regular basis. This would create a standard where consent is required for third-party access.	&lt;a href="#_ftn16" name="_ftnref16"&gt;[16]&lt;/a&gt; Consent will then play a more pivotal role in affixing liability.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The notice and choice mechanism is designed to put individuals in charge of the collection and use of their personal information. In theory, the regime preserves user autonomy by putting the individual in charge of decisions about the collection and use of personal information.	&lt;a href="#_ftn17" name="_ftnref17"&gt;[17]&lt;/a&gt; Notice and choice is asserted as a substitute for regulation because it is thought to be more 	flexible, inexpensive to implement, and easy to enforce.&lt;a href="#_ftn18" name="_ftnref18"&gt;[18]&lt;/a&gt; Additionally, notice and choice can legitimize an information practice, whatever it may be, by obtaining an individual's consent and suit individual privacy preferences.	&lt;a href="#_ftn19" name="_ftnref19"&gt;[19]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the notice and choice mechanism is often criticized for leaving users uninformed-or misinformed, at least-as people rarely see, read, or understand 	privacy notices. &lt;a href="#_ftn20" name="_ftnref20"&gt;[20]&lt;/a&gt; Moreover, few people opt out of the collection, use, or disclosure of their data when 	presented with the choice to do so.&lt;a href="#_ftn21" name="_ftnref21"&gt;[21]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Amber Sinha of the Centre for Internet and Society argues that consent in these scenarios Is rarely meaningful as consumers fail to read/access privacy 	policies, understand the consequences and developers do not provide them the choice to opt out of a particular data practice while still being allowed to 	use their services. &lt;a href="#_ftn22" name="_ftnref22"&gt;[22]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Of particular concern is the use of software applications (apps) designed to work on mobile devices. Estimates place the current number of apps available 	for download at more than 1.5 million, and that number is growing daily.&lt;a href="#_ftn23" name="_ftnref23"&gt;[23]&lt;/a&gt; A 2011 Google study, "The 	Mobile Movement," identified that mobile devices are viewed as extensions of ourselves that we share with deeply personal relations with, raising 	fundamental questions of how apps and other mobile communications influence our privacy decision-making.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Recent research indicates that mobile device users have concerns about the privacy implications of using apps.	&lt;a href="#_ftn24" name="_ftnref24"&gt;[24]&lt;/a&gt; The research finds that almost 60 percent of respondents ages 50 and older decided not to install an 	app because of privacy concerns (see figure 1).&lt;a href="#_ftn25" name="_ftnref25"&gt;[25]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/ConsumerReactions.png" alt="Consumer Reactions" class="image-inline" title="Consumer Reactions" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Because no standards currently exist for providing privacy notice disclosure for apps, consumers may find it difficult to understand what data the app is 	collecting, how those data will be used, and what rights users have in limiting the collection and use of their data. Many apps do not provide users with privacy policy statements, making it impossible for app users to know the privacy implications of using a particular app.	&lt;a href="#_ftn26" name="_ftnref26"&gt;[26]&lt;/a&gt;Apps can make use of any or all of the device's functions, including contact lists, calendars, phone 	and messaging logs, locational information, Internet searches and usage, video and photo galleries, and other possibly sensitive information. For example, 	an app that allows the device to function as a scientific calculator may be accessing contact lists, locational data, and phone records even though such 	access is unnecessary for the app to function properly. &lt;a href="#_ftn27" name="_ftnref27"&gt;[27]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Other apps may have privacy policies that are confusing or misleading. For example, an analysis of health and fitness apps found that more than 30 percent 	of the apps studied shared data with someone not disclosed in the app's privacy policy.&lt;a href="#_ftn28" name="_ftnref28"&gt;[28]&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Types of E-Contracts&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Margaret Radin distinguishes two models of direct e-contracts based on consent as -"contract-as-consent" and "contract-as-product."	&lt;a href="#_ftn29" name="_ftnref29"&gt;[29]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The contract-as-consent model is the traditional picture of how binding commitment is arrived at between two humans. It involves a meeting of the minds 	which implies that terms be understood, alternatives be available, and probably that bargaining be possible.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the contract-as-product model, the terms are part of the product, not a conceptually separate bargain; physical product plus terms are a package deal. 	For example the fact that a chip inside an electronics item will wear out after a year is an unseen contract creating a take-it-or-leave-it choice not to 	buy the package.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The product-as-consent model defies traditional ideas of consent and raises questions of whether consent is meaningful. Modern day e-contracts such as 	click wrap, shrink wrap, viral contracts and machine-made contracts which form the privacy policy of several apps have a product-as-consent approach where 	consumers are given the take-it-or-leave-it option.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mobile application privacy notices fall into the product-as-consent model. Consumers often have to click "I agree" to all the innumerable Terms and 	Conditions in order to install the app. For instance terms that the fitness app will collect biometric data is a feature of the product that is 	non-negotiable. It is a classic take-it-or-leave-it approach where consumers compromise on privacy to avail services.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Contracts that facilitate these transactions are generally long and complicated and often agreed to by consumers without reading them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Craswell strikes a balance in applying the liability rule to point out that as explaining the meaning of extensive fine print would be very costly to point 	out it could be efficient to affix the liability rule not as a written contract but rather on "reasonable" terms. This means that if a fitness app collects 	sensitive financial information, which is unreasonable given its core activities, then even if the user has consented to the same in the privacy policy's 	fine print the contract should be capable of being challenged.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;h2&gt;The Concept of Privacy by Design&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Privacy needs to be considered from the very beginning of system development. For this reason, Dr. Anne Cavoukian	&lt;a href="#_ftn30" name="_ftnref30"&gt;[30]&lt;/a&gt; coined the term "Privacy by Design", that is, privacy should be taken into account throughout the 	entire engineering process from the earliest design stages to the operation of the productive system. This holistic approach is promising, but it does not 	come with mechanisms to integrate privacy in the development processes of a system. The privacy-by-design approach, i.e. that data protection safeguards 	should be built into products and services from the earliest stage of development, has been addressed by the European Commission in their proposal for a 	General Data Protection Regulation. This proposal uses the terms "privacy by design" and "data protection by design" synonymously.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The 7 Foundational Principles&lt;a href="#_ftn31" name="_ftnref31"&gt;[31]&lt;/a&gt; of Privacy by Design are:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Proactive not Reactive; Preventative not Remedial&lt;/li&gt;
&lt;li&gt;Privacy as the Default Setting&lt;/li&gt;
&lt;li&gt;Privacy Embedded into Design&lt;/li&gt;
&lt;li&gt;Full Functionality - Positive-Sum, not Zero-Sum&lt;/li&gt;
&lt;li&gt;End-to-End Security - Full Lifecycle Protection&lt;/li&gt;
&lt;li&gt;Visibility and Transparency - Keep it Open&lt;/li&gt;
&lt;li&gt;Respect for User Privacy - Keep it User-Centric&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;Several terms have been introduced to describe types of data that need to be protected. A term very prominently used by industry is "personally 	identifiable information (PII)", i.e., data that can be related to an individual. Similarly, the European data protection framework centres on "personal 	data". However, some authors argue that this falls short since also data that is not related to a single individual might still have an impact on the 	privacy of groups, e.g., an entire group might be discriminated with the help of certain information. For data of this category the term "privacy-relevant 	data" has been used. &lt;a href="#_ftn32" name="_ftnref32"&gt;[32]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;An essential part of Privacy by Design is that data subjects should be adequately informed whenever personal data is processed. Whenever data subjects use 	a system, they should be informed about which information is processed, for what purpose, by which means and who it is shared is with. They should be 	informed about their data access rights and how to exercise them.&lt;a href="#_ftn33" name="_ftnref33"&gt;[33]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Whereas system design very often does not or barely consider the end-users' interests, but primarily focuses on owners and operators of the system, it is 	essential to account the privacy and security interests of all parties involved by informing them about associated advantages (e.g. security gains) and 	disadvantages (e.g. costs, use of resources, less personalisation). By creating this system of "multilateral security" the demands of all parties must be 	realized.&lt;a href="#_ftn34" name="_ftnref34"&gt;[34]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The Concept of Data Minimization&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The most basic privacy design strategy is MINIMISE, which states that the amount of personal data that is processed should be restricted to the minimal 	amount possible. By ensuring that no, or no unnecessary, data is collected, the possible privacy impact of a system is limited. Applying the MINIMISE 	strategy means one has to answer whether the processing of personal data is proportional (with respect to the purpose) and whether no other, less invasive, 	means exist to achieve the same purpose. The decision to collect personal data can be made at design time and at run time, and can take various forms. For 	example, one can decide not to collect any information about a particular data subject at all. Alternatively, one can decide to collect only a limited set 	of attributes.&lt;a href="#_ftn35" name="_ftnref35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;If a company collects and retains large amounts of data, there is an increased risk that the data will be used in a way that departs from consumers' 	reasonable expectations.&lt;a href="#_ftn36" name="_ftnref36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are three privacy protection goals&lt;a href="#_ftn37" name="_ftnref37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; that data minimization and privacy by 	design seek to achieve. These privacy protection goals are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Unlinkability - To prevent data being linked to an identifiable entity&lt;/li&gt;
&lt;li&gt;Transparency - The information has to be available before, during and after the processing takes place.&lt;/li&gt;
&lt;li&gt;Intervenability - Those who provide their data must have means of intervention into all ongoing or planned privacy-relevant data processing	&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Spiekermann and Cranor raised an intriguing point in their paper, they argued that those companies that employ privacy by design and data minimization practices in their applications should be allowed to skip the need for privacy policies and forgo need for notice and choice features.	&lt;a href="#_ftn38" name="_ftnref38"&gt;&lt;sup&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt;&lt;b&gt;&lt;span&gt; &lt;/span&gt;&lt;/b&gt;&lt;/p&gt;
&lt;table style="text-align: justify; "&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;div&gt;
&lt;p&gt;&lt;b&gt; To Summarise: 							&lt;i&gt; The emerging model and legal dialogue that regulates online privacy is that of Notice and Choice which has been severely 								criticised for not creating informed choice making processes. E-contracts such as agreeing to privacy notices follow the 								consent-as-product model. When there is extensive fine print liability must be affixed on the basis of reasonable terms. 								Privacy notices must incorporate the concepts of Privacy by Design through providing complete information and collecting 								minimum data. &lt;/i&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;h2 style="text-align: justify; "&gt;Features of Privacy Notices in the Current Mobile Ecosystem&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;A privacy notice inform a system's users or a company's customers of data practices involving personal information. Internal practices with regard to the 	collection, processing, retention, and sharing of personal information should be made transparent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Each app a user chooses to install on his smartphone can access different information stored on that device. There is no automatic access to user 	information. Each application has access only to the data that it pulls into its own 'sandbox'.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The sandbox is a set of fine-grained controls limiting an application's access to files, preferences, network resources, hardware etc. Applications cannot 	access each other's sandboxes.&lt;a href="#_ftn39" name="_ftnref39"&gt;[39]&lt;/a&gt; The data that makes it into the sandbox is normally defined by user permissions.&lt;a href="#_ftn40" name="_ftnref40"&gt;[40]&lt;/a&gt; These are a set of user defined controls&lt;a href="#_ftn41" name="_ftnref41"&gt;[41]&lt;/a&gt;and evidence that a user consents to the application accessing that data.	&lt;a href="#_ftn42" name="_ftnref42"&gt;[42]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;To gain permission mobile apps generally display privacy notices that explicitly seek consent. These can leverage different channels, including a privacy 	policy document posted on a website or linked to from mobile app stores or mobile apps. For example, Google Maps uses a traditional clickwrap structure that requires the user to agree to a list of terms and conditions when the program is initially launched.	&lt;a href="#_ftn43" name="_ftnref43"&gt;[43]&lt;/a&gt; Foursquare, on the other hand, embeds its terms in a privacy policy posted on its website, and not 	within the app. &lt;a href="#_ftn44" name="_ftnref44"&gt;[44]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This section explains the features of current privacy notices on the 4 parameters of stage (at which the notice is given), content, length and user 	comprehension. Under each of these parameters the associated problems are identified and alternatives are suggested.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;(1) &lt;/b&gt; &lt;b&gt;Timing and Frequency of Notice: &lt;br /&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt; This sub-section identifies the various stages that notices are given and highlights their advantages, disadvantages and makes recommendations. It 		concludes with the findings of a study on what the ideal stage to provide notice is. This is supplemented with 2 critical models to address the common 		problems of habituation and contextualization. &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; Studies indicate that timing of notices or the stage at which they are given impact how consumer's recall and comprehend them and make choices 		accordingly. &lt;/b&gt; &lt;a href="#_ftn45" name="_ftnref45"&gt;[45]&lt;/a&gt; &lt;b&gt; I&lt;/b&gt; ntroducing only a 15-second delay between the presentation of privacy notices and privacy relevant choices can be enough to render notices ineffective at 	driving user behaviour.&lt;a href="#_ftn46" name="_ftnref46"&gt;[46]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Google Android and Apple iOS provide notices at different times. At the time of writing, Android users are shown a list of requested permissions while the 	app is being installed, i.e., after the user has chosen to install the app. In contrast, iOS shows a dialog during app use, the first time a permission is 	requested by an app. This is also referred to as a "just-in-time" notification. &lt;a href="#_ftn47" name="_ftnref47"&gt;[47]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The following are the stages in which a notice can be given:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;1) NOTICE AT SETUP: Notice can be provided when a system is used for the first time&lt;a href="#_ftn48" name="_ftnref48"&gt;[48]&lt;/a&gt;. For instance, as 	part of a software installation process users are shown and have to accept the system's terms of use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) &lt;span&gt;Advantages&lt;/span&gt;: Users can inspect a system's data practices before using or purchasing it. The system developer is benefitted due to liability and 	transparency reasons that gain user trust. It provides the opportunity to explain unexpected data practices that may have a benign purpose in the context 	of the system&lt;a href="#_ftn49" name="_ftnref49"&gt;[49]&lt;/a&gt;. It can even impact purchase decisions. Egelman et al. found that participants were more 	likely to pay a premium at a privacy-protective website when they saw privacy information in search results, as opposed to on the website after selecting a 	search result&lt;a href="#_ftn50" name="_ftnref50"&gt;[50]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Users have become largely habituated to install time notices and ignore them&lt;a href="#_ftn51" name="_ftnref51"&gt;[51]&lt;/a&gt;. Users 	may have difficulty making informed decisions because they have not used the system yet and cannot fully assess its utility or weigh privacy trade-offs. They may also be focused on the primary task, namely completing the setup process to be able to use the system, and fail to pay attention to notices	&lt;a href="#_ftn52" name="_ftnref52"&gt;[52]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Privacy notices provided at setup time should be concise and focus on data practices immediately relevant to the primary user rather 	than presenting extensive terms of service. Integrating privacy information into other materials that explain the functionality of the system may further 	increase the chance that users do not ignore it.&lt;a href="#_ftn53" name="_ftnref53"&gt;[53]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;2) JUST IN TIME NOTICE: A privacy notice can be shown when a data practice is active, for example when information is being collected, used, or shared. 	Such notices are referred to as "contextualized" or "just-in-time" notices&lt;a href="#_ftn54" name="_ftnref54"&gt;[54]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: They enhance transparency and enable users to make privacy decisions in context. Users have also been shown to more freely share information 	if they are given relevant explanations at the time of data collection&lt;a href="#_ftn55" name="_ftnref55"&gt;[55]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Habituation can occur if these are shown too frequently. Moreover in apps such as gaming apps users generally tend to ignore notices 	displayed during usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Consumers can be given notice the first time a particular type of information is accessed such as email and then be given the option to 	opt out of further notifications. A Consumer may then seek to opt out of notices on email but choose to view all notices on health information that is 	accessed depending on his privacy priorities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;3) CONTEXT-DEPENDENT NOTICES: The user's and system's context can also be considered to show additional notices or controls if deemed necessary	&lt;a href="#_ftn56" name="_ftnref56"&gt;[56]&lt;/a&gt;. Relevant context may be determined by a change of location, additional users included in or receiving 	the data, and other situational parameters. Some locations may be particularly sensitive, therefore users may appreciate being reminded that they are 	sharing their location when they are in a new place, or when they are sharing other information that may be sensitive in a specific context. Facebook introduced a privacy checkup message in 2014 that is displayed under certain conditions before posting publicly. It acts as a "nudge"	&lt;a href="#_ftn57" name="_ftnref57"&gt;[57]&lt;/a&gt; to make users aware that the post will be public and to help them manage who can see their posts.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: It may help users make privacy decisions that are more aligned with their desired level of privacy in the respective situation and thus 	foster trust in the system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Challenges in providing context-dependent notices are detecting relevant situations and context changes. Furthermore, determining whether a context is relevant to an individual's privacy concerns could in itself require access to that person's sensitive data and privacy preferences.	&lt;a href="#_ftn58" name="_ftnref58"&gt;[58]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Standards must be evolved to determine a contextual model based on user preferences.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;4) PERIODIC NOTICES: These are shown the first couple of times a data practice occurs, or every time. The sensitivity of the data practice may determine 	the appropriate frequency.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: It can further help users maintain awareness of privacy-sensitive information flows especially when data practices are largely invisible	&lt;a href="#_ftn59" name="_ftnref59"&gt;[59]&lt;/a&gt;such as in patient monitoring apps. This helps provide better control options.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: Repeating notices can lead to notice fatigue and habituation&lt;a href="#_ftn60" name="_ftnref60"&gt;[60]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Frequency of these notices needs to be balanced with user needs. &lt;a href="#_ftn61" name="_ftnref61"&gt;[61]&lt;/a&gt; Data practices 	that are reasonably expected as part of the system may require only a single notice, whereas practices falling outside the expected context of use which 	the user is potentially unaware of may warrant repeated notices. Periodic notices should be relevant to users in order to be not perceived as annoying. A combined notice can remind about multiple ongoing data practices. Rotating warnings or changing their look can also further reduce habituation effects	&lt;a href="#_ftn62" name="_ftnref62"&gt;[62]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;5) PERSISTENT NOTICES: A persistent indicator is typically non-blocking and may be shown whenever a data practices is active, for instance when information 	is being collected continuously or when information is being transmitted&lt;a href="#_ftn63" name="_ftnref63"&gt;[63]&lt;/a&gt;. When inactive or not shown, 	persistent notices also indicate that the respective data practice is currently not active. For instance, Android and iOS display a small icon in the 	status bar whenever an application accesses the user's location.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: These are easy to understand and not annoying increasing their functionality.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: These ambient indicators often go unnoticed.&lt;a href="#_ftn64" name="_ftnref64"&gt;[64]&lt;/a&gt; Most systems can only accommodate such 	indicators for a small number of data practices.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Persistent indicators should be designed to be noticeable when they are active. A system should only provide a small set of persistent 	indicators to indicate activity of especially critical data practices which the user can also specify.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;6) NOTICE ON DEMAND: Users may also actively seek privacy information and request a privacy notice. A typical example is posting a privacy policy at a persistent location&lt;a href="#_ftn65" name="_ftnref65"&gt;[65]&lt;/a&gt; and providing links to it from the app.	&lt;a href="#_ftn66" name="_ftnref66"&gt;[66]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) Advantages: Privacy sensitive users are given the option to better explore policies and make informed decisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) Disadvantages: The current model of a link to a long privacy policy on a website will discourage users from requesting for information that they cannot 	fully understand and do not have time to read.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Recommendations: Better option are privacy settings interfaces or privacy dashboards within the system that provide information about data practices; 	controls to manage consent; summary reports of what information has been collected, used, and shared by the system; as well as options to manage or delete 	collected information. Contact information for a privacy office should be provided to enable users to make written requests.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Which of these Stages is the Most Ideal?&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In a series of experiments, Rebecca Balekabo and others &lt;a href="#_ftn67" name="_ftnref67"&gt;[67]&lt;/a&gt; have identified the impact of timing on 	smartphone privacy notices. The following 5 conditions were imposed on participants who were later tested on their levels of recall of the notices through 	questions:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt; Not Shown: The participants installed and used the app without being shown a privacy notice&lt;/li&gt;
&lt;li&gt;App Store: Notice was shown at the time of installation at the app store&lt;/li&gt;
&lt;li&gt;App store Big: A large notice occupying more screen space was shown at the app store&lt;/li&gt;
&lt;li&gt;App Store Popup: A smaller popup was displayed at the app Store&lt;/li&gt;
&lt;li&gt;During use: Notice was shown during usage of the app&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;The results (Figure) suggest that even if a notice contains information users care about, it is unlikely to be recalled if only shown in the app store and 	more effective when shown during app usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Seeing the app notice during app usage resulted in better recall. Although participants remembered the notice shown after app use as well as in other 	points of app use, they found that it was not a good point for them to make decisions about the app because they had already used it, and participants 	preferred when the notice was shown during or before app usage.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Hence depending on the app there are optimal times to show smartphone privacy notices to maximize attention and recall with preference being given to the 	beginning of or during app use.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However several of these stages as outlined baove face the disadvantages of habituation and uncertainty on contextualization. The following 2 models have 	been proposed to address this:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;h2&gt;Habituation&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;When notices are shown too frequently, users may become habituated. Habituation may lead to users disregarding warnings, often without reading or 	comprehending the notice&lt;a href="#_ftn68" name="_ftnref68"&gt;[68]&lt;/a&gt;. To reduce habituation from app permission notices, Felt et al. identified a 	tested method to determine which permission requests should be emphasized &lt;a href="#_ftn69" name="_ftnref69"&gt;[69]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;They categorized actions on the basis of revertibility, severability, initiation, alterable and approval nature (Explained in figure) and applied the 	following permission granting mechanisms :&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt; Automatic Grant: It must be requested by the developer, but it is granted without user involvement.&lt;/li&gt;
&lt;li&gt;Trusted UI elements: They appear as part of an application's workflow, but clicking on them imbues the application with a new permission. To ensure 	that applications cannot trick users, trusted UI elements can be controlled only by the platform. For example, a user who is sending an SMS message from a 	third-party application will ultimately need to press a button; using trusted UI means the platform provides the button.&lt;/li&gt;
&lt;li&gt;Confirmation Dialog: Runtime consent dialogs interrupt the user's flow by prompting them to allow or deny a permission and often contain 	descriptions of the risk or an option to remember the decision.&lt;/li&gt;
&lt;li&gt;Install-time warning: These integrate permission granting into the installation flow. Installation screens list the application's requested 	permissions. In some platforms (e.g., Facebook), the user can reject some install-time permissions. In other platforms (e.g., Android and Windows 8 Metro), 	the user must approve all requested permissions or abort installation.&lt;a href="#_ftn70" name="_ftnref70"&gt;[70]&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Based on these conditions the following sequential model that the system must adopt was proposed to determine frequency of displaying notices:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/SequentialModel.png/@@images/6a94f50d-4bd0-4566-bc30-32d5ef3f53d3.png" alt="Sequential Model" class="image-inline" title="Sequential Model" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Initial tests have proven to be successful in reducing habituation effects and it is an important step towards designing and displaying privacy notices.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Contextualization&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Bastian Koning and others, in their paper "Towards Context Adaptive Privacy Decisions in Ubiquitous Computing"	&lt;b&gt; &lt;a href="#_ftn71" name="_ftnref71"&gt;&lt;b&gt;[71]&lt;/b&gt;&lt;/a&gt;&lt;/b&gt; propose a system for supporting a user's privacy decisions in situ, 	i.e., in the context they are required in, following the notion of contextual integrity. It approximates the user's privacy preferences and adapts them to 	the current context. The system can then either recommend sharing decisions and actions or autonomously reconfigure privacy settings. It is divided into 	the following stages:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/PrivacyDecisionProcess.png/@@images/4dd72aef-1bb1-42d9-ae59-9592b2a36b9f.png" alt="Privacy Decision Process" class="image-inline" title="Privacy Decision Process" /&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Context Model:&lt;/b&gt; A distinction is created between the decision level and system level. The system level enables context awareness but also filters context information and 	maps it to semantic concepts required for decisions. Semantic mappings can be derived from a pre-defined or learnt world model. On the decision level, the 	context model only contains components relevant for privacy decision making. For example: An activity involves the user, is assigned a type, i.e., a 	semantic label, such as home or work, based on system level input.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Privacy Decision Engine&lt;/b&gt; : The context model allows to reason about which context items are affected by a context transition. When a transition occurs, the privacy decision engine 	(PDE) evaluates which protection worthy context items are affected. Protection worthiness (or privacy relevance) of context items for a given context are 	determined by the user's privacy preferences that are This serves as a basis for adapting privacy preferences and is subsequently further adjusted to the 	user by learning from the user's explicit decisions, behaviour, and reaction to system actions. &lt;a href="#_ftn72" name="_ftnref72"&gt;[72]&lt;/a&gt; approximated by the system from the knowledge base.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;The user's personality type is determined before initial system use&lt;/i&gt; to select a basic privacy profile.&lt;i&gt; &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It may also be possible that the privacy preference cannot be realized in the current context. In that case, the privacy policy would suggest terminating 	the activity. For each privacy policy variant a confidence score is calculated based on how well it fits the adapted privacy preference. Based on the 	confidence scores, the PDE selects the most appropriate policy candidate or triggers user involvement if the confidence is below a certain threshold 	determined by the user's personality and previous privacy decisions.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Realization and Enforcement:&lt;/b&gt; The selected privacy policy must be realized on the system level. This is by combining territorial privacy and information privacy aspects. The private 	territory is defined by a territorial privacy boundary that separates desired and undesired entities.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Granularity adjustments for specific Information items is defined. For example, instead of the user's exact position only the street address or city can be 	provided.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ADVANTAGES: The personalization to a specific user has the advantage of better emulating that user's privacy decision process. It also helps to decide when 	to involve the user in the decision process by providing recommendations only and when privacy decisions can be realized autonomously.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;DISADVANTAGES: The entire model hinges on the ability of the system to accurately determine user profile before the user starts using it and not after, 	when preferences can be more accurately determined. There is no provision for the user to pick his own privacy profile, it is all system determined taking 	away an element of consent in the very beginning. As all further preferences are adapted on this base, it is possible that the system may not deliver. The 	use of confident scores is an approximation that can compromise privacy by a small numerical margin of difference.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However it is a useful insight on techniques of contextualization. Depending on the environment, different strategies for policy realization and varying 	degrees of enforcement are possible&lt;a href="#_ftn73" name="_ftnref73"&gt;[73]&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Length&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The length of privacy policies is often cited as one reason they are so commonly ignored. Studies show privacy policies are hard to read, read 	infrequently, and do not support rational decision making. &lt;a href="#_ftn74" name="_ftnref74"&gt;[74]&lt;/a&gt; Aleecia M. McDonald and Lorrie Faith Cranor 	in their seminal study, "The Cost of Reading Privacy Policies" estimated that the the average length of privacy policies is 2,500 words. Using the reading 	speed of 250 words per minute which is typical for those who have completed secondary education, the average policy would take 10 minutes to read.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The researchers also investigated how quickly people could read privacy policies when they were just skimming it for pertinent details. They timed 93 	people as they skimmed a 934-word privacy policy and answered multiple choice questions on its content.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Though some people took under a minute and others up to 42 minutes, the bulk of the subjects of the research took between three and six minutes to skim the 	policy, which itself was just over a third of the size of the average policy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The researchers used their data to estimate how much it costs to read the privacy policy of every site they visit once a year if their time was charged for 	and arrived at a mind boggling figure of $652 billion.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;img src="http://editors.cis-india.org/home-images/ProbabilityDensityFunction.png" alt="Probability Density Function" class="image-inline" title="Probability Density Function" /&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Though the figure of $652 billion has limited usefulness, because people rarely read whole policies and cannot charge anyone for the time it takes to do 	this, the researchers concluded that readers who do conduct a cost-benefit analysis might decide not to read any policies.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;"Preliminary work from a small pilot study in our laboratory revealed that some Internet users believe their only serious risk online is they may lose up 	to $50 if their credit card information is stolen. For people who think that is their primary risk, our point estimates show the value of their time to 	read policies far exceeds this risk. Even for our lower bound estimates of the value of time, it is not worth reading privacy policies though it may be 	worth skimming them," said the research. This implies that seeing their only risk as credit card fraud suggests Internet users likely do not understand the 	risks to their privacy. As an FTC report recently stated, "it is unclear whether consumers even understand that their information is being collected, 	aggregated, and used to deliver advertising."&lt;a href="#_ftn75" name="_ftnref75"&gt;[75]&lt;/a&gt;"&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;If the privacy community can find ways to reduce the time cost of reading policies, it may be easier to convince Internet users to do so. For example, if 	consumers can move from needing to read policies word-for-word and only skim policies by providing useful headings, or with ways to hide all but relevant information in a layered format and thus reduce the effective length of the policies, more people may be willing to read them.	&lt;a href="#_ftn76" name="_ftnref76"&gt;[76]&lt;/a&gt; Apps can also adopt short form notices that summarize and link to the larger more complete notice 	displayed elsewhere. These short form notices need not be legally binding and must candidate that it does not cover all types of data collection but only 	the most relevant ones. &lt;a href="#_ftn77" name="_ftnref77"&gt;[77]&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;Content&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;In an attempt to gain permission most privacy policies inform users about: (1) the type of information collected; and (2) the purpose for collecting that 	information.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Standard privacy notices generally cover the points of:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;M&lt;b&gt;ethods Of Collection And Usage Of Personal Information&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;The Cookie Policy &lt;/b&gt; &lt;b&gt; &lt;/b&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt; &lt;b&gt;Sharing Of Customer Information&lt;/b&gt; &lt;a href="#_ftn78" name="_ftnref78"&gt;&lt;b&gt;[78]&lt;/b&gt;&lt;/a&gt; &lt;b&gt; &lt;/b&gt; &lt;/li&gt;
&lt;/ul&gt;
&lt;p style="text-align: justify; "&gt;Certified Information Privacy Professionals divide notices into the following sequential sections&lt;a href="#_ftn79" name="_ftnref79"&gt;[79]&lt;/a&gt;:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;i. &lt;b&gt;Policy Identification Details: D&lt;/b&gt;efines the policy name, version and description.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;ii. &lt;b&gt;P3P-Based Components: &lt;/b&gt;Defines policy attributes that would apply if the policy is exported to a P3P format.	&lt;a href="#_ftn80" name="_ftnref80"&gt;[80]&lt;/a&gt; Such attributes would include: policy URLs, organization information, P&lt;span&gt;II&lt;/span&gt; access and dispute 	resolution procedures.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;iii. &lt;b&gt;Policy Statements and Related Elements: Groups, Purposes and PII Types-&lt;/b&gt;Policy statements define the individuals able to access 	certain types of information, for certain pre-defined purposes.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Applications tend to define the type of data broadly in an attempt to strike a balance between providing enough information so that application may gain 	consent to access a user's data and being broad enough to avoid ruling out specific information.&lt;a href="#_ftn81" name="_ftnref81"&gt;[81]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This leads to usage of vague terms like "information collected &lt;i&gt;may &lt;/i&gt;include."&lt;a href="#_ftn82" name="_ftnref82"&gt;[82]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly the purpose of the data acquisition is also very broad. For example, a privacy policy may state that user data can be collected for anything 	related to ―"improving the content of the Service." As the scope of ―improving the content of the Service is never defined, any usage could 	conceivably fall within that category.&lt;a href="#_ftn83" name="_ftnref83"&gt;[83]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Several apps create user social profiles based on their online preferences to promote targeted marketing which is cleverly concealed in phrases like "we may also draw upon this Personal Information in order to adapt the Services of our community to your needs".	&lt;a href="#_ftn84" name="_ftnref84"&gt;[84]&lt;/a&gt; For instance Bees &amp;amp; Pollen is a "predictive personalization" platform for games and apps that 	"uses advanced predictive algorithms to detect complex, non-trivial correlations between conversion patterns and users' DNA signatures, thus enabling it to 	automatically serve each user a personalized best-fit game options, in real-time." In reality it analyses over 100 user attributes, including activity on 	Facebook, spending behaviours, marital status, and location.&lt;a href="#_ftn85" name="_ftnref85"&gt;[85]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notices also often mislead consumers into believing that their information will not be shared with third parties using the terms "unaffiliated third 	parties." Other affiliated companies within the corporate structure of the service provider may have access to user's data for marketing and other 	purposes. &lt;a href="#_ftn86" name="_ftnref86"&gt;[86]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are very few choices to opt-out of certain practices, such as sharing data for marketing purposes. Thus, users are effectively left with a 	take-it-or-leave-it choice - give up your privacy or go elsewhere.&lt;a href="#_ftn87" name="_ftnref87"&gt;[87]&lt;/a&gt;Users almost always grant consent if 	it is required to receive the service they want which raises the query if this consent is meaningful&lt;a href="#_ftn88" name="_ftnref88"&gt;[88]&lt;/a&gt;.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The following recommendations have emerged:&lt;/p&gt;
&lt;ul style="text-align: justify; "&gt;
&lt;li&gt; &lt;b&gt;Notice&lt;/b&gt; - Companies should provide consumers with clear, conspicuous notice that accurately describe their information practices. &lt;/li&gt;
&lt;/ul&gt;
&lt;ul style="text-align: justify; " type="disc"&gt;
&lt;li&gt; &lt;b&gt;Consumer Choice&lt;/b&gt; - Companies should provide consumers with the opportunity to decide (in the form of opting-out) if it may disclose personal information to unaffiliated 		third parties. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Access and Correction&lt;/b&gt; - Companies should provide consumers with the opportunity to access and correct personal information collected about the consumer. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Security&lt;/b&gt; - Companies must adopt reasonable security measures in order to protect the privacy of personal information. Possible security measures include: 		administrative security, physical security and technical security. &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Enforcement&lt;/b&gt; - Companies should have systems through which they can enforce the privacy policy. This may be managed by the company, or an independent third party to ensure compliance. Examples of popular third parties include &lt;a href="https://www.cippguide.org/tag/bbbonline/"&gt;BBBOnLine&lt;/a&gt; and		&lt;a href="https://www.cippguide.org/tag/truste/"&gt;TRUSTe&lt;/a&gt;.&lt;a href="#_ftn89" name="_ftnref89"&gt;[89]&lt;/a&gt; &lt;/li&gt;
&lt;li&gt; &lt;b&gt;Standardization&lt;/b&gt; : Several researchers and organizations have recommended a standardized privacy notice format that covers certain essential points.		&lt;a href="#_ftn90" name="_ftnref90"&gt;[90]&lt;/a&gt; However as displaying a privacy notice in itself is voluntary it is unpredictable whether 		companies would willingly adopt a standardized model. Moreover with the app market burgeoning with innovations a standard format may not cover all 		emergent data practices. &lt;/li&gt;
&lt;/ul&gt;
&lt;h2 style="text-align: justify; "&gt;Comprehension&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;The FTC states that &lt;/b&gt; "the notice-and-choice model, as implemented, has led to long, incomprehensible privacy policies that consumers typically do not read, let alone 	understand. the question is not whether consumers should be given a say over unexpected uses of their data; rather, the question is how to provide 	simplified notice and choice"&lt;a href="#_ftn91" name="_ftnref91"&gt;[91]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Notably, in a survey conducted by Zogby International, 93% of adults - and 81% of teens - indicated they would take more time to read terms and conditions 	for websites if they were written in clearer language.&lt;a href="#_ftn92" name="_ftnref92"&gt;[92]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Most privacy policies are in natural language format: companies explain their practices in prose. One noted disadvantage to current natural language 	policies is that companies can choose which information to present, which does not necessarily solve the problem of information asymmetry between companies and consumers. Further, companies use what have been termed "weasel words" - legalistic, ambiguous, or slanted phrases - to describe their practices	&lt;a href="#_ftn93" name="_ftnref93"&gt;[93]&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In a study by Aleecia M. McDonald and others&lt;a href="#_ftn94" name="_ftnref94"&gt;[94]&lt;/a&gt;, it was found that accuracy in what users comprehend span 	a wide range. An average of 91% of participants answered correctly when asked about cookies, 61% answered correctly about opt out links, 60% understood 	when their email address would be "shared" with a third party, and only 46% answered correctly regarding telemarketing. Participants found those questions 	harder which substituted vague or complicated terms to refer to practices such as telemarketing by "the information you provide may be used for marketing 	services." Overall accuracy was a mere 33%.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Problems&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Natural language policies are often long and require college-level reading skills. Furthermore, there are no standards for which information is disclosed, 	no standard place to find particular information, and data practices are not described using consistent language. These policies are "long, complicated, 	and full of jargon and change frequently."&lt;a href="#_ftn95" name="_ftnref95"&gt;[95]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Kent Walker list five problems that privacy notices typically suffer from -&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) overkill - long and repetitive text in small print,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) irrelevance - describing situations of little concern to most consumers,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) opacity - broad terms the reflect the truth that is impossible to track and control all the information collected and stored,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;d) non-comparability - simplification required to achieve comparability will lead to compromising accuracy, and&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;e) inflexibility - failure to keep pace with new business models. &lt;a href="#_ftn96" name="_ftnref96"&gt;[96]&lt;/a&gt;&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Recommendations&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;Researchers advocate a more succinct and simpler standard for privacy notices,&lt;a name="_ftnref34"&gt;&lt;/a&gt;&lt;a href="#_ftn97" name="_ftnref97"&gt;[97]&lt;/a&gt; such as representing the information in the form of a table. &lt;a href="#_ftn98" name="_ftnref98"&gt;[98]&lt;/a&gt; However, studies show only an insignificant improvement in the understanding by consumers when privacy policies are represented in graphic formats like tables and labels.	&lt;a href="#_ftn99" name="_ftnref99"&gt;[99]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;There are also recommendations to adopt a multi-layered approach where the relevant information is summarized through a short notice.&lt;a href="#_ftn100" name="_ftnref100"&gt;[100]&lt;/a&gt; This is backed by studies that consumers find layered policies easier to understand.	&lt;a href="#_ftn101" name="_ftnref101"&gt;[101]&lt;/a&gt; However they were less accurate in the layered format especially with parts that were not 	summarized. This suggests participants that did not continue to the full policy when the information they sought was not available on the short notice. 	Unless it is possible to identify all of the topics users care about and summarize to one page, the layered notice effectively hides information and reduces transparency. It has also been pointed out that it is impossible to convey complex data policies in simple and clear language.	&lt;a href="#_ftn102" name="_ftnref102"&gt;[102]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Consumers often struggle to map concepts such as third party access to the terms used in policies. This is also because companies with identical practices 	often convey different information, and these differences reflected in consumer's ability to understand the policies. These policies may need an 	educational component so readers understand what it means for a site to engage in a given practice&lt;a href="#_ftn103" name="_ftnref103"&gt;[103]&lt;/a&gt;. 	However it is unlikely that when readers fail to take time to read the policy that they will read up on additional educational components.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;[1]&lt;/a&gt; Amber Sinha http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;[2]&lt;/a&gt; Wang, &lt;i&gt;et al.&lt;/i&gt;, 1998) Milberg, &lt;i&gt;et al.&lt;/i&gt; (1995)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;[3]&lt;/a&gt; See e.g., White House, Consumer Privacy Bill of Rights (2012) 			http://www.whitehouse.gov/the-pressoffice/2012/02/23/we-can-t-wait-obama-administration-unveils-blueprint-privacy-bill-rights; Fed. Trade Comm'n, 			Protecting Consumer Privacy in an Era of Rapid Change: Recommendations for Business and Policy Makers (2012) 			http://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commissionreport-protecting-consumer-privacy-era-rapid-change-recommendations/120326privacyreport.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;[4]&lt;/a&gt; Fed. Trade Comm'n, Privacy Online: A Report to Congress 7 (June 1998), available at www.ftc.gov/reports/privacy3/priv-23a.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;[5]&lt;/a&gt; &lt;a href="http://itlaw.wikia.com/wiki/U.S._Department_of_Commerce" title="U.S. Department of Commerce"&gt;U.S. Department of Commerce&lt;/a&gt; , &lt;a href="http://itlaw.wikia.com/wiki/Internet_Policy_Task_Force" title="Internet Policy Task Force"&gt;Internet Policy Task Force&lt;/a&gt;, 			&lt;a href="http://itlaw.wikia.com/wiki/Commercial_Data_Privacy_and_Innovation_in_the_Internet_Economy:_A_Dynamic_Policy_Framework" title="Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework"&gt; Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework &lt;/a&gt; 20 (Dec. 16, 2010) (&lt;a href="http://www.ntia.doc.gov/reports/2010/IPTF_Privacy_GreenPaper_12162010.pdf"&gt;full-text&lt;/a&gt;).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;[6]&lt;/a&gt; 389 U.S. 347 (1967).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;[7]&lt;/a&gt; Dow Chem. Co. v. United States, 476 U.S. 227, 241 (1986)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;[8]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;[9]&lt;/a&gt; Dow Chem. Co. v. United States, 476 U.S. 227, 241 (1986)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;[10]&lt;/a&gt; Kyllo, 533 U.S. at 34 (―[T]he technology enabling human flight has exposed to public view (and hence, we have said, to official observation) 			uncovered portions of the house and its curtilage that once were private.‖).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;[11]&lt;/a&gt; Kyllo v. United States, 533 U.S. 27&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;[12]&lt;/a&gt; See Katz, 389 U.S. at 352 (―But what he sought to exclude when he entered the booth was not the intruding eye-it was the uninvited ear. He 			did not shed his right to do so simply because he made his calls from a place where he might be seen.‖).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;[13]&lt;/a&gt; See United States v. Ahrndt, No. 08-468-KI, 2010 WL 3773994, at *4 (D. Or. Jan. 8, 2010).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;[14]&lt;/a&gt; In re DoubleClick Inc. Privacy Litig., 154 F. Supp. 2d 497 (S.D.N.Y. 2001).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;[15]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;[16]&lt;/a&gt; See Michael A. Carrier, Against Cyberproperty, 22 BERKELEY TECH. L.J. 1485, 1486 (2007) (arguing against creating a right to exclude users from 			making electronic contact to their network as one that exceeds traditional property notions).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;[17]&lt;/a&gt; See M. Ryan Calo, Against Notice Skepticism in Privacy (and Elsewhere), 87 NOTRE DAME L. REV. 1027, 1049 (2012) (citing Paula J. Dalley, The Use 			and Misuse of Disclosure as a Regulatory System, 34 FLA. ST. U. L. REV. 1089, 1093 (2007) ("[D]isclosure schemes comport with the prevailing 			political philosophy in that disclosure preserves individual choice while avoiding direct governmental interference.")).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;[18]&lt;/a&gt; See Calo, supra note 10, at 1048; see also Omri Ben-Shahar &amp;amp; Carl E. Schneider, The Failure of Mandated Disclosure, 159 U. PA. L. REV. 647, 682 			(noting that notice "looks cheap" and "looks easy").&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;[19]&lt;/a&gt; Mark MacCarthy, New Directions in Privacy: Disclosure, Unfairness and Externalities, 6 I/S J. L. &amp;amp; POL'Y FOR INFO. SOC'Y 425, 440 (2011) 			(citing M. Ryan Calo, A Hybrid Conception of Privacy Harm Draft-Privacy Law Scholars Conference 2010, p. 28).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;[20]&lt;/a&gt; Daniel J. Solove, Introduction: Privacy Self-Management and the Consent Dilemma, 126 HARV. L. REV. 1879, 1885 (2013) (citing Jon Leibowitz, Fed. 			Trade Comm'n, So Private, So Public: Individuals, the Internet &amp;amp; the Paradox of Behavioral Marketing, Remarks at the FTC Town Hall Meeting on 			Behavioral Advertising: Tracking, Targeting, &amp;amp; Technology (Nov. 1, 2007), available at 			http://www.ftc.gov/speeches/leibowitz/071031ehavior/pdf). Paul Ohm refers to these issues as "information-quality problems." See Paul Ohm, Branding 			Privacy, 97 MINN. L. REV. 907, 930 (2013). Daniel J. Solove refers to this as "the problem of the uninformed individual." See Solove, supra note 17&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;[21]&lt;/a&gt; See Edward J. Janger &amp;amp; Paul M. Schwartz, The Gramm-Leach-Bliley Act, Information Privacy, and the Limits of Default Rules, 86 MINN. L. REV. 			1219, 1230 (2002) (stating that according to one survey, "only 0.5% of banking customers had exercised their opt-out rights").&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;[22]&lt;/a&gt; See Amber Sinha A Critique of Consent in Information Privacy 			http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;[23]&lt;/a&gt; Leigh Shevchik, "Mobile App Industry to Reach Record Revenue in 2013," New Relic (blog), April 1, 2013, 			http://blog.newrelic.com/2013/04/01/mobile-apps-industry-to-reach-record-revenue-in-2013/.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;[24]&lt;/a&gt; Jan Lauren Boyles, Aaron Smith, and Mary Madden, "Privacy and Data Management on Mobile Devices," Pew Internet &amp;amp; American Life Project, 			Washington, DC, September 5, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;[25]&lt;/a&gt; http://www.aarp.org/content/dam/aarp/research/public_policy_institute/cons_prot/2014/improving-mobile-device-privacy-disclosures-AARP-ppi-cons-prot.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;[26]&lt;/a&gt; "Mobile Apps for Kids: Disclosures Still Not Making the Grade," Federal Trade Commission, Washington, DC, December 2012&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;[27]&lt;/a&gt; http://www.aarp.org/content/dam/aarp/research/public_policy_institute/cons_prot/2014/improving-mobile-device-privacy-disclosures-AARP-ppi-cons-prot.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;[28]&lt;/a&gt; Linda Ackerman, "Mobile Health and Fitness Applications and Information Privacy," Privacy Rights Clearinghouse, San Diego, CA, July 15, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;[29]&lt;/a&gt; Margaret Jane Radin, Humans, Computers, and Binding Commitment, 75 IND. L.J. 1125, 1126 (1999). 			&lt;a href="http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=2199&amp;amp;context=ilj"&gt; http://www.repository.law.indiana.edu/cgi/viewcontent.cgi?article=2199&amp;amp;context=ilj &lt;/a&gt; &lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;[30]&lt;/a&gt; William Aiello, Steven M. Bellovin, Matt Blaze, Ran Canetti, John Ioannidis, Angelos D. Keromytis, and Omer Reingold. Just fast keying: Key 			agreement in a hostile internet. ACM Trans. Inf. Syst. Secur., 7(2):242-273, 2004.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;[31]&lt;/a&gt; Privacy By Design The 7 Foundational Principles by Anne Cavoukian https://www.ipc.on.ca/images/resources/7foundationalprinciples.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;[32]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;[33]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;[34]&lt;/a&gt; G. Danezis, J. Domingo-Ferrer, M. Hansen, J.-H. Hoepman, D. Le M´etayer, R. Tirtea, and S. Schiffner. Privacy and Data Protection by Design - 			from policy to engineering. report, ENISA, Dec. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn35"&gt;
&lt;p&gt;&lt;a href="#_ftnref35" name="_ftn35"&gt;&lt;sup&gt;&lt;sup&gt;[35]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; John Frank Weaver, We Need to Pass Legislation on Artificial Intelligence Early and Often, SLATE FUTURE TENSE (Sept. 12, 			2014),http://www.slate.com/blogs/future_tense/2014/09/12/we_need_to_pass_artificial_intelligence_laws_early_and_often.html&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn36"&gt;
&lt;p&gt;&lt;a href="#_ftnref36" name="_ftn36"&gt;&lt;sup&gt;&lt;sup&gt;[36]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Margaret Jane Radin, Humans, Computers, and Binding Commitment, 75 IND. L.J. 1125, 1126 (1999).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn37"&gt;
&lt;p&gt;&lt;a href="#_ftnref37" name="_ftn37"&gt;&lt;sup&gt;&lt;sup&gt;[37]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; Richard Warner &amp;amp; Robert Sloan, Beyond Notice and Choice: Privacy, Norms, and Consent, J. High Tech. L. (2013). Available at: 			http://scholarship.kentlaw.iit.edu/fac_schol/568&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn38"&gt;
&lt;p&gt;&lt;a href="#_ftnref38" name="_ftn38"&gt;&lt;b&gt;&lt;sup&gt;&lt;b&gt;&lt;sup&gt;[38]&lt;/sup&gt;&lt;/b&gt;&lt;/sup&gt;&lt;/b&gt;&lt;/a&gt; &lt;a href="http://ssrn.com/abstract=1085333"&gt;&lt;b&gt;Engineering Privacy by Sarah Spiekermann, Lorrie Faith Cranor :: SSRN&lt;/b&gt;&lt;/a&gt; &lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn39"&gt;
&lt;p&gt;&lt;a href="#_ftnref39" name="_ftn39"&gt;[39]&lt;/a&gt; iOS Application Programming Guide: The Application Runtime Environment, APPLE, http://developer.apple.com/library/ 			ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/RuntimeEnvironment /RuntimeEnvironment.html (last updated Feb. 24, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn40"&gt;
&lt;p&gt;&lt;a href="#_ftnref40" name="_ftn40"&gt;[40]&lt;/a&gt; Security and Permissions, ANDROID DEVELOPERS, http://developer.android.com/guide/topics/security/security.html (last updated Sept. 13, 2011).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn41"&gt;
&lt;p&gt;&lt;a href="#_ftnref41" name="_ftn41"&gt;[41]&lt;/a&gt; iOS Application Programming Guide: The Application Runtime Environment, APPLE, http://developer.apple.com/library/ 			ios/#documentation/iphone/conceptual/iphoneosprogrammingguide/RuntimeEnvironment /RuntimeEnvironment.html (last updated Feb. 24, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn42"&gt;
&lt;p&gt;&lt;a href="#_ftnref42" name="_ftn42"&gt;[42]&lt;/a&gt; See Katherine Noyes, Why Android App Security is Better Than for the iPhone, PC WORLD BUS. CTR. (Aug. 6, 2010, 4:20 PM), 			http://www.pcworld.com/businesscenter/article/202758/why_android_app_security_is_be tter_than_for_the_iphone.html; see also About Permissions for 			Third-Party Applications, BLACKBERRY, http://docs.blackberry.com/en/smartphone_users/deliverables/22178/ 			About_permissions_for_third-party_apps_50_778147_11.jsp (last visited Sept. 29, 2011); Security and Permissions, supra note 76.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn43"&gt;
&lt;p&gt;&lt;a href="#_ftnref43" name="_ftn43"&gt;[43]&lt;/a&gt; Peter S. Vogel, A Worrisome Truth: Internet Privacy is Impossible, TECHNEWSWORLD (June 8, 2011, 5:00 AM), http://www.technewsworld.com/ 			story/72610.html.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn44"&gt;
&lt;p&gt;&lt;a href="#_ftnref44" name="_ftn44"&gt;[44]&lt;/a&gt; Privacy Policy, FOURSQUARE, http://foursquare.com/legal/privacy (last updated Jan. 12, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn45"&gt;
&lt;p&gt;&lt;a href="#_ftnref45" name="_ftn45"&gt;[45]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing Notice: A Large-scale Experiment on the Timing of Software License 			Agreements. In Proc. of CHI. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn46"&gt;
&lt;p&gt;&lt;a href="#_ftnref46" name="_ftn46"&gt;[46]&lt;/a&gt; I. Adjerid, A. Acquisti, L. Brandimarte, and G. Loewenstein. Sleights of Privacy: Framing, Disclosures, and the Limits of Transparency. In Proc. of 			SOUPS. ACM, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn47"&gt;
&lt;p&gt;&lt;a href="#_ftnref47" name="_ftn47"&gt;[47]&lt;/a&gt; http://delivery.acm.org/10.1145/2810000/2808119/p63-balebako.pdf?ip=106.51.36.200&amp;amp;id=2808119&amp;amp;acc=OA&amp;amp;key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E35B5BCE80D07AAD9&amp;amp;CFID=801296199&amp;amp;CFTOKEN=33661544&amp;amp;__acm__=1466052980_2f265a2442ea3394aa1ebab7e6449933&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn48"&gt;
&lt;p&gt;&lt;a href="#_ftnref48" name="_ftn48"&gt;[48]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn49"&gt;
&lt;p&gt;&lt;a href="#_ftnref49" name="_ftn49"&gt;[49]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn50"&gt;
&lt;p&gt;&lt;a href="#_ftnref50" name="_ftn50"&gt;[50]&lt;/a&gt; S. Egelman, J. Tsai, L. F. Cranor, and A. Acquisti. Timing is everything?: the effects of timing and placement of online privacy indicators. In 			Proc. CHI '09. ACM, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn51"&gt;
&lt;p&gt;&lt;a href="#_ftnref51" name="_ftn51"&gt;[51]&lt;/a&gt; R. B¨ohme and S. K¨opsell. Trained to accept?: A field experiment on consent dialogs. In Proc. CHI '10. ACM, 2010&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn52"&gt;
&lt;p&gt;&lt;a href="#_ftnref52" name="_ftn52"&gt;[52]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing notice: a large-scale experiment on the timing of software license 			agreements. In Proc. CHI '07. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn53"&gt;
&lt;p&gt;&lt;a href="#_ftnref53" name="_ftn53"&gt;[53]&lt;/a&gt; N. S. Good, J. Grossklags, D. K. Mulligan, and J. A. Konstan. Noticing notice: a large-scale experiment on the timing of software license 			agreements. In Proc. CHI '07. ACM, 2007.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn54"&gt;
&lt;p&gt;&lt;a href="#_ftnref54" name="_ftn54"&gt;[54]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn55"&gt;
&lt;p&gt;&lt;a href="#_ftnref55" name="_ftn55"&gt;[55]&lt;/a&gt; A. Kobsa and M. Teltzrow. Contextualized communication of privacy practices and personalization benefits: Impacts on users' data sharing and 			purchase behavior. In Proc. PETS '05. Springer, 2005.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn56"&gt;
&lt;p&gt;&lt;a href="#_ftnref56" name="_ftn56"&gt;[56]&lt;/a&gt; F. Schaub, B. K¨onings, and M. Weber. Context-adaptive privacy: Leveraging context awareness to support privacy decision making. IEEE 			Pervasive Computing, 14(1):34-43, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn57"&gt;
&lt;p&gt;&lt;a href="#_ftnref57" name="_ftn57"&gt;[57]&lt;/a&gt; E. Choe, J. Jung, B. Lee, and K. Fisher. Nudging people away from privacy-invasive mobile apps through visual framing. In Proc. INTERACT '13. 			Springer, 2013.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn58"&gt;
&lt;p&gt;&lt;a href="#_ftnref58" name="_ftn58"&gt;[58]&lt;/a&gt; F. Schaub, B. K¨onings, and M. Weber. Context-adaptive privacy: Leveraging context awareness to support privacy decision making. IEEE 			Pervasive Computing, 14(1):34-43, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn59"&gt;
&lt;p&gt;&lt;a href="#_ftnref59" name="_ftn59"&gt;[59]&lt;/a&gt; Article 29 Data Protection Working Party. Opinion 8/2014 on the Recent Developments on the Internet of Things. WP 223, Sept. 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn60"&gt;
&lt;p&gt;&lt;a href="#_ftnref60" name="_ftn60"&gt;[60]&lt;/a&gt; B. Anderson, A. Vance, B. Kirwan, E. D., and S. Howard. Users aren't (necessarily) lazy: Using NeuroIS to explain habituation to security warnings. 			In Proc. ICIS '14, 2014.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn61"&gt;
&lt;p&gt;&lt;a href="#_ftnref61" name="_ftn61"&gt;[61]&lt;/a&gt; B. Anderson, B. Kirwan, D. Eargle, S. Howard, and A. Vance. How polymorphic warnings reduce habituation in the brain - insights from an fMRI study. 			In Proc. CHI '15. ACM, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn62"&gt;
&lt;p&gt;&lt;a href="#_ftnref62" name="_ftn62"&gt;[62]&lt;/a&gt; M. S. Wogalter, V. C. Conzola, and T. L. Smith-Jackson. Research-based guidelines for warning design and evaluation. Applied Ergonomics, 16 USENIX 			Association 2015 Symposium on Usable Privacy and Security 17 33(3):219-230, 2002.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn63"&gt;
&lt;p&gt;&lt;a href="#_ftnref63" name="_ftn63"&gt;[63]&lt;/a&gt; L. F. Cranor, P. Guduru, and M. Arjula. User interfaces for privacy agents. ACM TOCHI, 13(2):135-178, 2006.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn64"&gt;
&lt;p&gt;&lt;a href="#_ftnref64" name="_ftn64"&gt;[64]&lt;/a&gt; R. S. Portnoff, L. N. Lee, S. Egelman, P. Mishra, D. Leung, and D. Wagner. Somebody's watching me? assessing the effectiveness of webcam indicator 			lights. In Proc. CHI '15, 2015&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn65"&gt;
&lt;p&gt;&lt;a href="#_ftnref65" name="_ftn65"&gt;[65]&lt;/a&gt; M. Langheinrich. Privacy by design - principles of privacy-aware ubiquitous systems. In Proc. UbiComp '01. Springer, 2001&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn66"&gt;
&lt;p&gt;&lt;a href="#_ftnref66" name="_ftn66"&gt;[66]&lt;/a&gt; Microsoft. Privacy Guidelines for Developing Software Products and Services. Technical Report version 3.1, 2008.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn67"&gt;
&lt;p&gt;&lt;a href="#_ftnref67" name="_ftn67"&gt;[67]&lt;/a&gt; The Impact of Timing on the Salience of Smartphone App Privacy Notices, Rebecca Balebako , Florian Schaub, Idris Adjerid , Alessandro Acquist 			,Lorrie Faith Cranor&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn68"&gt;
&lt;p&gt;&lt;a href="#_ftnref68" name="_ftn68"&gt;[68]&lt;/a&gt; R. Böhme and J. Grossklags. The Security Cost of Cheap User Interaction. In Workshop on New Security Paradigms, pages 67-82. ACM, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn69"&gt;
&lt;p&gt;&lt;a href="#_ftnref69" name="_ftn69"&gt;[69]&lt;/a&gt; A. Felt, S. Egelman, M. Finifter, D. Akhawe, and D. Wagner. How to Ask For Permission. HOTSEC 2012, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn70"&gt;
&lt;p&gt;&lt;a href="#_ftnref70" name="_ftn70"&gt;[70]&lt;/a&gt; A. Felt, S. Egelman, M. Finifter, D. Akhawe, and D. Wagner. How to Ask For Permission. HOTSEC 2012, 2012.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn71"&gt;
&lt;p&gt;&lt;a href="#_ftnref71" name="_ftn71"&gt;[71]&lt;/a&gt; Towards Context Adaptive Privacy Decisions in Ubiquitous Computing Florian Schaub∗ , Bastian Könings∗ , Michael Weber∗ , 			Frank Kargl† ∗ Institute of Media Informatics, Ulm University, Germany Email: { florian.schaub | bastian.koenings | michael.weber 			}@uni-ulm.d&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn72"&gt;
&lt;p&gt;&lt;a href="#_ftnref72" name="_ftn72"&gt;[72]&lt;/a&gt; M. Korzaan and N. Brooks, "Demystifying Personality and Privacy: An Empirical Investigation into Antecedents of Concerns for Information Privacy," 			Journal of Behavioral Studies in Business, pp. 1-17, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn73"&gt;
&lt;p&gt;&lt;a href="#_ftnref73" name="_ftn73"&gt;[73]&lt;/a&gt; B. Könings and F. Schaub, "Territorial Privacy in Ubiquitous Computing," in WONS'11. IEEE, 2011, pp. 104-108.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn74"&gt;
&lt;p&gt;&lt;a href="#_ftnref74" name="_ftn74"&gt;[74]&lt;/a&gt; The Cost of Reading Privacy Policies Aleecia M. McDonald and Lorrie Faith Cranor&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn75"&gt;
&lt;p&gt;&lt;a href="#_ftnref75" name="_ftn75"&gt;[75]&lt;/a&gt; 5 Federal Trade Commission, "Protecting Consumers in the Next Tech-ade: A Report by the Staff of the Federal Trade Commission," March 2008, 11, 			http://www.ftc.gov/os/2008/03/P064101tech.pdf.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn76"&gt;
&lt;p&gt;&lt;a href="#_ftnref76" name="_ftn76"&gt;[76]&lt;/a&gt; The Cost of Reading Privacy Policies Aleecia M. McDonald and Lorrie Faith Cranor&lt;/p&gt;
&lt;p&gt;I/S: A Journal of Law and Policy for the Information Society 2008 Privacy Year in Review issue http://www.is-journal.org/&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn77"&gt;
&lt;p&gt;&lt;a href="#_ftnref77" name="_ftn77"&gt;[77]&lt;/a&gt; IS YOUR INSEAM YOUR BIOMETRIC? Evaluating the Understandability of Mobile Privacy Notice Categories Rebecca Balebako, Richard Shay, and Lorrie 			Faith Cranor July 17, 2013 https://www.cylab.cmu.edu/files/pdfs/tech_reports/CMUCyLab13011.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn78"&gt;
&lt;p&gt;&lt;a href="#_ftnref78" name="_ftn78"&gt;[78]&lt;/a&gt; https://www.sba.gov/blogs/7-considerations-crafting-online-privacy-policy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn79"&gt;
&lt;p&gt;&lt;a href="#_ftnref79" name="_ftn79"&gt;[79]&lt;/a&gt; https://www.cippguide.org&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn80"&gt;
&lt;p&gt;&lt;a href="#_ftnref80" name="_ftn80"&gt;[80]&lt;/a&gt; The Platform for Privacy Preferences Project, more commonly known as P3P was designed by the World Wide Web Consortium aka W3C in response to the 			increased use of the Internet for sales transactions and subsequent collection of personal information. P3P is a special protocol that allows a 			website's policies to be machine readable, granting web users' greater control over the use and disclosure of their information while browsing the 			internet.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn81"&gt;
&lt;p&gt;&lt;a href="#_ftnref81" name="_ftn81"&gt;[81]&lt;/a&gt; Security and Permissions, ANDROID DEVELOPERS, http://developer.android.com/guide/topics/security/security.html (last updated Sept. 13, 2011).&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn82"&gt;
&lt;p&gt;&lt;a href="#_ftnref82" name="_ftn82"&gt;[82]&lt;/a&gt; See Foursqaure Privacy Policy&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn83"&gt;
&lt;p&gt;&lt;a href="#_ftnref83" name="_ftn83"&gt;[83]&lt;/a&gt; http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1600&amp;amp;context=iplj&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn84"&gt;
&lt;p&gt;&lt;a href="#_ftnref84" name="_ftn84"&gt;[84]&lt;/a&gt; Privacy Policy, FOURSQUARE, http://foursquare.com/legal/privacy (last updated Jan. 12, 2011)&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn85"&gt;
&lt;p&gt;&lt;a href="#_ftnref85" name="_ftn85"&gt;[85]&lt;/a&gt; Bees and Pollen, "Bees and Pollen Personalization Platform," http://www.beesandpollen.com/TheProduct. aspx; Bees and Pollen, "Sense6-Social Casino 			Games Personalization Solution," http://www.beesandpollen. com/sense6.aspx; Bees and Pollen, "About Us," http://www.beesandpollen.com/About.aspx.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn86"&gt;
&lt;p&gt;&lt;a href="#_ftnref86" name="_ftn86"&gt;[86]&lt;/a&gt; CFA on the NTIA Short Form Notice Code of Conduct to Promote Transparency in Mobile Applications July 26, 2013 | Press Release&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn87"&gt;
&lt;p&gt;&lt;a href="#_ftnref87" name="_ftn87"&gt;[87]&lt;/a&gt; P. M. Schwartz and D. Solove. Notice &amp;amp; Choice. In The Second NPLAN/BMSG Meeting on Digital Media and Marketing to Children, 2009.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn88"&gt;
&lt;p&gt;&lt;a href="#_ftnref88" name="_ftn88"&gt;[88]&lt;/a&gt; F. Cate. The Limits of Notice and Choice. IEEE Security Privacy, 8(2):59-62, Mar. 2010.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn89"&gt;
&lt;p&gt;&lt;a href="#_ftnref89" name="_ftn89"&gt;[89]&lt;/a&gt; https://www.cippguide.org/2011/08/09/components-of-a-privacy-policy/&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn90"&gt;
&lt;p&gt;&lt;a href="#_ftnref90" name="_ftn90"&gt;[90]&lt;/a&gt; https://www.ftc.gov/public-statements/2001/07/case-standardization-privacy-policy-formats&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn91"&gt;
&lt;p&gt;&lt;a href="#_ftnref91" name="_ftn91"&gt;[91]&lt;/a&gt; Protecting Consumer Privacy in an Era of Rapid Change. Preliminary FTC Staff Report.December 2010&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn92"&gt;
&lt;p&gt;&lt;a href="#_ftnref92" name="_ftn92"&gt;[92]&lt;/a&gt; . See Comment of Common Sense Media, cmt. #00457, at 1.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn93"&gt;
&lt;p&gt;&lt;a href="#_ftnref93" name="_ftn93"&gt;[93]&lt;/a&gt; Pollach, I. What's wrong with online privacy policies? Communications of the ACM 30, 5 (September 2007), 103-108&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn94"&gt;
&lt;p&gt;&lt;a href="#_ftnref94" name="_ftn94"&gt;[94]&lt;/a&gt; A Comparative Study of Online Privacy Policies and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley, 1 Lorrie Faith Cranor1 1 			Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA&lt;/p&gt;
&lt;p&gt;http://lorrie.cranor.org/pubs/authors-version-PETS-formats.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn95"&gt;
&lt;p&gt;&lt;a href="#_ftnref95" name="_ftn95"&gt;[95]&lt;/a&gt; Amber Sinha Critique&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn96"&gt;
&lt;p&gt;&lt;a href="#_ftnref96" name="_ftn96"&gt;[96]&lt;/a&gt; Kent Walker, The Costs of Privacy, 2001 available at 			&lt;a href="https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy"&gt; https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn97"&gt;
&lt;p&gt;&lt;a href="#_ftnref97" name="_ftn97"&gt;[97]&lt;/a&gt; Annie I. Anton et al., Financial Privacy Policies and the Need for Standardization, 2004 available at			&lt;a href="https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf"&gt;https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf&lt;/a&gt;; Florian Schaub, R. 			Balebako et al, "A Design Space for effective privacy notices" available at 			&lt;a href="https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf"&gt; https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn98"&gt;
&lt;p&gt;&lt;a href="#_ftnref98" name="_ftn98"&gt;[98]&lt;/a&gt; Allen Levy and Manoj Hastak, Consumer Comprehension of Financial Privacy Notices, Interagency Notice Project, available at			&lt;a href="https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf"&gt;https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn99"&gt;
&lt;p&gt;&lt;a href="#_ftnref99" name="_ftn99"&gt;[99]&lt;/a&gt; Patrick Gage Kelly et al., Standardizing Privacy Notices: An Online Study of the Nutrition Label Approach available at 			&lt;a href="https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf"&gt; https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn100"&gt;
&lt;p&gt;&lt;a href="#_ftnref100" name="_ftn100"&gt;[100]&lt;/a&gt; The Center for Information Policy Leadership, Hunton &amp;amp; Williams LLP, "Ten Steps To Develop A Multi-Layered Privacy Notice" available at 			&lt;a href="https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf"&gt; https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn101"&gt;
&lt;p&gt;&lt;a href="#_ftnref101" name="_ftn101"&gt;[101]&lt;/a&gt; A Comparative Study of Online Privacy Policies and Formats Aleecia M. McDonald,1 Robert W. Reeder,2 Patrick Gage Kelley, 1 Lorrie Faith Cranor1 1 			Carnegie Mellon, Pittsburgh, PA 2 Microsoft, Redmond, WA&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn102"&gt;
&lt;p&gt;&lt;a href="#_ftnref102" name="_ftn102"&gt;[102]&lt;/a&gt; Howard Latin, "Good" Warnings, Bad Products, and Cognitive Limitations, 41 UCLA Law Review available at 			&lt;a href="https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5"&gt; https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&amp;amp;crawlid=1&amp;amp;srctype=smi&amp;amp;srcid=3B15&amp;amp;doctype=cite&amp;amp;docid=41+UCLA+L.+Rev.+1193&amp;amp;key=1c15e064a97759f3f03fb51db62a79a5 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn103"&gt;
&lt;p&gt;&lt;a href="#_ftnref103" name="_ftn103"&gt;[103]&lt;/a&gt; Report by Kleimann Communication Group for the FTC. Evolution of a prototype financial privacy notice, 2006. http://www.ftc.gov/privacy/ 			privacyinitiatives/ftcfinalreport060228.pdf Accessed 2 Mar 2007&lt;/p&gt;
&lt;p&gt;http://lorrie.cranor.org/pubs/authors-version-PETS-formats.pdf&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/enlarging-the-small-print'&gt;http://editors.cis-india.org/internet-governance/blog/enlarging-the-small-print&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Meera Manoj</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-12-14T16:27:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public">
    <title>En Inde, le biométrique version très grand public </title>
    <link>http://editors.cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public</link>
    <description>
        &lt;b&gt;Initiée en 2010, l’Aadhaar est désormais la plus grande base de données d’empreintes et d’iris au monde. Carte d’identité destinée aux 1,25 milliard d’Indiens, elle sert aussi de moyen de paiement. Mais la sécurité du système et son utilisation à des fins de surveillance posent question.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was &lt;a class="external-link" href="http://www.liberation.fr/futurs/2017/04/27/en-inde-le-biometrique-version-tres-grand-public_1565815"&gt;published by Liberation&lt;/a&gt; on April 27, 2017. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Le front barré d’un signe religieux hindou rouge, Vivek  Kumar se tient droit derrière le comptoir de son étroite papeterie  située dans une allée obscure d’un quartier populaire du sud-est de New  Delhi. Sous le regard bienveillant d’une idole de Ganesh - le dieu qui  efface les obstacles -, le commerçant à la fine moustache et à la  chemise bleu-gris au col Nehru réalise des photocopies, fournit des  tampons ou des stylos à des dizaines de chalands.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Gaurav, un vendeur de légumes de la halle d’à côté, entre  acheter du crédit de communication mobile. Au moment de payer, il sort  son portefeuille, mais pas pour chercher de la monnaie. Il y prend sa  carte d’identité Aadhaar et fournit ses douze chiffres au commerçant.  Qui les entre dans un smartphone, sélectionne la banque de Gaurav et  indique le montant de l’achat. Le client n’a plus qu’à poser son pouce  sur un lecteur biométrique relié au combiné, connecté à Internet. Une  lumière rouge s’allume et un son retentit : la transaction est bien  passée.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Depuis mars, 32 banques indiennes fournissent ce service  novateur de paiement par empreinte digitale. Appelé Aadhaar Pay, il  utilise les informations biométriques, à savoir les dix empreintes  digitales et celle de l’iris, recueillies par le gouvernement depuis  septembre 2010 pour créer la première carte d’identité du pays. Toute  personne résidant en Inde depuis plus de six mois, y compris les  étrangers, peut s’inscrire et l’obtenir gratuitement.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;«Renverser le système»&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;L’Aadhaar («la fondation» en hindi) représente aujourd’hui  la plus grande base de données biométriques au monde, avec 1,13 milliard  de personnes enregistrées sur 1,25 milliard, soit 99 % de la population  adulte indienne.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;L’objectif initial était double : identifier la population -  10% des Indiens n’avaient jusqu’ici aucun papier, et donc aucun droit -  et se servir de ces moyens biométriques pour sécuriser l’attribution de  nombreuses subventions alimentaires ou énergétiques, dont le  détournement coûte plusieurs milliards d’euros chaque année à l’Etat  fédéral.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A partir de 2014, la nouvelle majorité nationaliste hindoue  du BJP a étendu les usages de l’Aadhaar pour transformer cet outil de  reconnaissance en un vrai «passe-partout» de la vie quotidienne indienne  : depuis l’ouverture d’une ligne téléphonique à la déclaration de ses  impôts, en passant surtout par la création d’un compte en banque, le  numéro Aadhaar sera à présent requis. Dans ce dernier cas, l’Aadhaar  permet en prime d’utiliser le paiement bancaire par biométrie pour  réduire le recours au liquide, qui représente encore plus de 90 % des  transactions dans le pays.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Le Premier ministre, Narendra Modi, a fait de cette  inclusion financière l’un de ses principaux chevaux de bataille :  en 2014, son gouvernement a lancé un énorme programme qui a permis la  création de 213 millions de comptes bancaires en deux ans - aujourd’hui,  quasiment tous les foyers en possèdent au moins un. Il a continué dans  cette voie énergique en démonétisant, en novembre, les principales  coupures. But de la manœuvre : convaincre les Indiens de se défaire, au  moins temporairement, de leur dépendance aux billets marqués de la tête  de Gandhi.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;«Le liquide est gratuit, donc il est difficile de pousser les gens à utiliser d’autres moyens de paiement,&lt;/i&gt; explique Ragavan Venkatesan, responsable des paiements numériques à la  banque IDFC, pionnière dans l’utilisation de l’Aadhaar Pay. &lt;i&gt;Nous avons donc renversé le système pour que le commerçant soit incité à utiliser les moyens numériques.»&lt;/i&gt; L’établissement financier a d’abord développé le &lt;i&gt;«microdistributeur de billets»&lt;/i&gt; : une tablette que le vendeur peut utiliser pour créer des comptes,  recevoir des petits dépôts ou fournir du liquide aux clients au nom de  la banque, contre une commission. Comme l’Aadhaar Pay, cette tablette se  connecte au lecteur biométrique - fourni par l’entreprise française  Safran - pour l’identification et l’authentification. Dans les deux cas,  et à la différence des paiements par carte, ni le marchand ni le client  ne paient pour l’utilisation de ce réseau. &lt;i&gt;«Le mode traditionnel de paiement par carte va progressivement disparaître»,&lt;/i&gt; prédit Ragavan Venkatesan.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Défi&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Pour l’instant, le système n’en est toutefois qu’à ses  débuts. Environ 70 banques - une minorité du réseau indien - sont  reliées à l’Aadhaar Pay, et lors de nos visites dans différents magasins  de New Delhi, une transaction a été bloquée pendant dix minutes à cause  d’un problème de serveur. La connectivité est d’ailleurs un défi dans  un pays dont la population est en majorité rurale : le système nécessite  au minimum le réseau 2G, dont sont dépourvus environ 8 % des villages,  selon le ministère des Télécommunications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mais c’est la protection du système qui est surtout en question : &lt;i&gt;«La  biométrie réduit fortement le niveau de sécurité, car c’est facile de  voler ces données et de les utiliser sans votre accord,&lt;/i&gt; explique Sunil Abraham, directeur du Centre pour l’Internet et la société de Bangalore. &lt;i&gt;Il  existe maintenant des appareils photo de haute résolution qui  permettent de capturer et de répliquer les empreintes ou l’iris»&lt;/i&gt;, affirme ce spécialiste en cybersécurité.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Le problème tient au caractère irrévocable de ces données  biométriques. A la différence d’une carte bancaire qu’on peut annuler et  remplacer, on ne peut changer d’empreinte ou d’iris. L’Autorité  indienne d’identification unique (UIDAI), qui gère l’Aadhaar, prévoit  bien que l’on puisse bloquer l’utilisation de ses propres données  biométriques sur demande, ce qui offre une solution de sécurisation  temporaire. &lt;i&gt;«Si un fraudeur essaie de les utiliser, on peut le repérer&lt;/i&gt; [grâce au réseau internet, ndlr] &lt;i&gt;et l’arrêter»,&lt;/i&gt; défend Ragavan Venkatesan, de la banque IDFC.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Mais cela risque de ne pas suffire en cas de recel de ces  informations : la police vient d’interpeller un groupe de trafiquants  qui étaient en possession des données bancaires de 10 millions  d’Indiens, récupérées à travers des employés et sous-traitants, données  qu’ils revendaient par paquets. Une femme âgée s’était déjà fait dérober  146 000 roupies (un peu plus de 2 000 euros) à cause de cette fraude.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Outil idéal&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Le directeur de l’UIDAI assure qu’aucune fuite ni vol de  données n’ont été rapportés à ce jour depuis leurs serveurs - ce qui ne  garantit pas que cette confidentialité sera respectée par tous les  autres acteurs qui y ont accès. En février, un chercheur en  cybersécurité a alerté la police sur le fait que 500 000 numéros Aadhaar  ainsi que les détails personnels de leurs propriétaires - exclusivement  des mineurs - avaient été publiés en ligne. La loi sur l’Aadhaar punit  de trois ans de prison le vol ou le recel de ces données. Ce texte  adopté l’année dernière - soit six ans après le début de la collecte -  empêche également leur utilisation à d’autres fins que  l’authentification pour l’attribution de subventions et de services. Et  l’UIDAI ne peut y accéder pleinement qu’en cas de risque pour la  sécurité nationale, et selon une procédure spéciale.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Reste qu’il n’existe pas d’autorité, comme la Cnil en France&lt;i&gt;,&lt;/i&gt; chargée de veiller de manière indépendante à ce que ces lignes rouges  ne soient pas franchies par un Etat à la recherche de nouveaux moyens de  renseignement. Car les experts s’accordent sur ce point : le  biométrique est un outil idéal pour surveiller une population.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;En 2010, le gouvernement britannique avait d’ailleurs mis  fin à son projet de carte d’identité biométrique, estimant que le taux  d’erreurs dans l’authentification était trop élevé et le risque  d’atteinte aux libertés trop important. Les Indiens, souvent subjugués  par les nouvelles technologies pour résoudre leurs problèmes sociaux, ne  semblent pas prêts de revenir en arrière. Surtout si cela peut en plus  servir à mieux ficher un pays menacé par un terrorisme régional et  local.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public'&gt;http://editors.cis-india.org/internet-governance/news/en-inde-le-biometrique-version-tres-grand-public&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-05-03T16:27:23Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward">
    <title>Emerging Technologies: Issues &amp; Way Forward</title>
    <link>http://editors.cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward</link>
    <description>
        &lt;b&gt;Aayush Rathi and Gurshabad Grover attended a two day conference on 'Emerging Technologies: Issues &amp; Way Forward' organised by the Technology Policy team at the National Institute of Public Finance and Policy (NIPFP), held on 23rd and 24th May in Bangalore.&lt;/b&gt;
        &lt;p&gt;The themes for discussion included:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Privacy, surveillance and data protection&lt;/li&gt;
&lt;li&gt;Regulation of emerging technologies&lt;/li&gt;
&lt;li&gt;Building sound regulators for technology policy, and&lt;/li&gt;
&lt;li&gt;Fintech regulation&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/nipfp-bangalore-agenda"&gt;Click here&lt;/a&gt; to read the agenda&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward'&gt;http://editors.cis-india.org/internet-governance/news/emerging-technologies-issues-way-forward&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-05-26T00:39:11Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns">
    <title>Electoral Databases – Privacy and Security Concerns</title>
    <link>http://editors.cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns</link>
    <description>
        &lt;b&gt;In this blogpost, Snehashish Ghosh analyzes privacy and security concerns which have surfaced with the digitization, centralization and standardization of the electoral database and argues that even though the law provides the scope for protection of electoral databases, the State has not taken any steps to ensure its safety.&lt;/b&gt;
        &lt;p&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The recent move by the Election Commission of India (ECI) to tie-up with Google for providing electoral look-up services for citizens and electoral information services has faced heavy criticism on the grounds of data security and privacy.&lt;a href="#_edn1" name="_ednref1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[i]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; After due consideration, the ECI has decided to drop the plan.&lt;a href="#_edn2" name="_ednref2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[ii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The plan to partner with Google has led to much apprehension regarding Google gaining access to the database of 790 million voters including, personal information such as age, place of birth and residence. It could have also gained access to cell phone numbers and email addresses had the voter chosen to enroll via the online portal on the ECI website.  Although, the plan has been cancelled, it does not necessarily mean that the largest database of citizens of India is safe from any kind of security breach or abuse. In fact, the personal information of each voter in a constituency can be accessed by anyone through the ECI website and the publication of electoral rolls is mandated by the law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Publication of Electoral Rolls&lt;/b&gt;&lt;br /&gt;The electoral roll essentially contains the name of the voter, name of the relationship (son of/wife of, etc.), age, sex, address and the photo identity card number. The main objective of creation and maintenance of electoral rolls and the issue of Electoral Photo Identity Card (EPIC) was to ensure a free and fair election where the voter would have been  able to cast his own vote as per his own choice. In other words, the main purpose of the exercise was to curtail bogus voting. This is achieved by cross referencing the EPIC with the electoral roll.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The process of creation and maintenance of electoral rolls is governed by the Registration of Electors Rules, 1960. Rule 22 requires the registration officer to publish the roll with list of amendments at his office for inspection and public information. Furthermore, ECI may direct the registration officer to send two copies of the electoral roll to every political party for which a symbol has exclusively been reserved by the ECI. It can be safely concluded that the electoral roll of a constituency is a public document&lt;a href="#_edn3" name="_ednref3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[iii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; given that the roll is published and can be circulated on the direction of the ECI.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With the computational turn, in 1998 the ECI took the decision to digitize the electoral databases. Furthermore, printed electoral rolls and compact discs containing the rolls are available for sale to general public.&lt;a href="#_edn4" name="_ednref4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[iv]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; In addition to that, the electoral rolls for the entire country are available on the ECI website.&lt;a href="#_edn5" name="_ednref5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[v]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; However, the current database is not uniform and standardized, and entries in some constituencies are available only in the local language. The ECI has taken steps to make the database uniform, standardized and centralized.&lt;a href="#_edn6" name="_ednref6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[vi]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Security Concerns&lt;/b&gt;&lt;br /&gt;The Registration of Electoral Rules, 1960 is an archaic piece of delegated legislation which is still in force and casts a statutory duty on the ECI to publish the electoral rolls. The publication of electoral rolls is not a threat to security when it is distributed in hard copies and the availability of electoral rolls is limited. The security risks emerge only after the digitization of electoral database, which allows for uniformity, standardization and centralization of the database which in turn makes it vulnerable and subject to abuse. The law has failed to evolve with the change in technology.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In a recent article, Bill Davidow analyzes "the dark side of Moore’s Law" and argues that with the growth processing power there has been a growth in surveillance capabilities and on this note the article is titled, “&lt;i&gt;With Great Computing Power Comes Great Surveillance”&lt;/i&gt;&lt;a href="#_edn7" name="_ednref7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[vii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Drawing from Davidow’s argument, with the exponential growth in computing power, search has become convenient, faster and cheap. A uniform, standardized and centralized database bearing the personal information of 790 million voters can be searched and categorized in accordance with the search terms. The personal information of the voters can be used for good, but it can be equally abused if it falls into the wrong hands. Big data analysis or the computing power makes it easier to target voters, as bits and pieces of personal information give a bigger picture of an individual, a community, etc. This can be considered intrusive on individual’s privacy since the personal information of every voter is made available in the public domain&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;For example, the availability of a centralized, searchable database of voters along with their age would allow the appropriate authorities to identify wards or constituencies, which has a high population of voters above the age of 65. This would help the authority to set up polling booths at closer location with special amenities. However, the same database can be used to search for density of members of a particular community in a ward or constituency based on the name, age, sex of the voters. This information can be used to disrupt elections, target vulnerable communities during an election and rig elections.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Current IT Laws does not mandate the protection of the electoral database&lt;/b&gt;&lt;br /&gt;A centralized electoral database of the entire country can be considered as a critical information infrastructure (CII) given the impact it may have on the election which is the cornerstone of any democracy. Under Section 70 of the Information Technology Act, 2000 (IT Act) CII means “the computer resource, incapacitation or destruction of which, shall have debilitating impact on national security, economy.”&lt;a href="#_edn8" name="_ednref8"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[viii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; However, the appropriate Government has not notified the electoral database as a protected system&lt;a href="#_edn9" name="_ednref9"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[ix]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;. Therefore, information security practices and procedures for a protected system are not applicable to the electoral database.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Information Technology Rules (IT Rules) are also not applicable to electoral databases, &lt;i&gt;per se&lt;/i&gt;. Since, ECI is not a body corporate, the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information), Rules, 2011 (&lt;i&gt;hereinafter &lt;/i&gt;Reasonable Security Practices Rules) do not apply to electoral databases. Ignoring that Reasonable Security Practices Rules only apply to a body corporate, the electoral database does fall within the ambit of definition of “personal information”&lt;a href="#_edn10" name="_ednref10"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[x]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; and should arguably be made subject to the Rules.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The intent of the ECI for hosting the entire country’s electoral database online &lt;i&gt;inter alia&lt;/i&gt; is to provide electronic service delivery to the citizens. It seeks to provide “electoral look up services for citizens ... for better electoral information services.”&lt;a href="#_edn11" name="_ednref11"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[xi]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; However, the Information Technology (Electronic Service Delivery) Rules, 2011 are not applicable to the electoral database given that it is not notified by the appropriate Government as a service to be delivered electronically. Hence, the encryption and security standards for electronic service delivery are not applicable to electoral rolls.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The IT Act and the IT Rules provide a reasonable scope for the appropriate Government to include electoral databases within the ambit of protected system and electronic service delivery. However, the appropriate government has not taken any steps to notify electoral database as protected system or a mode of electronic service delivery under the existing laws.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Conclusion&lt;/b&gt;&lt;br /&gt;Publication of electoral rolls is a necessary part of an election process. It ensures free and fair election and promotes transparency and accountability. But unfettered access to electronic electoral databases may have an adverse effect and would endanger the very goal it seeks to achieve because the electronic database may pose threat to privacy of the voters and also lead to security breach.  It may be argued that the ECI is mandated by the law to publish the electoral database and hence, it is beyond the operation of the IT Act. But Section 81 of the IT Act has an overriding effect on any law inconsistent, therewith. The appropriate Government should take necessary steps under the IT Act and notify electoral databases as a protected system.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is recommended that the Electors Registration Rules, 1960 should be amended, taking into account the advancement in technology. Therefore, the Rules should aim at restricting the unfettered electronic access to the electoral database and also introduce purposive limitation on the use of the electoral database. It should also be noted that more adequate and robust data protection and privacy laws should be put in place, which would regulate the collection, use, storage and processing of databases which are critical to national security.&lt;/p&gt;
&lt;div&gt;
&lt;hr align="left" size="1" width="100%" /&gt;
&lt;div id="edn1"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref1" name="_edn1"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[i]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Pratap Vikram Singh, Post-uproar, EC’s Google tie-up plan may go for a toss, Governance Now, January 7, 2014 available at &lt;a class="external-link" href="http://www.governancenow.com/news/regular-story/post-uproar-ecs-google-tie-plan-may-go-toss"&gt;http://www.governancenow.com/news/regular-story/post-uproar-ecs-google-tie-plan-may-go-toss&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn2"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref2" name="_edn2"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[ii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Press Note No.ECI/PN/1/2014, Election Commission of India , January 9, 2014 available at &lt;a class="external-link" href="http://eci.nic.in/eci_main1/current/PN09012014.pdf"&gt;http://eci.nic.in/eci_main1/current/PN09012014.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn3"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref3" name="_edn3"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[iii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Section 74, Indian Evidence Act, 1872&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn4"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref4" name="_edn4"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[iv]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; &lt;a class="external-link" href="http://eci.nic.in/eci_main1/the_function.aspx"&gt;eci.nic.in/eci_main1/the_function.aspx&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn5"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref5" name="_edn5"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[v]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; &lt;a class="external-link" href="http://eci.nic.in/eci_main1/Linkto_erollpdf.aspx"&gt;http://eci.nic.in/eci_main1/Linkto_erollpdf.aspx&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn6"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref6" name="_edn6"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[vi]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; “At present, in most States and UTs the Electoral Database is kept at the district level. In some cases it is kept even with the vendors. In most States/UTs it is maintained in MS Access, while in some cases it is on a primitive technology like FoxPro and in some other cases on advanced RDBMS like Oracle or Sql Server. The database is not kept in bilingual form in some of the States/UTs, despite instructions of the Commission. In most cases Unicode fonts are not used. The database structure not being uniform in the country, makes it almost impossible for the different databases to talk to each other” –  Election Commission of India, Revision of Electoral Rolls with reference to 01-01-2010 as the qualifying date – Integration and Standardization of the database- reg., No. 23/2009-ERS, January 6, 2010 available at e&lt;a class="external-link" href="http://eci.nic.in/eci_main/eroll&amp;amp;epic/ins06012010.pdf"&gt;ci.nic.in/eci_main/eroll&amp;amp;epic/ins06012010.pdf&lt;/a&gt;&lt;span dir="RTL"&gt;&lt;/span&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn7"&gt;
&lt;p class="MsoEndnoteText"&gt;&lt;a href="#_ednref7" name="_edn7"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[vii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt;&lt;a class="external-link" href="http://eci.nic.in/eci_main1/current/PN09012014.pdf"&gt;&lt;span&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt; &lt;/span&gt;&lt;/span&gt;&lt;/span&gt;http://www.theatlantic.com/technology/archive/2014/01/with-great-computing-power-comes-great-surveillance/282933/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn8"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref8" name="_edn8"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[viii]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Section 70, Information Technology Act, 2000&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn9"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref9" name="_edn9"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[ix]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Computer resource which directly or indirectly affects the facility of Critical Information Infrastructure&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn10"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref10" name="_edn10"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[x]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Rule 2(1)(i), Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="edn11"&gt;
&lt;p class="MsoEndnoteText" style="text-align: justify; "&gt;&lt;a href="#_ednref11" name="_edn11"&gt;&lt;span class="MsoEndnoteReference"&gt;&lt;span class="MsoEndnoteReference"&gt;[xi]&lt;/span&gt;&lt;/span&gt;&lt;/a&gt; Press Note No.ECI/PN/1/2014, Election Commission of India , January 9, 2014 available at &lt;a class="external-link" href="http://eci.nic.in/eci_main1/current/PN09012014.pdf"&gt;http://eci.nic.in/eci_main1/current/PN09012014.pdf&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns'&gt;http://editors.cis-india.org/internet-governance/blog/electoral-databases-2013-privacy-and-security-concerns&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>snehashish</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Digital Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Cybersecurity</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Safety</dc:subject>
    
    
        <dc:subject>Information Technology</dc:subject>
    
    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Security</dc:subject>
    
    
        <dc:subject>e-Governance</dc:subject>
    
    
        <dc:subject>Transparency, Politics</dc:subject>
    
    
        <dc:subject>E-Governance</dc:subject>
    

   <dc:date>2014-01-16T11:07:21Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/news/livemint-january-31-2014-anuja-moulishree-srivastava-election-panel-rejects-google-proposal-for-electoral-services-tie-up">
    <title>Election panel rejects Google’s proposal for electoral services tie-up</title>
    <link>http://editors.cis-india.org/news/livemint-january-31-2014-anuja-moulishree-srivastava-election-panel-rejects-google-proposal-for-electoral-services-tie-up</link>
    <description>
        &lt;b&gt;EC had earlier signed a non-disclosure agreement with Google but had not shared or handed over any data to the Internet giant so far. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Anuja and Moulishree Srivastava was &lt;a class="external-link" href="http://www.livemint.com/Politics/Ff3ecnx7UO9d891CDwuGoM/EC-aborts-tieup-with-Google-over-security-concerns.html"&gt;published in Livemint&lt;/a&gt; on January 9, 2014. Lawrence Liang is quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;The Election Commission (EC) on Thursday rejected a proposal by Internet search engine operator &lt;a href="http://www.livemint.com/Search/Link/Keyword/Google%20Inc."&gt;Google Inc.&lt;/a&gt; to provide electoral information services to EC ahead of the general election due later this year. &lt;a href="http://www.livemint.com/Search/Link/Keyword/Google"&gt;Google&lt;/a&gt;’s proposal, made earlier this week, was criticized by experts and political parties on the grounds of security.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Google, which deals with Internet-related services and products, had made a presentation at EC where it proposed to deliver voter facilitation services through a tie-up with the Commission.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Google made a presentation to the Commission for electoral hook up services for citizens to help in efforts of the Commission for better electoral information services. However, after due consideration, the Commission has decided not to pursue the proposal any further,” EC said in a statement.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Its decision came at a meeting of senior EC officials on Thursday, called to discuss the proposal. Security was one of the main issues before it.&lt;/p&gt;
&lt;table class="listing"&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;th&gt;&lt;iframe frameborder="0" height="315" src="http://www.youtube.com/embed/5yHMBsAnbc4" width="420"&gt;&lt;/iframe&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p style="text-align: justify; "&gt;“Security and controlling the data were the main points which were considered. By ways of such a tie-up all the data would have been up for access. It was always a question of whether Indian laws would apply to it or not, so we decided against it,” a senior official from EC said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;PTI&lt;/i&gt; reported that the meeting was attended by the chief election commissioner &lt;a href="http://www.livemint.com/Search/Link/Keyword/V.S.%20Sampath"&gt;V.S. Sampath&lt;/a&gt; and election commissioners &lt;a href="http://www.livemint.com/Search/Link/Keyword/H.S.%20Brahma"&gt;H.S. Brahma&lt;/a&gt; and &lt;a href="http://www.livemint.com/Search/Link/Keyword/S.%20N.%20A.%20Zaidi"&gt;S. N. A. Zaidi&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;The Times of India &lt;/i&gt;in a report on Sunday said there were concerns over the EC move to tie up with Google for voter registration.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;EC had earlier signed a non-disclosure agreement with Google but it had not shared any data with it. The move was criticised by the ruling Congress party as well as the main opposition Bharatiya Janata Party. The legal cell of the Congress had written to EC raising concerns over national security and asking whether the tie-up would affect the electoral process. The BJP’s complaint was that stakeholders, including political parties, should have been consulted.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Experts say that in the event of such a tie-up, concerns about protection of privacy would have outweighed national security fears.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“The concern is not so much about national security as it is about privacy issues. This kind of database is too important and too powerful to be controlled by a private company. There have been too many instances of this kind of data being skewed and riots happening during the election process. Privately owned databases could lead to potential misuse of the data,” said &lt;a href="http://www.livemint.com/Search/Link/Keyword/Lawrence%20Liang"&gt;Lawrence Liang&lt;/a&gt;, co-founder of Alternative Law Forum and chairman of the board at the Bangalore-based Centre for Internet and Society.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“It is not a question of how and what service Google could have provided for elections, but how the state can bring itself to provide that kind of service,” he said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the US, when &lt;a href="http://www.livemint.com/Search/Link/Keyword/George%20W.%20Bush"&gt;George W. Bush&lt;/a&gt; was re-elected president in 2004, the company that manufactured the voting machines was accused of rigging the polls, Liang added.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Google called the EC’s rejection “unfortunate”, pointing out that the company has already helped governments with such services in countries like the Philippines, Egypt, Mexico and Kenya.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“It is unfortunate that our discussions with the Election Commission of India to change the way users access their electoral information, that is publicly available, through an online voter look up tool, were not fruitful,” Google said in a statement.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“Google will continue to develop tools and resources to make civic information universally accessible and useful, help drive more informed citizen participation, and open up new avenues for engagement for politicians, citizens, and civic leaders,” it added.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/news/livemint-january-31-2014-anuja-moulishree-srivastava-election-panel-rejects-google-proposal-for-electoral-services-tie-up'&gt;http://editors.cis-india.org/news/livemint-january-31-2014-anuja-moulishree-srivastava-election-panel-rejects-google-proposal-for-electoral-services-tie-up&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2014-01-31T08:58:01Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015">
    <title>Eight Key Privacy Events in India in the Year 2015</title>
    <link>http://editors.cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015</link>
    <description>
        &lt;b&gt;As the year draws to a close, we are enumerating some of the key privacy related events in India that transpired in 2015. Much like the last few years, this year, too, was an eventful one in the context of privacy.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;While we did not witness, as one had hoped, any progress in the passage of a privacy law, the year saw significant developments with respect to the ongoing 	Aadhaar case. The statement by the Attorney General, India's foremost law officer, that there is a lack of clarity over whether the right to privacy is a fundamental right, and the fact the the matter is yet unresolved was a huge setback to the jurisprudence on privacy.	&lt;a href="#_ftn1" name="_ftnref1"&gt;[1]&lt;/a&gt; However, the court has recognised a purpose limitation as applicable into the Aadhaar scheme, limiting 	the sharing of any information collected during the enrollment of residents in UID. A draft Encryption Policy was released and almost immediately withdrawn 	in the face of severe public backlash, and an updated Human DNA Profiling Bill was made available for comments. Prime Minister Narendra Modi's much 	publicised project "Digital India" was in news throughout the year, and it also attracted its' fair share of criticism in light of the lack of privacy 	safeguards it offered. Internationally, a lawsuit brought by Maximilian Schrems, an Austrian privacy activist, dealt a body blow to the fifteen year old 	Safe Harbour Framework in place for data transfers between EU and USA. Below, we look at what were, according to us, the eight most important privacy 	events in India, in 2015.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;1. &lt;/b&gt; &lt;b&gt;August 11, 2015 order on Aadhaar not being compulsory&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In 2012, a writ petition was filed by Judge K S Puttaswamy challenging the government's policy in its attempt to enroll all residents of India in the UID 	project and linking the Aadhaar card with various government services. A number of other petitioners who filed cases against the Aadhaar scheme have also 	been linked with this petition and the court has been hearing them together. On September 11, 2015, the Supreme Court reiterated its position in earlier orders made on September 23, 2013 and March 24, 2014 stating that the Aadhaar card shall not be made compulsory for any government services.	&lt;a href="#_ftn2" name="_ftnref2"&gt;[2]&lt;/a&gt; Building on its earlier position, the court passed the following orders:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;a) The government must give wide publicity in the media that it was not mandatory for a resident to obtain an Aadhaar card,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;b) The production of an Aadhaar card would not be a condition for obtaining any benefits otherwise due to a citizen,&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;c) Aadhaar card would not be used for any purpose other than the PDS Scheme, for distribution of foodgrains and cooking fuel such as kerosene and for the 	LPG distribution scheme.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;d) The information about an individual obtained by the UIDAI while issuing an Aadhaar card shall not be used for any other purpose, save as above, except 	as may be directed by a Court for the purpose of criminal investigation.&lt;a href="#_ftn3" name="_ftnref3"&gt;[3]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Despite this being the fifth court order given by the Supreme Court&lt;a href="#_ftn4" name="_ftnref4"&gt;[4]&lt;/a&gt; stating that the Aadhaar card cannot 	be a mandatory requirement for access to government services or subsidies, repeated violations continue. One of the violations which has been widely 	reported is the continued requirement of an Aadhaar number to set up a Digital Locker account which also led to activist, Sudhir Yadav filing a petition in 	the Supreme Court.&lt;a href="#_ftn5" name="_ftnref5"&gt;[5]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;2. &lt;/b&gt; &lt;b&gt;No Right to Privacy - Attorney General to SC&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Attorney General, Mukul Rohatgi argued before the Supreme Court in the Aadhaar case that the Constitution of India did not provide for a fundamental 	Right to Privacy.&lt;a href="#_ftn6" name="_ftnref6"&gt;[6]&lt;/a&gt; He referred to the body of case in the Supreme Court dealing with this issue and made a 	reference to the 1954 case, MP Sharma v. Satish Chandra&lt;a href="#_ftn7" name="_ftnref7"&gt;[7]&lt;/a&gt; stating that there was "clear divergence of 	opinion" on the Right to Privacy and termed it as "a classic case of unclear position of law." He also referred to the discussion on this matter in the 	Constitutional Assembly Debates and pointed to the fact the framers of the Constitution did not intend for this to be a fundamental right. He said the 	matter needed to be referred to a nine judge Constitution bench.&lt;a href="#_ftn8" name="_ftnref8"&gt;[8]&lt;/a&gt; This raises serious questions over the 	jurisprudence developed by the Supreme Court on the right to privacy over the last five decades. The matter is currently pending resolution by a larger 	bench which needs to be constituted by the Chief Justice of India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;3. &lt;/b&gt; &lt;b&gt;Shreya Singhal judgment and Section 69A, IT Act&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the much celebrated judgment, Shreya Singhal v. Union of India, in March 2015, the Supreme Court struck down Section 66A of the Information Technology 	Act, 2000 as unconstitutional and laid down guidelines for online takedowns under the Internet intermediary rules. However, significantly, the court also 	upheld Section 69A and the blocking rules under this provision. It was held to be a narrowly-drawn provision with adequate safeguards. The rules prescribe 	a procedure for blocking which involves receipt of a blocking request, examination of the request by the Committee and a review committee which performs 	oversight functions. However, commentators have pointed to the opacity of the process in the rules under this provisions. While the rules mandate that a 	hearing is given to the originator of the content, this safeguard is widely disregarded. The judgment did not discuss Section 69 of the Information 	Technology Act, 2000 which deal with decrypting of electronic communication, however, the Department of Electronic and Information Technology brought up 	this issue subsequently, through a Draft Encryption Policy, discussed below.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;4. &lt;/b&gt; &lt;b&gt;Circulation and recall of Draft Encryption Policy&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;On October 19, 2015, the Department of Electronic and Information Technology (DeitY) released for public comment a draft National Encryption Policy. The draft received an immediate and severe backlash from commentators, and was withdrawn by September 22, 2015.	&lt;a href="#_ftn9" name="_ftnref9"&gt;[9]&lt;/a&gt; The government blamed a junior official for the poor drafting of the document and noted that it had been 	released without a review by the Telecom Minister, Ravi Shankar Prasad and other senior officials.&lt;a href="#_ftn10" name="_ftnref10"&gt;[10]&lt;/a&gt; The 	main areas of contention were a requirement that individuals store plain text versions of all encrypted communication for a period of 90 days, to be made 	available to law enforcement agencies on demand; the government's right to prescribe key-strength, algorithms and ciphers; and only government-notified 	encryption products and vendors registered with the government being allowed to be used for encryption.&lt;a href="#_ftn11" name="_ftnref11"&gt;[11]&lt;/a&gt; The purport of the above was to limit the ways in which citizens could encrypt electronic communication, and to allow adequate access to law enforcement 	agencies. The requirement to keep all encrypted information in plain text format for a period of 90 days garnered particular criticism as it would allow 	for creation of a 'honeypot' of unencrypted data, which could attract theft and attacks.&lt;a href="#_ftn12" name="_ftnref12"&gt;[12]&lt;/a&gt; The withdrawal of the draft policy is not the final chapter in this story, as the Telecom Minister has promised that the Department will come back with a revised policy.	&lt;a href="#_ftn13" name="_ftnref13"&gt;[13]&lt;/a&gt; This attempt to put restrictions on use of encryption technologies is not only in line with a host of 	surveillance initiatives that have mushroomed in India in the last few years,&lt;a href="#_ftn14" name="_ftnref14"&gt;[14]&lt;/a&gt; but also finds resonance with a global trend which has seen various governments and law enforcement organisations argue against encryption.	&lt;a href="#_ftn15" name="_ftnref15"&gt;[15]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;5. &lt;/b&gt; &lt;b&gt;Privacy concerns raised about Digital India&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Digital India initiative includes over thirty Mission Mode Projects in various stages of implementation.	&lt;a href="#_ftn16" name="_ftnref16"&gt;[16]&lt;/a&gt; All of these projects entail collection of vast quantities of personally identifiable information of 	the citizens. However, most of these initiatives do not have clearly laid down privacy policies.&lt;a href="#_ftn17" name="_ftnref17"&gt;[17]&lt;/a&gt; There 	is also a lack of properly articulated access control mechanisms and doubts over important issues such as data ownership owing to most projects involving public private partnership which involves private organisation collecting, processing and retaining large amounts of data.	&lt;a href="#_ftn18" name="_ftnref18"&gt;[18]&lt;/a&gt; Ahead of Prime Minister Modi's visit to the US, over 100 hundred prominent US based academics released a statement raising concerns about "lack of safeguards about privacy of information, and thus its potential for abuse" in the Digital India project.	&lt;a href="#_ftn19" name="_ftnref19"&gt;[19]&lt;/a&gt; It has been pointed out that the initiatives could enable a "cradle-to-grave digital identity that is unique, lifelong, and authenticable, and it plans to widely use the already mired in controversy Aadhaar program as the identification system."	&lt;a href="#_ftn20" name="_ftnref20"&gt;[20]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;6. &lt;/b&gt; &lt;b&gt;Issues with Human DNA Profiling Bill, 2015&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Human DNA Profiling Bill, 2015 envisions the creation of national and regional DNA databases comprising DNA profiles of the categories of persons 	specified in the Bill.&lt;a href="#_ftn21" name="_ftnref21"&gt;[21]&lt;/a&gt; The categories include offenders, suspects, missing persons, unknown deceased 	persons, volunteers and such other categories specified by the DNA Profiling Board which has oversight over these banks. The Bill grants wide discretionary powers to the Board to introduce new DNA indices and make DNA profiles available for new purposes it may deem fit.	&lt;a href="#_ftn22" name="_ftnref22"&gt;[22]&lt;/a&gt; These, and the lack of proper safeguards surrounding issues like consent, retention and collection 	pose serious privacy risks if the Bill becomes a law. Significantly, there is no element of purpose limitation in the proposed law, which would allow the 	DNA samples to be re-used for unspecified purposes.&lt;a href="#_ftn23" name="_ftnref23"&gt;[23]&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;7. &lt;/b&gt; &lt;b&gt;Impact of the Schrems ruling on India&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In Schrems v. Data Protection Commissioner, the Court of Justice in European Union (CJEU) annulled the Commission Decision 2000/520 according to which US 	data protection rules were deemed sufficient to satisfy EU privacy rules enabling transfers of personal data from EU to US, otherwise known as the 'Safe 	Harbour' framework. The court ruled that broad formulations of derogations on grounds of national security, public interest and law enforcement in place in 	the US goes beyond the test of proportionality and necessity under the Data Protection rules.&lt;a href="#_ftn24" name="_ftnref24"&gt;[24]&lt;/a&gt; This 	judgment could also have implications for the data processing industry in India. For a few years now, a framework similar to the Safe Harbour has been 	under discussion for transfer of data between India and EU. The lack of a privacy legislation has been among the significant hurdles in arriving at a 	framework.&lt;a href="#_ftn25" name="_ftnref25"&gt;[25]&lt;/a&gt; In the absence of a Safe Harbour framework, the companies in India rely on alternate 	mechanisms such as Binding Corporate Rules (BCR) or Model Contractual Clauses. These contracts impose the obligation on the data exporters and importers to 	ensure that 'adequate level of data protection' is provided. The Schrems judgement makes it clear that 'adequate level of data protection' entails a regime 	that is 'essentially equivalent' to that envisioned under Directive 95/46.&lt;a href="#_ftn26" name="_ftnref26"&gt;[26]&lt;/a&gt; What this means is that any 	new framework of protection between EU and other countries like US or India will necessarily have to meet this test of essential equivalence. The PRISM 	programme in the US and a host of surveillance programmes that have been initiated by the government in India in the last few years could pose problems in 	satisfying this test of essential equivalence as they do not conform to the proportionality and necessity principles.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;8. &lt;/b&gt; &lt;b&gt;The definition of "unfair trade practices" in the Consumer Protection Bill, 2015&lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The Consumer Protection Bill, 2015, tabled in the Parliament towards the end of the monsoon session&lt;a href="#_ftn27" name="_ftnref27"&gt;[27]&lt;/a&gt; has 	introduced an expansive definition of the term "unfair trade practices." The definition as per the Bill includes the disclosure "to any other person any 	personal information given in confidence by the consumer."&lt;a href="#_ftn28" name="_ftnref28"&gt;[28]&lt;/a&gt; This clause exclude from the scope of unfair 	trade practices, disclosures under provisions of any law in force or in public interest. This provision could have significant impact on the personal data 	protection law in India. Currently, the only law governing data protection law are the Reasonable security practices and procedures and sensitive personal 	data or information Rules, 2011&lt;a href="#_ftn29" name="_ftnref29"&gt;[29]&lt;/a&gt; prescribed under Section 43A of the Information Technology Act, 2000. Under these rules, sensitive personal data or information is protected in that their disclosure requires prior permission from the data subject.	&lt;a href="#_ftn30" name="_ftnref30"&gt;[30]&lt;/a&gt; For other kinds of personal information not categorized as sensitive personal data or information, the only recourse of data subjects in case to claim breach of the terms of privacy policy which constitutes a lawful contract.	&lt;a href="#_ftn31" name="_ftnref31"&gt;[31]&lt;/a&gt; The Consumer Protection Bill, 2015, if enacted as law, could significantly expand the scope of 	protection available to data subjects. First, unlike the Section 43A rules, the provisions of the Bill would be applicable to physical as well as 	electronic collection of personal information. Second, disclosure to a third party of personal information other than sensitive personal data or 	information could also have similar 'prior permission' criteria under the Bill, if it can be shown that the information was shared by the consumer in 	confidence.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;What we see above are events largely built around a few trends that we have been witnessing in the context of privacy in India, in particular and across 	the world, in general. Lack of privacy safeguards in initiatives like the Aadhaar project and Digital India is symptomatic of policies that are not 	comprehensive in their scope, and consequently fail to address key concerns. Dr Usha Ramanathan has called these policies "powerpoint based policies" which are implemented based on proposals which are superficial in their scope and do not give due regard to their impact on a host of issues.	&lt;a href="#_ftn32" name="_ftnref32"&gt;[32]&lt;/a&gt; Second, the privacy concerns posed by the draft Encryption Policy and the Human DNA Profiling Bill point to the motive of surveillance that is in line with other projects introduced with the intent to protect and preserve national security.	&lt;a href="#_ftn33" name="_ftnref33"&gt;[33]&lt;/a&gt; Third, the incidents that championed the cause of privacy like the Schrems judgment have largely been 	initiated by activists and civil society actors, and have typically entailed the involvement of the judiciary, often the single recourse of actors in the 	campaign for the protection of civil rights. It must be noted that jurisprudence on the right to privacy in India has not moved beyond the guidelines set 	forth by the Supreme Court in PUCL v. Union of India.&lt;a href="#_ftn34" name="_ftnref34"&gt;[34]&lt;/a&gt; However, new mass surveillance programmes and 	massive collection of personal data by both public and private parties through various schemes mandated a re-look at the standards laid down twenty years 	ago. The privacy issue pending resolution by a larger bench in the Aadhaar case affords an opportunity to revisit those principles in light of how 	surveillance has changed in the last two decades and strengthen privacy and data protection.&lt;/p&gt;
&lt;div style="text-align: justify; "&gt;
&lt;hr /&gt;
&lt;div id="ftn1"&gt;
&lt;p&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Right to Privacy not a fundamental right, cannot be invoked to scrap Aadhar: Centre tells Supreme Court, available at 			&lt;a href="http://articles.economictimes.indiatimes.com/2015-07-23/news/64773078_1_fundamental-right-attorney-general-mukul-rohatgi-privacy"&gt; http://articles.economictimes.indiatimes.com/2015-07-23/news/64773078_1_fundamental-right-attorney-general-mukul-rohatgi-privacy &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn2"&gt;
&lt;p&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; SC allows govt to link Aadhaar card with PDS and LPG subsidies, available at 			&lt;a href="http://timesofindia.indiatimes.com/india/SC-allows-govt-to-link-Aadhaar-card-with-PDS-and-LPG-subsidies/articleshow/48436223.cms"&gt; http://timesofindia.indiatimes.com/india/SC-allows-govt-to-link-Aadhaar-card-with-PDS-and-LPG-subsidies/articleshow/48436223.cms &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn3"&gt;
&lt;p&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://judis.nic.in/supremecourt/imgs1.aspx?filename=42841"&gt;http://judis.nic.in/supremecourt/imgs1.aspx?filename=42841&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn4"&gt;
&lt;p&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Five SC Orders Later, Aadhaar Requirement Continues to Haunt Many, available at 			&lt;a href="http://thewire.in/2015/09/19/five-sc-orders-later-aadhaar-requirement-continues-to-haunt-many-11065/"&gt; http://thewire.in/2015/09/19/five-sc-orders-later-aadhaar-requirement-continues-to-haunt-many-11065/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn5"&gt;
&lt;p&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;[5]&lt;/a&gt; Digital Locker scheme challenged in Supreme Court, available at 			&lt;a href="http://www.moneylife.in/article/digital-locker-scheme-challenged-in-supreme-court/42607.html"&gt; http://www.moneylife.in/article/digital-locker-scheme-challenged-in-supreme-court/42607.html &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn6"&gt;
&lt;p&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Privacy not a fundamental right, argues Mukul Rohatgi for Govt as Govt affidavit says otherwise, available at 			&lt;a href="http://www.legallyindia.com/Constitutional-law/privacy-not-a-fundamental-right-argues-mukul-rohatgi-for-govt-as-govt-affidavit-says-otherwise"&gt; http://www.legallyindia.com/Constitutional-law/privacy-not-a-fundamental-right-argues-mukul-rohatgi-for-govt-as-govt-affidavit-says-otherwise &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn7"&gt;
&lt;p&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; 1954 SCR 1077.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn8"&gt;
&lt;p&gt;&lt;a href="#_ftnref8" name="_ftn8"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Supra Note 1.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn9"&gt;
&lt;p&gt;&lt;a href="#_ftnref9" name="_ftn9"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Government to withdraw draft encryption policy, available at 			&lt;a href="http://www.thehindu.com/news/national/govt-to-withdraw-draft-encryption-policy/article7677348.ece"&gt; http://www.thehindu.com/news/national/govt-to-withdraw-draft-encryption-policy/article7677348.ece &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn10"&gt;
&lt;p&gt;&lt;a href="#_ftnref10" name="_ftn10"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Encryption policy poorly worded by officer: Telecom Minister Ravi Shankar Prasad, available at 			&lt;a href="http://economictimes.indiatimes.com/articleshow/49068406.cms?utm_source=contentofinterest&amp;amp;utm_medium=text&amp;amp;utm_campaign=cppst"&gt; http://economictimes.indiatimes.com/articleshow/49068406.cms?utm_source=contentofinterest&amp;amp;utm_medium=text&amp;amp;utm_campaign=cppst &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn11"&gt;
&lt;p&gt;&lt;a href="#_ftnref11" name="_ftn11"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Updated: India's draft encryption policy puts user privacy in danger, available at 			&lt;a href="http://www.medianama.com/2015/09/223-india-draft-encryption-policy/"&gt; http://www.medianama.com/2015/09/223-india-draft-encryption-policy/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn12"&gt;
&lt;p&gt;&lt;a href="#_ftnref12" name="_ftn12"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Bhairav Acharya, The short-lived adventure of India's encryption policy, available at 			&lt;a href="http://notacoda.net/2015/10/10/the-short-lived-adventure-of-indias-encryption-policy/"&gt; http://notacoda.net/2015/10/10/the-short-lived-adventure-of-indias-encryption-policy/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn13"&gt;
&lt;p&gt;&lt;a href="#_ftnref13" name="_ftn13"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Supra Note 9.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn14"&gt;
&lt;p&gt;&lt;a href="#_ftnref14" name="_ftn14"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Maria Xynou, Big democracy, big surveillance: India's surveillance state, available at 			&lt;a href="https://www.opendemocracy.net/opensecurity/maria-xynou/big-democracy-big-surveillance-indias-surveillance-state"&gt; https://www.opendemocracy.net/opensecurity/maria-xynou/big-democracy-big-surveillance-indias-surveillance-state &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn15"&gt;
&lt;p&gt;&lt;a href="#_ftnref15" name="_ftn15"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; China passes controversial anti-terrorism law to access encrypted user accounts, available at 			&lt;a href="http://www.theverge.com/2015/12/27/10670346/china-passes-law-to-access-encrypted-communications"&gt; http://www.theverge.com/2015/12/27/10670346/china-passes-law-to-access-encrypted-communications &lt;/a&gt; ; Police renew call against encryption technology that can help hide terrorists, available at 			&lt;a href="http://www.washingtontimes.com/news/2015/nov/16/paris-terror-attacks-renew-encryption-technology-s/?page=all"&gt; http://www.washingtontimes.com/news/2015/nov/16/paris-terror-attacks-renew-encryption-technology-s/?page=all &lt;/a&gt; .&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn16"&gt;
&lt;p&gt;&lt;a href="#_ftnref16" name="_ftn16"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://www.mmp.cips.org.in/digital-india/"&gt;http://www.mmp.cips.org.in/digital-india/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn17"&gt;
&lt;p&gt;&lt;a href="#_ftnref17" name="_ftn17"&gt;[17]&lt;/a&gt; &lt;a href="http://slides.com/cisindia/big-data-in-indian-governance-preliminary-findings#/"&gt; http://slides.com/cisindia/big-data-in-indian-governance-preliminary-findings#/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn18"&gt;
&lt;p&gt;&lt;a href="#_ftnref18" name="_ftn18"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Indira Jaising, Digital India Schemes Must Be Preceded by a Data Protection and Privacy Law, available at 			&lt;a href="http://thewire.in/2015/07/04/digital-india-schemes-must-be-preceded-by-a-data-protection-and-privacy-law-5471/"&gt; http://thewire.in/2015/07/04/digital-india-schemes-must-be-preceded-by-a-data-protection-and-privacy-law-5471/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn19"&gt;
&lt;p&gt;&lt;a href="#_ftnref19" name="_ftn19"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; US academics raise privacy concerns over 'Digital India' campaign, available at			&lt;a href="http://yourstory.com/2015/08/us-digital-india-campaign/"&gt;http://yourstory.com/2015/08/us-digital-india-campaign/&lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn20"&gt;
&lt;p&gt;&lt;a href="#_ftnref20" name="_ftn20"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Lisa Hayes, Digital India's Impact on Privacy: Aadhaar numbers, biometrics, and more, available at 			&lt;a href="https://cdt.org/blog/digital-indias-impact-on-privacy-aadhaar-numbers-biometrics-and-more/"&gt; https://cdt.org/blog/digital-indias-impact-on-privacy-aadhaar-numbers-biometrics-and-more/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn21"&gt;
&lt;p&gt;&lt;a href="#_ftnref21" name="_ftn21"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://www.prsindia.org/uploads/media/draft/Draft%20Human%20DNA%20Profiling%20Bill%202015.pdf"&gt; http://www.prsindia.org/uploads/media//draft/Draft%20Human%20DNA%20Profiling%20Bill%202015.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn22"&gt;
&lt;p&gt;&lt;a href="#_ftnref22" name="_ftn22"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Comments on India's Human DNA Profiling Bill (June 2015 version), available at 			&lt;a href="http://www.genewatch.org/uploads/f03c6d66a9b354535738483c1c3d49e4/IndiaDNABill_FGPI_15.pdf"&gt; http://www.genewatch.org/uploads/f03c6d66a9b354535738483c1c3d49e4/IndiaDNABill_FGPI_15.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn23"&gt;
&lt;p&gt;&lt;a href="#_ftnref23" name="_ftn23"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Elonnai Hickok, Vanya Rakesh and Vipul Kharbanda, CIS Comments and Recommendations to the Human DNA Profiling Bill, June 2015, available at 			&lt;a href="http://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015"&gt; http://cis-india.org/internet-governance/blog/cis-comments-and-recommendations-to-human-dna-profiling-bill-2015 &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn24"&gt;
&lt;p&gt;&lt;a href="#_ftnref24" name="_ftn24"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://curia.europa.eu/jcms/upload/docs/application/pdf/2015-10/cp150117en.pdf"&gt; http://curia.europa.eu/jcms/upload/docs/application/pdf/2015-10/cp150117en.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn25"&gt;
&lt;p&gt;&lt;a href="#_ftnref25" name="_ftn25"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Jyoti Pandey, Contestations of Data, ECJ Safe Harbor Ruling and Lessons for India, available at 			&lt;a href="http://cis-india.org/internet-governance/blog/contestations-of-data-ecj-safe-harbor-ruling-and-lessons-for-india"&gt; http://cis-india.org/internet-governance/blog/contestations-of-data-ecj-safe-harbor-ruling-and-lessons-for-india &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn26"&gt;
&lt;p&gt;&lt;a href="#_ftnref26" name="_ftn26"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Simon Cox, Case Watch: Making Sense of the Schrems Ruling on Data Transfer, available at 			&lt;a href="https://www.opensocietyfoundations.org/voices/case-watch-making-sense-schrems-ruling-data-transfer"&gt; https://www.opensocietyfoundations.org/voices/case-watch-making-sense-schrems-ruling-data-transfer &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn27"&gt;
&lt;p&gt;&lt;a href="#_ftnref27" name="_ftn27"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://www.prsindia.org/billtrack/the-consumer-protection-bill-2015-3965/"&gt; http://www.prsindia.org/billtrack/the-consumer-protection-bill-2015-3965/ &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn28"&gt;
&lt;p&gt;&lt;a href="#_ftnref28" name="_ftn28"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Section 2(41) (I) of the Consumer Protection Bill, 2015.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn29"&gt;
&lt;p&gt;&lt;a href="#_ftnref29" name="_ftn29"&gt;&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://www.ijlt.in/pdffiles/IT-(Reasonable%20Security%20Practices)-Rules-2011.pdf"&gt; http://www.ijlt.in/pdffiles/IT-%28Reasonable%20Security%20Practices%29-Rules-2011.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn30"&gt;
&lt;p&gt;&lt;a href="#_ftnref30" name="_ftn30"&gt;&lt;sup&gt;&lt;sup&gt;[30]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Rule 6 of Reasonable security practices and procedures and sensitive personal data or information Rules, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn31"&gt;
&lt;p&gt;&lt;a href="#_ftnref31" name="_ftn31"&gt;&lt;sup&gt;&lt;sup&gt;[31]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Rule 4 of Reasonable security practices and procedures and sensitive personal data or information Rules, 2011&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn32"&gt;
&lt;p&gt;&lt;a href="#_ftnref32" name="_ftn32"&gt;&lt;sup&gt;&lt;sup&gt;[32]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; &lt;a href="http://cis-india.org/internet-governance/events/communication-rights-in-the-age-of-digital-technology"&gt; http://cis-india.org/internet-governance/events/communication-rights-in-the-age-of-digital-technology &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn33"&gt;
&lt;p&gt;&lt;a href="#_ftnref33" name="_ftn33"&gt;&lt;sup&gt;&lt;sup&gt;[33]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Supra Note 11.&lt;/p&gt;
&lt;/div&gt;
&lt;div id="ftn34"&gt;
&lt;p&gt;&lt;a href="#_ftnref34" name="_ftn34"&gt;&lt;sup&gt;&lt;sup&gt;[34]&lt;/sup&gt;&lt;/sup&gt;&lt;/a&gt; &lt;sup&gt; &lt;/sup&gt; Chaitanya Ramachandra, PUCL V. Union of India Revisited: Why India's Sureveillance Law must be redesigned for the Digital Age, available at 			&lt;a href="http://nujslawreview.org/wp-content/uploads/2015/10/Chaitanya-Ramachandran.pdf"&gt; http://nujslawreview.org/wp-content/uploads/2015/10/Chaitanya-Ramachandran.pdf &lt;/a&gt;&lt;/p&gt;
&lt;/div&gt;
&lt;/div&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015'&gt;http://editors.cis-india.org/internet-governance/blog/eight-key-privacy-events-in-india-in-the-year-2015&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Amber Sinha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-01-03T05:43:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states">
    <title>EC disables easy access to electoral data across states </title>
    <link>http://editors.cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states</link>
    <description>
        &lt;b&gt;The recently-concluded Assembly elections may have set more than just one precedent with implications for the entire nation. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Akshatha M was published in &lt;a class="external-link" href="https://economictimes.indiatimes.com/news/politics-and-nation/ec-disables-easy-access-to-electoral-data-across-states/articleshow/64474558.cms"&gt;Economic Times&lt;/a&gt; on June 5, 2018. Sunil Abraham was quoted.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;While the poll result led to what many see as the beginning of a national front comprising regional parties, the steps Karnataka’s chief electoral officer took to protect the privacy of its electoral rolls will be emulated across the country. &lt;br /&gt;&lt;br /&gt;The Election Commission of India, in an internal circular issued in January, ordered the chief electoral officers of all states and union territories to publish electoral rolls only in image PDF and CAPTCHA formats. These formats ensure that no individual can access electoral data, except as readonly files. While the image PDF format disables the search option in the rolls, CAPTCHA does not allow visitors to either extract or download the rolls.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“It has been decided that electoral rolls should be published on (the) website in image PDF only. If presently-available PDF electoral rolls are not image PDF, then the same shall be done immediately,” the EC circular said. &lt;br /&gt;&lt;br /&gt;It all started in the latter part of 2017 when Karnataka’s chief electoral officer published the draft electoral rolls as image PDF with CAPTCHA formats. Earlier, rolls were published in text PDF (minus CAPTCHA) format. Electoral analysts and citizen groups took exception to the new formats on the grounds that that did not allow them to analyse errors.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;CEO Sanjiv Kumar’s contention was that analysts were seeking easy access to data. He defended his move on the grounds that the personal data of voters need to be protected. The tiff eventually reached the EC’s doorstep. It turns out that the chief election commissioner was convinced about Sanjiv Kumar’s intent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Speaking to ET, CEC Om Prakash Rawat said: “After Cambridge Analytica and Facebook episodes, the EC has decided to protect voters’ data from data harvesting and data manipulation as a precautionary measure. We are also working towards adding special data security features in the electoral rolls.” &lt;br /&gt;&lt;br /&gt;Electoral roll analysts continue to see the EC’s decision as a bid to cover up flaws in the rolls. “Their argument is self-defeating on two counts: One, an individual can still extract the data, though it is a little time-consuming. Two, voter data is sold in broad daylight for 7 paise per record and despite knowing this, the election authorities have not taken any action to prevent the same,” said Bengaluru-based electoral roll analyst PG Bhat.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Data security researchers say the EC decision to have the new formats is no long-term solution.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Sunil Abraham, executive director of the Centre for Internet and Society, said that the image PDF format would not be a longterm solution at a time when the optical character recognition software has become all-powerful. “The EC should first remove EPIC numbers from the public database as it allows people who are into data-mining exercise to combine data. The solution would be to have mandatory registration in order to access data, even for voters,” he said.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;He said the EC should have a system that enables it to track those who access electoral data. “Mass export of data should be permitted only for those who are monitoring electoral rolls at the polling-station level,” he said.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states'&gt;http://editors.cis-india.org/internet-governance/news/economic-times-june-6-2018-akshatha-m-ec-disables-easy-access-to-electoral-data-across-states&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-06-29T01:59:02Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/observer-research-foundation-shashidhar-kj-and-kashish-parpiani-july-22-2019-easing-the-us-india-divergence-on-data-localisation">
    <title>Easing the US-India divergence on data localisation</title>
    <link>http://editors.cis-india.org/internet-governance/news/observer-research-foundation-shashidhar-kj-and-kashish-parpiani-july-22-2019-easing-the-us-india-divergence-on-data-localisation</link>
    <description>
        &lt;b&gt;Addition of data localisation to the basket of persisting trade issues warrants greater compartmentalisation and consultative approaches to US-India ties.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article by Shashidhar KJ and Kashish Parpiani was &lt;a class="external-link" href="https://www.orfonline.org/expert-speak/easing-us-india-divergence-data-localisation-53256/"&gt;published by Observer Research Foundation&lt;/a&gt; on July 22, 2019.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;The Reserve Bank of India’s (RBI) finally &lt;a href="https://rbi.org.in/Scripts/FAQView.aspx?Id=130" rel="noopener" target="_blank"&gt;clarified &lt;/a&gt;its position eight months after it issued the controversial April 2018 circular mandating the storage of all payment data of Indians in the country and allowing the central bank “unfettered access”. The circular particularly aimed at US-based companies such as Mastercard, Visa, American Express, PayPal, Facebook and Google, as they scrambled to comply. The clarification was a welcome relief for companies seeking guidance on how to comply, what kind of data needs to be stored in India, and if the payment companies needed to move their processing infrastructure. Note, the RBI has yet to issue a formal directive with these clarifications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Meanwhile, media reports have indicated that Facebook-owned WhatsApp would &lt;a href="https://economictimes.indiatimes.com/tech/internet/local-data-storage-ready-whatsapp-to-open-payments-tap/articleshow/69966898.cms" rel="noopener" target="_blank"&gt;obey&lt;/a&gt; the RBI norm as it looks to kick off its payments business. This runs counter to what Facebook CEO Mark Zuckerberg had &lt;a href="https://www.nasdaq.com/aspx/call-transcript.aspx?StoryId=4256521&amp;amp;Title=facebook-s-fb-ceo-mark-zuckerberg-on-q1-2019-results-earnings-call-transcript" rel="noopener" target="_blank"&gt;told &lt;/a&gt;investors in April:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“&lt;em&gt;You should expect that we won’t store sensitive data in countries where it might be improperly accessed because of weak rule of law or governments that can forcibly get access to your data&lt;/em&gt;.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;India is still debating passing a Personal Data Protection legislation, and as such, India doesn’t have any legal safeguards protecting users’ data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This has revealed yet another faultline in the persisting trade issues between the US and India.&lt;/p&gt;
&lt;blockquote class="quoted" style="text-align: justify; "&gt;India is still debating passing a Personal Data Protection legislation, and as such, India doesn’t have any legal safeguards protecting users’ data.&lt;/blockquote&gt;
&lt;h2 style="text-align: justify; "&gt;Indian data rights vs. American IPR protectionism&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;New Delhi has started to assert its right over its citizens’ data as India’s footprint on the Internet increases. Moreover, without clear guidance from Personal Data Protection legislation, there has been a glut of policy prescriptions from sector regulators. The Centre for Internet and Society &lt;a href="https://cis-india.org/internet-governance/resources/the-localisation-gambit.pdf" rel="noopener" target="_blank"&gt;published&lt;/a&gt; a paper in which it chronicles 10 policy measures for both ‘soft’ and ‘hard’ data localisation across health, telecommunications, e-commerce, insurance and others. These measures range from storing copies of specific data, local content production requirements, or imposing conditions on cross-border data transfers that act as a localisation mandate.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This oversupply of policy prescriptions is leading to blurring of jurisdictions. Often, the policy measures given have many a slip between the cup and the lip. For example, one of the reasons for insisting on localisation is security, but even if companies localise data, there is no framework to access this data by the local security apparatus.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;India’s policy thinking on the matter often begins with the idea: ‘data is the new oil.’ The thinking is that data generated by Indians should be viewed as a natural resource that must be protected by the state through localisation. This notion is &lt;a href="https://www.orfonline.org/expert-speak/indias-draft-e-commerce-policy-a-need-to-look-beyond-data-as-the-new-oil-49413/" rel="noopener" target="_blank"&gt;problematic&lt;/a&gt;. Data, unlike oil, which is found in limited quantities, has different properties. Newer ideas of regulation must be thought of and that’s where Indian policy makers have not been accommodative.&lt;/p&gt;
&lt;blockquote class="quoted" style="text-align: justify; "&gt;Oversupply of policy prescriptions is leading to blurring of jurisdictions. Often, the policy measures given have many a slip between the cup and the lip.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;A gripe that US-based companies mention is that there is a distinctive domestic tilt and that company representatives have turned away from consultations as they do not serve the “national interests.” This was best exemplified in October 2018 when a closed-door discussion between the RBI and the US-India Strategic Partnership Forum (USISPF representing the interests of US companies) &lt;a href="https://economictimes.indiatimes.com/news/economy/policy/data-localisation-sparking-complaints-of-bias-us-companies-seek-12-months-time-from-rbi/articleshow/66210317.cms?from=mdr" rel="noopener" target="_blank"&gt;broke down&lt;/a&gt;and the latter accused the RBI of having a bias. During the discussions, the RBI placed a lot of emphasis on the inputs from iSPIRT (Indian Software Product Industry Roundtable), an Indian think tank which has been advocating for data protectionism.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The aforementioned sentiment has been carried over to international summits. At the recently concluded G20 summit, India &lt;a href="https://www.livemint.com/news/world/india-boycotts-osaka-track-at-g20-summit-1561897592466.html" rel="noopener" target="_blank"&gt;boycotted &lt;/a&gt;the Osaka Track on the digital economy as it felt that it would undermine multilateral consensus-based decisions on trade and deny policy space for digital industrialisation. The Osaka Track pushed hard for the creation of laws which would allow data flows between countries and the removal of data localisation.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;India’s foreign secretary, Vijay Gokhale, &lt;a href="https://www.thehindu.com/news/national/on-5g-and-data-india-stands-with-developing-world-not-us-japan-at-g20/article28207169.ece" rel="noopener" target="_blank"&gt;mentioned &lt;/a&gt;that data is a new form of wealth and wanted latitude on domestic rule-making on data. And in the age of digital commerce, this may signify a broader trend of a developed-developing nations’ impasse. The tussle has now moved beyond the security angle with the United States &lt;a href="https://cis-india.org/internet-governance/blog/an-analysis-of-the-cloud-act-and-implications-for-india" rel="noopener" target="_blank"&gt;enacting &lt;/a&gt;the Clarifying Lawful Overseas Use of Data (CLOUD) Act for security agencies to procure data stored in servers regardless of whether in the US or foreign soil. With monetisation now at the core of the dispute, the discussed divergences on data localisation tie into the US’ broader, long-standing issues pertaining to US-India bilateral trade.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Divergence on data localisation issue crosses path with trade tensions&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;The &lt;a href="https://ustr.gov/about-us/policy-offices/press-office/fact-sheets/2019/march/fact-sheet-2019-national-trade-estimate" rel="noopener" target="_blank"&gt;2019 National Trade Estimate&lt;/a&gt; (NTE) by the Office of the United States Trade Representative (USTR) focuses on reducing “barriers to digital trade.” Taking a tone of American stewardship on open liberal market economics, it notes:&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;“&lt;em&gt;When governments impose unnecessary barriers to cross-border data flows or discriminate against foreign digital services, local firms are often hurt the most, as they cannot take advantage of cross-border digital services that facilitate global competitiveness&lt;/em&gt;.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;At a time when the Trump administration has sought to re-calibrate America’s trade relationships via the adoption of punitive sanctions that run counter to the fundamentals of the liberal world order, the aforementioned American concern for the competitiveness of foreign nation’s local firms may seem like sardonic preaching.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;President Trump’s ‘America First’ worldview in many ways upended conventional tenets of US foreign policy. But on some fronts, it has presented opportunities for marginal establishment agendas. For instance, Trump’s heightened focus on ties with Israel and the US’ Sunni allies in the Middle East, complements the realisation of &lt;a href="https://www.google.com/search?q=neoconservatives+bolton+iran+trump&amp;amp;rlz=1C1GCEU_enIN821IN821&amp;amp;oq=neoconservatives+bolton+iran+trump&amp;amp;aqs=chrome..69i57j33.7943j0j7&amp;amp;sourceid=chrome&amp;amp;ie=UTF-8&amp;amp;safe=active" rel="noopener" target="_blank"&gt;neoconservatives’ penchant for regime change in Iran&lt;/a&gt;.&lt;/p&gt;
&lt;blockquote class="quoted" style="text-align: justify; "&gt;At a time when the Trump administration has sought to re-calibrate America’s trade relationships via the adoption of punitive sanctions that run counter to the fundamentals of the liberal world order, the aforementioned American concern for the competitiveness of foreign nation’s local firms may seem like sardonic preaching.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;On Trump’s fixation with recalibrating US trade relationships on “&lt;a href="https://www.whitehouse.gov/briefings-statements/president-donald-j-trump-will-promote-worldwide-economic-growth-prosperity-g20-summit/" rel="noopener" target="_blank"&gt;fair and reciprocal&lt;/a&gt;” footing, the American trade establishment successfully addressed US’ belated concerns over absence of digital trade rules in case of the North American Free Trade Agreement (NAFTA) with Canada and Mexico. Similarly, the emerging divergences over data localisation with India are subsumed under the ongoing — albeit repeatedly stalled, US-India trade negotiations.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Hence, the NTE underscores India’s decision with regards to payment service suppliers to be part of trade barriers hampering digital commerce and US-India trade at-large.&lt;/p&gt;
&lt;h2 style="text-align: justify; "&gt;Fixing the strained Carter &lt;em&gt;mantra&lt;/em&gt; via compartmentalisation and consultation&lt;/h2&gt;
&lt;p style="text-align: justify; "&gt;India has &lt;a href="https://www.orfonline.org/expert-speak/us-recent-decisions-to-cloud-pompeos-visit-to-india-52012/" rel="noopener" target="_blank"&gt;approached&lt;/a&gt; trade talks from the standpoint of addressing the Trumpian aberration of the US pushing for reduction of its trade deficits with other countries. Whereas, USTR negotiators have approached negotiations with India with regards to, what they view as longstanding issues in bilateral trade, such as market access for dairy products and price caps on medical equipment.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In the past, those outstanding issues were downplayed in view of the promising long-term trajectory of US-India strategic ties. The same has come to be known as the understated dictum of the &lt;a href="https://www.cfr.org/content/publications/attachments/052416_Ayres_Testimony.pdf"&gt;Carter &lt;/a&gt;&lt;a href="https://www.cfr.org/content/publications/attachments/052416_Ayres_Testimony.pdf" rel="noopener" target="_blank"&gt;&lt;em&gt;mantra&lt;/em&gt;&lt;/a&gt; — named after former US Secretary of Defense Ashton Carter and architect of the &lt;a href="https://dod.defense.gov/Portals/1/Documents/pubs/US-IND-Fact-Sheet.pdf" rel="noopener" target="_blank"&gt;US-India Defense Technology and Trade Initiative&lt;/a&gt;. The approach encompassed the US to focus on harnessing strategic ties and not let differences on other fronts like trade to &lt;a href="https://www.orfonline.org/wp-content/uploads/2018/10/ORF_Issue_Brief_262_US_Legislature.pdf" rel="noopener" target="_blank"&gt;crowd out minimal-yet-positive developments&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In recent times, that dictum has come under strain as trade tensions have resurfaced. Cases in-point being, the Trump administration’s &lt;a href="https://indianexpress.com/article/explained/donald-trump-wilbur-ross-commerce-industry-india-us-trade-suresh-prabhu-5717901/" rel="noopener" target="_blank"&gt;recent revocation&lt;/a&gt; of India’s designation as a “beneficiary developing country” under its Generalised System of Preferences programme, and India’s &lt;a href="https://www.livemint.com/politics/policy/india-imposes-tariffs-on-28-us-goods-as-global-trade-war-heats-up-1560616982719.html" rel="noopener" target="_blank"&gt;imposition of retaliatory tariffs&lt;/a&gt; on 28 US products.&lt;/p&gt;
&lt;blockquote class="quoted" style="text-align: justify; "&gt;The US-India dynamic is graduating from the erstwhile top-heavy approach based on the personal relations developed between head of states, to an institutionalised format of consultative platforms on varied bureaucratic, legislative, military, and even public-private partnership levels.&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;Furthermore, ahead of Secretary of State Mike Pompeo’s visit to New Delhi last month, the Trump administration &lt;a href="https://thewire.in/diplomacy/us-india-h1b-visa-data-localisation" rel="noopener" target="_blank"&gt;reportedly&lt;/a&gt; mulled capping the issuance of H1B visas to about 15 percent for any country that “&lt;a href="https://thewire.in/diplomacy/us-india-h1b-visa-data-localisation" rel="noopener" target="_blank"&gt;does data localisation&lt;/a&gt;.” It bore ominous prospects for India’s &lt;a href="https://thewire.in/diplomacy/us-india-h1b-visa-data-localisation" rel="noopener" target="_blank"&gt;$150 billion IT sector&lt;/a&gt; as &lt;a href="https://thewire.in/diplomacy/us-india-h1b-visa-data-localisation" rel="noopener" target="_blank"&gt;70 percent of the 85,000 H1B visas&lt;/a&gt; issued every year go to Indians. With regards to the broader trajectory of US-India ties, the report came to be seen as another blow to the Carter &lt;em&gt;mantra&lt;/em&gt;’s prescription for compartmentalisation of issues from promising aspects of the bilateral relationship.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Both sides however, have attempted to temper tensions, and keep the Carter &lt;em&gt;mantra &lt;/em&gt;in place with the continued focus on evolving strategic ties — with continued impetus on US-India &lt;a href="https://timesofindia.indiatimes.com/india/india-lining-up-defence-deals-worth-10-billion-with-us-amid-trade-row/articleshow/69919916.cms" rel="noopener" target="_blank"&gt;defence trade&lt;/a&gt; and &lt;a href="https://www.hindustantimes.com/india-news/india-us-to-take-forward-talks-for-key-military-pact/story-bi2IfgMjKtKsfA2wjTqQzM.html" rel="noopener" target="_blank"&gt;force interoperability agreements&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;More importantly, there seems to be an overt attempt to reinstitute a sense of compartmentalisation. For instance, Secretary Pompeo, during his visit to New Delhi &lt;a href="https://www.news18.com/news/india/mike-pompeo-in-india-live-india-us-relationship-has-made-strides-but-we-can-do-more-says-us-secy-of-state-2203957.html" rel="noopener" target="_blank"&gt;eased fears&lt;/a&gt; by denouncing reports about the US considering H1B visa caps. Whereas, India, too, has sought to institute a sense of compartmentalisation with Commerce Minister Piyush Goyal announcing that the contentious data protection issue will be &lt;a href="https://www.livemint.com/politics/policy/data-storage-rules-out-of-e-commerce-policy-1561488393145.html" rel="noopener" target="_blank"&gt;kept out of the e-commerce policy draft&lt;/a&gt;, and will be dealt with by the IT ministry instead.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Lastly, the US-India dynamic is graduating from the erstwhile top-heavy approach based on the personal relations developed between head of states, to an institutionalised format of consultative platforms on varied bureaucratic, legislative, military, and even public-private partnership levels. Examples of which include, the &lt;a href="https://www.timesnownews.com/india/article/india-us-officials-to-meet-for-laying-groundwork-for-two-plus-two-dialogue-with-china-on-agenda/405609" rel="noopener" target="_blank"&gt;US-India 2+2&lt;/a&gt; consultative platform between foreign and defense portfolio chiefs, and the &lt;a href="https://www.livemint.com/industry/energy/india-us-discuss-crude-oil-price-volatility-1560179681174.html" rel="noopener" target="_blank"&gt;India-US Strategic Energy Partnership&lt;/a&gt; working groups between India’s Petroleum Minister and US Energy Secretary. The upcoming editions of these forums are set to be critical in addressing outstanding issues in the strategic realm, like India’s &lt;a href="https://www.orfonline.org/expert-speak/the-turkish-interjection-in-indo-us-relations-49800/" rel="noopener" target="_blank"&gt;purchase of the Russian S-400 systems inviting the prospect of American CAATSA sanctions&lt;/a&gt;, and India’s push for a &lt;a href="https://qz.com/india/1651932/mike-pompeos-india-visit-to-push-us-oil-and-gas-over-irans/" rel="noopener" target="_blank"&gt;gas-based economy in light of reduced oil purchases from Iran following recent tensions between Washington and Tehran&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Similarly, on easing the hardening American and Indian stances on data localisation, in addition to compartmentalisation, a consultative approach must be explored. Towards that end, the &lt;a href="http://pib.nic.in/newsite/PrintRelease.aspx?relid=188617" rel="noopener" target="_blank"&gt;India-US Commercial Dialogue and India-US CEO Forum&lt;/a&gt; could serve as appropriate starting points for a joint working group involving a diverse set of stakeholders from the public and private realm.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/observer-research-foundation-shashidhar-kj-and-kashish-parpiani-july-22-2019-easing-the-us-india-divergence-on-data-localisation'&gt;http://editors.cis-india.org/internet-governance/news/observer-research-foundation-shashidhar-kj-and-kashish-parpiani-july-22-2019-easing-the-us-india-divergence-on-data-localisation&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Shashidhar KJ and Kashish Parpiani</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-07-30T01:40:24Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/e-governance-identity-privacy.pdf">
    <title>E-Governance, Identity &amp; Privacy</title>
    <link>http://editors.cis-india.org/internet-governance/e-governance-identity-privacy.pdf</link>
    <description>
        &lt;b&gt;This chapter will look at different legislations, projects, and policies pertaining to e-governance and identity that India has put in place, and examine both the strengths and the weaknesses of these, through the lense of privacy.&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/e-governance-identity-privacy.pdf'&gt;http://editors.cis-india.org/internet-governance/e-governance-identity-privacy.pdf&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>praskrishna</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2012-09-26T06:17:03Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women">
    <title>Due Diligence Project FGD by UN Women</title>
    <link>http://editors.cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women</link>
    <description>
        &lt;b&gt;On October 11, 2019, Radhika Radhakrishnan attended a focussed group discussion at the UN House, New Delhi, organized by UN Women for their multi-country research study on online violence (Due Diligence Project).&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The purpose of the  discussion was to provide a better understanding of the nature and the  scope of this form of VAWG and to provide recommendations to inform  policies, plans, programming and advocacy on the issue.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women'&gt;http://editors.cis-india.org/internet-governance/news/due-diligence-project-fgd-by-un-women&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Due Diligence</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-10-20T07:11:13Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/dsci-infosys-roundtable">
    <title>DSCI-Infosys Roundtable</title>
    <link>http://editors.cis-india.org/internet-governance/news/dsci-infosys-roundtable</link>
    <description>
        &lt;b&gt;Sunil Abraham participated in this meeting organized by Infosys in Bangalore on March 25, 2019 as a speaker.&lt;/b&gt;
        &lt;p&gt;AGENDA:&lt;/p&gt;
&lt;table&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;10:00-10:15 AM&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;Opening Remarks:  Infosys &lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;Context Setting: DSCI and Infosys&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;10:15- 11:00 AM&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;b&gt;Elements                     shaping Data Economy &lt;/b&gt;&lt;/p&gt;
&lt;p&gt;§                        Digitization:                   Personalization, Experience, Productivity &amp;amp;                   Possibilities&lt;/p&gt;
&lt;p&gt;§                        Global Internet                   Platforms: Transforming B2C and B2B&lt;/p&gt;
&lt;p&gt;§                        Phantomization                   of Technology &amp;amp; Business Models&lt;/p&gt;
&lt;p&gt;§                        Changing nature                   of Deliveries: value driven, subscription based and                   platform based&lt;/p&gt;
&lt;p&gt;§                        Product                   Economy: Data-centric Designs, Start-ups and Unicorn,&lt;/p&gt;
&lt;p&gt;§                        IOT and                   Industrialisation 4.0: Next generation service &amp;amp;                   business lines&lt;/p&gt;
&lt;p&gt;§                        Data flow and                   how it’s shaping trade of goods and services&lt;/p&gt;
&lt;p&gt;§                        Role of data in                   delivering the public service and improving public                   order&lt;/p&gt;
&lt;p&gt;§                        Artificial                   Intelligence: at specific product/service level and                   its ramification to industrial and national economy&lt;/p&gt;
&lt;p&gt;§                        Technology:                   role of data in developing next generation tech                   platforms&lt;/p&gt;
&lt;p align="right"&gt;&lt;i&gt;Discussion Facilitation: DSCI and                     Infosys&lt;/i&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;11:00- 11:45 AM&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;b&gt;Tech’s Dilemmas&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;§                        Scale and reach                   of BigTech: Industrial Capitalism versus Internet                   Capitalism&lt;/p&gt;
&lt;p&gt;§                        Competition&lt;/p&gt;
&lt;p&gt;§                        Influence on                   personal, social, transactional, economic and                   political life&lt;/p&gt;
&lt;p&gt;§                        Stressed                   relations with values of modern value system&lt;/p&gt;
&lt;p&gt;§                        Ethical issues:                   human rights, social harmony, public space decency,                   health electoral  processes, information warfare...&lt;/p&gt;
&lt;p&gt;§                        Data Privacy&lt;/p&gt;
&lt;p&gt;§                        Tech’s                   response: Locking down of data, editorial/ censorship                    controls...&lt;/p&gt;
&lt;p&gt;§                        Challenges of                   law enforcement, fraud management and supervision&lt;/p&gt;
&lt;p&gt;§                        Relevance to                   national security objectives&lt;/p&gt;
&lt;p&gt;......&lt;/p&gt;
&lt;p&gt;§                        Principles of                   Responsible Innovation&lt;/p&gt;
&lt;p&gt;§                        Ideas under                   discussion/ experimentation&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;p align="right"&gt;&lt;i&gt;Discussion Facilitation: DSCI and Infosys&lt;/i&gt;&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;11:45-12:15 AM&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;b&gt;Shaping Data Economy&lt;/b&gt;&lt;/p&gt;
&lt;p&gt;§                        Structures and                   approaches: state controlled, private sector led,                   decentralized&lt;/p&gt;
&lt;p&gt;§                        Directions:                   legal/ policy,  innovation, investments, architectures                   (like India Stack),&lt;/p&gt;
&lt;p&gt;§                        Searching the                   role of liberal economic principles&lt;/p&gt;
&lt;p&gt;§                        Open                   architectures and open data ecosystem&lt;/p&gt;
&lt;p&gt;§                        Positions,                   Obligations, Burdens and Liabilities for protecting                   rights, creating level playing field, ensuring                   competition...&lt;/p&gt;
&lt;p&gt;§                        Regulatory                   approaches: establishing supervisory controls&lt;/p&gt;
&lt;p&gt;§                        National                   security: Interventions, mandates and cooperation&lt;/p&gt;
&lt;p align="right"&gt;&lt;i&gt;Discussion Facilitation: DSCI and Infosys&lt;/i&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;12:15 to 12:30 PM&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p&gt;&lt;b&gt;Discussion Summary&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;
&lt;p align="center"&gt;12:30 PM onwards&lt;b&gt;&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;td&gt;
&lt;p align="center"&gt;&lt;b&gt;Lunch&lt;/b&gt;&lt;/p&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;
&lt;p&gt; &lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/dsci-infosys-roundtable'&gt;http://editors.cis-india.org/internet-governance/news/dsci-infosys-roundtable&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-04-05T02:06:00Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet">
    <title>DSCI's Bangalore chapter meet</title>
    <link>http://editors.cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet</link>
    <description>
        &lt;b&gt;On January 29, 2019, Karan Saini and Gurshabad Grover participated in the Bangalore chapter meet organized by Data Security Council of India in Bangalore.&lt;/b&gt;
        &lt;p&gt;&lt;img src="http://editors.cis-india.org/home-images/DSCI.png/@@images/5964984e-07ca-4be0-8a63-98b2490b5032.png" alt="DSCI" class="image-inline" title="DSCI" /&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet'&gt;http://editors.cis-india.org/internet-governance/news/dscis-bangalore-chapter-meet&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Admin</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-02-02T01:47:51Z</dc:date>
   <dc:type>News Item</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes">
    <title>DSCI Best Practices Meet 2013</title>
    <link>http://editors.cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes</link>
    <description>
        &lt;b&gt;The DSCI Best Practices Meet 2013 was organized on July 12, 2013 at Hyatt Regency, Anna Salai in Chennai. Kovey Coles attended the meet and shares a summary of the happenings in this blog post.&lt;/b&gt;
        &lt;hr /&gt;
&lt;p&gt;&lt;i&gt;This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC&lt;/i&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;Last year’s annual Best Practices Meet, sponsored by the Data Security Council of India (DSCI), was held in here in Bangalore, and featured CIS associates as panelists for an agenda focused mostly around mobility in technology. This year, the event was continued in nearby Chennai, where many of India’s top stakeholders in Cyber Security came together at the Hyatt hotel to discuss the modern cyber security landscape. Several of the key points of the day emphasized how the industry realm needed to be especially keen on Cyber Security today. Early speakers explained how many Cyber-Attacks occur as opportunistic attacks on financial institutions, and that these breaches often take months to be discovered, with the discovery usually being made by a third-party. For those reasons, it was repeatedly mentioned throughout the day that modern entities must anticipate attacks as inevitable, and prepare themselves to be able to respond and successfully bounce-back.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Several panelists of the event expanded upon the evolving challenges facing industries, and explained why service based industry continually grows more susceptible to Cyber-Attack. There were representatives from Microsoft, Flextronics, MyEasyDoc, and others, who explained how technological demands of modern consumers resulted inadvertently in weaker security. For example, with customers expecting real-time access to data rather than periodic data reports, i.e financial data reports, industries must now keep their data open, which weakens database security. Overall, the primary challenge faced by the industry was effectively summarized by Microsoft India CSO Ganapathi Subramaniam, stating that within web services, “Security and usability are inversely proportional.” Essentially, the more convenient a product, the less secure its infrastructure.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Despite discussion of the difficulties facing modern producers and consumers, there were undoubtedly highlights of optimism at the conference. A presentation by event sponsor Juniper Networks shed light on practices which combat Cyber-Attackers, including rerouting perceived Distributed Denial of Service (DDoS) attacks and finger-printing suspected hackers through a series of characteristics rather than just IP addresses (these characteristics include browser version, fonts, Add-Ons, time zone, and more). Notably, there was a call for cooperation on all fronts in combatting Cyber-crime, for public-private partnerships (PPP), and many citizens stood and spoke on the behalf of civil society’s incorporation in the process as well. One speaker, Retired Brig. Abhimanyu Ghosh admirably tore down sector divisions in the face of Cyber-Security threats, saying “We all want to secure ourselves. It is not a question of industry versus government, government versus industry. Government needs industry, and industry needs government.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, a few speakers used their opportunity at the conference to highlight issues related to rights and responsibilities of both citizens and government in internet. Nikhil Moro, a scholar at the Hindu Center for Politics and Public Policy, spoke at length about the urgent condition of laws which undermine freedom of speech and freedom of expression in India, especially within while online. His talk, which occurred near the end of the event, stirred the crowd to discussion, and helped remind the attendees of the comprehensiveness of issues which demand attention in the realm of a growing internet presence.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes'&gt;http://editors.cis-india.org/internet-governance/blog/dsci-bpm-2013-conference-notes&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>kovey</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2013-07-26T08:18:01Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
