The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 61 to 66.
Security: Privacy, Transparency and Technology
http://editors.cis-india.org/internet-governance/blog/security-privacy-transparency-and-technology
<b>The Centre for Internet and Society (CIS) has been involved in privacy and data protection research for the last five years. It has participated as a member of the Justice A.P. Shah Committee, which has influenced the draft Privacy Bill being authored by the Department of Personnel and Training. It has organised 11 multistakeholder roundtables across India over the last two years to discuss a shadow Privacy Bill drafted by CIS with the participation of privacy commissioners and data protection authorities from Europe and Canada.</b>
<p> </p>
<p>The article was co-authored by Sunil Abraham, Elonnai Hickok and Tarun Krishnakumar. It was published by Observer Research Foundation, <a href="http://editors.cis-india.org/internet-governance/blog/security-privacy-transparency-technology.pdf" class="internal-link">Digital Debates 2015: CyFy Journal Volume 2</a>.</p>
<hr />
<p style="text-align: justify;">Our centre’s work on privacy was considered incomplete by some stakeholders because of a lack of focus in the area of cyber security and therefore we have initiated research on it from this year onwards. In this article, we have undertaken a preliminary examination of the theoretical relationships between the national security imperative and privacy, transparency and technology.</p>
<h3 style="text-align: justify;">Security and Privacy</h3>
<p style="text-align: justify;">Daniel J. Solove has identified the tension between security and privacy as a false dichotomy: "Security and privacy often clash, but there need not be a zero-sum tradeoff." <a name="fr1" href="#fn1">[1]</a> Further unpacking this false dichotomy, Bruce Schneier says, "There is no security without privacy. And liberty requires both security and privacy." <a name="fr2" href="#fn2">[2]</a> Effectively, it could be said that privacy is a precondition for security, just as security is a precondition for privacy. A secure information system cannot be designed without guaranteeing the privacy of its authentication factors, and it is not possible to guarantee privacy of authentication factors without having confidence in the security of the system. Often policymakers talk about a balance between the privacy and security imperatives—in other words a zero-sum game. Balancing these imperatives is a foolhardy approach, as it simultaneously undermines both imperatives. Balancing privacy and security should instead be framed as an optimisation problem. Indeed, during a time when oversight mechanisms have failed even in so-called democratic states, the regulatory power of technology <a name="fr3" href="#fn3">[3]</a> should be seen as an increasingly key ingredient to the solution of that optimisation problem.</p>
<p style="text-align: justify;">Data retention is required in most jurisdictions for law enforcement, intelligence and military purposes. Here are three examples of how security and privacy can be optimised when it comes to Internet Service Provider (ISP) or telecom operator logs:</p>
<ol>
<li style="text-align: justify;"><strong>Data Retention</strong>: We propose that the office of the Privacy Commissioner generate a cryptographic key pair for each internet user and give one key to the ISP / telecom operator. This key would be used to encrypt logs, thereby preventing unauthorised access. Once there is executive or judicial authorisation, the Privacy Commissioner could hand over the second key to the authorised agency. There could even be an emergency procedure and the keys could be automatically collected by concerned agencies from the Privacy Commissioner. This will need to be accompanied by a policy that criminalises the possession of unencrypted logs by ISP and telecom operators.<br /><br /></li>
<li style="text-align: justify;"><strong>Privacy-Protective Surveillance</strong>: Ann Cavoukian and Khaled El Emam <a name="fr4" href="#fn4">[4]</a> have proposed combining intelligent agents, homomorphic encryption and probabilistic graphical models to provide “a positive-sum, ‘win–win’ alternative to current counter-terrorism surveillance systems.” They propose limiting collection of data to “significant” transactions or events that could be associated with terrorist-related activities, limiting analysis to wholly encrypted data, which then does not just result in “discovering more patterns and relationships without an understanding of their context” but rather “intelligent information—information selectively gathered and placed into an appropriate context to produce actual knowledge.” Since fully homomorphic encryption may be unfeasible in real-world systems, they have proposed use of partially homomorphic encryption. But experts such as Prof. John Mallery from MIT are also working on solutions based on fully homomorphic encryption.<br /><br /></li>
<li style="text-align: justify;"><strong>Fishing Expedition Design</strong>: Madan Oberoi, Pramod Jagtap, Anupam Joshi, Tim Finin and Lalana Kagal have proposed a standard <a name="fr5" href="#fn5">[5]</a> that could be adopted by authorised agencies, telecom operators and ISPs. Instead of giving authorised agencies complete access to logs, they propose a format for database queries, which could be sent to the telecom operator or ISP by authorised agencies. The telecom operator or ISP would then process the query, and anonymise/obfuscate the result-set in an automated fashion based on applicable privacypolicies/regulation. Authorised agencies would then hone in on a subset of the result-set that they would like with personal identifiers intact; this smaller result set would then be shared with the authorised agencies.</li></ol>
<p style="text-align: justify;">An optimisation approach to resolving the false dichotomy between privacy and security will not allow for a total surveillance regime as pursued by the US administration. Total surveillance brings with it the ‘honey pot’ problem: If all the meta-data and payload data of citizens is being harvested and stored, then the data store will become a single point of failure and will become another target for attack. The next Snowden may not have honourable intentions and might decamp with this ‘honey pot’ itself, which would have disastrous consequences.</p>
<p style="text-align: justify;">If total surveillance will completely undermine the national security imperative, what then should be the optimal level of surveillance in a population? The answer depends upon the existing security situation. If this is represented on a graph with security on the y-axis and the proportion of the population under surveillance on the x-axis, the benefits of surveillance could be represented by an inverted hockey-stick curve. To begin with, there would already be some degree of security. As a small subset of the population is brought under surveillance, security would increase till an optimum level is reached, after which, enhancing the number of people under surveillance would not result in any security pay-off. Instead, unnecessary surveillance would diminish security as it would introduce all sorts of new vulnerabilities. Depending on the existing security situation, the head of the hockey-stick curve might be bigger or smaller. To use a gastronomic analogy, optimal surveillance is like salt in cooking—necessary in small quantities but counter-productive even if slightly in excess.</p>
<p style="text-align: justify;">In India the designers of surveillance projects have fortunately rejected the total surveillance paradigm. For example, the objective of the National Intelligence Grid (NATGRID) is to streamline and automate targeted surveillance; it is introducing technological safeguards that will allow express combinations of result-sets from 22 databases to be made available to 12 authorised agencies. This is not to say that the design of the NATGRID cannot be improved.</p>
<h3>Security and Transparency</h3>
<p style="text-align: justify;">There are two views on security and transparency: One, security via obscurity as advocated by vendors of proprietary software, and two, security via transparency as advocated by free/open source software (FOSS) advocates and entrepreneurs. Over the last two decades, public and industry opinion has swung towards security via transparency. This is based on the Linus rule that “given enough eyeballs, all bugs are shallow.” But does this mean that transparency is a necessary and sufficient condition? Unfortunately not, and therefore it is not necessarily true that FOSS and open standards will be more secure than proprietary software and proprietary standards.</p>
<blockquote style="text-align: justify;" class="pullquote">Optimal surveillance is like salt in cooking—necessary in small quantities but counter-productive even if slightly in excess.</blockquote>
<p style="text-align: justify;">The recent detection of the Heartbleed <a name="fr6" href="#fn6">[6]</a> security bug in Open SSL, <a name="fr7" href="#fn7">[7]</a> causing situations where more data can be read than should be allowed, and Snowden’s revelations about the compromise of some open cryptographic standards (which depend on elliptic curves), developed by the US National Institute of Standards and Technology, are stark examples. <a name="fr8" href="#fn8">[8]</a></p>
<p style="text-align: justify;">At the same time, however, open standards and FOSS are crucial to maintaining the balance of power in information societies, as civil society and the general public are able to resist the powers of authoritarian governments and rogue corporations using cryptographic technology. These technologies allow for anonymous speech, pseudonymous speech, private communication, online anonymity and circumvention of surveillance and censorship. For the media, these technologies enable anonymity of sources and the protection of whistle-blowers—all phenomena that are critical to the functioning of a robust and open democratic society. But these very same technologies are also required by states and by the private sector for a variety of purposes—national security, e-commerce, e-banking, protection of all forms of intellectual property, and services that depend on confidentiality, such as legal or medical services.</p>
<p style="text-align: justify;">In order words, all governments, with the exception of the US government, have common cause with civil society, media and the general public when it comes to increasing the security of open standards and FOSS. Unfortunately, this can be quite an expensive task because the re-securing of open cryptographic standards depends on mathematicians. Of late, mathematical research outputs that can be militarised are no longer available in the public domain because the biggest employers of mathematicians worldwide today are the US military and intelligence agencies. If other governments invest a few billion dollars through mechanisms like Knowledge Ecology International’s proposed World Trade Organization agreement on the supply of knowledge as a public good, we would be able to internationalise participation in standard-setting organisations and provide market incentives for greater scrutiny of cryptographic standards and patching of vulnerabilities of FOSS. This would go a long way in addressing the trust deficit that exists on the internet today.</p>
<h3 style="text-align: justify;">Security and Technology</h3>
<p style="text-align: justify;">A techno-utopian understanding of security assumes that more technology, more recent technology and more complex technology will necessarily lead to better security outcomes.</p>
<p style="text-align: justify;">This is because the security discourse is dominated by vendors with sales targets who do not present a balanced or accurate picture of the technologies that they are selling. This has resulted in state agencies and the general public having an exaggerated understanding of the capabilities of surveillance technologies that is more aligned with Hollywood movies than everyday reality.</p>
<h3 style="text-align: justify;">More Technology</h3>
<p style="text-align: justify;">Increasing the number of x-ray machines or full-body scanners at airports by a factor of ten or hundred will make the airport less secure unless human oversight is similarly increased. Even with increased human oversight, all that has been accomplished is an increase in the potential locations that can be compromised. The process of hardening a server usually involves stopping non-essential services and removing non-essential software. This reduces the software that should be subject to audit, continuously monitored for vulnerabilities and patched as soon as possible. Audits, ongoing monitoring and patching all cost time and money and therefore, for governments with limited budgets, any additional unnecessary technology should be seen as a drain on the security budget. Like with the airport example, even when it comes to a single server on the internet, it is clear that, from a security perspective, more technology without a proper functionality and security justification is counter-productive. To reiterate, throwing increasingly more technology at a problem does not make things more secure; rather, it results in a proliferation of vulnerabilities.</p>
<h3 style="text-align: justify;">Latest Technology</h3>
<p style="text-align: justify;">Reports that a number of state security agencies are contemplating returning to typewriters for sensitive communications in the wake of Snowden’s revelations makes it clear that some older technologies are harder to compromise in comparison to modern technology. <a name="fr9" href="#fn9">[9]</a> Between iris- and fingerprint-based biometric authentication, logically, it would be easier for a criminal to harvest images of irises or authentication factors in bulk fashion using a high resolution camera fitted with a zoom lens in a public location, in comparison to mass lifting of fingerprints.</p>
<h3 style="text-align: justify;">Complex Technology</h3>
<p style="text-align: justify;">Fifteen years ago, Bruce Schneier said, "The worst enemy of security is complexity. This has been true since the beginning of computers, and it’s likely to be true for the foreseeable future." <a name="fr10" href="#fn10">[10]</a> This is because complexity increases fragility; every feature is also a potential source of vulnerabilities and failures. The simpler Indian electronic machines used until the 2014 elections are far more secure than the Diebold voting machines used in the 2004 US presidential elections. Similarly when it comes to authentication, a pin number is harder to beat without user-conscious cooperation in comparison to iris- or fingerprint-based biometric authentication.</p>
<p style="text-align: justify;">In the following section of the paper we have identified five threat scenarios <a name="fr11" href="#fn11">[11]</a> relevant to India and identified solutions based on our theoretical framing above.</p>
<h3 style="text-align: justify;">Threat Scenarios and Possible Solutions</h3>
<p style="text-align: justify;"><strong>Hacking the NIC Certifying Authority</strong><br />One of the critical functions served by the National Informatics Centre (NIC) is as a Certifying Authority (CA). <a name="fr12" href="#fn12">[12]</a> In this capacity, the NIC issues digital certificates that authenticate web services and allow for the secure exchange of information online. <a name="fr13" href="#fn13">[13]</a> Operating systems and browsers maintain lists of trusted CA root certificates as a means of easily verifying authentic certificates. India’s Controller of Certifying Authority’s certificates issued are included in the Microsoft Root list and recognised by the majority of programmes running on Windows, including Internet Explorer and Chrome. <a name="fr14" href="#fn14">[14]</a> In 2014, the NIC CA’s infrastructure was compromised, and digital certificates were issued in NIC’s name without its knowledge. <a name="fr15" href="#fn15">[15]</a> Reports indicate that NIC did not "have an appropriate monitoring and tracking system in place to detect such intrusions immediately." <a name="fr16" href="#fn16">[16]</a> The implication is that websites could masquerade as another domain using the fake certificates. Personal data of users can be intercepted or accessed by third parties by the masquerading website. The breach also rendered web servers and websites of government bodies vulnerable to attack, and end users were no longer sure that data on these websites was accurate and had not been tampered with. <a name="fr17" href="#fn17">[17]</a> The NIC CA was forced to revoke all 250,000 SSL Server Certificates issued until that date <a name="fr18" href="#fn18">[18]</a> and is no longer issuing digital certificates for the time being. <a name="fr19" href="#fn19">[19]</a>Public key pinning is a means through which websites can specify which certifying authorities have issued certificates for that site. Public key pinning can prevent man-in-the-middle attacks due to fake digital certificates. <a name="fr20" href="#fn20">[20]</a> Certificate Transparency allows anyone to check whether a certificate has been properly issued, seeing as certifying authorities must publicly publish information about the digital certificates that they have issued. Though this approach does not prevent fake digital certificates from being issued, it can allow for quick detection of misuse. <a name="fr21" href="#fn21">[21]</a></p>
<p style="text-align: justify;"><strong>‘Logic Bomb’ against Airports</strong><br />Passenger operations in New Delhi’s Indira Gandhi International Airport depend on a centralised operating system known as the Common User Passenger Processing System (CUPPS). The system integrates numerous critical functions such as the arrival and departure times of flights, and manages the reservation system and check-in schedules. <a name="fr22" href="#fn22">[22]</a> In 2011, a logic bomb attack was remotely launched against the system to introduce malicious code into the CUPPS software. The attack disabled the CUPPS operating system, forcing a number of check-in counters to shut down completely, while others reverted to manual check-in, resulting in over 50 delayed flights. Investigations revealed that the attack was launched by three disgruntled employees who had assisted in the installation of the CUPPS system at the New Delhi Airport. <a name="fr23" href="#fn23">[23]</a> Although in this case the impact of the attack was limited to flight delay, experts speculate that the attack was meant to take down the entire system. The disruption and damage resulting from the shutdown of an entire airport would be extensive.</p>
<p style="text-align: justify;">Adoption of open hardware and FOSS is one strategy to avoid and mitigate the risk of such vulnerabilities. The use of devices that embrace the concept of open hardware and software specifications must be encouraged, as this helps the FOSS community to be vigilant in detecting and reporting design deviations and investigate into probable vulnerabilities.</p>
<p style="text-align: justify;"><strong>Attack on Critical Infrastructure</strong><br />The Nuclear Power Corporation of India encounters and prevents numerous cyber attacks every day. <a name="fr24" href="#fn24">[24]</a> The best known example of a successful nuclear plant hack is the Stuxnet worm that thwarted the operation of an Iranian nuclear enrichment complex and set back the country’s nuclear programme. <a name="fr25" href="#fn25">[25] </a></p>
<p style="text-align: justify;">The worm had the ability to spread over the network and would activate when a specific configuration of systems was encountered <a name="fr26" href="#fn26">[26]</a> and connected to one or more Siemens programmable logic controllers. <a name="fr27" href="#fn27">[27]</a> The worm was suspected to have been initially introduced through an infected USB drive into one of the controller computers by an insider, thus crossing the air gap. <a name="fr28" href="#fn28">[28]</a> The worm used information that it gathered to take control of normal industrial processes (to discreetly speed up centrifuges, in the present case), leaving the operators of the plant unaware that they were being attacked. This incident demonstrates how an attack vector introduced into the general internet can be used to target specific system configurations. When the target of a successful attack is a sector as critical and secured as a nuclear complex, the implications for a country’s security and infrastructure are potentially grave.</p>
<p style="text-align: justify;">Security audits and other transparency measures to identify vulnerabilities are critical in sensitive sectors. Incentive schemes such as prizes, contracts and grants may be evolved for the private sector and academia to identify vulnerabilities in the infrastructure of critical resources to enable/promote security auditing of infrastructure.</p>
<p style="text-align: justify;"><strong>Micro Level: Chip Attacks</strong><br />Semiconductor devices are ubiquitous in electronic devices. The US, Japan, Taiwan, Singapore, Korea and China are the primary countries hosting manufacturing hubs of these devices. India currently does not produce semiconductors, and depends on imported chips. This dependence on foreign semiconductor technology can result in the import and use of compromised or fraudulent chips by critical sectors in India. For example, hardware Trojans, which may be used to access personal information and content on a device, may be inserted into the chip. Such breaches/transgressions can render equipment in critical sectors vulnerable to attack and threaten national security. <a name="fr29" href="#fn29">[29]</a></p>
<p style="text-align: justify;">Indigenous production of critical technologies and the development of manpower and infrastructure to support these activities are needed. The Government of India has taken a number of steps towards this. For example, in 2013, the Government of India approved the building of two Semiconductor Wafer Fabrication (FAB) manufacturing facilities <a name="fr30" href="#fn30">[30]</a> and as of January 2014, India was seeking to establish its first semiconductor characterisation lab in Bangalore. <a name="fr31" href="#fn31">[31]</a></p>
<p style="text-align: justify;"><strong>Macro Level: Telecom and Network Switches</strong></p>
<p style="text-align: justify;">The possibility of foreign equipment containing vulnerabilities and backdoors that are built into its software and hardware gives rise to concerns that India’s telecom and network infrastructure is vulnerable to being hacked and accessed by foreign governments (or non-state actors) through the use of spyware and malware that exploit such vulnerabilities. In 2013, some firms, including ZTE and Huawei, were barred by the Indian government from participating in a bid to supply technology for the development of its National Optic Network project due to security concerns. <a name="fr32" href="#fn32">[32]</a> Similar concerns have resulted in the Indian government holding back the conferment of ‘domestic manufacturer’ status on both these firms. <a name="fr33" href="#fn33">[33]</a></p>
<p style="text-align: justify;">Following reports that Chinese firms were responsible for transnational cyber attacks designed to steal confidential data from overseas targets, there have been moves to establish laboratories to test imported telecom equipment in India. <a name="fr34" href="#fn34">[34]</a> Despite these steps, in a February 2014 incident the state-owned telecommunication company Bharat Sanchar Nigam Ltd’s network was hacked, allegedly by Huawei. <a name="fr35" href="#fn35">[35]</a></p>
<blockquote style="text-align: justify;" class="pullquote">Security practitioners and policymakers need to avoid the zero-sum framing prevalent in popular discourse regarding security VIS-A-VIS privacy, transparency and technology.</blockquote>
<p style="text-align: justify;">A successful hack of the telecom infrastructure could result in massive disruption in internet and telecommunications services. Large-scale surveillance and espionage by foreign actors would also become possible, placing, among others, both governmental secrets and individuals personal information at risk.</p>
<p style="text-align: justify;">While India cannot afford to impose a general ban on the import of foreign telecommunications equipment, a number of steps can be taken to address the risk of inbuilt security vulnerabilities. Common International Criteria for security audits could be evolved by states to ensure compliance of products with international norms and practices. While India has already established common criteria evaluation centres, <a name="fr36" href="#fn36">[36]</a> the government monopoly over the testing function has resulted in only three products being tested so far. A Code Escrow Regime could be set up where manufacturers would be asked to deposit source code with the Government of India for security audits and verification. The source code could be compared with the shipped software to detect inbuilt vulnerabilities.</p>
<h3 style="text-align: justify;">Conclusion</h3>
<p style="text-align: justify;">Cyber security cannot be enhanced without a proper understanding of the relationship between security and other national imperatives such as privacy, transparency and technology. This paper has provided an initial sketch of those relationships, but sustained theoretical and empirical research is required in India so that security practitioners and policymakers avoid the zero-sum framing prevalent in popular discourse and take on the hard task of solving the optimisation problem by shifting policy, market and technological levers simultaneously. These solutions must then be applied in multiple contexts or scenarios to determine how they should be customised to provide maximum security bang for the buck.</p>
<hr />
<p style="text-align: justify;">[<a name="fn1" href="#fr1">1</a>]. Daniel J. Solove, Chapter 1 in Nothing to Hide: The False Tradeoff between Privacy and Security (Yale University Press: 2011), http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1827982.</p>
<p style="text-align: justify;">[<a name="fn2" href="#fr2">2</a>]. Bruce Schneier, “What our Top Spy doesn’t get: Security and Privacy aren’t Opposites,” Wired, January 24, 2008, http://archive.wired.com/politics/security commentary/security matters/2008/01/securitymatters_0124 and Bruce Schneier, “Security vs. Privacy,” Schneier on Security, January 29, 2008, https://www.schneier.com/blog/archives/2008/01/security_vs_pri.html.</p>
<p style="text-align: justify;">[<a name="fn3" href="#fr3">3</a>]. There are four sources of power in internet governance: Market power exerted by private sector organisations; regulatory power exerted by states; technical power exerted by anyone who has access to certain categories of technology, such as cryptography; and finally, the power of public pressure sporadically mobilised by civil society. A technically sound encryption standard, if employed by an ordinary citizen, cannot be compromised using the power of the market or the regulatory power of states or public pressure by civil society. In that sense, technology can be used to regulate state and market behaviour.</p>
<p style="text-align: justify;">[<a name="fn4" href="#fr4">4</a>]. Ann Cavoukian and Khaled El Emam, “Introducing Privacy-Protective Surveillance: Achieving Privacy and Effective Counter-Terrorism,” Information & Privacy Commisioner, September 2013, Ontario, Canada, http://www.privacybydesign.ca/content/uploads/2013/12/pps.pdf.</p>
<p style="text-align: justify;">[<a name="fn5" href="#fr5">5</a>]. Madan Oberoi, Pramod Jagtap, Anupam Joshi, Tim Finin and Lalana Kagal, “Information Integration and Analysis: A Semantic Approach to Privacy”(presented at the third IEEE International Conference on Information Privacy, Security, Risk and Trust, Boston, USA, October 2011), ebiquity.umbc.edu/_file_directory_/papers/578.pdf.</p>
<p style="text-align: justify;">[<a name="fn6" href="#fr6">6</a>]. Bruce Byfield, “Does Heartbleed disprove ‘Open Source is Safer’?,” Datamation, April 14, 2014, http://www.datamation.com/open-source/does-heartbleed-disprove-open-source-is-safer-1.html.</p>
<p style="text-align: justify;">[<a name="fn7" href="#fr7">7</a>]. “Cybersecurity Program should be more transparent, protect privacy,” Centre for Democracy and Technology Insights, March 20, 2009, https://cdt.org/insight/cybersecurity-program-should-be-more-transparent-protect-privacy/#1.</p>
<p style="text-align: justify;">[<a name="fn8" href="#fr8">8</a>]. “Cracked Credibility,” The Economist, September 14, 2013, http://www.economist.com/news/international/21586296-be-safe-internet-needs-reliable-encryption-standards-software-and.</p>
<p style="text-align: justify;">[<a name="fn9" href="#fr9">9</a>]. Miriam Elder, “Russian guard service reverts to typewriters after NSA leaks,” The Guardian, July 11, 2013, www.theguardian.com/world/2013/jul/11/russia-reverts-paper-nsa-leaks and Philip Oltermann, “Germany ‘may revert to typewriters’ to counter hi-tech espionage,” The Guardian, July 15, 2014, www.theguardian.com/world/2014/jul/15/germany-typewriters-espionage-nsa-spying-surveillance.</p>
<p style="text-align: justify;">[<a name="fn10" href="#fr10">10</a>]. Bruce Schneier, “A Plea for Simplicity,” Schneier on Security, November 19, 1999, https://www.schneier.com/essays/archives/1999/11/a_plea_for_simplicit.html.</p>
<p style="text-align: justify;">[<a name="fn11" href="#fr11">11</a>]. With inputs from Pranesh Prakash of the Centre for Internet and Society and Sharathchandra Ramakrishnan of Srishti School of Art, Technology and Design.</p>
<p style="text-align: justify;">[<a name="fn12" href="#fr12">12</a>]. “Frequently Asked Questions,” Controller of Certifying Authorities, Department of Electronics and Information Technology, Government of India, http://cca.gov.in/cca/index.php?q=faq-page#n41.</p>
<p>[<a name="fn13" href="#fr13">13</a>]. National Informatics Centre Homepage, Government of India, http://www.nic.in/node/41.</p>
<p style="text-align: justify;">[<a name="fn14" href="#fr14">14</a>]. Adam Langley, “Maintaining Digital Certificate Security,” Google Security Blog, July 8, 2014, http://googleonlinesecurity.blogspot.in/2014/07/maintaining-digital-certificate-security.html.</p>
<p style="text-align: justify;">[<a name="fn15" href="#fr15">15</a>]. This is similar to the kind of attack carried out against DigiNotar, a Dutch certificate authority. See: http://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=1246&context=jss.</p>
<p>[<a name="fn16" href="#fr16">16</a>]. R. Ramachandran, “Digital Disaster,” Frontline, August 22, 2014, http://www.frontline.in/the-nation/digital-disaster/article6275366.ece.</p>
<p>[<a name="fn17" href="#fr17">17</a>]. Ibid.</p>
<p>[<a name="fn18" href="#fr18">18</a>]. “NIC’s digital certification unit hacked,” Deccan Herald, July 16, 2014, http://www.deccanherald.com/content/420148/archives.php.</p>
<p>[<a name="fn19" href="#fr19">19</a>]. National Informatics Centre Certifying Authority Homepage, Government of India, http://nicca.nic.in//.</p>
<p>[<a name="fn20" href="#fr20">20</a>]. Mozilla Wiki, “Public Key Pinning,” https://wiki.mozilla.org/SecurityEngineering/Public_Key_Pinning.</p>
<p style="text-align: justify;">[<a name="fn21" href="#fr21">21</a>]. “Certificate Transparency - The quick detection of fraudulent digital certificates,” Ascertia, August 11, 2014, http://www.ascertiaIndira.com/blogs/pki/2014/08/11/certificate-transparency-the-quick-detection-of-fraudulent-digital-certificates.</p>
<p style="text-align: justify;">[<a name="fn22" href="#fr22">22</a>]. “Indira Gandhi International Airport (DEL/VIDP) Terminal 3, India,” Airport Technology.com, http://www.airport-technology.com/projects/indira-gandhi-international-airport-terminal -3/.</p>
<p style="text-align: justify;">[<a name="fn23" href="#fr23">23</a>]. “How techies used logic bomb to cripple Delhi Airport,” Rediff, November 21, 2011, http://www.rediff.com/news/report/how-techies-used-logic-bomb-to-cripple-delhi-airport/20111121 htm.</p>
<p style="text-align: justify;">[<a name="fn24" href="#fr24">24</a>]. Manu Kaushik and Pierre Mario Fitter, “Beware of the bugs,” Business Today, February 17, 2013, http://businesstoday.intoday.in/story/india-cyber-security-at-risk/1/191786.html.</p>
<p>[<a name="fn25" href="#fr25">25</a>]. “Stuxnet ‘hit’ Iran nuclear plants,” BBC, November 22, 2010, http://www.bbc.com/news/technology-11809827.</p>
<p>[<a name="fn26" href="#fr26">26</a>]. In this case, systems using Microsoft Windows and running Siemens Step7 software were targeted.</p>
<p>[<a name="fn27" href="#fr27">27</a>]. Jonathan Fildes, “Stuxnet worm ‘targeted high-value Iranian assets’,” BBC, September 23, 2010, http://www.bbc.com/news/technology-11388018.</p>
<p style="text-align: justify;">[<a name="fn28" href="#fr28">28</a>]. Farhad Manjoo, “Don’t Stick it in: The dangers of USB drives,” Slate, October 5, 2010, http://www.slate.com/articles/technology/technology/2010/10/dont_stick_it_in.html.</p>
<p>[<a name="fn29" href="#fr29">29</a>]. Ibid.</p>
<p style="text-align: justify;">[<a name="fn30" href="#fr30">30</a>]. “IBM invests in new $5bn chip fab in India, so is chip sale off?,” ElectronicsWeekly, February 14, 2014, http://www.electronicsweekly.com/news/business/ibm-invests-new-5bn-chip-fab-india-chip-sale-2014-02/.</p>
<p style="text-align: justify;">[<a name="fn31" href="#fr31">31</a>]. NT Balanarayan, “Cabinet Approves Creation of Two Semiconductor Fabrication Units,” Medianama, February 17, 2014, http://articles.economictimes.indiatimes.com/2014-02-04/news/47004737_1_indian-electronics-special-incentive-package-scheme-semiconductor-association.</p>
<p style="text-align: justify;">[<a name="fn32" href="#fr32">32</a>]. Jamie Yap, “India bars foreign vendors from national broadband initiative,” ZD Net, January 21, 2013, http://www.zdnet.com/in/india-bars-foreign-vendors-from-national-broadband-initiative-7000010055/.</p>
<p style="text-align: justify;">[<a name="fn33" href="#fr33">33</a>]. Kevin Kwang, “India holds back domestic-maker status for Huawei, ZTE,” ZD Net, February 6, 2013, http://www.zdnet.com/in/india-holds-back-domestic-maker-status-for-huawei-zte-70 00010887/. Also see “Huawei, ZTE await domestic-maker tag,” The Hindu, February 5, 2013, http://www.thehindu.com/business/companies/huawei-zte-await-domesticmaker-tag/article4382888.ece.</p>
<p style="text-align: justify;">[<a name="fn34" href="#fr34">34</a>]. Ellyne Phneah, “Huawei, ZTE under probe by Indian government,” ZD Net, May 10, 2013, http://www.zdnet.com/in/huawei-zte-under-probe-by-indian-government-7000015185/.</p>
<p style="text-align: justify;">[<a name="fn35" href="#fr35">35</a>]. Devidutta Tripathy, “India investigates report of Huawei hacking state carrier network,” Reuters, February 6, 2014, http://www.reuters.com/article/2014/02/06/us-india-huawei-hacking-idUSBREA150QK20140206.</p>
<p>[<a name="fn36" href="#fr36">36</a>]. “Products Certified,” Common Criteria Portal of India, http://www.commoncriteria-india.gov.in/Pages/ProductsCertified.aspx.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/security-privacy-transparency-and-technology'>http://editors.cis-india.org/internet-governance/blog/security-privacy-transparency-and-technology</a>
</p>
No publishersunilBig DataPrivacyInternet GovernanceFeaturedHomepage2015-09-15T10:53:52ZBlog EntryA Review of the Policy Debate around Big Data and Internet of Things
http://editors.cis-india.org/internet-governance/blog/review-of-policy-debate-around-big-data-and-internet-of-things
<b>This blog post seeks to review and understand how regulators and experts across jurisdictions are reacting to Big Data and Internet of Things (IoT) from a policy perspective.</b>
<h3>Defining and Connecting Big Data and Internet of Things</h3>
<p style="text-align: justify; ">The Internet of Things is a term that refers to networked objects and systems that can connect to the internet and can transmit and receive data. Characteristics of IoT include the gathering of information through sensors, the automation of functions, and analysis of collected data.[1] For IoT devices, because of the <i>velocity</i> at which data is generated, the <i>volume</i> of data that is generated, and the <i>variety</i> of data generated by different sources [2] - IoT devices can be understood as generating Big Data and/or relying on Big Data analytics. In this way IoT devices and Big Data are intrinsically interconnected.</p>
<h3>General Implications of Big Data and Internet of Things</h3>
<p style="text-align: justify; ">Big Data paradigms are being adopted across countries, governments, and business sectors because of the potential insights and change that it can bring. From improving an organizations business model, facilitating urban development, allowing for targeted and individualized services, and enabling the prediction of certain events or actions - the application of Big Data has been recognized as having the potential to bring about dramatic and large scale changes.</p>
<p style="text-align: justify; ">At the same time, experts have identified risks to the individual that can be associated with the generation, analysis, and use of Big Data. In May 2014, the White House of the United States completed a ninety day study of how big data will change everyday life. The Report highlights the potential of Big Data as well as identifying a number of concerns associated with Big Data. For example: the selling of personal data, identification or re-identification of individuals, profiling of individuals, creation and exacerbation of information asymmetries, unfair, discriminating, biased, and incorrect decisions based on Big Data analytics, and lack of or misinformed user consent.[3] Errors in Big Data analytics that experts have identified include statistical fallacies, human bias, translation errors, and data errors.[4] Experts have also discussed fundamental changes that Big Data can bring about. For example, Danah Boyd and Kate Crawford in the article <i>"Critical Questions for Big Data: Provocations for a cultural, technological, and scholarly phenomenon"</i> propose that Big Data can change the definition of knowledge and shape the reality it measures.[5] Similarly, a BSC/Oxford Internet Institute conference report titled " <i>The Societal Impact of the Internet of Things</i>" points out that often users of Big Data assume that information and conclusions based on digital data is reliable and in turn replace other forms of information with digital data.[6]</p>
<p style="text-align: justify; ">Concerns that have been voiced by the Article 29 Working Party and others specifically about IoT devices have included insufficient security features built into devices such as encryption, the reliance of the devices on wireless communications, data loss from infection by malware or hacking, unauthorized access and use of personal data, function creep resulting from multiple IoT devices being used together, and unlawful surveillance.[7]</p>
<h3>Regulation of Big Data and Internet of Things</h3>
<p style="text-align: justify; ">The regulation of Big Data and IoT is currently being debated in contexts such as the US and the EU. Academics, civil society, and regulators are exploring questions around the adequacy of present regulation and overseeing frameworks to address changes brought about Big Data, and if not - what forms of or changes in regulation are needed? For example, Kate Crawford and Jason Shultz in the article <i>"Big Data and Due Process: Towards a Framework to Redress Predictive Privacy Harms"</i>stress the importance of bringing in 'data due process rights' i.e ensuring fairness in the analytics of Big Data and how personal information is used.[8] While Solon Barocas and Andrew Selbst in the article <i>"Big Data's Disparate Impact"</i> explore if present anti-discrimination legislation and jurisprudence in the US is adequate to protect against discrimination arising from Big Data practices - specifically data mining.[9]</p>
<p><strong>The Impact of Big Data and IoT on Data Protection Principles</strong></p>
<p style="text-align: justify; ">In the context of data protection, various government bodies, including the Article 29 Data Protection Working Party set up under the Directive 95/46/EC of the European Parliament, the Council of Europe, the European Commission, and the Federal Trade Commission, as well as experts and academics in the field, have called out at least ten different data protection principles and concepts that Big Data impacts:</p>
<ol>
<li style="text-align: justify; "><strong>Collection Limitation:</strong> As a result of the generation of Big Data as enabled by networked devices, increased capabilities to analyze Big Data, and the prevalent use of networked systems - the principle of collection limitation is changing.[10]</li>
<li><strong>Consent: </strong>As a result of the use of data from a wide variety of sources and the re-use of data which is inherent in Big Data practices - notions of informed consent (initial and secondary) are changing.[11]</li>
<li><strong>Data Minimization:</strong> As a result of Big Data practices inherently utilizing all data possible - the principle of data minimization is changing/obsolete.[12]</li>
<li><strong>Notice:</strong> As a result of Big Data practices relying on vast amounts of data from numerous sources and the re-use of that data - the principle of notice is changing.[13]</li>
<li><strong>Purpose Limitation:</strong> As a result of Big Data practices re-using data for multiple purposes - the principle of purpose limitation is changing/obsolete.[14]</li>
<li><strong>Necessity: </strong>As a result of Big Data practices re-using data, the new use or re-analysis of data may not be pertinent to the purpose that was initially specified- thus the principle of necessity is changing.[15]</li>
<li><strong>Access and Correction:</strong> As a result of Big Data being generated (and sometimes published) at scale and in real time - the principle of user access and correction is changing.[16]</li>
<li><strong>Opt In and Opt Out Choices: </strong>Particularly in the context of smart cities and IoT which collect data on a real time basis, often without the knowledge of the individual, and for the provision of a service - it may not be easy or possible for individuals to opt in or out of the collection of their data.[17]</li>
<li><strong>PI:</strong> As a result of Big Data analytics using and analyzing a wide variety of data, new or unexpected forms of personal data may be generated - thus challenging and evolving beyond traditional or specified definitions of personal information.[18]</li>
<li><strong>Data Controller:</strong> In the context of IoT, given the multitude of actors that can collect, use and process data generated by networked devices, the traditional understanding of what and who is a data controller is changing.[19]</li>
</ol>
<h3 style="text-align: justify; ">Possible Technical and Policy Solutions</h3>
<p style="text-align: justify; ">In a Report titled "<i>Internet of Things: Privacy & Security in a Connected World</i>" by the Federal Trade Commission in the United States it was noted that though IoT changes the application and understanding of certain privacy principles, it does not necessarily make them obsolete.[20] Indeed many possible solutions that have been suggested to address the challenges posed by IoT and Big Data are technical interventions at the device level rather than fundamental policy changes. For example it has been proposed that IoT devices can be programmed to:</p>
<ul>
<li>Automatically delete data after a specified period of time [21] (addressing concerns of data retention)</li>
<li>Ensure that personal data is not fed into centralized databases on an automatic basis [22] (addressing concerns of transfer and sharing without consent, function creep, and data breach)</li>
<li style="text-align: justify; ">Offer consumers combined choices for consent rather than requiring a one time blanket consent at the time of initiating a service or taking fresh consent for every change that takes place while a consumer is using a service. [23] (addressing concerns of informed and meaningful consent)</li>
<li style="text-align: justify; ">Categorize and tag data with accepted uses and programme automated processes to flag when data is misused. [24] (addressing concerns of misuse of data)</li>
<li style="text-align: justify; ">Apply 'sticky policies' - policies that are attached to data and define appropriate uses of the data as it 'changes hands' [25] (addressing concerns of user control of data)</li>
<li style="text-align: justify; ">Allow for features to only be turned on with consent from the user [26] (addressing concerns of informed consent and collection without the consent or knowledge of the user)</li>
<li>Automatically convert raw personal data to aggregated data [27] (addressing concerns of misuse of personal data and function creep)</li>
<li>Offer users the option to delete or turn off sensors [28] (addressing concerns of user choice, control, and consent)</li>
</ul>
<p style="text-align: justify; ">Such solutions place the designers and manufacturers of IoT devices in a critical role. Yet some, such as Kate Crawford and Jason Shultz are not entirely optimistic about the possibility of effective technological solutions - noting in the context of automated decision making that it is difficult to build in privacy protections as it is unclear when an algorithm will predict personal information about an individual.[29]</p>
<p>Experts have also suggested that more emphasis should be placed on the principles and practices of:</p>
<ul>
<li>Transparency,</li>
<li> Access and correction,</li>
<li>Use/misuse</li>
<li>Breach notification</li>
<li>Remedy</li>
<li>Ability to withdraw consent</li>
</ul>
<p style="text-align: justify; ">Others have recommended that certain privacy principles need to be adapted to the Big Data/IoT context. For example, the Article 29 Working Party has clarified that in the context of IoT, consent mechanisms need to include the types of data collected, the frequency of data collection, as well as conditions for data collection.[30] While the Federal Trade Commission has warned that adopting a pure "use" based model has its limitations as it requires a clear (and potentially changing) definition of what use is acceptable and what use is not acceptable, and it does not address concerns around the collection of sensitive personal information.[31] In addition to the above, the European Commission has stressed that the right of deletion, the right to be forgotten, and data portability also need to be foundations of IoT systems and devices.[32]</p>
<h3>Possible Regulatory Frameworks</h3>
<p style="text-align: justify; ">To the question - are current regulatory frameworks adequate and is additional legislation needed, the FTC has recommended that though a specific IoT legislation may not be necessary, a horizontal privacy legislation would be useful as sectoral legislation does not always account for the use, sharing, and reuse of data across sectors. The FTC also highlighted the usefulness of privacy impact assessments and self regulatory steps to ensure privacy.[33] The European Commission on the other hand has concluded that to ensure enforcement of any standard or protocol - hard legal instruments are necessary.[34] As mentioned earlier, Kate Crawford and Jason Shultz have argued that privacy regulation needs to move away from principles on collection, specific use, disclosure, notice etc. and focus on elements of due process around the use of Big Data - as they say "procedural data due process". Such due process should be based on values instead of defined procedures and should include at the minimum notice, hearing before an independent arbitrator, and the right to review. Crawford and Shultz more broadly note that there are conceptual differences between privacy law and big data that pose as serious challenges i.e privacy law is based on causality while big data is a tool of correlation. This difference raises questions about how effective regulation that identifies certain types of information and then seeks to control the use, collection, and disclosure of such information will be in the context of Big Data – something that is varied and dynamic. According to Crawford and Shultz many regulatory frameworks will struggle with this difference – including the FTC's Fair Information Privacy Principles and the EU regulation including the EU's right to be forgotten.[35] The European Data Protection Supervisor on the other hand looks at Big Data as spanning the policy areas of data protection, competition, and consumer protection – particularly in the context of 'free' services. The Supervisor argues that these three areas need to come together to develop ways in which the challenges of Big Data can be addressed. For example, remedy could take the form of data portability – ensuring users the ability to move their data to other service providers empowering individuals and promoting competitive market structures or adopting a 'compare and forget' approach to data retention of customer data. The Supervisor also stresses the need to promote and treat privacy as a competitive advantage, thus placing importance on consumer choice, consent, and transparency.[36] The European Data Protection reform has been under discussion and it is predicted to be enacted by the end of 2015. The reform will apply across European States and all companies operating in Europe. The reform proposes heavier penalties for data breaches, seeks to provide users with more control of their data.[37] Additionally, Europe is considering bringing digital platforms under the Network and Information Security Directive – thus treating companies like Google and Facebook as well as cloud providers and service providers as a critical sector. Such a move would require companies to adopt stronger security practices and report breaches to authorities.[38]</p>
<h3>Conclusion</h3>
<p style="text-align: justify; ">A review of the different opinions and reactions from experts and policy makers demonstrates the ways in which Big Data and IoT are changing traditional forms of protection that governments and societies have developed to protect personal data as it increases in value and importance. While some policy makers believe that big data needs strong legislative regulation and others believe that softer forms of regulation such as self or co-regulation are more appropriate, what is clear is that Big Data is either creating a regulatory dilemma– with policy makers searching for ways to control the unpredictable nature of big data through policy and technology through the merging of policy areas, the honing of existing policy mechanisms, or the broadening of existing policy mechanisms - while others are ignoring the change that Big Data brings with it and are forging ahead with its use.</p>
<p style="text-align: justify; ">Answering the 'how do we regulate Big Data” question requires <strong>re-conceptualization of data ownership and realities</strong>. Governments need to first recognize the criticality of their data and the data of their citizens/residents, as well as the contribution to a country's economy and security that this data plays. With the technologies available now, and in the pipeline, data can be used or misused in ways that will have vast repercussions for individuals, society, and a nation. All data, but especially data directly or indirectly related to citizens and residents of a country, needs to be looked upon as owned by the citizens and the nation. In this way, data should be seen as a part of <strong>critical</strong> <strong>national infrastructure of a nation, </strong>and accorded the security, protections, and legal backing thereof to <strong>prevent the misuse of the resource by the private or public sectors, local or foreign governments</strong>. This could allow for local data warehousing and bring physical and access security of data warehouses on par with other critical national infrastructure. Recognizing data as a critical resource answers in part the concern that experts have raised – that Big Data practices make it impossible for data to be categorized as personal and thus afforded specified forms of protection due to the unpredictable nature of big data. Instead – all data is now recognized as critical.</p>
<p style="text-align: justify; ">In addition to being able to generate personal data from anonymized or non-identifiable data, big data also challenges traditional divisions of public vs. private data. Indeed Big Data analytics can take many public data points and derive a private conclusion. The use of Big Data analytics on public data also raises questions of consent. For example, though a license plate is public information – should a company be allowed to harvest license plate numbers, combine this with location, and sell this information to different interested actors? This is currently happening in the United States.[39] Lastly, Big Data raises questions of ownership. A solution to the uncertainty of public vs. private data and associated consent and ownership could be the creation a <strong>National Data Archive</strong> with such data. The archive could function with representation from the government, public and private companies, and civil society on the board. In such a framework, for example, companies like Airtel would provide mobile services, but the CDRs and customer data collected by the company would belong to the National Data Archive and be available to Airtel and all other companies within a certain scope for use. This 'open data' approach could enable innovation through the use of data but within the ambit of national security and concerns of citizens – a framework that could instill trust in consumers and citizens. Only when backed with strong security requirements, enforcement mechanisms and a proactive, responsive and responsible framework can governments begin to think about ways in which Big Data can be harnessed.</p>
<hr />
<p style="text-align: justify; ">[1] BCS - The Chartered Institute for IT. (2013). The Societal Impact of the Internet of Things. Retrieved May 17, 2015, from http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf</p>
<p style="text-align: justify; "><i>[2] Sicular, S. (2013, March 27). Gartner’s Big Data Definition Consists of Three Parts, Not to Be Confused with Three “V”s. Retrieved May 20, 2015, from http://www.forbes.com/sites/gartnergroup/2013/03/27/gartners-big-data-definition-consists-of-three-parts-not-to-be-confused-with-three-vs/</i></p>
<p style="text-align: justify; ">[3] Executive Office of the President. “Big Data: Seizing Opportunities, Preserving Values”. May 2014. Available at: <a href="https://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf">https://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[4] Moses, B., Lyria, & Chan, J. (2014). Using Big Data for Legal and Law Enforcement Decisions: Testing the New Tools (SSRN Scholarly Paper No. ID 2513564). Rochester, NY: Social Science Research Network. Retrieved from http://papers.ssrn.com/abstract=2513564</p>
<p style="text-align: justify; ">[5] Danah Boyd, Kate Crawford. <a href="http://www.tandfonline.com/doi/abs/10.1080/1369118X.2012.678878">CRITICAL QUESTIONS FOR BIG DATA</a>. In<a href="http://www.tandfonline.com/toc/rics20/15/5">formation, Communication & Society </a> Vol. 15, Iss. 5, 2012. Available at: <a href="http://www.tandfonline.com/doi/full/10.1080/1369118X.2012.678878">http://www.tandfonline.com/doi/full/10.1080/1369118X.2012.678878</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[6] The Chartered Institute for IT, Oxford Internet Institute, University of Oxford. “The Societal Impact of the Internet of Things” February 2013. Available at: <a href="http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf">http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[7] ARTICLE 29 Data Protection Working Party. (2014). <i>Opinion 8/2014 on the on Recent Developments on the Internet of Things.</i> European Commission. Retrieved May 20, 2015, from http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</p>
<p style="text-align: justify; ">[8] Crawford, K., & Schultz, J. (2013). Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms (SSRN Scholarly Paper No. ID 2325784). Rochester, NY: Social Science Research Network. Retrieved from http://papers.ssrn.com/abstract=2325784</p>
<p style="text-align: justify; ">[9] Barocas, S., & Selbst, A. D. (2015). Big Data’s Disparate Impact (SSRN Scholarly Paper No. ID 2477899). Rochester, NY: Social Science Research Network. Retrieved from http://papers.ssrn.com/abstract=2477899</p>
<p style="text-align: justify; ">[10] Barocas, S., & Selbst, A. D. (2015). Big Data’s Disparate Impact (SSRN Scholarly Paper No. ID 2477899). Rochester, NY: Social Science Research Network. Retrieved from http://papers.ssrn.com/abstract=2477899</p>
<p style="text-align: justify; ">[11] Article 29 Data Protection Working Party. “Opinion 8/2014 on the on Recent Developments on the Internet of Things”. September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">h</a><a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">ttp://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[12] Tene, O., & Polonetsky, J. (2013). Big Data for All: Privacy and User Control in the Age of Analytics. Northwestern Journal of Technology and Intellectual Property, 11(5), 239.</p>
<p style="text-align: justify; ">[13] Omer Tene and Jules Polonetsky, <i>Big Data for All: Privacy and User Control in the Age of Analytics</i>, 11 Nw. J. Tech. & Intell. Prop. 239 (2013).</p>
<p style="text-align: justify; ">[14] Article 29 Data Protection Working Party. “Opinion 8/2014 on the on Recent Developments on the Internet of Things”. September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">h</a><a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">ttp://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[15] Information Commissioner's Office. (2014). Big Data and Data Protection. Infomation Commissioner's Office. Retrieved May 20, 2015, from https://ico.org.uk/media/for-organisations/documents/1541/big-data-and-data-protection.pdf</p>
<p style="text-align: justify; ">[16] Article 29 Data Protection Working Party. “Opinion 8/2014 on the on Recent Developments on the Internet of Things”. September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">h</a><a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">ttp://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[17] The Chartered Institute for IT and Oxford Internet Institute, University of Oxford. “The Societal Impact of the Internet of Things”. February 14<sup>th</sup> 2013. Available at: <a href="http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf">http://www.bcs.org/upload/pdf/societal-impact-report-feb13.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[18] Kate Crawford and Jason Shultz, “Big Data and Due Process: Towards a Framework to Redress Predictive Privacy Harms”. Boston College Law Review, Volume 55, Issue 1, Article 4. January 1st 2014. Available at: <a href="http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&context=bclr">http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&context=bclr</a>. Accessed: July 2nd 2015.</p>
<p style="text-align: justify; ">[19] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16th 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2nd 2015.</p>
<p style="text-align: justify; ">[20] Federal Trade Commission. (2015). <i>Internet of Things: Privacy & Security in a Connected World.</i> Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf</p>
<p style="text-align: justify; ">[21] Federal Trade Commission. (2015). <i>Internet of Things: Privacy & Security in a Connected World.</i> Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf</p>
<p style="text-align: justify; ">[22] Federal Trade Commission. (2015). <i>Internet of Things: Privacy & Security in a Connected World.</i> Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf</p>
<p style="text-align: justify; ">[23] Federal Trade Commission. (2015). <i>Internet of Things: Privacy & Security in a Connected World.</i> Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf</p>
<p style="text-align: justify; ">[24] Federal Trade Commission. (2015). <i>Internet of Things: Privacy & Security in a Connected World.</i> Federal Trade Commision. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf</p>
<p style="text-align: justify; ">[25] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[26] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[27] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[28] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[29] Kate Crawford and Jason Shultz, “Big Data and Due Process: Towards a Framework to Redress Predictive Privacy Harms”. Boston College Law Review, Volume 55, Issue 1, Article 4. January 1st 2014. Available at: <a href="http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&context=bclr">http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&context=bclr</a>. Accessed: July 2nd 2015.</p>
<p style="text-align: justify; ">[30] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[31] Federal Trade Commission. (2015). <i>Internet of Things: Privacy & Security in a Connected World.</i> Federal Trade Commission. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf</p>
<p style="text-align: justify; ">[32] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[33] Federal Trade Commission. (2015). <i>Internet of Things: Privacy & Security in a Connected World.</i> Federal Trade Commission. Retrieved May 20, 2015, from https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff-report-november-2013-workshop-entitled-internet-things-privacy/150127iotrpt.pdf</p>
<p style="text-align: justify; ">[34] Article 29 Data Protection Working Party “Opinion 8/2014 on the on Recent Developments on the Internet of Things” September 16<sup>th</sup> 2014. Available at: <a href="http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf">http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp223_en.pdf</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[35] Kate Crawford and Jason Shultz, “Big Data and Due Process: Towards a Framework to Redress Predictive Privacy Harms”. Boston College Law Review, Volume 55, Issue 1, Article 4. January 1<sup>st</sup> 2014. Available at: <a href="http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&context=bclr">http://lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=3351&context=bclr</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p style="text-align: justify; ">[36] European Data Protection Supervisor. Preliminary Opinion of the European Data Protection Supervisor, Privacy and competitiveness in the age of big data: the interplay between data protection, competition law and consumer protection in the Digital Economy. March 2014. Available at: https://secure.edps.europa.eu/EDPSWEB/webdav/site/mySite/shared/Documents/Consultation/Opinions/2014/14-03-26_competitition_law_big_data_EN.pdf</p>
<p style="text-align: justify; ">[37] SC Magazine. Harmonised EU data protection and fines by the end of the year. June 25<sup>th</sup> 2015. Available at: <a href="http://www.scmagazineuk.com/harmonised-eu-data-protection-and-fines-by-the-end-of-the-year/article/422740/">http://www.scmagazineuk.com/harmonised-eu-data-protection-and-fines-by-the-end-of-the-year/article/422740/</a>. Accessed: August 8<sup>th</sup> 2015.</p>
<p style="text-align: justify; ">[38] Tom Jowitt, “Digital Platforms to be Included in EU Cybersecurity Law”. TechWeek Europe. August 7<sup>th</sup> 2015. Available at: http://www.techweekeurope.co.uk/e-regulation/digital-platforms-eu-cybersecuity-law-174415</p>
<p style="text-align: justify; ">[39] Adam Tanner. Data Brokers are now Selling Your Car's Location for $10 Online. July 10<sup>th</sup> 2013. Available at: http://www.forbes.com/sites/adamtanner/2013/07/10/data-broker-offers-new-service-showing-where-they-have-spotted-your-car/</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/review-of-policy-debate-around-big-data-and-internet-of-things'>http://editors.cis-india.org/internet-governance/blog/review-of-policy-debate-around-big-data-and-internet-of-things</a>
</p>
No publisherelonnaiInternet GovernanceBig Data2015-08-17T08:36:18ZBlog EntryBig Data and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules 2011
http://editors.cis-india.org/internet-governance/blog/big-data-and-information-technology-rules-2011
<b>Experts and regulators across jurisdictions are examining the impact of Big Data practices on traditional data protection standards and principles. This will be a useful and pertinent exercise for India to undertake as the government and the private and public sectors begin to incorporate and rely on the use of Big Data in decision making processes and organizational operations.This blog provides an initial evaluation of how Big Data could impact India's current data protection standards.</b>
<p>Experts and regulators across the globe are examining the impact of Big Data practices on traditional data protection standards and principles. This will be a useful and pertinent exercise for India to undertake as the government and the private and public sectors begin to incorporate and rely on the use of Big Data in decision making processes and organizational operations.</p>
<p>Below is an initial evaluation of how Big Data could impact India's current data protection standards.</p>
<p style="text-align: justify; ">India currently does not have comprehensive privacy legislation - but the Reasonable Security Practices and Procedures and Sensitive Personal Data or Information Rules 2011 formed under section 43A of the Information Technology Act 2000<a href="#_ftn1" name="_ftnref1">[1]</a> define a data protection framework for the processing of digital data by Body Corporate. Big Data practices will impact a number of the provisions found in the Rules:</p>
<p style="text-align: justify; "><b>Scope of Rules: </b>Currently the Rules apply to Body Corporate and digital data. As per the IT Act, Body Corporate is defined as <i>"Any company and includes a firm, sole proprietorship or other association of individuals engaged in commercial or professional activities."</i></p>
<p style="text-align: justify; ">The present scope of the Rules excludes from its purview a number of actors that do or could have access to Big Data or use Big Data practices. The Rules would not apply to government bodies or individuals collecting and using Big Data. Yet, with technologies such as IoT and the rise of Smart Cities across India – a range of government, public, and private organizations and actors could have access to Big Data.</p>
<p style="text-align: justify; "><b>Definition of personal and sensitive personal data: </b>Rule 2(i) defines personal information as <i>"information that relates to a natural person which either directly or indirectly, in combination with other information available or likely to be available with a body corporate, is capable of identifying such person."</i></p>
<p>Rule 3 defines sensitive personal information as:</p>
<ul>
<li>Password,</li>
<li>Financial information,</li>
<li>Physical/physiological/mental health condition,</li>
<li>Sexual orientation,</li>
<li>Medical records and history,</li>
<li>Biometric information</li>
</ul>
<p style="text-align: justify; ">The present definition of personal data hinges on the factor of identification (data that is capable of identifying a person). Yet this definition does not encompass information that is associated to an already identified individual - such as habits, location, or activity.</p>
<p style="text-align: justify; ">The definition of personal data also addresses only the identification of 'such person' and does not address data that is related to a particular person but that also reveals identifying information about another person - either directly - or when combined with other data points.</p>
<p style="text-align: justify; ">By listing specific categories of sensitive personal information, the Rules do not account for additional types of sensitive personal information that might be generated or correlated through the use of Big Data analytics.</p>
<p style="text-align: justify; ">Importantly, the definitions of sensitive personal information or personal information do not address how personal or sensitive personal information - when anonymized or aggregated – should be treated.</p>
<p style="text-align: justify; "><b>Consent</b>: Rule 5(1) requires that Body Corporate must, prior to collection, obtain consent in writing through letter or fax or email from the provider of sensitive personal data regarding the use of that data.</p>
<p style="text-align: justify; ">In a context where services are delivered with little or no human interaction, data is collected through sensors, data is collected on a real time and regular basis, and data is used and re-used for multiple and differing purposes - it is not practical, and often not possible, for consent to be obtained through writing, letter, fax, or email for each instance of data collection and for each use.</p>
<p style="text-align: justify; "><b>Notice of Collection: </b>Rule 5(3) requires Body Corporate to provide the individual with a notice during collection of information that details the fact that information is being collected, the purpose for which the information is being collected, the intended recipients of the information, the name and address of the agency that is collecting the information and the agency that will retain the information. Furthermore body corporate should not retain information for longer than is required to meet lawful purposes.</p>
<p style="text-align: justify; ">Though this provision acts as an important element of transparency, in the context of Big Data, communicating the purpose for which data is collected, the intended recipients of the information, the name and address of the agency that is collecting the information and the agency that will retain the information could prove to be difficult to communicate as they are likely to encompass numerous agencies and change depending upon the analysis being done.</p>
<p style="text-align: justify; "><b>Access and correction</b>: Rule 5(6) provides individuals with the ability to access sensitive personal information held by the body corporate and correct any inaccurate information.</p>
<p style="text-align: justify; ">This provision would be difficult to implement effectively in the context of Big Data as vast amounts of data are being generated and collected on an ongoing and real time basis and often without the knowledge of the individual.</p>
<p><b>Purpose Limitation:</b> Rule 5(5) requires that body corporate should use information only of the purpose which it has been collected.</p>
<p>In the context of Big Data this provision would overlook the re-use of data that is inherent in such practices.</p>
<p style="text-align: justify; "><b>Security:</b> Rule 8 states that any Body Corporate or person on its behalf will be understood to have complied with reasonable security practices and procedures if they have implemented such practices and have in place codes that address managerial, technical, operational and physical security control measures. These codes could follow the IS/ISO/IEC 27001 standard or another government approved and audited standard.</p>
<p style="text-align: justify; ">This provision importantly requires that data controllers collecting and processing data have in place strong security practices. In the context of Big Data – the security of devices that might be generating or collecting data and algorithms processing and analysing data is critical. Once generated, it might be challenging to ensure the data is being transferred to or being analysed by organisations that comply with such security practices as listed.</p>
<p style="text-align: justify; "><b>Data Breach</b> : Rule 8 requires that if a data breach occurs, Body Corporate would have to be able to demonstrate that they have implemented their documented information security codes.</p>
<p style="text-align: justify; ">Though this provision holds a company accountable for the implementation of security practices, it does not address how a company should be held accountable for a large scale data breach as in the context of Big Data the scope and impact of a data breach is on a much larger scale.</p>
<p style="text-align: justify; "><b>Opt in and out and ability to withdraw consent</b> : Rule 5(7) requires Body Corporate or any person on its behalf, prior to the collection of information - including sensitive personal information - must give the individual the option of not providing information and must give the individual the option of withdrawing consent. Such withdrawal must be sent in writing to the body corporate.</p>
<p style="text-align: justify; ">The feasibility of such a provision in the context of Big Data is unclear, especially in light of the fact that Big Data practices draw upon large amounts of data, generated often in real time, and from a variety of sources.</p>
<p style="text-align: justify; "><b>Disclosure of Information</b>: Rule 6 maintains that disclosure of sensitive personal data can only take place with permission from the provider of such information or as agreed to through a lawful contract.</p>
<p style="text-align: justify; ">This provision addresses disclosure and does not take into account the “sharing” of information that is enabled through networked devices, as well as the increasing practice of companies to share anonymized or aggregated data.</p>
<p style="text-align: justify; "><b>Privacy Policy</b> : Rule 4 requires that body corporate have in place a privacy policy on their website that provides clear and accessible statements of its practices and policies, type of personal or sensitive personal information that is being collected, purpose of the collection, usage of the information, disclosure of the information, and the reasonable security practices and procedures that have been put in place to secure the information.</p>
<p style="text-align: justify; ">In the context of Big Data where data from a variety of sources is being collected, used, and re-used it is important for policies to 'follow data' and appear in a contextualized manner. The current requirement of having Body Corporate post a single overarching privacy policy on its website could prove to be inadequate.</p>
<p style="text-align: justify; "><b>Remedy</b> : Section 43A of the Act holds that if a body corporate is negligent in implementing and maintain reasonable security practices and procedures which results in wrongful loss or wrongful gain to any person, the body corporate can be held liable to pay compensation to the affected person.</p>
<p style="text-align: justify; ">This provision will provide limited remedy for an affected individual in the context of Big Data. Though important to help prevent data breaches resulting from negligent data practices, implementation of reasonable security practices and procedures cannot be the only hinging point for determining liability of a Body Corporate for violations and many of the harms possible through Big Data are not in the form of wrongful loss or wrongful gain to another person. Indeed many harms possible through Big Data are non-economic in nature – including physical invasion of privacy, and discriminatory practices that can arise from decisions based on Big Data analytics. Nor does the provision address the potential for future damage that can result from a 'Big Data data breach'.</p>
<p style="text-align: justify; ">The safeguards noted in the above section are not the only legal provisions that speak to privacy in India. There are over fifty sectoral legislation that have provisions addressing privacy - for example provisions addressing confidentiality of health and banking information. The government of India is also in the process of drafting a privacy legislation. In 2012 the Report of the Group of Experts on Privacy provided recommendations for a privacy framework in India. The Report envisioned a framework of co-regulation - with sector level self regulatory organization developing privacy codes (that are not lower than the defined national privacy principles) and that are enforced by a privacy commissioner.<a href="#_ftn2" name="_ftnref2">[2]</a> Perhaps this method would be optimal for the regulation of Big Data- allowing for the needed flexibility and specificity in standards and device development. Though the Report notes that individuals can seek remedy from the court and the Privacy Commissioner can issue fines for a violation, the development of privacy legislation in India has yet to clearly integrate the importance of due process and remedy. With the onset of Big Data - this will become more important than ever.</p>
<h3></h3>
<h3>Conclusion</h3>
<p style="text-align: justify; ">The use and generation of Big Data in India is growing. Plans such as free wifi zones in cities<a href="#_ftn3" name="_ftnref3">[3]</a>, city wide CCTV networks with facial recognition capabilities<a href="#_ftn4" name="_ftnref4">[4]</a>, and the implementation of an identity/authentication platform for public and private services<a href="#_ftn5" name="_ftnref5">[5]</a>, are indicators towards a move of data generation that is networked and centralized, and where the line between public and private is blurred through the vast amount of data that is collected.</p>
<p style="text-align: justify; ">In such developments and innovations what is privacy and what role does privacy play? Is it the archaic inhibitor - limiting the sharing and use of data for new and innovative purposes? Will it be defined purely by legislative norms or through device/platform design as well? Is it a notion that makes consumers think twice about using a product or service or is it a practice that enables consumer and citizen uptake and trust and allows for the growth and adoption of these services?</p>
<p style="text-align: justify; ">How privacy will be regulated and how it will be perceived is still evolving across jurisdictions, technologies, and cultures - but it is clear that privacy is not being and cannot be overlooked. Governments across the world are reforming and considering current and future privacy regulation targeted towards life in a quantified society. As the Indian government begins to roll out initiatives that create a "Digital India" indeed a "quantified India", taking privacy into consideration could facilitate the uptake, expansion, and success of these practices and services. As the Indian government pursues the opportunities possible through Big Data it will be useful to review existing privacy protections and deliberate on if, and in what form, future protections for privacy and other rights will be needed.</p>
<hr />
<p><a href="#_ftnref1" name="_ftn1">[1]</a>Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information Rules 2011). Available at: http://deity.gov.in/sites/upload_files/dit/files/GSR313E_10511(1).pdf</p>
<p><a href="#_ftnref2" name="_ftn2">[2]</a>Group of Experts on Privacy. (2012). <i>Report of the Group of Experts on Privacy.</i> New Delhi: Planning Commission, Government of India. Retrieved May 20, 2015, from http://planningcommission.nic.in/reports/genrep/rep_privacy.pdf</p>
<p><a href="#_ftnref3" name="_ftn3">[3]</a> NDTV. “Free Public Wi-Fi Facility in Delhi to Have Daily Data Limit. NDTV, May 25<sup>th</sup> 2015, Available at: <a href="http://gadgets.ndtv.com/internet/news/free-public-wi-fi-facility-in-delhi-to-have-daily-data-limit-695857">http://gadgets.ndtv.com/internet/news/free-public-wi-fi-facility-in-delhi-to-have-daily-data-limit-695857</a>. Accessed: July 2<sup>nd</sup> 2015.</p>
<p><a href="#_ftnref4" name="_ftn4">[4]</a>FindBiometrics Global Identity Management. “Surat Police Get NEC Facial Recognition CCTV System”. July 21<sup>st</sup> 2015. Available at: http://findbiometrics.com/surat-police-nec-facial-recognition-27214/</p>
<p style="text-align: justify; "><a href="#_ftnref5" name="_ftn5">[5]</a>UIDAI Official Website. Available at: https://uidai.gov.in/</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/big-data-and-information-technology-rules-2011'>http://editors.cis-india.org/internet-governance/blog/big-data-and-information-technology-rules-2011</a>
</p>
No publisherelonnaiInternet GovernanceBig DataPrivacy2015-08-11T07:01:12ZBlog EntryStudying the Emerging Database State in India: Notes for Critical Data Studies (Accepted Abstract)
http://editors.cis-india.org/raw/studying-the-emerging-database-state-in-india-accepted-abstract
<b>"Critical Data Studies (CDS) is a growing field of research that focuses on the unique theoretical, ethical, and epistemological challenges posed by 'Big Data.' Rather than treat Big Data as a scientifically empirical, and therefore largely neutral phenomena, CDS advocates the view that data should be seen as always-already constituted within wider data assemblages." The Big Data and Society journal has provisionally accepted a paper abstract of mine for its upcoming special issue on Critical Data Studies.</b>
<p> </p>
<h2>Introduction</h2>
<p>Through the last decade, the Government of India has given shape to an digital identification infrastructure, developed and operated by the Unique Identification Authority of India (UIDAI). The infrastructure combines the task of assigning unique identification numbers, called Aadhaar numbers, to individuals submitting their biometric and demographic details, and the task of authenticating their identity when provided with an Aadhaar number and associated data (biometric data, One Time Pin sent to the pre-declared mobile number, etc.). The aim of UIDAI is to provide universal authentication-as-a-service for all residents of India who approach any public or private agencies for any kind of service or transaction. Simultaneously, the Aadhaar numbers will function as unique identifiers for joining up databases of different government agencies, and hence allow the Indian government to undertake big data analytics at a governmental scale, and not only at a departmental one.</p>
<p>In this paper, I am primarily motivated by the challenge of finding points and objects to enter into a critical study of such an in-progress data infrastructure. As I proceed with an understanding that data is produced within its specific social and material context, the question then is to read through the data to reflect on its possible social and material context. This is complicated when approaching a big data infrastructure that is meant to produce data for explicitly intra-governmental consumption and circulation. The problem then is not one of reading through available big data, but one of reading through the assemblage and imaginaries of big data to reflect on the kind of data it will give rise to, and thus on the politics of the data assemblage and the database state it enables.</p>
<p> </p>
<h2>Logic of the Database State</h2>
<p>Application of data to inform governmental acts have taken place at least since government has been understood as responsible for the welfare of the population and the territory. The measurement of the population and the territory – the number of people, their demographic features, amounts and locations of natural resources, and so on – have always been integral to the functioning of the modern nation-state. Database state is used in this paper to identify a particular mode of mobilisation of data within governmental acts, which is fundamentally shaped by the possibilities of big data extraction, appropriation, and analytics pioneered by a range of companies since late 1990s. The reason for not using big data state but database dtate is that big data refers to a body of technologies emerging in response to a set of data management and analysis challenges situated in a certain moment of development of information technologies, whereas database refers to a symbolic form (Manovich 1999): a form in which not only the population is made visible to the government (as a collection of visual, textual, numeric, and other forms of records), but also how the acts of government are made visible to the population (as a collection of performance indicators, budget allocation and utilisation tables, and other data visualised through dashboards, analog and digital).</p>
<p>The data production and management logic of this database state is specifically inspired by the notion of platform introduced by the so-called Web 2.0 companies: providing a common service layer upon which various other applications may also run, but under specific arrangements (including distribution of generated user data) with the original common layer provider. Data assemblages of the database state are expected to enable the government to function as a platform, as an intensely data-driven layer that widely gathers data about population individuals and feeds it back selectively to various providers of public and private services. This transforms the data assemblage from one vertical of governmental activities to a horizontal critical infrastructure for modularisation of governmental activities.</p>
<p> </p>
<h2>Studying the Emerging Database State in India</h2>
<p>Government of India is presently debating the legal and technical validity of the digital identity infrastructure programme in the Supreme Court, while simultaneously carrying out the enrollment drive for the same, linking up assignment of unique identity numbers with a national drive for population registration, and rolling out citizen-facing services and applications that implement the Aadhaar number as a necessary key to access them. With the enrollment process going on and the integration with various governmental processes (termed seeding by Aadhaar policy literature) just beginning, I enter this study through two key sets of objects reflecting the imaginaries and the technical specifications of the emerging database state in India. The first entry point is through the various official documents of vision, intentions, plans, and reconsiderations, and the second entry point is through the Application Programming Interface (API) documentations published by UIDAI to specify how its identity authentication platform will collaborate with various public and private services.</p>
<p>The first section of the paper provides a brief survey of pre-UIDAI attempts by the Government of India to deploy unique identification numbers and Smart Cards for specific population groups, so as to understand the initial conceptualisation of this data assemblage of a digital identification platform. The second section foregrounds how this platform undertakes a transformation of the components and relations of the pre-existing data assemblage of the Government of India, as articulated in various official documents of promised utility and proposed collaborations. The third section studies the API documentations to track how such imaginaries are materially interpreted and operationalised through the design of protocols of data interactions with various public and private agencies offering services utilising the identity authentication platform.</p>
<p> </p>
<h2>Notes for Critical Data Studies</h2>
<p>Expanding the early agenda note on Critical Data Studies by Craig Dalton and Jim Thatcher (2014), Rob Kitchin and Tracey P. Lauriault have taken steps towards emphasising the responsibility of this nebulous research strategy to chart and unpack the data assemblages (2014). This is exactly what I propose to do in this paper. While Kitchin and Lauriault provide a detailed list of the components of the apparatus of a data assemblage (2014: 7), I find the concepts of infrastructural components and infrastructural relations very useful in thinking through the emerging infrastructure of authentication. Thus, my approach to these tasks of charting and unpacking is focused on the infrastructural relations that the digital identity infrastructure re-configures, instead of the infrastructural components it mobilises (Bowker et al 2010). This tactical choice of focusing on the infrastructural relations is also necessitated by the practical difficulty in having comprehensive access to the individual components of the data assemblage concerned. Addressing questions of causality and quality becomes difficult when studying the assemblage sans the produced data, and rigorously analysing concerns of security and uncertainty pre-requires an actually existing data assemblage, with a public interface to investigating its leakages, breakages, and internal functioning. In the absence of such points of entry into the data assemblage, which I fear may not be an exceptional case, I attempt an inverted reading. Turning the data infrastructure inside out, in this paper I describe how the digital identity platform is critically reshaping the basis of governmental acts in India, through a specific model of production, extraction and application of big data.</p>
<p> </p>
<h2>Bibliography</h2>
<p>Bowker, Geoffrey C., Karen Baker, Florence Millerand, & David Ribes. 2010. Toward Information Infrastructure Studies: Ways of Knowing in a Networked Environment. Jeremy Hunsinger, Lisbeth Klastrup, & Matthew Allen (Eds.) International Handbook of Internet Research. Springer Dordrecht Heidelberg London New York. Pp. 97-117.</p>
<p>Dalton, Craig, & Jim Thatcher. 2014. What does a Critical Data Studies Look Like, and Why do We Care? Seven Points for a Critical Approach to ‘Big Data.’ Society and Space. May 19. Accessed on July 08, 2015, from <a href="http://societyandspace.com/material/commentaries/craig-dalton-and-jim-thatcher-what-does-a-critical-data-studies-look-like-and-why-do-we-care-seven-points-for-a-critical-approach-to-big-data/" target="_blank">http://societyandspace.com/material/commentaries/craig-dalton-and-jim-thatcher-what-does-a-critical-data-studies-look-like-and-why-do-we-care-seven-points-for-a-critical-approach-to-big-data/</a>.</p>
<p>Kitchin, Rob, & Tracey P. Lauriault. 2014. Towards Critical Data Studies: Charting and Unpacking Data Assemblages and their Work. The Programmable City Working Paper 2. July 29. National University of Ireland Maynooth, Ireland. Accessed on July 08, 2015 from <a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2474112" target="_blank">http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2474112</a>.</p>
<p>Manovich, Lev. 1999. Database as Symbolic Form. Convergence. Volume 5, Number 2. Pp. 80-99.</p>
<p> </p>
<p><em>Note: Call for Papers for the special issue can found here: <a href="http://bigdatasoc.blogspot.in/2015/06/call-for-proposals-special-theme-on.html" target="_blank">http://bigdatasoc.blogspot.in/2015/06/call-for-proposals-special-theme-on.html</a>.</em></p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/studying-the-emerging-database-state-in-india-accepted-abstract'>http://editors.cis-india.org/raw/studying-the-emerging-database-state-in-india-accepted-abstract</a>
</p>
No publishersumandroBig DataData SystemsResearchFeaturedAadhaarResearchers at WorkE-Governance2015-11-13T05:54:53ZBlog EntryNASA International Open Data Challenge 2015
http://editors.cis-india.org/openness/events/nasa-international-open-data-challenge-2015
<b>As part of the initial NASA Open Government 2.0 plan, the NASA International Open Data challenge brings together the FOSS community, citizen scientists, open data practitioners , open hardware enthusiasts and students for collaborative problem solving with the goal of producing relevant open-source solutions to address global needs applicable to both life on Earth and life in Space.</b>
<p style="text-align: justify; ">On April 11 and 12, 2015 2015, the event will be organized by the Centre for Internet and Society in collaboration with mentors from Team Indus, one of India's leading Space Technology Start-Ups. The event will start off with the following keynote and workshops at 9am on Saturday, April 11th, 2015:</p>
<div style="text-align: justify; "><b>Pre-Hackathon Workshop: 9 a.m., Saturday, April 11, 2015</b></div>
<div style="text-align: justify; ">IBM Blue Mix Team + OpenCube Labs</div>
<div style="text-align: justify; ">(Big Data Analytics using Cloud Services like Bluemix/Heroku, with node.js implementation and Android APIs)</div>
<div style="text-align: justify; "></div>
<div style="text-align: justify; ">
<div><b>Keynote: Amar Sharma, 12 p.m., April 11, 2015</b></div>
<div>Amar is credited as being the youngest and first Indian amateur astronomer to have an Asteroid named after him in 2014 at the age of 29. <b>(380607 Sharma)</b> He will talk about CCD Astro Imaging and his travails and journey as a self-made astronomer and comet hunter.</div>
<div></div>
<div>We will then break off into teams to commence the hackathon that will end on Sunday,April 12, 2015 at 6pm, after which teams will upload and present their solutions for Local and Global Nominations.</div>
<div></div>
<div>Registration is free and you are required to confirm participation at the below link:</div>
<div><a href="https://2015.spaceappschallenge.org/location/bangalore/">https://2015.spaceappschallenge.org/location/bangalore/</a></div>
</div>
<div style="text-align: justify; "></div>
<div style="text-align: justify; ">Participants are requested to bring their own laptop/computing devices.</div>
<hr />
<p> </p>
<div style="text-align: justify; ">Please see last year's event's focus on Open Science and Big data, and the various Open Data solutions developed at CIS, to get an idea of what the event is about:</div>
<div style="text-align: justify; "><a href="https://2014.spaceappschallenge.org/location/bangalore/">https://2014.spaceappschallenge.org/location/bangalore/</a> This year, we will have a workshop on Big Data Analytics conducted by IBM BlueMix Labs followed by Heroku implementation and Android Programming by friends of CIS from OpenCubeLabs, that will follow a very special Keynote speaker who is first amateur astronomer to have an asteroid named after him, to join the likes of Ramanujan and Vikram Sarabhai.</div>
<p>
For more details visit <a href='http://editors.cis-india.org/openness/events/nasa-international-open-data-challenge-2015'>http://editors.cis-india.org/openness/events/nasa-international-open-data-challenge-2015</a>
</p>
No publishersharathOpen DataEventBig DataOpenness2015-04-27T01:08:27ZEventBig Data and Positive Social Change in the Developing World: A White Paper for Practitioners and Researchers
http://editors.cis-india.org/internet-governance/blog/big-data-and-positive-social-change-in-developing-world
<b>I was a part of a working group writing a white paper on big data and social change, over the last six months. This white paper was produced by a group of activists, researchers and data experts who met at the Rockefeller Foundation’s Bellagio Centre to discuss the question of whether, and how, big data is becoming a resource for positive social change in low- and middle-income countries (LMICs).</b>
<hr />
<p style="text-align: justify; ">Bellagio Big Data Workshop Participants. (2014). “Big data and positive social change in the developing world: A white paper for practitioners and researchers.” Oxford: Oxford Internet Institute. Available online: <a class="external-link" href="http://ssrn.com/abstract=2491555">http://ssrn.com/abstract=2491555</a>.</p>
<h2>Summary</h2>
<p style="text-align: justify; ">Our working definition of big data includes, but is not limited to, sources such as social media, mobile phone use, digitally mediated transactions, the online news media, and administrative records. It can be categorised as data that is provided explicitly (e.g. social media feedback); data that is observed (e.g. mobile phone call records); and data that is inferred and derived by algorithms (for example social network structure or inflation rates). We defined four main areas where big data has potential for those interested in promoting positive social change: advocating and facilitating; describing and predicting; facilitating information exchange and promoting accountability and transparency.</p>
<p style="text-align: justify; ">In terms of <span class="ff5">advocating and facilitating</span>,<span class="_0 _"> </span> we discussed ways in which volunteered data may <span class="_0 _"> </span>help organisations to open up new public spa<span class="_0 _"></span>ces for discussion and awareness<span class="_0 _"></span>-building; how both aggregating data and working across different databa<span class="_0 _"></span>ses can be tools for building awa<span class="_0 _"></span>reness, and howthe digital data commons can also configure new<span class="_0 _"></span><span class="ff5"> </span>communities and actions<span class="_0 _"></span> (sometimes serendipitously) through data science and aggregation. Finally, we also<span class="_0 _"></span> looked at the problem of overexposure and ho<span class="_0 _"></span>wactivists and organisations can<span class="_0 _"></span> protect themselves and hide their digital footprin<span class="_0 _"></span>ts. The challenges w<span class="ls2">e</span> identified in this area were how to interpret data<span class="_0 _"></span> correctly when supplementary information may b<span class="_0 _"></span>e lacking; organisational capacity constraints aro<span class="_0 _"></span>und processing and storing data,<span class="_0 _"></span> and issues around data dissemination, i.e. the pos<span class="_0 _"></span>sible negative consequences of inadvertently ide<span class="_0 _"></span>ntifying groups or individuals<span class="_0 _"></span>.</p>
<p style="text-align: justify; ">Next, we looked at the way big data can help describe and predict, functions which are particularly important in the academic, development and humanitarian areas of work where researchers can combine data into new dynamic, high-resolution datasets to detect new correlations and surface new questions. With data such as mobile phone data and Twitter analytics, understanding the data’s comprehensiveness, meaning and bias are the main challenges, accompanied by the problem of developing new and more comprehensive ethical systems to protect data subjects where data is observed rather than volunteered.</p>
<p style="text-align: justify; ">The next group of activities discussed was facilitating information exchange. We looked at mobile-based information services, where it is possible for a platform created around a particular aim (e.g. agricultural knowledge-building) to incorporate multiple feedback loops which feed into both research and action. The pitfalls include the technical challenge of developing a platform which is lean yet multifaceted in terms of its uses, and particularly making it reliably available to low-income users. This kind of platform, addressed by big data analytics, also offers new insights through data discovery and allows the provider to steer service provision according to users’ revealed needs and priorities.</p>
<p style="text-align: justify; ">Our last category for big data use was accountability and transparency, where organisations are using crowdsourcing methods to aggregate and analyse information in real time to establish new spaces for critical discussion, awareness and action. Flows of digital information can be managed to prioritise participation and feedback, provide a safe space to engage with policy decisions and expose abuse. The main challenges are how to keep sensitive information (and informants) safe while also exposing data and making authorities accountable; how to make the work sustainable without selling data, and how to establish feedback loops so that users remain involved in the work beyond an initial posting. In the crowdsourcing context, new challenges are also arising in terms of how to verify and moderate real-time flows of information, and how to make this process itself transparent.</p>
<p style="text-align: justify; ">Finally, we also discussed the relationship between big and open data. Open data can be seen as a system of governance and a knowledge commons, whereas big data does not by its nature involve the idea of the commons, so we leaned toward the term ‘opening data’, i.e. processes which could apply to commercially generated as much as public-sector datasets. It is also important to understand where to prioritise opening, and where this may exclude people who are not using the ‘right’ technologies: for example, analogue methods (e.g. nailing a local authority budget to a town hall door every month) may be more open than ‘open’ digital data that’s available online.</p>
<p style="text-align: justify; ">Our discussion surfaced many questions to do with representation and meaning: must datasets be interpreted by people with local knowledge? For researchers to get access to data that is fully representative, do we need a data commons? How are data proprietors engaging with the power dynamics and inequalities in the research field, and how can civil society engage with the private sector on its own terms if data access is skewed towards elites? We also looked at issues of privacy and risk: do we need a contextual risk perspective rather than a single set of standards? What is the role of local knowledge in protecting data subjects, and what kinds of institutions and practices are necessary? We concluded that there is a case to be made for building a data commons for private/public data, and for setting up new and more appropriate ethical guidelines to deal with big data, since aggregating, linking and merging data present new kinds of privacy risk. In particular, organisations advocating for opening datasets must admit the limitations of anonymisation, which is currently being ascribed more power to protect data subjects than it merits in the era of big data.</p>
<p style="text-align: justify; ">Our analysis makes a strong case that it is time for civil society groups in particular to become part of the conversation about the power of data. These groups are the connectors between individuals and governments, corporations and governance institutions, and have the potential to promote big data analysis that is locally driven and rooted. Civil society groups are also crucially important but currently underrepresented in debates about privacy and the rights of technology users, and civil society as a whole has a responsibility for building critical awareness of the ways big data is being used to sort, categorise and intervene in LMICs by corporations, governments and other actors. Big data is shaping up to be one of the key battlefields of our era, incorporating many of the issues civil society activists worldwide have been working on for decades. We hope that this paper can inform organisations and<br />individuals as to where their particular interests may gain traction in the debate, and what their contribution may look like.</p>
<hr />
<p><b><a class="external-link" href="http://cis-india.org/internet-governance/blog/big-data-and-positive-social-change.pdf">Click to download the full white paper here</a></b>. (PDF, 1.95 Mb)</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/big-data-and-positive-social-change-in-developing-world'>http://editors.cis-india.org/internet-governance/blog/big-data-and-positive-social-change-in-developing-world</a>
</p>
No publishernishantBig DataPrivacyInternet GovernanceFeaturedOpennessHomepage2014-10-01T03:52:35ZBlog Entry