Blog

by kaeru — last modified Mar 25, 2013 11:14 AM

Privacy Round Table, Delhi

by Prasad Krishna last modified Aug 12, 2013 10:42 AM

PDF document icon Invite-Delhi.pdf — PDF document, 1157 kB (1185675 bytes)

More than a Hundred Global Groups Make a Principled Stand against Surveillance

by Elonnai Hickok last modified Jul 31, 2013 02:26 PM
For some time now there has been a need to update understandings of existing human rights law to reflect modern surveillance technologies and techniques.

Nothing could demonstrate the urgency of this situation more than the recent revelations confirming the mass surveillance of innocent individuals around the world.

To move toward that goal, today we’re pleased to announce the formal launch of the International Principles on the Application of Human Rights to Communications Surveillance. The principles articulate what international human rights law – which binds every country across the globe – require of governments in the digital age. They speak to a growing global consensus that modern surveillance has gone too far and needs to be restrained. They also give benchmarks that people around the world can use to evaluate and push for changes in their own legal systems.

The product of over a year of consultation among civil society, privacy and technology experts, including the Centre for Internet and Society (read here, here, here and here), the principles have already been co-signed by over hundred organisations from around the world. The process was led by Privacy International, Access, and the Electronic Frontier Foundation. The process was led by Privacy International, Access, and the Electronic Frontier Foundation.

The release of the principles comes on the heels of a landmark report from the United Nations Special Rapporteur on the right to Freedom of Opinion and Expression, which details the widespread use of state surveillance of communications, stating that such surveillance severely undermines citizens’ ability to enjoy a private life, freely express themselves and enjoy their other fundamental human rights. And recently, the UN High Commissioner for Human Rights, Nivay Pillay, emphasised the importance of applying human right standards and democratic safeguards to surveillance and law enforcement activities.

"While concerns about national security and criminal activity may justify the exceptional and narrowly-tailored use of surveillance programmes, surveillance without adequate safeguards to protect the right to privacy actually risk impacting negatively on the enjoyment of human rights and fundamental freedoms," Pillay said.

The principles, summarised below, can be found in full at necessaryandproportionate.org. Over the next year and beyond, groups around the world will be using them to advocate for changes in how present laws are interpreted and how new laws are crafted.

We encourage privacy advocates, rights organisations, scholars from legal and academic communities, and other members of civil society to support the principles by adding their signature.

To sign, please send an email to [email protected], or visit https://www.necessaryandproportionate.org/about

Summary of the 13 principles

  • Legality: Any limitation on the right to privacy must be prescribed by law.
  • Legitimate Aim: Laws should only permit communications surveillance by specified State authorities to achieve a legitimate aim that corresponds to a predominantly important legal interest that is necessary in a democratic society.
  • Necessity: Laws permitting communications surveillance by the State must limit surveillance to that which is strictly and demonstrably necessary to achieve a legitimate aim.
  • Adequacy: Any instance of communications surveillance authorised by law must be appropriate to fulfill the specific legitimate aim identified.
  • Proportionality: Decisions about communications surveillance must be made by weighing the benefit sought to be achieved against the harm that would be caused to users’ rights and to other competing interests.
  • Competent judicial authority: Determinations related to communications surveillance must be made by a competent judicial authority that is impartial and independent.
  • Due process: States must respect and guarantee individuals' human rights by ensuring that lawful procedures that govern any interference with human rights are properly enumerated in law, consistently practiced, and available to the general public.
  • User notification: Individuals should be notified of a decision authorising communications surveillance with enough time and information to enable them to appeal the decision, and should have access to the materials presented in support of the application for authorisation.
  • Transparency: States should be transparent about the use and scope of communications surveillance techniques and powers.
  • Public oversight: States should establish independent oversight mechanisms to ensure transparency and accountability of communications surveillance.
  • Integrity of communications and systems: States should not compel service providers, or hardware or software vendors to build surveillance or monitoring capabilities into their systems, or to collect or retain information.
  • Safeguards for international cooperation: Mutual Legal Assistance Treaties (MLATs) entered into by States should ensure that, where the laws of more than one State could apply to communications surveillance, the available standard with the higher level of protection for users should apply.
  • Safeguards against illegitimate access: States should enact legislation criminalising illegal communications surveillance by public and private actors.

The Audacious ‘Right to Be Forgotten’

by Kovey Coles last modified Jul 31, 2013 10:08 AM
There has long been speculation over the permanency of our online presence. Posting about excessively-personal details, commenting in a way which is later embarrassing, being caught in unflattering public photos; to our chagrin, all of these unfortunate situations often persist on the web, and can continue to haunt us in future years.

Perhaps less dire, what if someone decides that she no longer wants the history of her internet action stored in online systems?

So far, there has been confusion over what should be done, and what realistically can be done about this type of permanent presence on a platform as complex and international in scope as the internet. But now, the idea of a right to be forgotten may be able to define the rights and responsibilities in dealing with unwanted data.

The right to be forgotten is an interesting and highly contentious concept currently being debated in the new European Union Data Protection Regulations.[1]

The Data Protection Regulation Bill was proposed in 2012 by EU Commissioner Viviane Reding and stands to replace the EU’s previous Data Protection law, which was enacted in 1995. Referred to as the “right to be forgotten” (RTBF), article 17 of the proposal would essentially allow an EU citizen to demand service providers to “take all reasonable steps” to remove his or her personal data from the internet, as long as there is no “legitimate” reason for the provider to retain it.[1] Despite the evident emphasis on personal privacy, the proposition is surrounded by controversy and facing resistance from many parties. Apparently, there are a range of concerns over the ramifications RTBF could bring.

Not only are major IT companies staunchly opposed to the daunting task of being responsible for the erasure of data floating around the web, but governments like the United States and even Great Britain are objecting the proposal as well.[2],[3]

From a commercial aspect, IT companies and US lobbying forces view the concept of RTBF as a burden and a waste of resources for service providers to implement. Largely due to the RTBF clause, the new EU Data Protection proposal as a whole has witnessed intense, “unprecedented” lobbying by the largest US tech companies and US lobby groups[4],[5]. From a different angle, there are those like Great Britain, whose grievances with the RTBF are in its overzealous aim and insatiable demands.[2] There are doubts as to whether a company will even be able to track down and erase all forms of  the data in question. The British Ministry of Justice stated, "The UK does not support the right to be forgotten as proposed by the European commission. The title raises unrealistic and unfair expectations of the proposals."[2] Many experts share these feasibility concerns. The Council of European Professional Informatics Societies (CEPIS) wrote a short report on the ramifications of cloud computing practices in 2011, in which it conformed, “It is impossible to guarantee complete deletion of all copies of data. Therefore it is difficult to enforce mandatory deletion of data. Mandatory deletion of data should be included into any forthcoming regulation of Cloud Computing services, but still it should not be relied on too much: the age of a ‘Guaranteed complete deletion of data’, if it ever existed has passed."[6]

Feasibility aside, the most compelling issue in the debate over RTBF is the demanding challenge of balancing and prioritizing parallel rights. When it comes to forced data erasure, conflicts of right to be forgotten versus freedom of speech and expression easily arises. Which right takes precedence over the other?

Some RTBF opponents fear that RTBF will hinder freedom of speech. They have a valid point. What is the extent of personal data erasure? Abuse of RTBF could result in some strange, Orwellian cyberspace where the mistakes or blemishes of society are all erased or constantly amended, and only positivity fills the internet. There are reasonable fears that a chilling effect may come into play once providers face the hefty noncompliance fines of the Data Protection law, and begin to automatically opt for customer privacy over considerations for freedom of expression. Moreover, what safeguards may be in place to prevent politicians or other public figures from removing bits of unwanted coverage?

Although these examples are extreme, considerations like these need to be made in the development of this law. With the amount of backlash from various entities, it is clear that a concept like the right to be forgotten could not exist as a simple, generalized law. It needs refinement.

Still, the concept of a RTBF is not without its supporters. Viktor Mayer-Schönberger, professor of Internet Governance at Oxford Internet Institute, considers RTBF implementation feasible and necessary, saying that even if it is difficult to remove all traces of an item, "it might be in Google's back-up, but if 99% of the population don't have access to it you have effectively been deleted."[7] Additionally, he claims that the undermining of freedom of speech and expression is "a ridiculous misstatement."[7] To him, the right to be forgotten is tied intricately to the important and natural process of forgetting things of the past.

Moreover, the Data Protection Regulation does mention certain exceptions for the RTBF, including protection for "journalistic purposes or the purpose of artistic or literary expression." [1] The problem, however, is the seeming contradiction between the RTBF and its own exceptions. In practice, it will be difficult to reconcile the powers granted by the RTBF with the limitations claimed in other sections of the Data Protection Regulation.

Currently, the are a few clean and straight forward implementations of RTBF. One would be the removal of mined user data which has been accumulated by service providers. Here, invoking the right would be possible once a person has deleted accounts or canceled contracts with a service (thereby fulfilling the notion that the service no longer has "legitimate" reason to retain the data). Another may be in the case of personal data given by minors who later want their data removed, which is an important example mentioned in Reding’s original proposal.[4] These narrow cases are some of the only instances where RTBF may be used without fear of interference with other social rights. Broader implementations of the RTBF concept, under the current unrefined form, may cause too many conflicting areas with other freedoms, and especially freedom of expression.

Overall, the Right to Be Forgotten is a noble concept, born out of concern for the citizen being overpowered by the internet. As an early EU publication states, "The [RTBF] rules are about empowering people, not about erasing past events or restricting the freedom of the press."[8] But at this point, too many clear details seem to be lacking from the draft design of the RTBF. There is concern that without proper deliberation, the concept could lead to unforeseen and undesirable outcomes. Privacy is a fundamental right that deserves to be protected, but policy makers cannot blindly follow the ideals of one right to the point where it interferes with other aspects of society.

Fortunately, recent amendment proposals have attempted some refinement of the bill. Jeffrey Rosen writes in the Stanford Law Review about a certain key concept that could help legitimize the right, namely an amendment proposing that only personally contributed data may be rescinded.[9] This would help avoid interference with others’ rights to expression, and provide limitations on the extent of right to be forgotten claims. As Leslie Harris, president of the Center for Democracy and Technology wrote in the Huffington Post, amendments are needed which can specifically define personal data in the RTBF sense; thereby distinguishing which type of data is allowed to be removed.[10] In the upcoming months, the European Parliament will be considering such amendments to the proposal. This time will be crucial as it will determine if the development of the right to be forgotten will make it a viable option for the EU’s 500 million citizens.

But even after terms are defined and after safeguards are established, this underling philosophical question remains:

Should a person be able to reclaim the right to privacy after willingly giving it up in the first place?

The RTBF is obviously a contentious topic, one which may need to be gauged individually by nation states; it will soon be revealed if the EU becomes the first to adopt the right. If RTBF fails to pass in European parliament, I would hope that it at least serves to remind people of the permanence of the data which they add to the internet, further incentivizing careful consideration of what one yields to the web. Rights frequently evolve and expand to meet societal or technological advances. If we are to expand the concept of privacy, however, then we must do so with proper consideration, so that privacy may not gain disproportionate power over other rights, or vice versa.


[1]. http://bit.ly/WSZvHv

[2]. http://bit.ly/YxKaNJ

[3]. http://tcrn.ch/YdH82f

[4]. http://bit.ly/196E8qj

[5]. http://bit.ly/wJKWTZ

[6]. http://bit.ly/15aoknF

[7]. http://bit.ly/Z3JbRU

[8]. http://bit.ly/xfodhI

[9]. http://bit.ly/13uyda5

[10]. http://huff.to/16P2XIS

India's National Cyber Security Policy in Review

by Jonathan Diamond last modified Jul 31, 2013 10:40 AM
Earlier this month, the Department of Electronics and Information Technology released India’s first National Cyber Security Policy. Years in the making, the Policy sets high goals for cyber security in India and covers a wide range of topics, from institutional frameworks for emergency response to indigenous capacity building.

What the Policy achieves in breadth, however, it often lacks in depth. Vague, cursory language ultimately prevents the Policy from being anything more than an aspirational document. In order to translate the Policy’s goals into an effective strategy, a great deal more specificity and precision will be required.

The Scope of National Cyber Security

Where such precision is most required is in definitions. Having no legal force itself, the Policy arguably does not require the sort of legal precision one would expect of an act of Parliament, for example. Yet the Policy deals in terms plagued with ambiguity, cyber security not the least among them. In forgoing basic definitions, the Policy fails to define its own scope, and as a result it proves remarkably broad and arguably unfocused.

The Policy’s preamble comes close to defining cyber security in paragraph 5 when it refers to "cyber related incident[s] of national significance" involving "extensive damage to the information infrastructure or key assets…[threatening] lives, economy and national security." Here at least is a picture of cyber security on a national scale, a picture which would be quite familiar to Western policymakers: computer security practices "fundamental to both protecting government secrets and enabling national defence, in addition to protecting the critical infrastructures that permeate and drive the 21st century global economy."[*] The paragraph 5 definition of sorts becomes much broader, however, when individuals and businesses are introduced, and threats like identity theft are brought into the mix.

Here the Policy runs afoul of a common pitfall: conflating threats to the state or society writ large (e.g. cyber warfare, cyber espionage, cyber terrorism) with threats to businesses and individuals (e.g. fraud, identity theft). Although both sets of threats may be fairly described as cyber security threats, only the former is worthy of the term national cyber security. The latter would be better characterized as cyber crime. The distinction is an important one, lest cyber crime be “securitized,” or elevated to an issue of national security. National cyber security has already provided the justification for the much decried Central Monitoring System (CMS). Expanding the range of threats subsumed under this rubric may provide a pretext for further surveillance efforts on a national scale.

Apart from mission creep, this vague and overly broad conception of national cyber security risks overwhelming an as yet underdeveloped system with more responsibilities than it may be able to handle. Where cyber crime might be left up to the police, its inclusion alongside true national-level cyber security threats in the Policy suggests it may be handled by the new "nodal agency" mentioned in section IV. Thus clearer definitions would not only provide the Policy with a more focused scope, but they would also make for a more efficient distribution of already scarce resources.

What It Get Right

Definitions aside, the Policy actually gets a lot of things right — at least as an aspirational document. It certainly covers plenty of ground, mentioning everything from information sharing to procedures for risk assessment / risk management to supply chain security to capacity building. It is a sketch of what could be a very comprehensive national cyber security strategy, but without more specifics, it is unlikely to reach its full potential. Overall, the Policy is much of what one might expect from a first draft, but certain elements stand out as worthy of special consideration.

First and foremost, the Policy should be commended for its commitment to “[safeguarding] privacy of citizen’s data” (sic). Privacy is an integral component of cyber security, and in fact other states’ cyber security strategies have entire segments devoted specifically to privacy. India’s Policy stands to be more specific as to the scope of these safeguards, however. Does the Policy aim primarily to safeguard data from criminals? Foreign agents? Could it go so far as to protect user data even from its own agents? Indeed this commitment to privacy would appear at odds with the recently unveiled CMS. Rather than merely paying lip service to the concept of online privacy, the government would be well advised to pass legislation protecting citizens’ privacy and to use such legislation as the foundation for a more robust cyber security strategy.

The Policy also does well to advocate “fiscal schemes and incentives to encourage entities to install, strengthen and upgrade information infrastructure with respect to cyber security.” Though some have argued that such regulation would impose inordinate costs on private businesses, anyone with a cursory understanding of computer networks and microeconomics could tell you that “externalities in cybersecurity are so great that even the freest free market would fail”—to quote expert Bruce Schneier. In less academic terms, a network is only as strong as its weakest link. While it is true that many larger enterprises take cyber security quite seriously, small and medium-sized businesses either lack immediate incentives to invest in security (e.g. no shareholders to answer to) or more often lack the basic resources to do so. Some form of government transfer for cyber security related investments could thus go a long way toward shoring up the country’s overall security.

The Policy also “[encourages] wider usage of Public Key Infrastructure (PKI) within Government for trusted communication and transactions.” It is surprising, however, that the Policy does not mandate the usage of PKI. In general, the document provides relatively few details on what specific security practices operators of Critical Information Infrastructure (CII) can or should implement.

Where It Goes Wrong

One troubling aspect of the Policy is its ambiguous language with respect to acquisition policies and supply chain security in general. The Policy, for example, aims to “[mandate] security practices related to the design, acquisition, development, use and operation of information resources” (emphasis added). Indeed, section VI, subsection A, paragraph 8 makes reference to the “procurement of indigenously manufactured ICT products,” presumably to the exclusion of imported goods. Although supply chain security must inevitably factor into overall cyber security concerns, such restrictive acquisition policies could not only deprive critical systems of potentially higher-quality alternatives but—depending on the implementation of these policies—could also sharpen the vulnerabilities of these systems.

Not only do these preferential acquisition policies risk mandating lower quality products, but it is unlikely they will be able to keep pace with the rapid pace of innovation in information technology. The United States provides a cautionary tale. The U.S. National Institute of Standards and Technology (NIST), tasked with producing cyber security standards for operators of critical infrastructure, made its first update to a 2005 set of standards earlier this year. Other regulatory agencies, such as the Federal Energy Regulatory Commission (FERC) move at a marginally faster pace yet nevertheless are delayed by bureaucratic processes. FERC has already moved to implement Version 5 of its Critical Infrastructure Protection (CIP) standards, nearly a year before the deadline for Version 4 compliance. The need for new standards thus outpaces the ability of industry to effectively implement them.

Fortunately, U.S. cyber security regulation has so-far been technology-neutral. Operators of Critical Information Infrastructure are required only to ensure certain functionalities and not to procure their hardware and software from any particular supplier. This principle ensures competition and thus security, allowing CII operators to take advantage of the most cutting-edge technologies regardless of name, model, etc. Technology neutrality does of course raise risks, such as those emphasized by the Government of India regarding Huawei and ZTE in 2010. Risk assessment must, however, remain focused on the technology in question and avoid politicization. India’s cyber security policy can be technology neutral as long as it follows one additional principle: trust but verify.

Verification may be facilitated by the use of free and open-source software (FOSS). FOSS provides security through transparency as opposed to security through obscurity and thus enables more agile responses to security responses. Users can identify and patch bugs themselves, or otherwise take advantage of the broader user community for such fixes. Thus open-source software promotes security in much the same way that competitive markets do: by accepting a wide range of inputs.

Despite the virtues of FOSS, there are plenty of good reasons to run proprietary software, e.g. fitness for purpose, cost, and track record. Proprietary software makes verification somewhat more complicated but not impossible. Source code escrow agreements have recently gained some traction as a verification measure for proprietary software, even with companies like Huawei and ZTE. In 2010, the infamous Chinese telecommunications giants persuaded the Indian government to lift its earlier ban on their products by concluding just such an agreement.  Clearly trust but verify is imminently practicable, and thus technology neutrality.

What’s Missing

Level of detail aside, what is most conspicuously absent from the new Policy is any framework for institutional cooperation beyond 1) the designation of CERT-In “as a Nodal Agency for coordination of all efforts for cyber security emergency response and crisis management” and 2) the designation of the “National Critical Information Infrastructure Protection Centre (NCIIPC) to function as the nodal agency for critical information infrastructure protection in the country.” The Policy mentions additionally “a National nodal agency to coordinate all matters related to cyber security in the country, with clearly defined roles & responsibilities.” Some clarity with regard to roles and responsibilities would certainly be in order. Even among these three agencies—assuming they are all distinct—it is unclear who is to be responsible for what.

More confusing still is the number of other pre-existing entities with cyber security responsibilities, in particular the National Technical Research Organization (NTRO), which in an earlier draft of the Policy was to have authority over the NCIIPC. The Ministry of Defense likewise has bolstered its cyber security and cyber warfare capabilities in recent years. Is it appropriate for these to play a role in securing civilian CII? Finally, the already infamous Central Monitoring System, justified predominantly on the very basis of cyber security, receives no mention at all. For a government that is only now releasing its first cyber security policy, India has developed a fairly robust set of institutions around this issue. It is disappointing that the Policy does not more fully address questions of roles and responsibilities among government entities.

Not only is there a lack of coordination among government cyber security entities, but there is no mention of how the public and private sectors are to cooperate on cyber security information—other than oblique references to “public-private partnerships.” Certainly there is a need for information sharing, which is currently facilitated in part by the sector-level CERTS. More interesting, however, is the question of liability for high-impact cyber attacks. To whom are private CII operators accountable in the event of disruptive cyber attacks on their systems? This legal ambiguity must necessarily be resolved in conjunction with the “fiscal schemes and incentives” also alluded to in the Policy in order to motivate strong cyber security practices among all CII operators and the public more broadly.

Next Steps

India’s inaugural National Cyber Security Policy is by and large a step in the right direction. It covers many of the most pressing issues in national cyber security and lays out a number of ambitious goals, ranging from capacity building to robust public-private partnerships. To realize these goals, the government will need a much more detailed roadmap.

Firstly, the extent of the government’s proposed privacy safeguards must be clarified and ideally backed by a separate piece of privacy legislation. As Benjamin Franklin once said, “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” When it comes to cyberspace, the Indian people must demand both liberty and safety.

Secondly, the government should avoid overly preferential acquisition policies and allow risk assessments to be technologically rather than politically driven. Procurement should moreover be technology-neutral. Open source software and source code escrow agreements can facilitate the verification measures that make technology neutrality work.

Finally, to translate this policy into a sound strategy will necessarily require that India’s various means be directed toward specific ends. The Policy hints at organizational mapping with references to CERT-In and the NCIIPC, but the roles and responsibilities of other government agencies as well as the private sector remain underdetermined. Greater clarity on these points would improve inter-agency and public-private cooperation—and thus, one hopes, security—significantly.

Not only is there a lack of coordination among government cyber security entities, but there is no mention of how the public and private sectors are to cooperate on cyber security information—other than oblique references to “public-private partnerships.” Certainly there is a need for information sharing, which is currently facilitated in part by the sector-level CERTS. More interesting, however, is the question of liability for high-impact cyber attacks. To whom are private CII operators accountable in the event of disruptive cyber attacks on their systems? This legal ambiguity must necessarily be resolved in conjunction with the “fiscal schemes and incentives” also alluded to in the Policy in order to motivate strong cyber security practices among all CII operators and the public more broadly.


[*]. Melissa E. Hathaway and Alexander Klimburg, “Preliminary Considerations: On National Cyber Security” in National Cyber Security Framework Manual, ed. Alexander Klimburg, (Tallinn, Estonia: Nato Cooperative Cyber Defence Centre of Excellence, 2012), 13

Guidelines for the Protection of National Critical Information Infrastructure: How Much Regulation?

by Jonathan Diamond last modified Aug 01, 2013 04:48 AM
July has been a busy month for cyber security in India. Beginning with the release of the country’s first National Cyber Security Policy on July 2 and followed just this past week by a set of guidelines for the protection of national critical information infrastructure (CII) developed under the direction of the National Technical Research Organization (NTRO), India has made respectable progress in its thinking on national cyber security.

Yet the National Cyber Security Policy, taken together with what little is known of the as-yet restricted guidelines for CII protection, raises troubling questions, particularly regarding the regulation of cyber security practices in the private sector. Whereas the current Policy suggests the imposition of certain preferential acquisition policies, India would be best advised to maintain technology neutrality to ensure maximum security.

According to Section 70(1) of the Information Technology Act, Critical Information Infrastructure (CII) is defined as a “computer resource, the incapacitation or destruction of which, shall have debilitating impact on national security, economy, public health or safety.” In one of the 2008 amendments to the IT Act, the Central Government granted itself the authority to “prescribe the information security practices and procedures for such protected system[s].” These two paragraphs form the legal basis for the regulation of cyber security within the private sector.

Such basis notwithstanding, private cyber security remains almost completely unregulated. According to the Intermediary Guidelines [pdf], intermediaries are required to report cyber security incidents to India’s national-level computer emergency response team (CERT-In). Other than this relatively small stipulation, the only regulation in place for CII exists at the sector level. Last year the Reserve Bank of India mandated that each bank in India appoint a chief information officer (CIO) and a steering committee on information security. The finance sector is also the only sector of the four designated “critical” by the Department of Electronics and Information Technology (DEIT) Cyber Security Strategy to have established a sector-level CERT, which released a set of non-compulsory guidelines [pdf] for information security governance in late 201

The new guidelines for CII protection seek to reorganize the government’s approach to CII. According to a Times of India article on the new guidelines, the NTRO will outline a total of eight sectors (including energy, aviation, telecom and National Stock Exchange) of CII and then “monitor if they are following the guidelines.” Such language, though vague and certainly unsubstantiated, suggests the NTRO may ultimately be responsible for enforcing the “[mandated] security practices related to the design, acquisition, development, use and operation of information resources” described in the Cyber Security Policy. If so, operators of systems deemed critical by the NTRO or by other authorized government agencies may soon be subject to cyber security regulation—with teeth.

To be sure, some degree of cyber security regulation is necessary. After all, large swaths of the country’s CII are operated by private industry, and poor security practices on the part of one operator can easily undermine the security of the rest. To quote security expert Bruce Schneier, “the externalities in cybersecurity are so great that even the freest free market would fail.” In less academic terms, networks are only as secure as their weakest links. While it is true that many larger enterprises take cyber security quite seriously, small and medium-sized businesses either lack immediate incentives to invest in security (e.g. no shareholders to answer to) or more often lack the basic resources to do so. Some form of government transfer for cyber security related investments could thus go a long way toward shoring up the country’s overall security.

Yet regulation may well extend beyond the simple “fiscal schemes and incentives” outlined in section IV of the Policy and “provide for procurement of indigenously manufactured ICT products that have security implications.” Such, at least, was the aim of the Preferential Market Access (PMA) Policy recently put on hold by the Prime Minister’s Office (PMO). Under pressure from international industry groups, the government has promised to review the PMA Policy, with the PMO indicating it may strike out clauses “regarding preference to domestic manufacturer[s] on security related products that are to be used by private sector.” If the government’s aim is indeed to ensure maximum security (rather than to grow an infant industry), it would be well advised to extend this approach to the Cyber Security Policy and the new guidelines for CII protection.

Although there is a national security argument to be made in favor of such policies—namely that imported ICT products may contain “backdoors” or other nefarious flaws—there are equally valid arguments to be made against preferential acquisition policies, at least for the private sector. First and foremost, it is unlikely that India’s nascent cyber security institutions will be able to regulate procurement in such a rapidly evolving market. Indeed, U.S. authorities have been at pains to set cyber security standards, especially in the past several years. Secondly, by mandating the procurement of indigenously manufactured products, the government may force private industry to forgo higher quality products. Absent access to source code or the ability to effectively reverse engineer imported products, buyers should make decisions based on the products’ performance records, not geo-economic considerations like country of origin. Finally, limiting procurement to a specific subset of ICT products likewise restricts the set of security vulnerabilities available to hackers. Rather than improve security, however, a smaller, more distinct set of vulnerabilities may simply make networks easier targets for the sorts of “debilitating” attacks the Policy aims to avert.

As India broaches the difficult task of regulating cyber security in the private sector, it must emphasize flexibility above all. On one hand, the government should avoid preferential acquisition policies which risk a) overwhelming limited regulatory resources, b) saddling CII operators with subpar products, and/or c) differentiating the country’s attack surface. On the other hand, the government should encourage certain performance standards through precisely the sort of “fiscal schemes and incentives” alluded to in the Cyber Security Policy. Regulation should focus on what technology does and does not do, not who made it or what rival government might have had their hands in its design. Ultimately, India should adopt a policy of technology neutrality, backed by the simple principle of trust but verify. Only then can it be truly secure.

CIS Cybersecurity Series (Part 9) - Saikat Datta

by Purba Sarkar last modified Aug 05, 2013 05:24 AM
CIS interviews Saikat Datta, Resident Editor of DNA, Delhi, as part of the Cybersecurity Series.

"Anonymous speech, in countries which have extremely severe systems of governments, which do not have freedom, etcetera, is welcome. But in a democracy like India, I do not see the need for anonymous speech because it is anyways guaranteed by the Constitution of India. So, no, I do not see the need for anonymity in an open and democratic state like India and I would be seriously worried if such a requirement comes up. Shouldn't I strive to be ideal? The ideal suggests that the constitution has guaranteed freedom of speech. Anonymity, for a time being may be acceptable to some people but I would like a situation where a person, without having to seek anonymity, can speak about anything and not be prosecuted by the state, or persecuted by society. And that is the ideal situation that I would like to strive for." - Saikat Datta, Resident Editor, DNA, Delhi.

Centre for Internet and Society presents its ninth installment of the CIS Cybersecurity Series. 

The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

Saikat Datta is a journalist who began his career in December 1996 and has worked with several publications like The Indian Express, the Outlook magazine and the DNA newspaper. He is currently the Resident Editor of DNA, Delhi. Saikat has authored a book on India's Special Forces and presented papers at seminars organized by the Centre for Land Warfare Studies, the Centre for Air Power Studies and the National Security Guards. He has also been awarded the International Press Institute Award for investigative journalism, the National RTI award in the journalism category and the Jagan Phadnis Memorial Award for investigative journalism.

 

 
This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

'Ethical Hacker' Saket Modi Calls for Stronger Cyber Security Discussions

by Kovey Coles last modified Aug 05, 2013 01:11 PM
Twenty-two year old Saket Modi is the CEO and co-founder of Lucideus, a leading cyber security company in India which claims to have worked with 4 out of 5 top global e-commerce companies, 4 out of 10 top IT companies in the world, and 3 out of 5 top banks of the Asia Pacific.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


At the Confederation of Indian Industry (CII) conference on July 13, titled “ACT – Achieving Cyber-Security Together,” Modi as the youngest speaker on the agenda delivered an impromptu talk which lambasted the weaknesses of modern cyber security discussions, enlightened the audience on modern capabilities and challenges of leading cyber security groups, and ultimately received a standing ovation from the crowd. As a later speaker commented, Modi’s controversial opinions and practitioner insight had "set the auditorium ablaze for the remainder of the evening". Since then the Centre for Internet and Society (CIS) has had the pleasure of interviewing Saket Modi over Skype.

It is quite easy to find accounts of Saket Modi's introduction into hacking just by typing his name in the search engine. Faced with the pressure of failing, a teenage Saket discovered how to hack into his high school Chemistry teacher’s test and answer database. After successfully obtaining the answers, and revealing his wrong doings to his teacher, the young man grew intrigued by the possibilities of hacking. "I thought, if I could do this in a couple hours, four hours, then what might I be able to do in four days, four weeks, four months?"

Nowadays, Modi describes himself and his Lucideus team as "ethical hackers", a term recently espoused by hacker groups in the public eye. As opposed to "hacktivists", who utilize hacking methods (including attacks) to achieve or bring awareness to political issues, ethical hackers claim to exclusively use their computer skills to support defenses. At first, incorporation of ethics into a for-profit organization’s game plan may seem confusing, as it leaves room for key questions, like how does one determine which clients constitute ethical business? When asked, however, Modi clarifies by explaining how the ethics are not manifest in the entities Lucideus supports, but instead inherent in the choice of building defensive networks as opposed to using their skills for attack or debilitation. Nevertheless, considerations remain as to whether supporting the cyber security of some entities can lead to the insecurity of others, for example, strengthening the agencies which work in covert cyber espionage. On this point, Modi seems more ambivalent, saying "it depends on a case by case basis". But he still believes cyber security is a right that should be enjoyed by all, "entitled to [you] the moment you set foot on the internet".

As an experienced professional in the field who often gives input on major cyber policy decisions, Modi emphasizes the necessity of youth engagement in cyber security practice and policy. He calls his age bracket the “web generation,” those who have “grown with technology.” According to Modi, no one over 50 or 60 years of age can properly meet the current challenges of the cyber security realm. It is "a sad thing" that those older leaders carry the most power in policy making, and that they often have problems with both understanding and acceptability of modern technological capabilities. For the public, businesses, and also government, there are misconceptions about the importance of cyber security and the extent of modern cyber threats, threats which Modi and his company claim to combat regularly. "About 90 per cent of the crimes that take place in cyber space are because of lack of knowledge, rather than the expertise of the hacker,” he explains. Modi mentions a few basic misconceptions, as simple as, "if I have an anti-virus, my system is secured" or "if you have HTTPS certificate and SSL connection, your system is secured". “These are like wearing an elbow guard while playing cricket,” Modi tells. “If the ball comes at the elbow then you are protected, but what about the rest of the body?”

This highlights another problem evident in India’s current cyber security scene, the problem of lacking “quality institutes to produce good cyber security experts.” For example, Modi takes offence at there not being “a single institute which is providing cyber security at the undergraduate level [in India].” He alludes to the recently unveiled National Cyber Security Policy, specifically the call for five lakh cyber security experts in upcoming years. He calls this “a big figure,” but agrees that there needs to be a lot more awareness throughout the nation. “You really have to change a lot of things,” he says, “in order to get the right things in the right place here in India.”

When considering citizen privacy in relation to cyber security, and the relationship between the two (be it direct or inverse), Saket Modi says the important factor is the governing body, because the issue ultimately resolves to trust. Citizens must trust the “right people with the right qualifications” to store and protect their sensitive data, and to respect privacy. Modi is no novice to the importance of personal data protection, and his company works with a plethora of extremely sensitive information relating to both their clients and their clients’ clients data, so it operates with due care lest it create a “wikileaks part two.”

On internationalization and cyber security, he views the connection between the two as natural, intrinsic. “Cyberspace has added a new dimension to humanity,” says Modi, and tells how former constructs of physical constraints and linear bounds no longer apply. International cooperation is especially pertinent, according to Modi, because the greatest challenge for catching today’s criminal hackers is their international anonymity, “the ability to jump from one country to the other in a matter of milliseconds.”

With the extent of the challenges facing cyber defense specialists, and with the somewhat disorderly current state of Indian cyber security, it is curious to see that Saket Modi has devoted himself to the "ethical" side of hacking. Why hasn’t he or the rest of the Lucideus team resorted to offensive hacking, since Modi claims the majority of cyber attacks of the world who are committed by people also fall between the ages of 15 and 24? Apparently, the answer is simple. “We believe in the need for ethical hacking,” he defends. “We believe in the purpose of making the internet safer.”

Ethical Issues in Open Data

by Kovey Coles last modified Aug 07, 2013 09:19 AM
On August 1, 2013, I took part in a web meeting, organized and hosted by Tim Davies of the World Wide Web foundation. The meeting, titled “Ethical issues in Open Data,” had an agenda focused around privacy considerations in the context of the open data movement.

The main panelists, Carly Nyst and Sam Smith from Privacy International, as well as Steve Song from the International  Development Research Centre, were joined by roughly a dozen other privacy and development researchers from around the globe in the hour long session.

The primary issue of the meeting was the concern over modern capabilities of cross-analytics for de-anonymizing data sets and revealing personally identifiable information (PII) in open data. Open data can constitute publicly available information such as budgets, infrastructures, and population statistics, as long as the data meets the three open data characteristics: accessibility, machine readability, and availability for re-use. “Historically,” said Tim Davies, “public registers have been protected through obscurity.” However, both the capabilities of data analysts and the definition of personal data have continued to expand in recent years. This concern thus presents a conflict between researchers who advocate governments releasing open data reports, and researchers who emphasize privacy in the developing world.

Steve Song, advisor to IDRC Information & Networks program, spoke of the potential collateral damage that comes with publishing more and more types of information. Song addressed the imperative of the meeting in saying, “privacy needs to be a core part of open data conversation.” In his presentation, he gave a particularly interesting example of the tensions between public and private information implications. Following the infamous 2012 school shooting in Newtown, Connecticut, the information on Newtown’s gun permit owning citizens (made publicly available through America’s Freedom of Information Act) was aggregated into an interactive map which revealed the citizens’ addresses. This obviously became problematic for the Newtown community, as the map not only singled out homes which exercised their right to bear arms but also indirectly revealed which homes were without firearm protection and thereby more vulnerable to theft and crime. The Newtown example clearly demonstrates the relationship (and conflict) between open data and privacy; it resolves to the conflict between the right to information and the right to privacy.

An apparent issue surrounding open data is its perceived binary nature. Many advocates either view data as being open, or not; any intermediary boundaries are only forms of governments limiting data accessibility. Therefore, a point raised by meeting attendee Raed Sharif aptly presented an open data counter-argument. Sarif noted how, inversely, privacy conceptions may form a threat to open data. He mentioned how governments could take advantage of privacy arguments to justify their refusal to publish open reports.

However, Carly Nyst summarized the privacy concern and argument in her remarks near the end of the meeting. Namely, she reasoned that the open data mission is viable, if only limited to generic data, i.e., data about infrastructure, or other information that is in no way personal. Doing so will avoid obstructions of individual privacy. Until more advanced anonymization techniques can be achieved, which can overcome modern re-identification methods, publicly publishing PII may prove too risky. It was generally agreed upon during the meeting that open data is not inherently bad, and in fact its analysis and availability can be beneficial, but the threat of its misuse makes it dangerous. For the future of open data, researchers and advocates should perhaps consider more nuanced approaches to the concept in order to respect considerations for other ethical issues, such as privacy.

FinFisher in India and the Myth of Harmless Metadata

by Maria Xynou last modified Aug 13, 2013 11:30 AM
In this article, Maria Xynou argues that metadata is anything but harmless, especially since FinFisher — one of the world's most controversial types of spyware — uses metadata to target individuals.
FinFisher in India and the Myth of Harmless Metadata

by John-Norris on Flickr

In light of PRISM, the Central Monitoring System (CMS) and other such surveillance projects in India and around the world, the question of whether the collection of metadata is “harmless” has arisen.[1] In order to examine this question, FinFisher[2] — surveillance spyware — has been chosen as a case study to briefly examine to what extent the collection and surveillance of metadata can potentially violate the right to privacy and other human rights. FinFisher has been selected as a case study not only because its servers have been recently found in India[3] but also because its “remote monitoring solutions” appear to be very pervasive even on the mere grounds of metadata.

FinFisher in India

FinFisher is spyware which has the ability to take control of target computers and capture even encrypted data and communications. The software is designed to evade detection by anti-virus software and has versions which work on mobile phones of all major brands.[4] In many cases, the surveillance suite is installed after the target accepts installation of a fake update to commonly used software.[5] Citizen Lab researchers have found three samples of FinSpy that masquerades as Firefox.[6]

FinFisher is a line of remote intrusion and surveillance software developed by Munich-based Gamma International. FinFisher products are sold exclusively to law enforcement and intelligence agencies by the UK-based Gamma Group.[7] A few months ago, it was reported that command and control servers for FinSpy backdoors, part of Gamma International´s FinFisher “remote monitoring solutions”, were found in a total of 25 countries, including India.[8]

The following map, published by the Citizen Lab, shows the 25 countries in which FinFisher servers have been found.[9]

Map

The above map shows the results of scanning for characteristics of FinFisher command and control servers.

FinFisher spyware was not found in the countries coloured blue, while the colour green is used for countries not responding. The countries using FinFisher range from shades of orange to shades of red, with the lightest shade of orange ranging to the darkest shade of red on a scale of 1-6, and with 1 representing the least active servers and 6 representing the most active servers in regards to the use of FinFisher. On a scale of 1-6, India is marked a 3 in terms of actively using FinFisher.[10]

Research published by the Citizen Lab reveals that FinSpy servers were recently found in India, which indicates that Indian law enforcement agencies may have bought this spyware from Gamma Group and might be using it to target individuals in India.[11] According to the Citizen Lab, FinSpy servers in India have been detected through the HostGator operator and the first digits of the IP address are: 119.18.xxx.xxx. Releasing complete IP addresses in the past has not proven useful, as the servers are quickly shut down and relocated, which is why only the first two octets of the IP address are revealed.[12]

The Citizen Lab's research reveals that FinFisher “remote monitoring solutions” were found in India, which, according to Gamma Group's brochures, include the following:

  • FinSpy: hardware or software which monitors targets that regularly change location, use encrypted and anonymous communications channels and reside in foreign countries. FinSpy can remotely monitor computers and encrypted communications, regardless of where in the world the target is based. FinSpy is capable of bypassing 40 regularly tested antivirus systems, of monitoring the calls, chats, file transfers, videos and contact lists on Skype, of conducting live surveillance through a webcam and microphone, of silently extracting files from a hard disk, and of conducting a live remote forensics on target systems. FinSpy is hidden from the public through anonymous proxies.[13]
  • FinSpy Mobile: hardware or software which remotely monitors mobile phones. FinSpy Mobile enables the interception of mobile communications in areas without a network, and offers access to encrypted communications, as well as to data stored on the devices that is not transmitted. Some key features of FinSpy Mobile include the recording of common communications like voice calls, SMS/MMS and emails, the live surveillance through silent calls, the download of files, the country tracing of targets and the full recording of all BlackBerry Messenger communications. FinSpy Mobile is hidden from the public through anonymous proxies.[14]
  • FinFly USB: hardware which is inserted into a computer and which can automatically install the configured software with little or no user-interaction and does not require IT-trained agents when being used in operations. The FinFly USB can be used against multiple systems before being returned to the headquarters and its functionality can be concealed by placing regular files like music, video and office documents on the device. As the hardware is a common, non-suspicious USB device, it can also be used to infect a target system even if it is switched off.[15]
  • FinFly LAN: software which can deploy a remote monitoring solution on a target system in a local area network (LAN). Some of the major challenges law enforcement faces are mobile targets, as well as targets who do not open any infected files that have been sent via email to their accounts. FinFly LAN is not only able to deploy a remote monitoring solution on a target´s system in local area networks, but it is also able to infect files that are downloaded by the target, by sending fake software updates for popular software or to infect the target by injecting the payload into visited websites. Some key features of the FinFly LAN include: discovering all computer systems connected to LANs, working in both wired and wireless networks, and remotely installing monitoring solutions through websites visited by the target. FinFly LAN has been used in public hotspots, such as coffee shops, and in the hotels of targets.[16]
  • FinFly Web: software which can deploy remote monitoring solutions on a target system through websites. FinFly Web is designed to provide remote and covert infection of a target system by using a wide range of web-based attacks. FinFly Web provides a point-and-click interface, enabling the agent to easily create a custom infection code according to selected modules. It provides fully-customizable web modules, it can be covertly installed into every website and it can install the remote monitoring system even if only the email address is known.[17]
  • FinFly ISP: hardware or software which deploys a remote monitoring solution on a target system through an ISP network. FinFly ISP can be installed inside the Internet Service Provider Network, it can handle all common protocols and it can select targets based on their IP address or Radius Logon Name. Furthermore, it can hide remote monitoring solutions in downloads by targets, it can inject remote monitoring solutions as software updates and it can remotely install monitoring solutions through websites visited by the target.[18]

Although FinFisher is supposed to be used for “lawful interception”, it has gained notoriety for targeting human rights activists.[19] According to Morgan Marquis-Boire, a security researcher and technical advisor at the Munk School and a security engineer at Google, FinSpy has been used in Ethiopia to target an opposition group called Ginbot.[20] Researchers have argued that FinFisher has been sold to Bahrain's government to target activists, and such allegations were based on an examination of malicious software which was emailed to Bahraini activists.[21] Privacy International has argued that FinFisher has been deployed in Turkmenistan, possibly to target activists and political dissidents.[22]

Many questions revolving around the use of FinFisher and its “remote monitoring solutions” remain   vague, as there is currently inadquate proof of whether this spyware is being used to target individuals by law enforcement agencies in the countries where command and control servers have been found, such as India.[23] However, FinFisher's brochures which were circulated in the ISS world trade shows and leaked by WikiLeaks do reveal some confirmed facts: Gamma International claims that its FinFisher products are capable of taking control of target computers, of capturing encrypted data and of evading mainstream anti-virus software.[24] Such products are exhibited in the world's largest surveillance trade show and probably sold to law enforcement agencies around the world.[25] This alone unveils a concerning fact: spyware which is so sofisticated that it even evades encryption and anti-virus software is currently in the market and law enforcement agencies can potentially use it to target activists and anyone who does not comply with social conventions.[26] A few months ago, two Indian women were arrested after having questioned the shutdown of Mumbai for Shiv Sena patriarch Bal Thackeray's funeral.[27] Thus, it remains unclear what type of behaviour is targeted by law enforcement agencies and whether spyware, such as FinFisher, would be used in India to track individuals without a legally specified purpose.

Furthermore, India lacks privacy legislation which could safeguard individuals from potential abuse, while sections 66A and 69 of the Information Technology (Amendment) Act, 2008, empower Indian authorities with extensive surveillance capabilites.[28] While it remains unclear if Indian law enforcement agencies are using FinFisher spy products to unlawfully target individuals, it is a fact that FinFisher control and command servers have been found in India and that, if used, they could potentially have severe consequences on individuals' right to privacy and other human rights.[29]

The Myth of Harmless Metadata

Over the last months, it has been reported that the Central Monitoring System (CMS) is being implemented in India, through which all telecommunications and Internet communications in the country are being centrally intercepted by Indian authorities. This mass surveillance of communications in India is enabled by the omission of privacy legislation and Indian authorities are currently capturing the metadata of communications.[30]

Last month, Edward Snowden leaked confidential U.S documents on PRISM, the top-secret National Security Agency (NSA) surveillance programme that collects metadata through telecommunications and Intenet communications. It has been reported that through PRISM, the NSA has tapped into the servers of nine leading Internet companies: Microsoft, Google, Yahoo, Skype, Facebook, YouTube, PalTalk, AOL and Apple.[31] While the extent to which the NSA is actually tapping into these servers remains unclear, it is certain that the NSA has collected metadata on a global level.[32] Yet, the question of whether the collection of metadata is “harmful” remains ambiguous.

According to the National Information Standards Organization (NISO), the term “metadata” is defined as “structured information that describes, explains, locates or otherwise makes it easier to retrieve, use or manage an information resource”. NISO claims that metadata is “data about data” or “information about information”.[33] Furthermore, metadata is considered valuable due to its following functions:

  • Resource discovery
  • Organizing electronic resources
  • Interoperability
  • Digital Identification
  • Archiving and preservation

Metadata can be used to find resources by relevant criteria, to identify resources, to bring similar resources together, to distinguish dissimilar resources and to give location information. Electronic resources can be organized through the use of various software tools which can automatically extract and reformat information for Web applications. Interoperability is promoted through metadata, as describing a resource with metadata allows it to be understood by both humans and machines, which means that data can automatically be processed more effectively. Digital identification is enabled through metadata, as most metadata schemes include standard numbers for unique identification. Moreover, metadata enables the archival and preservation of large volumes of digital data.[34]

Surveillance projects, such as PRISM and India's CMS, collect large volumes of metadata, which include the numbers of both parties on a call, location data, call duration, unique identifiers, the International Mobile Subscriber Identity (IMSI) number, email addresses, IP addresses and browsed webpages.[35] However, the fact that such surveillance projects may not have access to content data might potentially create a false sense of security.[36] When Microsoft released its report on data requests by law enforcement agencies around the world in March 2013, it revealed that most of the disclosed data was metadata, while relatively very little content data was allegedly disclosed.[37]

imilarily, Google's transparency report reveals that the company disclosed large volumes of metadata to law enforcement agencies, while restricting its disclosure of content data.[38]

Such reports may potentially provide a sense of security to the public, as they reassure that the content of personal emails, for example, has not been shared with the government, but merely email addresses – which might be publicly available online anyway. However, is content data actually more “harmful” than metadata? Is metadata “harmless”? How much data does metadata actually reveal?

The Guardian recently published an article which includes an example of how individuals can be tracked through their metadata. In particular, the example explains how an individual is tracked – despite using an anonymous email account – by logging in from various hotels' public Wi-Fi and by leaving trails of metadata that include times and locations. This example illustrates how an individual can be tracked through metadata alone, even when anonymous accounts are being used.[39]

Wired published an article which states that metadata can potentially be more harmful than content data because “unlike our words, metadata doesn't lie”. In particular, content data shows what an individual says – which may be true or false – whereas metadata includes what an individual does. While the validity of the content within an email may potentially be debateable, it is undeniable that an individual logged into specific websites – if that is what that individuals' IP address shows. Metadata, such as the browsing habits of an individual, may potentially provide a more thorough and accurate profile of an individual than that individuals' email content, which is why metadata can potentially be more harmful than content data.[40]

Furthermore, voice content is hard to process and written content in an email or chat communication may not always be valid. Metadata, on the other hand, provides concrete patterns of an individuals' behaviour, interests and interactions. For example, metadata can potentially map out an individuals' political affiliation, interests, economic background, institution, location, habits and the people that individual interacts with. Such data can potentially be more valuable than content data, because while the validity of email content is debateable, metadata usually provides undeniable facts. Not only is metadata more accurate than content data, but it is also ideally suited to automated analysis by a computer. As most metadata includes numeric figures, it can easily be analysed by data mining software, whereas content data is more complicated.[41]

FinFisher products, such as FinFly LAN, FinFly Web and FinFly ISP, provide solid proof that the collection of metadata can potentially be “harmful”. In particular, FinFly LAN can be deployed in a target system in a local area network (LAN) by infecting files that are downloaded by the target, by sending fake software updates for popular software or by infecting the payload into visited websites. The fact that FinFly LAN can remotely install monitoring solutions through websites visited by the target indicates that metadata alone can be used to acquire other sensitive data.[42]

FinFly Web can deploy remote monitoring solutions on a target system through websites. Additionally, FinFly Web can be covertly installed into every website and it can install the remote monitoring system even if only the email address is known.[43] FinFly ISP can select targets based on their IP address or Radius Logon Name. Furthermore, FinFly ISP can remotely install monitoring solutions through websites visited by the target, as well as inject remote monitoring solutions as software updates.[44] In other words, FinFisher products, such as FinFly LAN, FinFly Web and FinFly ISP, can target individuals, take control of their computers and their data, and capture even encrypted data and communications with the help of metadata alone.

The example of FinFisher products illustrates that metadata can potentially be as “harmful” as content data, if acquired unlawfully and without individual consent.[45] Thus, surveillance schemes, such as PRISM and India's CMS, which capture metadata without individuals' consent can potentially pose a major threat to the right to privacy and other human rights.[46] Privacy can be defined as the claim of individuals, groups or institutions to determine when, how and to what extent information about them is communicated to others.[47] Furthermore, privacy is at the core of human rights because it protects individuals from abuse by those in power.[48] The unlawful collection of metadata exposes individuals to the potential violation of their human rights, as it is not transparent who has access to their data, whether it is being shared with third parties or for how long it is being retained.

It is not clear if Indian law enforcement agencies are actually using FinFisher products, but the Citizen Lab did find FinFisher command and control servers in the country which indicates that there is a high probability that such spyware is being used.[49] This probability is highly concerning not only because the specific spy products have such advanced capabilities that they are even capable of capturing encrypted data, but also because India currently lacks privacy legislation which could safeguard individuals.

Thus, it is recommended that Indian law enforcement agencies are transparent and accountable if they are using spyware which can potentially breach their citizens' human rights and that privacy legislation is enacted into law. Lastly, it is recommended that all surveillance technologies are strictly regulated with regards to the protection of human rights and that Indian authorities adopt the principles on communication surveillance formulated by the Electronic Frontier Foundation and Privacy International.[50] The above could provide a decisive first step in ensuring that India is the democracy it claims to be.


[1]. Robert Anderson (2013), “Wondering What Harmless 'Metadata' Can Actually Reveal? Using Own Data, German Politician Shows Us”, The CSIA Foundation, http://bit.ly/1cIhu7G

[2]. Gamma Group, FinFisher IT Intrusion, http://bit.ly/fnkGF3

[3]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, “You Only Click Twice: FinFisher's Global Proliferation”, The Citizen Lab, 13 March 2013, http://bit.ly/YmeB7I

[4]. Michael Lewis, “FinFisher Surveillance Spyware Spreads to Smartphones”, The Star: Business, 30 August 2012, http://bit.ly/14sF2IQ

[5]. Marcel Rosenbach, “Troublesome Trojans: Firm Sought to Install Spyware Via Faked iTunes Updates”, Der Spiegel, 22 November 2011, http://bit.ly/14sETVV

[6]. Intercept Review, Mozilla to Gamma: stop disguising your FinSpy as Firefox, 02 May 2013, http://bit.ly/131aakT

[7]. Intercept Review, LI Companies Review (3) – Gamma, 05 April 2012, http://bit.ly/Hof9CL

[8]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, For Their Eyes Only: The Commercialization of Digital Spying, Citizen Lab and Canada Centre for Global Security Studies, Munk School of Global Affairs, University of Toronto, 01 May 2013, http://bit.ly/ZVVnrb

[9]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, “You Only Click Twice: FinFisher's Global Proliferation”, The Citizen Lab, 13 March 2013, http://bit.ly/YmeB7I

[10]. Ibid.

[11]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, For Their Eyes Only: The Commercialization of Digital Spying, Citizen Lab and Canada Centre for Global Security Studies, Munk School of Global Affairs, University of Toronto, 01 May 2013, http://bit.ly/ZVVnrb

[12]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, “You Only Click Twice: FinFisher's Global Proliferation”, The Citizen Lab, 13 March 2013, http://bit.ly/YmeB7I

[13]. Gamma Group, FinFisher IT Intrusion, FinSpy: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/zaknq5

[14]. Gamma Group, FinFisher IT Intrusion, FinSpy Mobile: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/19pPObx

[15]. Gamma Group, FinFisher IT Intrusion, FinFly USB: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/1cJSu4h

[16]. Gamma Group, FinFisher IT Intrusion, FinFly LAN: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/14J70Hi

[17]. Gamma Group, FinFisher IT Intrusion, FinFly Web: Remote Monitoring & Intrusion Solutions, WikiLeaks: The Spy Files, http://bit.ly/19fn9m0

[18]. Gamma Group, FinFisher IT Intrusion, FinFly ISP: Remote Monitoring & Intrusion Solutions, WikiLeaks: The Spy Files, http://bit.ly/13gMblF

[19]. Gerry Smith, “FinSpy Software Used To Surveil Activists Around The World, Reports Says”, The Huffington Post, 13 March 2013, http://huff.to/YmmhXI

[20]. Jeremy Kirk, “FinFisher Spyware seen Targeting Victims in Vietnam, Ethiopia”, Computerworld: IDG News, 14 March 2013, http://bit.ly/14J8BwW

[21]. Reporters without Borders: For Freedom of Information (2012), The Enemies of the Internet: Special Edition: Surveillance, http://bit.ly/10FoTnq

[22]. Privacy International, FinFisher Report, http://bit.ly/QlxYL0

[23]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, “You Only Click Twice: FinFisher's Global Proliferation”, The Citizen Lab, 13 March 2013, http://bit.ly/YmeB7I

[24]. Gamma Group, FinFisher IT Intrusion, FinSpy: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/zaknq5

[25]. Adi Robertson, “Paranoia Thrives at the ISS World Cybersurveillance Trade Show”, The Verge, 28 December 2011, http://bit.ly/tZvFhw

[26]. Gerry Smith, “FinSpy Software Used To Surveil Activists Around The World, Reports Says”, The Huffington Post, 13 March 2013, http://huff.to/YmmhXI

[27]. BBC News, “India arrests over Facebook post criticising Mumbai shutdown”, 19 November 2012, http://bbc.in/WoSXkA

[28]. Indian Ministry of Law, Justice and Company Affairs, The Information Technology (Amendment) Act, 2008, http://bit.ly/19pOO7t

[29]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, For Their Eyes Only: The Commercialization of Digital Spying, Citizen Lab and Canada Centre for Global Security Studies, Munk School of Global Affairs, University of Toronto, 01 May 2013, http://bit.ly/ZVVnrb

[30]. Phil Muncaster, “India introduces Central Monitoring System”, The Register, 08 May 2013, http://bit.ly/ZOvxpP

[31]. Glenn Greenwald & Ewen MacAskill, “NSA PRISM program taps in to user data of Apple, Google and others”, The Guardian, 07 June 2013, http://bit.ly/1baaUGj

[32]. BBC News, “Google, Facebook and Microsoft seek data request transparency”, 12 June 2013, http://bbc.in/14UZCCm

[33]. National Information Standards Organization (2004), Understanding Metadata, NISO Press, http://bit.ly/LCSbZ

[34]. Ibid.

[35]. The Hindu, “In the dark about 'India's PRISM'”, 16 June 2013, http://bit.ly/1bJCXg3 ; Glenn Greenwald, “NSA collecting phone records of millions of Verizon customers daily”, The Guardian, 06 June 2013, http://bit.ly/16L89yo

[36]. Robert Anderson, “Wondering What Harmless 'Metadata' Can Actually Reveal? Using Own Data, German Politician Shows Us”, The CSIA Foundation, 01 July 2013, http://bit.ly/1cIhu7G

[37]. Microsoft: Corporate Citizenship, 2012 Law Enforcement Requests Report,http://bit.ly/Xs2y6D

[38]. Google, Transparency Report, http://bit.ly/14J7hKp

[39]. Guardian US Interactive Team, A Guardian Guide to your Metadata, The Guardian, 12 June 2013, http://bit.ly/ZJLkpy

[40]. Matt Blaze, “Phew, NSA is Just Collecting Metadata. (You Should Still Worry)”, Wired, 19 June 2013, http://bit.ly/1bVyTJF

[41]. Ibid.

[42]. Gamma Group, FinFisher IT Intrusion, FinFly LAN: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/14J70Hi

[43]. Gamma Group, FinFisher IT Intrusion, FinFly Web: Remote Monitoring & Intrusion Solutions, WikiLeaks: The Spy Files, http://bit.ly/19fn9m0

[44]. Gamma Group, FinFisher IT Intrusion, FinFly ISP: Remote Monitoring & Intrusion Solutions, WikiLeaks: The Spy Files, http://bit.ly/13gMblF

[45]. Robert Anderson, “Wondering What Harmless 'Metadata' Can Actually Reveal? Using Own Data, German Politician Shows Us”, The CSIA Foundation, 01 July 2013, http://bit.ly/1cIhu7G

[46]. Shalini Singh, “India's surveillance project may be as lethal as PRISM”, The Hindu, 21 June 2013, http://bit.ly/15oa05N

[47]. Cyberspace Law and Policy Centre, Privacy, http://bit.ly/14J5u7W

[48]. Bruce Schneier, “Privacy and Power”, Schneier on Security, 11 March 2008, http://bit.ly/i2I6Ez

[49]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, For Their Eyes Only: The Commercialization of Digital Spying, Citizen Lab and Canada Centre for Global Security Studies, Munk School of Global Affairs, University of Toronto, 01 May 2013, http://bit.ly/ZVVnrb

[50]. Elonnai Hickok, “Draft International Principles on Communications Surveillance and Human Rights”, The Centre for Internet and Society, 16 January 2013, http://bit.ly/XCsk9b

Freedom from Monitoring: India Inc Should Push For Privacy Laws

by Sunil Abraham last modified Aug 21, 2013 07:04 AM
More surveillance than absolutely necessary actually undermines the security objective.

This article by Sunil Abraham was published in Forbes India Magazine on August 21, 2013.


I think I understand why the average Indian IT entrepreneur or enterprise does not have a position on blanket surveillance. This is because the average Indian IT enterprise’s business model depends on labour arbitrage, not intellectual property. And therefore they have no worries about proprietary code or unfiled patent applications being stolen by competitors via rogue government officials within projects such as NATGRID, UID and, now, the CMS.

A sub-section of industry, especially the technology industry, will always root for blanket surveillance measures. The surveillance industry has many different players, ranging from those selling biometric and CCTV hardware to those providing solutions for big data analytics and legal interception systems. There are also more controversial players who provide spyware, especially those in the market for zero-day exploits. The cheerleaders for the surveillance industry are techno-determinists who believe you can solve any problem by throwing enough of the latest and most expensive technology at it.

What is surprising, though, is that other indigenous or foreign enterprises that depend on secrecy and confidentiality—in sectors such a banking, finance, health, law, ecommerce, media, consulting and communications—also don’t seem to have a public position on the growing surveillance ambitions of ‘democracies’ such as India and the United States of America. (Perhaps the only exceptions are a few multinational internet and software companies that have made some show of resistance and disagreement with the blanket surveillance paradigm.)

Is it because these businesses are patriotic? Do they believe that secrecy, confidentiality and, most importantly, privacy, must be sacrificed for national security? If that were true then it would not be a particularly wise thing to do, as privacy is the precondition for security. Ann Cavoukian, privacy commissioner of Ontario, calls it a false dichotomy. Bruce Schneier, security technologist and writer, calls it a false zero sum game; he goes on to say, “There is no security without privacy. And liberty requires both security and privacy.”

The reason why the secret recipe of Coca Cola is still secret after over 120 years is the same as the reason why a captured soldier cannot spill the beans on the overall war strategy. Corporations, like militaries, have layers and layers of privacy and secrecy. The ‘need to know’ principle resists all centralising tendencies, such as blanket surveillance. It’s important to note that targeted surveillance to identify a traitor or spy within the military, or someone engaged in espionage within a corporation, is pretty much an essential. However, any more surveillance than absolutely necessary actually undermines the security objective. To summarise, privacy is a pre-condition to the security of the individual, the enterprise, the military and the nation state.

Most people complaining online about projects like the Central Monitoring System seem to think that India has no privacy laws. This is completely untrue: We have around 50 different laws, rules and regulations that aim to uphold privacy and confidentiality in various domains. Unfortunately, most of those policies are very dated and do not sufficiently take into account the challenges of contemporary information societies. These policy documents need to be updated and harmonised through the enactment of a new horizontal privacy law. A small minority will say that Section 43(A) of the Information Technology Act is the India privacy law. That is not completely untrue, but is a gross exaggeration. Section 43(A) is really only a data security provision and, at that, it does not even comprehensively address data protection, which is only a sub-set of the overall privacy regulation required in a nation.

What would an ideal privacy law for India look like? For one, it would protect the rights of all persons, regardless of whether they are citizens or residents. Two, it would define privacy principles. Three, it would establish the office of an independent and autonomous privacy commissioner, who would be sufficiently empowered to investigate and take action against both government and private entities. Four, it would define civil and criminal offences, remedies and penalties. And five, it would have an overriding effect on previous legislation that does not comply with all the privacy principles.

The Justice AP Shah Committee report, released in October 2012, defined the Indian privacy principles as notice, choice and consent, collection limitation, purpose limitation, access and correction, disclosure of information, security, openness and accountability. The report also lists the exemptions and limitations, so that privacy protections do not have a chilling effect on the freedom of expression and transparency enabled by the Right to Information Act.

The Department of Personnel and Training has been working on a privacy bill for the last three years. Two versions of the bill had leaked before the Justice AP Shah Committee was formed. The next version of the bill, hopefully implementing the recommendations of the Justice AP Shah Committee report, is expected in the near future. In a multi-stakeholder-based parallel process, the Centre for Internet and Society (where I work), along with FICCI and DSCI, is holding seven round tables on a civil society draft of the privacy bill and the industry-led efforts on co-regulation.

The Indian ITES, KPO and BPO sector should be particularly pleased with this development. As should any other Indian enterprise that holds personal information of EU and US nationals. This is because the EU, after the enactment of the law, will consider data protection in India adequate as per the requirements of its Data Protection Directive. This would mean that these enterprises would not have to spend twice the time and resources ensuring compliance with two different regulatory regimes.

Is the lack of enthusiasm for privacy in the Indian private sector symptomatic of Indian societal values? Can we blame it on cultural relativism, best exemplified by what Simon Davies calls “the Indian Train Syndrome, in which total strangers will disclose their lives on a train to complete strangers”? But surely, when email addresses are exchanged at the end of that conversation, they are not accompanied by passwords. Privacy is perhaps differently configured in Indian societies but it is definitely not dead. Fortunately for us, calls to protect this important human right are growing every day.

The Personal Data (Protection) Bill, 2013

by Prachi Arya last modified Aug 30, 2013 02:53 PM
Below is the text of the Personal Data (Protection) Bill, 2013 as discussed at the 6th Privacy Roundtable, New Delhi held on 24 August 2013. Note: This version of the Bill caters only to the Personal Data regime. The surveillance and privacy of communications regime was not discussed at the 6th Privacy Roundtable.

PDF document icon Personal Data (Protection) Bill.pdf — PDF document, 193 kB (198250 bytes)

Report on the Sixth Privacy Roundtable Meeting, New Delhi

by Prachi Arya last modified Aug 30, 2013 03:04 PM
In 2013 the Centre for Internet and Society (CIS) drafted the Privacy Protection Bill as a citizens' version of a privacy legislation for India. Since April 2013, CIS has been holding Privacy Roundtables in collaboration with Federation of Indian Chambers of Commerce and Industry (FICCI) and DSCI, with the objective of gaining public feedback to the Privacy Protection Bill and other possible frameworks for privacy in India. The following is a report on the Sixth Privacy Roundtable held in New Delhi on August 24, 2013.
Report on the Sixth Privacy Roundtable Meeting, New Delhi

A banner of the event with logos of all the organisers


This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC.


Introduction

A series of seven multi-stakeholder roundtable meetings on "privacy" were conducted by CIS in collaboration with FICCI from April 2013 to August 2013 under the Internet Governance initiative. DSCI joined CIS and FICCI as a co-organizer on April 20, 2013.

CIS was a member of the Justice A.P. Shah Committee which drafted the "Report of Groups of Experts on Privacy". CIS also drafted a Privacy (Protection) Bill 2013 (hereinafter referred to as ‘the Bill’), with the objective of establishing a well protected privacy regime in India. CIS has also volunteered to champion the session/workshops on "privacy" in the final meeting on Internet Governance proposed for October 2013.

At the roundtables the Report of the Group of Experts on Privacy and the text of the Privacy (Protection) Bill 2013 will be discussed. The discussions and recommendations from the six round table meetings will be presented at the Internet Governance meeting in October 2013.

The dates of the six Privacy Round Table meetings are enlisted below:

  1. New Delhi Roundtable: April 13, 2013
  2. Bangalore Roundtable: April 20, 2013
  3. Chennai Roundtable: May 18, 2013
  4. Mumbai Roundtable: June 15, 2013
  5. Kolkata Roundtable: July 13, 2013
  6. New Delhi Roundtable: August 24, 2013
  7. New Delhi Final Roundtable and National Meeting: October 19, 2013

This Report provides an overview of the proceedings of the Sixth Privacy Roundtable (hereinafter referred to as 'the Roundtable'), conducted at FICCI, Federation House in Delhi on August 24, 2013. The Personal Data (Protection) Bill, 2013 was discussed at the Roundtable.

The Sixth Privacy Roundtable began with reflections on the evolution of the Bill. In its penultimate form, the Bill stands substantially changed as compared to its previous versions. For the purpose of this Roundtable, which entailed participation largely from industry organizations and other entities who handle personal data, only the personal data regime was discussed. This debate was distinguished from the general and specific discussion relating to privacy, surveillance and interception of communications as it was felt that greater expertise was required to deal adequately with such a vast and nuanced area. After further discussion with security experts, the provisions on surveillance and privacy of communications will be reincorporated resulting in omnibus privacy legislation. To reflect this alteration in the ambit of the Bill in its current form, its title was changed to Personal Data (Protection) Bill from the more expansive – Privacy (Protection) Bill.

Chapter I – Preliminary

Section 2 of the first chapter enumerates various definitions including ‘personal data’, which is defined as any data that can lead to identification and ‘sensitive personal data’; a subset of personal data defined by way of a list. The main contentions arose in relation to the latter definition.

Religion and Caste

A significant modification is found in the definition of ‘sensitive personal data’, which has expanded to include two new categories, namely, (i) ethnicity, religion, race or caste, and (ii) financial and credit information. Although discussed previously, these two categories have hitherto been left out of the purview of the definition as they are fraught with issues of practicality. In the specific example of caste, the government has historically indulged in large-scale data collection for the purpose of census, for example as conducted by the Ministry of Rural Development and the Ministry of Social Justice and Empowerment, Government of India. Further, in the Indian scenario, various statutory benefits accrue from caste identities under the aegis of affirmative action policies. Hence, categorizing it as sensitive personal data may not be considered desirable. The problem is further exacerbated with respect to religion as even a person’s name can be an indicator. In light of this, some issues under consideration were –

  • Whether religion and caste should be categorized as sensitive personal data or personal data?
  • Whether it is impracticable to include it in either category?
  • If included as sensitive personal data, how should it be implemented?

The majority seemed to lean towards including it under the category of sensitive personal data rather than personal data. It was argued that the categorization of some personal data as sensitive was done on the basis of higher potential for profiling or discrimination. In the same vein, caste and religious identities were sensitive information, requiring greater protection as provided under section 16 of the Bill. Regarding the difficulties posed by revealing names, it was proposed that since it was not an indicator by default, this consideration could not be used as a rationale to eliminate religion from the definition. Instead, it was suggested that programmes sensitizing the populous to the implications of names as indicators of religion/caste should be encouraged. With regard to the issue of census, where caste information is collected, it was opined that the same could be done in an anonymously as well. The maintenance of public databases including such information by various public bodies was considered problematic for privacy as they are often easily accessible and hence have a high potential for abuse. Overall, the conclusion was that the potential for abuse of such data could be better curtailed if greater privacy requirements were mandated for both private and public organizations. The collection of this kind of data should be done on a necessity basis and kept anonymous wherever possible. However, it was acknowledged that there were greater impracticalities associated with treating religion and caste as sensitive personal data. Further, the use and disclosure of indicative names was considered to be a matter of choice. Often caste information was revealed for affirmative action schemes, for example, rank lists for admissions or appointments. In such cases, it was considered to be counter-productive to discourage the beneficiary from revealing such information. Consequently, it was suggested that they could be regulated differently and qualified wherever required. The floor was then thrown open for discussing the other categories included under the definition of ‘sensitive personal data’.

Political Affiliation

Another contentious issue discussed at the Roundtable was the categorization of ‘political affiliation’ as ‘sensitive personal data’. A participant questioned the validity of including it in the definition, arguing that it is not an issue in India. Further, it was argued that one’s political affiliation was also subject to change and hence did not mandate higher protection as provided for sensitive personal data. Instead, if included at all, it should be categorized as ‘personal data’. This was countered by other participants who argued that revealing such information should be a matter of choice and if this choice is not protected adequately, it may lead to persecution. In light of this, changing one’s political affiliation particularly required greater protection as it may leave one more vulnerable. Everyone was in agreement that the aggregation of this class of data, particularly when conducted by public and private organizations, was highly problematic, as evidenced by its historic use for targeting dissident groups. Further, it was accepted unanimously that this protection should not extend to public figures as citizens had a right to know their political affiliation. However, although there was consensus on voting being treated as sensitive personal data, the same could not be reached for extending this protection to political affiliation.

Conviction Data

The roundtable also elicited a debate on conviction data being enumerated as sensitive personal data. The contention stemmed from the usefulness of maintaining this information as a matter of public record. Inter alia, the judicial practice of considering conviction history for repeat offenders, the need to consider this data before issuing passport and the possibility of establishing a sex offenders registry in India were cited as examples for the same.

Financial and Credit Information

From the outset, the inclusion of Financial and Credit information as sensitive personal data was considered problematic as it would clash directly with existing legislations. Specifically, the Reserve Bank of India mandates on all issues revolving around this class of data. However, it was considered expedient to categorize it in this manner due to grave mismanagement associated with it, despite existing protections. In this regard, the handling of Credit Information was raised as an issue. Even though it is regulated under the Credit Information Companies (Regulation) Act, 2005, its implementation was found to be wanting by some participants. In this context, the harm sought to be prevented by its inclusion in the Bill was unregulated sharing of credit-worthiness data with foreign banks and organs of the state. Informed consent was offered as the primary qualifier. However, some participants proposed that extending a strong regime of protection to such information would not be economically viable for financial institutions. Thus, it was suggested that this category should be categorized as personal data with the aim of regulating unauthorized disclosures.

Conclusion

The debate on the definition of sensitive personal data concluded with the following suggestions and remarks:

  • The categories included under sensitive personal data should be subject to contextual provisions instead of blanket protection.
  • Sensitive personal data mandates greater protection with regard to storage and disclosure than personal data.
  • While obtaining prior consent is important for both kinds of data, obtaining informed consent is paramount for sensitive personal data.
  • Both classes of data can be collected for legitimate purposes and in compliance with the protection provided by law.

Chapter II – Regulation of Personal Data

This chapter of the Bill establishes a negative statement of a positive right under Section 3 along with exemptions under Section 4, as opposed to the previous version of the Bill, discussed at the fifth Privacy Roundtable, which established a positive right. Thus, in its current form, the Bill provides a stronger regime for the regulation of personal data. The single exemption provided under this part is for personal or domestic use.

The main issues under consideration with regard to this part were –

  • The scope of the protection provided
  • Whether the exemptions should be expanded or diminished.

A participant raised a doubt regarding the subject of the right. In response, it was clarified that the Bill was subject to existing Constitutional provisions and relevant case law. According to the apex court, in Kharak Singh v. The State of U.P. (1964), the Right to Privacy arose from the Right to Life and Personal Liberty as enshrined under Article 21 of the Constitution of India. Since the Article 21 right is applicable to all persons, the Right to Privacy has to be interpreted in conjunction. Consequently, the Right to Privacy will apply to both citizens and non-citizens in India. It would also extend to information of foreigners stored by any entity registered in India and any other entity having an Indian legal personality irrespective of whether they are registered in India or not.

The next issue that arose at the Roundtable stemmed from the exemption provided under Section 4 of the Bill. A participant opined that excluding domestic use of such data was unadvisable as often such data was used maliciously during domestic rows such as divorce. With regard to the how ‘personal and domestic use’ was to be defined it was proposed that the same had to cater existing cultural norms. In India, this entailed that existing community laws had to be followed which does not recognize nuclear families as a legal entity. It was also acknowledged that Joint Hindu Families had to be dealt with specially and their connection with large businesses in India would have to be carefully considered.

Another question regarding exemptions brought up at the Roundtable was whether they should be broadened to include the information of public servants and the handling of all information by intelligence agencies. Similarly, some participants proposed that exemptions or exceptions should be provided for journalists, private figures involved in cases of corruption, politicians, private detective agencies etc. It was also proposed that public disclosure of information should be handled differently than information handled in the course of business.

Conclusion

The overall conclusion of the discussion on this Chapter was –

  • All exemptions and exceptions included in this Chapter should be narrowly tailored and specifically defined.
  • Blanket exemptions should be avoided. The specificities can be left to the Judiciary to adjudicate on as and when contentions arise.

Chapter III – Protection of Personal Data

This chapter seeks to regulate the collection, storage, processing, transfer, security and disclosure of personal data.

Collection of Personal Data

Sections 5, 6 and 7 of the Bill regulate the collection of personal data. While section 5 establishes a broad bar for the collection of personal data, Section 6 and 7 provide for deviations from the same, for collecting data with and without prior informed consent respectively.

Collection of Data with Prior Informed Consent

Section 6 establishes the obligation to obtain prior informed consent, sets out the regime for the same and by way of 2 provisos allows for withdrawal of consent which may result in denial of certain services.

The main issues discerned from this provision involved (i) notice for obtaining consent, (ii) mediated data collection, and (iv) destruction of data.

Regarding notice, some participants observed that although it was a good practice it was not always feasible. A participant raised the issue of the frequency of obtaining consent. It was observed that services that allowed its users to stay logged in and the storage of cookies etc. were considered benefits which would be disrupted if consent had to be obtained at every stage or each time the service was used. To solve this problem, it was unanimously accepted that consent only had to be obtained once for the entirety of the service offered except when the contract or terms and conditions were altered by the service provider. It was also decided that the entity directly conducting the collection of data was obligated to obtain consent, even if the same was conducted on behalf of a 3rd party.

Mediated date collection proved to be a highly contentious issue at the Roundtable. The issue was determining the scope and extent of liability in cases where a mediating party collects data for a data controller for another subject who may or may not be a user. In this regard, two scenarios were discussed – (i) uploading pictures of a 3rd party by a data subject on social media sites like Facebook and (ii) using mobile phone applications to send emails, which involves, inter alia, the sender, the phone manufacturer and the receiver. The ancillary issues recognized by participants in this regard were – (i) how would data acquired in this manner be treated if it could lead to the identification of the 3rd party?, and (ii) whether destruction of user data due to withdrawal of consent amount to destruction of general data, i.e. of the 3rd party. The consensus was that there was no clarity on how such forms of data collection could be regulated, even though it seemed expedient to do so. The government’s inability to find a suitable solution was also brought to the table. In this regard it was suggested by some participants that the Principle of Collection Limitation, as defined in the A.P. Shah Committee Report, would provide a basic protection. Further the extent to which this would be exempted for being personal use was suggested as a threshold. A participant observed that it would be technically unfeasible for the service provider to regulate such collection, even if it involved illicit data such as pornographic or indecent photographs. Further, it was opined that such an oversight by the service provider could be undesirable since it would result in the violation of the user’s privacy. Thus, any proposal for regulation had to balance the data subject’s rights with that of the 3rd party. In light of this, it was suggested that the mediating party should be made responsible for obtaining consent from the 3rd party.

Another aspect of this provision which garnered much debate was the proviso mandating destruction of data in case of withdrawal of consent. A participant stated the need for including broad exceptions as it may not always be desirable. Regarding the definition of ‘destroy’, as provided for under Section 2, it was observed that it mandated the erasure/deletion of the data in its entirety. Instead, it was suggested, that the same could be achieved by merely anonymising the information.

Collection of Data without Consent

Section 7 of the Bill outlines four scenarios which entail collection of personal data without prior consent, which are reproduced below -

“(a) necessary for the provision of an emergency medical service to the data subject;
(b) required for the establishment of the identity of the data subject and the collection is authorised by a law in this regard;
(c) necessary to prevent a reasonable threat to national security, defence or public order; or
(d) necessary to prevent, investigate or prosecute a cognisable offence”

Most participants at the Roundtable found that the list was too large in scope. The unqualified inclusion of prevention in that last two sub clauses was found to be particularly problematic. It was suggested that Section 7 (c) was entirely redundant as its provisions could be read into Section 7 (d). Furthermore, the inclusion of ‘national security’ as a basis for collecting information without consent was rejected almost unanimously. It was suggested that if it was to be included then a qualification was desirable, allowing collection of information only when authorized by law. Some participants extended this line of reasoning to Section 7 (c) as state agencies were already authorized to collect information in this manner. It was opined that including it under the Bill would reassert their right to do so in broader terms. For similar reasons, Section 7 (b) was found objectionable as well. It was further suggested that if sub clauses (b), (c) and (d) remained in the Bill, it should be subject to existing protections, for example those established by seminal cases such as Maneka Gandhi v. Union of India (1978) and PUCL v. Union of India (1997).

Storage and Processing of Personal Data

Section 8 of the Bill lays down a principle mandating the destruction of the information collected, following the cessation of the necessity or purpose for storage and provides exceptions to the same. It sets down a regime of informed consent, purpose specific storage and data anonymization.

The first amendment suggested for this provision was regarding the requirement of deleting the stored information ‘forthwith’. It was proposed by a participant that deleting personal data instantaneously had practical constraints and a reasonability criteria should be added. It was also noticed that in the current form of the Bill, the exception of historical, archival and research purposes had been replaced by the more general phrase ‘for an Act of Parliament’. The previous definition was altered as the terms being used were hard to define. In response, a participant suggested a broader phrase which would include any legal requirement. Another participant argued that a broader phrase would need to me more specifically defined to avoid dilution.

Section 9 of the Bill sets out two limitations for processing data in terms of (i) the kind of personal data being processed and (ii) the purpose for the same. The third sub clause enumerates exceptions to the abovementioned principles in language similar to that found in Section 7.

With regard to the purpose limitation clause it was suggested by many participants that the same should be broadened to include multiple purposes as purpose swapping is widespread in existing practice and would be unfeasible and undesirable to curtail. Sub clause 3 of this Section was critiqued for the same reasons as Section 7.

Section 10 restricts cross-border transfer of data. It was clarified that different departments of the same company or the same holding company would be treated as different entities for the purpose of identifying the data processor. However, a concern was raised regarding the possibility of increased bureaucratic hurdles on global transfer of data in case this section is read too strictly. At the same time, to provide adequate protection of the data subject’s rights certain restrictions on the data controller and location of transfer.

The regime for disclosure of personal data without prior consent is provided for by Section 14. The provision did not specify the rank of the police officer in charge of passing orders for such disclosure. It was observed that a suitable rank had to be identified to ensure adequate protection. Further, it was suggested that the provision be broadened to include other competent agencies as well. This could be included by way of a schedule or subsequent notifications.

Conclusion

  • Mediated collection of data should be qualified on the basis of purpose and intent of collection.
  • The issue of cost to company (C2C) was not given adequate consideration in the Bill.
  • The need to lay down Procedures at all stages of handling personal data.
  • Special exemptions need to be provided for journalistic sources.

Meeting Conclusion

The Sixth Privacy Roundtable was the second to last of the stakeholder consultations conducted for the Citizens’ Personal Data (Protection) Bill, 2013. Various changes made to the Bill from its last form were scrutinized closely and suitable suggestions were provided. Further changes were recommended for various aspects of it, including definitions, qualifications and procedures, liability and the chapter on offences and penalties. The Bill will be amended to reflect multi-stakeholder suggestions and cater to various interests.

6th Privacy Roundtable

by Prachi Arya last modified Aug 30, 2013 08:15 AM
6th Privacy Roundtable
Full-size image: 502.1 KB | View image View Download image Download

CIS Cybersecurity Series (Part 10) - Lawrence Liang

by Purba Sarkar last modified Sep 10, 2013 08:31 AM
CIS interviews Lawrence Liang, researcher and lawyer, and co-founder of Alternative Law Forum, Bangalore, as part of the Cybersecurity Series.

"The right to privacy and the right to free speech have often been understood as distinct rights. But I think in the ecology of online communication, it becomes crucial for us to look at the two as being inseparable. And this is not entirely new in India. But, interestingly, a lot of the cases that have had to deal with this question in the Indian context, have pitted one against the other. Now, India doesn't have a law for the protection of whistle-blowers. So how do we now think of the idea of whistle-blowers being one of the subjects of speech and privacy coming together? How do we use the strong pillars that have been established, in terms of a very rich tradition that Indian law has, on the recognition of free speech issues but slowly start incorporating questions of privacy?" - Lawrence Liang, researcher and lawyer, Alternative Law Forum. 

Centre for Internet and Society presents its tenth installment of the CIS Cybersecurity Series. 

The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

Lawrence Liang is one of the co-founders of the Alternative Law Forum where he works on issues of intellectual property, censorship, and the intersection of law and culture. He is also a fellow with the Centre for Internet and Society and serves on its board.  

 
This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

Out of the Bedroom

by Nishant Shah last modified Sep 06, 2013 08:32 AM
We have shared it with our friends. We have watched it with our lovers. We have discussed it with our children and talked about it with our partners. It is in our bedrooms, hidden in sock drawers. It is in our laptops, in a folder marked "Miscellaneous". It is in our cellphones and tablets, protected under passwords. It is the biggest reason why people have learned to clean their browsing history and cookies from their browsers.

The article by Nishant Shah was published in the Indian Express on August 25, 2013.


Whether we go into surreptitious shops to buy unmarked CDs or trawl through Torrent and user-generated content sites in the quest of a video, there is no denying the fact that it has become a part of our multimedia life. Even in countries like India, where consumption and distribution of pornography are punished by law, we know that pornography is rampant. With the rise of the digital technologies of easy copy and sharing, and the internet which facilitates amateur production and anonymous distribution, pornography has escaped the industrial market and become one of the most intimate and commonplace practices of the online world.

In fact, if Google trend results are to be believed, Indians are among the top 10 nationalities searching for pornography daily. Even a quick look at our internet history tells us that it has all been about porn. The morphed pictures of a naked Pooja Bhatt adorned the covers of Stardust in the late 1990s, warning us that the true potential of Photoshop had been realised. The extraordinary sensation of the Delhi Public School MMS case which captured two underage youngsters in a grainy sexcapade announced the arrival of user-generated porn in a big way. The demise of Savita Bhabhi — India's first pornographic graphic novel — is still recent enough for us to remember that the history of the internet in India is book-ended by porn and censorship.

Recent discussions on pornography have been catalysed by a public interest litigation requesting for a ban on internet pornography filed in April by Kamlesh Vaswani. Whether Vaswani's observations on what porn can make us do stem from his own personal epiphany or his self-appointed role as our moral compass is a discussion that merits its own special space. Similarly, a debate on the role, function, and use of pornography in a society is complex, rich and not for today.

Instead, I want to focus on the pre-Web imagination of porn that Vaswani and his endorsers are trying to impose upon the rest of us. There is a common misunderstanding that all porn is the same porn, no matter what the format, medium and aesthetics of representations. Or in other words, a homogenising presumption is that erotic fiction and fantasies, pictures of naked people in a magazine, adult films produced by entertainment houses, and user-generated videos on the internet are the same kind of porn. However, as historical legal debates and public discussions have shown us, what constitutes porn is specific to the technologies that produce it. There was a time when DH Lawrence's iconic novel now taught in undergraduate university courses — Lady Chatterley's Lover — was deemed pornographic and banned in India. In more recent times, the nation was in uproar at the Choli ke peeche song from Khalnayak which eventually won awards for its lyrics and choreography.

In all the controversy, there has so far been a "broadcast imagination" of how pornography gets produced, consumed and distributed. There is a very distinct separation of us versus them when it comes to pornography. They produce porn. They distribute porn. They push porn down our throats (that was probably a poor choice of words) by spamming us and buying Google adwords to infect our search results. We consume porn. And all we need to do is go and regulate, like we do with Bollywood, the central management and distribution mechanism so that the flow of pornography can be curbed. This is what I call a broadcast way of thinking, where the roles of the performers, producers, consumers and distributors of pornography are all distinct and can be regulated.

However, within the murky spaces of the World Wide Web, the scenario is quite different. Internet pornography is not the same as availability of pornography on the internet. True, the digital multimedia space of sharing and peer-2-peer distribution has made the internet the largest gateway to accessing pornographic objects which are produced through commercial production houses. However, the internet is not merely a way of getting access to existing older forms of porn. The internet also produces pornography that is new, strange, unprecedented and is an essential part of the everyday experience of being digitally connected and networked into sociality.

The recent controversies about the former congressman from New York, Anthony Weiner, sexting — sending inappropriate sexual messages through his cellphone — gives us some idea of what internet porn looks like. It is not just something captured on a phone-cam but interactive and collaboratively produced. Or as our own Porngate, where two cabinet ministers of the Karnataka legislative assembly were caught surfing some good old porn on their mobile devices while the legislature was in session, indicated, porn is not something confined to the privacy of our rooms. Naked flashmobs, young people experimenting with sexual identities in public, and sometimes bizarre videos of a bus-ride where the camera merely captures the banal and the everyday through a "pornographic gaze" are also a part of the digital porn landscape. The world of virtual reality and multiple online role-playing games offer simulated sexual experiences that allow for human, humanoid, and non-human avatars to engage in sexual activities in digital spaces. Peer-2-peer video chat platforms like Chatroulette, offer random encounters of the naked kind, where nothing is recorded but almost everything can be seen.

The list of pornography produced by the internet — as opposed to pornography made accessible through the internet — is huge. It doesn't just hide in subcultural practices but resides on popular video-sharing sites like YouTube or Tumblr blogs. It vibrates in our cellphones as we connect to people far away from us, and pulsates on the glowing screens of our tablets as we get glimpses of random strangers and their intimate bodies and moments. An attempt to ban and censor this porn is going to be futile because it does not necessarily take the shape of a full narrative text which can be examined by others to judge its moral content. Any petition that tries to censor such activities is going to fall flat on its face because it fails to recognise that sexual expression, engagement and experimentation is a part of being human — and the ubiquitous presence of digital technologies in our life is going to make the internet a fair playground for activities which might seem pornographic in nature. In fact, trying to restrict and censor them, will only make our task of identifying harmful pornography — porn that involves minors, or hate speech or extreme acts of violence — so much more difficult because it will be pushed into the underbelly of the internet which is much larger than the searched and indexed World Wide Web.

Trying to suggest that internet pornography is an appendage which can be surgically removed from everyday cyberspace is to not understand the integral part that pornography and sexual interactions play in the development and the unfolding of the internet. The more fruitful efforts would be to try and perhaps create a guideline that helps promote healthy sexual interaction and alerts us to undesirable sexual expressions which reinforce misogyny, violence, hate speech and non-consensual invasions of bodies and privacy. This blanket ban on trying to sweep all internet porn under a carpet is not going to work — it will just show up as a big bump, in places we had not foreseen.

An Interview with Suresh Ramasubramanian

by Elonnai Hickok last modified Sep 06, 2013 09:37 AM
Suresh Ramasubramanian is the ICS Quality Representative - IBM SmartCloud at IBM. We from the Centre for Internet and Society conducted an interview on cybersecurity and issues in the Cloud.
  1. You have done a lot of work around cybersecurity and issues in the Cloud. Could you please tell us of your experience in these areas and the challenges facing them?
    a. I have been involved in antispam activism from the late 1990s and have worked in ISP / messaging provider antispam teams since 2001. Since 2005, I expanded my focus to include general cyber security and privacy, having written white papers on spam and botnets for the OECD, ITU and UNDP/APDIP. More recently, have become a M3AAWG special advisor for capacity building and outreach in India.

    In fact capacity building and outreach has been the focus of my career for a long time now. I have been putting relevant stakeholders from ISPs, government and civil society in India in touch with their counterparts around the world, and, at a small level, enabling an international exchange of ideas and information around antispam and security.

    This was a challenge over a decade back when I was a newbie to antispam and it still is. People in India and other emerging economies, with some notable exceptions, are not part of the international communities that have grown in the area of cyber security and privacy.

    There is a prevalent lack of knowledge in this area, which combined with gaps in local law and its enforcement. There is a tendency on the part of online criminals to target emerging and fast growing economies as a rich source of potential victims for various forms of online crime, and sometimes as a safe haven against prosecution.
  2. In a recent public statement Google said "Cloud users have no legitimate expectation of privacy. Do you agree with this statement?
    a. Let us put it this way. All email received by a cloud or other Internet service provider for its customers is automatically processed and data mined in one form or the other. At one level, this can be done for spam filtering and other security measures that are essential to maintain the security and stability of the service, and to protect users from being targeted by spam, malware and potential account compromises.

    The actual intent of automated data mining and processing should be transparently provided to customers of a service, with a clearly defined privacy policy, and the deployment of such processing, and the “end use” to which data mined from this processing is put, are key to agreeing or disagreeing with such a statement.

    It goes without saying that such processing must stay within the letter, scope and spirit of a company’s privacy policy, and must actually be structured to be respectful of user privacy.

    Especially where mined data is used to provide user advertising or for any other commercial purpose (such as being aggregated and resold), strict adherence to a well written privacy policy and periodic review of this policy and its implementation to examine its compliance to laws in all countries that the company operates in are essential.

    There is way too much noise in the media for me to usefully add any more to this issue and so I will restrict myself to the purely general comments above.
  3. What ways can be privacy of an individual be compromised on the cloud? What can be done to prevent such instances of compromise?
    a. All the recent headlines about companies mining their own users’ data, and yet more headlines about different countries deploying nationwide or even international lawful intercept and wiretap programs, aside, the single largest threat to individual privacy on the cloud is, and has been for years before the word “cloud” came into general use, the constant targeting of online users by online criminals with a variety of threats including scams, phish campaigns and data / account credential stealing malware.

    Poor device security is another threat – one that becomes even more of a serious problem when the long talked about “internet of things” seems set to become reality, with cars, baby monitors, even Bluetooth enabled toilets, and more dangerously, critical national infrastructure such as power plants and water utilities becoming accessible over the Internet but still running software that is basically insecure and architected with assumptions that date back to an era when there was no conception or need to connect these to the Internet.

    Someone in Bluetooth range with the appropriate android application being able to automatically flush your toilet and even download a list of the dates and times when you last used it is personally embarrassing. Having your bank account broken into because your computer got infected with a virus is even more damaging. Someone able to access a dam’s control panel over the internet and remotely trigger the dam’s gates to open can cause far more catastrophic damage.

    The line between security and privacy, between normal business practice and unacceptable, even illegal behaviour, is sometimes quite thin and in a grey area that may be leveraged to the hilt for commercial and/or national security interests. However, scams, malware, exploits of insecure systems and similar threats are well on the wrong side of the “criminal” spectrum, and are a clear and present danger that cause far more than an embarrassing or personally damaging loss of privacy.
  4. How is the jurisdiction of the data on the cloud determined?
    This is a surprisingly thorny question. Normally, a company is based in a particular country and has an end user agreement / terms of service that makes its customers / users accept that country’s jurisdiction.

    However, a cloud based provider that does business around the world may, in practice, have to comply to some extent at least, with that country’s local laws – at any rate, in respect to its users who are citizens of that country. And any cloud product sold to a local business or individual by a salesman from the vendor’s branch in the country would possibly fall under a contract executed in the country and therefore, subject to local law.

    The level of compliance for data retention and disclosure in response to legal processes will possibly vary from country to country – ranging from flat refusals to cooperate (especially where any law enforcement request for data are for something that is quite legal in the country the cloud provider is based in) to actual compliance.

    In practice this may also depend on what is at stake for the cloud vendor in complying or refusing to comply with local laws – regardless of what the terms of use policies or contract assert about jurisdiction. The number of users the cloud vendor has in the country, the extent of its local presence in the country, how vulnerable its resident employees and executives are to legal sanctions or punishment.

    In the past, it has been observed that a practical balance [which may be based on business economics as much as it is based on a privacy assessment] may be struck by certain cloud vendors with a global presence, based on the critical mass of users it stands to gain or lose by complying with local law, and the risks it faces if it complies, or conversely, does not comply with local laws – so the decision may be to fight lawsuits or prosecutions on charges of breaking local data privacy laws or not complying with local law enforcement requests for handover of user data in court, or worst case, pulling out of the country altogether.
  5. Currently, big cloud owners are US corps, yet US courts do not extend the same privacy rights to non US citizens. Is it possible for countries to use the cloud and still protect citizen data from being accessed by foreign governments? Do you think a "National Cloud" is a practical solution?
    a. The “cloud” in this context is just “the internet”, and keeping local data local and within local jurisdiction is possible in theory at any rate. Peering can be used to keep local traffic local instead of having it do a roundtrip through a foreign country and back [where it might or might not be subject to another country’s intercept activities, no comment on that].

    A national cloud demands local infrastructure including bandwidth, datacenters etc. that meet the international standards of most global cloud providers. It then requires cloud based sites that provide an equivalent level of service, functionality and quality to that provided by an international cloud vendor. And then after that, it has to have usable privacy policies and the country needs to have a privacy law and a sizeable amount of practical regulation to bolster the law, a well-defined path for reporting and redress of data breaches. There are a whole lot of other technical and process issues before having a national cloud becomes a reality, and even more before such a reality makes a palpable positive difference to user privacy.
  6. What audit mechanisms of security and standards exist for Cloud Service Providers and Cloud Data Providers?
    a. Plenty – some specific to the country and the industry sector / kind of data the cloud handles. The Cloud Security Alliance has been working for quite a while on CloudAudit, a framework developed as part of a cross industry effort to unify and automate Assertion, Assessment and Assurance of their infrastructure and service.

    Different standards bodies and government agencies have all come out with their own sets of standards and best practices in this area (this article has a reasonable list - http://www.esecurityplanet.com/network-security/cloud-security-standards-what-youshould-know.html). Some standards you absolutely have to comply with for legal reasons.

    Compliance reasons aside, a judicious mix of standards, and considerable amounts of adaptation in your process to make those standards work for you and play well together.

    The standards all exist – what varies considerably, and is a major cause of data privacy breaches, are incomplete or ham handed implementations of existing standards, any attempt at “checkbox compliance” to simply implement a set of steps that lead to a required certification, and a lack of continuing initiative to keep the data privacy and securitymomentum going once these standards have been “achieved”, till it is time for the next audit at any rate.
  7. What do you see as the big challenges for privacy in the cloud in the coming years?
    a. Not very much more than the exact same challenges for privacy in the cloud over the past decade or more. The only difference is that any threat that existed before has always amplified itself because the complexity of systems and the level of technology and computing power available to implement security, and to attempt to breach security, is exponentially higher than ever before – and set to increase as we go further down the line.
  8. Do you think encryption the answer to the private and public institutions snooping?
    a. Encryption of data at rest and in transit is a key recommendation of any data privacy standard and cloud / enterprise security policy. Companies and users are strongly encouraged to deploy and use strong cryptography for personal protection. But to call it “the answer” is sort of like the tale of the blind men and the elephant.

    There are multiple ways to circumvent encryption – social engineering to trick people into revealing data (which can be mitigated to some extent, or detected if it is tried on a large cross section of your userbase – it is something that security teams do have to watch for), or just plain coercion, which is much tougher to defend against.

    As a very popular XKCD cartoon that has been shared around social media and has been cited in multiple security papers says -

    “A crypto nerd’s imagination”

    “His laptop’s encrypted. Let us build a million dollar cluster to crack it”
    “No good! It is 4096 bit RSA”
    “Blast, our evil plan is foiled”

    “What would actually happen”
    “His laptop’s encrypted. Drug him and hit him with this $5 wrench till he tells us the password”
    “Got it”
  9. Spam is now consistently used to get people to divulge their personal data or otherwise compromise a persons financial information and perpetuate illegal activity. Can spam be regulated? If so, how?
    a. Spam has been regulated in several countries around the world. The USA has had laws against spam since 2003. So has Australia. Several other countries have laws that specifically target spam or use other statutes in their books to deal with crime (fraud, the sale of counterfeit goods, theft..) that happens to be carried out through the medium of spam.

    The problems here are the usual problems that plague international enforcement of any law at all. Spammers (and worse online criminals including those that actively employ malware) tend to pick jurisdictions to operate in where there are no existing laws on their activities, and generally take the precaution not to target residents of the country that they live in. Others send spam but attempt to, in several cases successfully, skate around loopholes in their country’s antispam laws.

    Still others fully exploit the anonymity that the Internet provides, with privately registered domain names, anonymizing proxy servers (when they are not using botnets of compromised machines), as well as a string of shell companies and complex international routing of revenue from their spam campaigns, to quickly take money offshore to a more permissible jurisdiction.

    Their other advantage is that law enforcement and regulatory bodies are generally short staffed and heavily tasked, so that even a spammer who operates in the open may continue his activities for a very long time before someone manages to prosecute him.

    Some antispam laws allow recipients of spam to sue the spammer in small claims courts – which, like regulatory action, has also previously led to judgements being handed out against spammers and their being fined or possibly imprisoned in case their spam has criminal aspects to it, attracting local computer crime laws rather than being mere violations of civil antispam laws.
  10. There has been a lot of talk about the use of malware like FinFisher and its ability to compromise national security and individual security. Do you think regulation is needed for this type of malware - and if so what type - export  controls? privacy regulation? Use control?
    a. Malware used by nation states as a part of their surveillance activities is a problem. It is further a problem if such malware is used by nation states that are not even nominally democratic and that have long standing records of human rights violations.

    Regulating or embargoing their sale is not going to help in such cases. One problem is that export controls on such software are not going to be particularly easy and countries that are on software export blacklists routinely manage to find newer and more creative ways to attempt to get around these and try to purchase embargoed software and computing equipment of all kinds.

    Another problem is that such software is not produced just by legitimate vendors of lawful intercept gear. Criminals who write malware that is capable of, say, stealing personal data such as bank account credentials are perfectly capable of writing such software, and there is a thriving underground economy in the sale of malware and of “take” from malware such as personal data, credit cards and bank accounts where any rogue nation state can easily acquire products with an equivalent functionality.

    This is going to apply even if legitimate vendors of such products are subject to strict regulations governing their sale and national laws exist regulating the use of such products. So while there is no reason not to regulate / provide judicial and regulatory oversight of their sale and intended use, it should not be seen as any kind of a solution to this problem.

    User education in privacy and access to secure computing resources is probably going to be the bedrock of any initiative that looks to protect user privacy – a final backstop to any technical / legal or other measure that is taken to protect them.

Privacy Law Must Fit the Bill

by Sunil Abraham last modified Sep 12, 2013 06:25 AM
The process of updating Indian privacy policy has gained momentum ever since the launch of the UID project and also the leak of the Radia tapes. The Department of Personnel and Training has lead the drafting of privacy bill for the last three years. This bill will ideally articulate privacy principles and establish the office of the privacy commissioner and most importantly have an over-riding effect over 50 odd existing laws, rules and policies with privacy implications.
Privacy Law Must Fit the Bill

Sunil Abraham


The article was published in the Deccan Chronicle on September 9, 2013.


Given the harmonizing impact of the proposed privacy bill, we must ensure that rigorous debate and discussion happens before the bill is finalized otherwise there may be terrible consequences.

Here is a short list of what can possibly go wrong:

One, the privacy bill ignores the massive power asymmetry in Indian societies undermining the right to information – in other jurisdictions referred to as freedom of information and access to information. The power asymmetry is addressed via a public interest test. The right to privacy would be the same for everyone except when public interest is at stake. This enables protection of the right to privacy to be inversely proportionate to power and almost conversely the requirement of transparency to be directly proportionate to power. In other words, the poor would have greater privacy than a middle-class citizens who in turn would have greater privacy than political and economic elites. And transparency requirements would be greatest for economic and political elites and lower for middle-class citizens and lowest for the poor.  If this is not properly addressed in the language of the bill – privacy activists would have undone the significant accomplishments of the right to information or transparency movement in India over the last decade.

Two, the privacy bill has chilling effect on free speech. This can happen either by denying the speaker privacy, or by affording those who are spoken about too much privacy. For the speaker - Know Your Customer (KYC) and data retention requirements for telecom and internet infrastructure necessary to participate in the networked public sphere can result in the death of anonymous and pseudonymous speech. Anonymous and pseudonymous speech must be protected as it is a necessary for good governance, free media, robust civil society, and vibrant art and culture in a democracy.  For those spoken about - privacy is clearly required in certain cases to protect the victims of certain categories of crimes. However, the right to privacy could be abused by those occupying public office and those in public life to censor speech that is in the public interest. If for example a sport person does not publicly drink the aerated drink that he or she endorses in advertisements then the public has a right to know.

Three, the privacy bill has a limited scope. Jurisprudence in India derives the right to privacy from the right to life and liberty through several key judgments including Naz Foundation v. Govt. of NCT of Delhi decided by the Delhi High Court. The right to life and liberty or Article 21 unlike other constitutionally guaranteed fundamental rights does not distinguish between citizens and non-citizens. As a consequence the privacy bill must also protect residents, visitors and other persons who may never visit India, but whose personal information may travel to India as part of the global outsourcing phenomena. Also the obligations and safeguards under the privacy bill must equally apply to both the state and the private sector entities that could potentially infringe upon the individual's right to privacy. Different levels of protection may be afforded to citizens, residents, visitors and everybody else. Government and private sector data controllers may be subject to different regulations – for ex. an intelligence agency may not require 'consent' of the data subject to collect personal information and may only provide 'notice' after the investigation has cleared the suspect of all charges.

Four, the privacy bill is expected to fix poorly designed technology. There are two diametrically opposite definitions of projects like NATGRID, CMS and UID. The government definition is that all these systems will allow only for targeted interception and surveillance, however the majority of civil society believes that these system will be used for blanket surveillance. If these systems are indeed built in a manner that supports blanket surveillance then legal band-aid in the form of a new law or provision that prohibits blanket surveillance will be a complete failure. The principle of 'privacy by design' is the only way to address this. For ex. shutters of digital cameras are silent and this allows for a particular form of voyeurism called upskirt. Almost a decade ago, the Korean government enacted a law that requires camera and mobile phone manufacturers to ensure that audio recording of a mechanical shutter is played every time the camera function is used. It is also illegal for the user to circumvent or disable this feature. In this example, the principle of notice is hardwired within the technology itself. To remix Spiderman's motto – with great power comes great temptation. We know that a rogue NTRO official installed a spy camera in the office toilet to make recording female colleagues and most recently that NSA officers confessed to spying on their love interests. If the technology can be abused it will be abused. Therefore legal safeguards are a poor substitute for technological safeguards. We need both simultaneously.

Five, the bill does not require compliance with internationally accepted privacy principles including the ones discussed so far 'consent', 'notice' and 'privacy by design'. Apart from human rights considerations – the most important imperative to modernize India privacy laws is trade. We have a vibrant ITES, BPO and KPO sector which handles personal information of foreigners mostly from the North American and European continents.  The Justice AP Shah committee in October 2012 identified privacy principle that required for India - notice, choice and consent, collection limitation, purpose limitation, access and correction, disclosure of information, security, openness and accountability. A privacy bill that does include all these principles will increase the regulatory compliance overhead for Indian enterprise with foreign clients and for multinationals operating in India. There is also the risk that privacy regulators in these jurisdictions will ban outsourcing to Indian firms because our privacy laws are not adequate by their standards.

To conclude, it is not sufficient for India to enact a privacy law it is essential that we get it right so that there are no unintended consequences on other equally important rights and dimensions of our democracy.

Transparency Reports — A Glance on What Google and Facebook Tell about Government Data Requests

by Prachi Arya last modified Sep 13, 2013 09:44 AM
Transparency Reports are a step towards greater accountability but how efficacious are they really?

Prachi Arya examines the transparency reports released by tech giants with a special focus on user data requests made to Google and Facebook by Indian law enforcement agencies.

The research was conducted as part of the 'SAFEGUARDS' project that CIS is doing with Privacy International and IDRC.


According to a recent comScore Report India has now become the third largest internet user with nearly 74 million citizens on the Internet, falling just behind China and the United States. The report also reveals that Google is the preferred search engine for Indians and Facebook is the most popular social media website followed by LinkedIn and Twitter. While users posting their photos on Facebook can limit viewership through privacy settings, there isn’t much they can do against government seeking information on their profiles. All that can be said for sure in the post-Snowden world is that large-scale surveillance is a reality and the government wants it on their citizen’s online existence. In this Orwellian scenario, transparency reports provide a trickle of information on how much our government finds out about us.

The first transparency report was released by Google three years ago to provide an insight into ‘the scale and scope of government requests for censorship and data around the globe’. Since then the issuance of such reports is increasingly becoming a standard practice for tech giants. An Electronic Frontier Foundation Report reveals that major companies that have followed Google’s lead include Dropbox, LinkedIn, Microsoft and Twitter with Facebook and Yahoo! being the latest additions . Requests to Twitter and Microsoft from Indian law enforcement agencies were significantly less than requests to Facebook and Google. Twitter revealed that Indian law enforcement agencies made less than 10 requests, none of which resulted in sharing of user information. Out of the 418 requests made to Microsoft by India (excluding Skype), 88.5 per cent were complied with for non-content user data. The Yahoo! Transparency Report revealed that 6 countries surpassed India in terms of the number of user data requests. Indian agencies requested user data 1490 times from 2704 accounts for both content and non-content data and over 50 per cent of these requests were complied with.

The following is a compilation of what the latest transparency reports issued by Facebook and Google.

"The information we share on the Transparency Report is just a sliver of what happens on the internet"
Susan Infantino, Legal Director for Google

Beginning from December 2009, Google has published several biannual transparency reports:

  • It discloses traffic data of Google services globally and statistics on removal requests received from copyright owners or governments as well as user data requests received from government agencies and courts. It also lays down the legal process required to be followed by government agencies seeking data.
  • There was a 90 per cent increment in the number of content removal requests received by Google from India. The requests complied with included:
    • Restricting videos containing clips from the controversial movie “Innocence of Muslims” from view.
    • Many YouTube videos and comments as well as some Blogger blog posts being restricted from local view for disrupting public order in relation to instability in North East India.
  • For User Data requests, the Google report details the number of user data requests and users/accounts as well as percentage of requests which were partially or completely complied with. In India the user data requests more than doubled from 1,061 in the July-December 2009 period to 2,431 in the July-December 2012 period. The compliance rate decreased from 79 per cent in the July-December 2010 period to 66 per cent in the last report.
  • Jurisdictions outside the United States can seek disclosure using Mutual Legal Assistance Treaties or any ‘other diplomatic and cooperative arrangement’. Google also provides information on a voluntary basis if requested following a valid legal process if the requests are in consonance with international norms, U.S. and the requesting countries' laws and Google’s policies.

Facebook

    "We hope this report will be useful to our users in the ongoing debate about the proper standards for government requests for user information in official investigations."
    Colin Stretch, Facebook General Counsel

Facebook inaugurated its first ever transparency report last Tuesday with a promise to continue releasing these reports.

  • The ‘Global Government Requests Report’ provides information on the number of requests received by the social media giant for user/account information by country and the percentage of requests it complied with. It also includes operational guidelines for law enforcement authorities.
  • The report covers the first six months of 2013, specifically till June 30. In this period India made 3,245 requests from 4,144 users/accounts and half of these requests were complied with.
  • Jurisdictions outside the United States can seek disclosure by way of mutual legal assistance treaties requests or letter rogatory. Legal requests can be in the form of search warrants, court orders or subpoena. The requests are usually made in furtherance of criminal investigations but no details about the nature of such investigations are provided.
  • Broad or vague requests are not processed. The requests are expected to include details of the law enforcement authority issuing the request and the identity of the user whose details are sought.

The Indian Regime

Section 69 and 69 B of the Information Technology (Amended) Act, 2008 prescribes the procedure and sets safeguards for the Indian Government to request user data from corporates. According to section 69, authorized officers can issue directions to intercept, monitor or decrypt information for the following reasons:

  1. Sovereignty or integrity of India,
  2. Defence of India,
  3. Security of the state,
  4. Friendly relations with foreign states,
  5. Maintenance of public order,
  6. Preventing incitement to the commission  of any cognizable offence relating to the above, or
  7. For investigation of any offence.

Section 69 B empowers authorized agencies to monitor and collect information for cyber security purposes, including ‘for identification, analysis and prevention of intrusion and spread of computer contaminants’. Additionally, there are rules under section 69 and 69 B that regulate interception under these provisions.

Information can also be requested through the Controller of Certifying Authority under section 28 of the IT Act which circumvents the stipulated procedure. If the request is not complied with then the intermediary may be penalized under section 44.

The Indian Government has been increasingly leaning towards greater control over online communications. In 2011, Yahoo! was slapped with a penalty of Rs. 11 lakh for not complying with a section 28 request, which called for email information of a person on the grounds of national security although the court subsequently stayed the Controller of Certifying Authorities' order. In the same year the government called for pre-screening user content by internet companies and social media sites to ensure deletion of ‘objectionable content’ before it was published. Similarly, the government has increasingly sought greater online censorship, using the Information Technology Act to arrest citizens for social media posts and comments and even emails criticizing the government.

What does this mean for Privacy?

The Google Transparency Report has thrown light on an increasing trend of governmental data requests on a yearly basis. The reports published by Google and Facebook reveal that the number of government requests from India is second only to the United States. Further, more than 50 per cent of the requests from India have led to disclosure by nearly all the companies surveyed in this post, with Twitter being the single exception.

Undeniably, transparency reports are important accountability mechanisms which reaffirm the company’s dedication towards protecting its user’s privacy. However, basic statistics and vague information cannot lift the veil on the full scope of surveillance. Even though Google’s report has steadily moved towards a more nuanced disclosure, it would only be meaningful if, inter alia, it included a break-up of the purpose behind the requests. Similarly, although Google has also included a general understanding of the legal process, more specifics need to be disclosed. For example, the report could provide statistics for notifications to indicate how often user’s under scrutiny are not notified. Such disclosures are important to enhance user understanding of when their data may be accessed and for what purposes, particularly without prior or retrospective intimation of the same. Till such time the report can provide comprehensive details about the kind of surveillance websites and internet services are subjected to, it will be of very limited use. Its greatest limitation, however, may lie beyond its scope.

The monitoring regime envisioned under the Information Technology Act effectively lays down an overly broad system which may easily lead to abuse of power. Further, the Indian Government has become infamous for their need to control websites and social media sites. Now, with the Indian Government’s plan for establishing the Central Monitoring System the need for intermediaries to conduct the interception may be done away with, giving the government unfettered access to user data, potentially rendering corporate transparency of data requests obsolete.

Privacy Meeting Brussels - Bangalore Slides

by Prasad Krishna last modified Sep 12, 2013 07:55 AM

PDF document icon presentation_vub_lsts_v3.pdf — PDF document, 1269 kB (1300025 bytes)

Privacy and Surveillance Talk by Sunil Abraham

by Prasad Krishna last modified Sep 13, 2013 09:47 AM

PDF document icon lecture_ccmg_2013september18.pdf — PDF document, 212 kB (217342 bytes)

The National Privacy Roundtable Meetings

by Bhairav Acharya last modified Mar 21, 2014 10:03 AM
The Centre for Internet & Society ("CIS"), the Federation of Indian Chambers of Commerce and Industry ("FICCI"), the Data Security Council of India ("DSCI") and Privacy International are, in partnership, conducting a series of national privacy roundtable meetings across India from April to October 2013. The roundtable meetings are designed to discuss possible frameworks to privacy in India.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC.


Background: The Roundtable Meetings and Organisers

CIS is a Bangalore-based non-profit think-tank and research organisation with interests in, amongst other fields, the law, policy and practice of free speech and privacy in India. FICCI is a non-governmental, non-profit association of approximately 250,000 Indian bodies corporate. It is the oldest and largest organisation of businesses in India and represents a national corporate consensus on policy issues. DSCI is an initiative of the National Association of Software and Service Companies, a non-profit trade association of Indian information technology ("IT") and business process outsourcing ("BPO") concerns, which promotes data protection in India. Privacy International is a London-based non-profit organisation that defends and promotes the right to privacy across the world.

Privacy in the Common Law and in India

Because privacy is a multi-faceted concept, it has rarely been singly regulated. A taxonomy of privacy yields many types of individual and social activity to be differently regulated based on the degree of harm that may be caused by intrusions into these activities.[1]

The nature of the activity is significant; activities that are implicated by the state are attended by public law concerns and those conducted by private persons inter se demand market-based regulation. Hence, because the principles underlying warranted police surveillance differ from those prompting consensual collections of personal data for commercial purposes, legal governance of these different fields must proceed differently. For this and other reasons, the legal conception of privacy — as opposed to its cultural construction – has historically been diverse and disparate.

Traditionally, specific legislations have dealt separately with individual aspects of privacy in tort law, constitutional law, criminal procedure and commercial data protection, amongst other fields. The common law does not admit an enforceable right to privacy.[2] In the absence of a specific tort of privacy, various equitable remedies, administrative laws and lesser torts have been relied upon to protect the privacy of claimants.[3]

The question of whether privacy is a constitutional right has been the subject of limited judicial debate in India. The early cases of Kharak Singh (1964)[4] and Gobind (1975)[5] considered privacy in terms of physical surveillance by the police in and around the homes of suspects and, in the latter case, the Supreme Court of India found that some of the Fundamental Rights “could be described as contributing to the right to privacy” which was nevertheless subject to a compelling public interest. This inference held the field until 1994 when, in the Rajagopal case (1994),[6] the Supreme Court, for the first time, directly located privacy within the ambit of the right to personal liberty guaranteed by Article 21 of the Constitution of India. However, Rajagopal dealt specifically with a book, it did not consider the privacy of communications. In 1997, the Supreme Court considered the question of wiretaps in the PUCL case (1996)[7] and, while finding that wiretaps invaded the privacy of communications, it continued to permit them subject to some procedural safeguards.[8] A more robust statement of the right to privacy was made recently by the Delhi High Court in the Naz Foundation case (2011)[9] that de-criminalised consensual homosexual acts; however, this judgment is now in appeal.

Attempts to Create a Statutory Regime

The silence of the common law leaves the field of privacy in India open to occupation by statute. With the recent and rapid growth of the Indian IT and BPO industry, concerns regarding the protection of personal data to secure privacy have arisen. In May 2010, the European Union ("EU") commissioned an assessment of the adequacy of Indian data protection laws to evaluate the continued flow of personal data of European data subjects into India for processing. That assessment made adverse findings on the adequacy and preparedness of Indian data protection laws to safeguard personal data.[10]

Conducted amidst negotiations for a free trade agreement between India and the EU, the failed assessment potentially impeded the growth of India’s outsourcing industry that is heavily reliant on European and North American business.

Consequently, the Department of Electronics and Information Technology of the Ministry of Communications and Information Technology, Government of India, issued subordinate legislation under the rule-making power of the Information Technology Act, 2000 ("IT Act"), to give effect to section 43A of that statute. These rules – the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 ("Personal Data Rules")[11] — were subsequently reviewed by the Committee on Subordinate Legislation of the 15th Lok Sabha.[12] The Committee found that the Personal Data Rules contained clauses that were ambiguous, invasive of privacy and potentially illegal.[13]

In 2011, a draft privacy legislation called the ‘Right to Privacy Bill, 2011’, which was drafted within the Department of Personnel and Training ("DoPT") of the Ministry of Personnel, Public Grievances and Pensions, Government of India,  was made available on the internet along with several file notings ("First DoPT Bill"). The First DoPT Bill contained provisions for the regulation of personal data, interception of communications, visual surveillance and direct marketing. The First DoPT Bill was referred to a Committee of Secretaries chaired by the Cabinet Secretary which, on 27 May 2011, recommended several changes including re-drafts of the chapters relating to interception of communications and surveillance.

Aware of the need for personal data protection laws to enable economic growth, the Planning Commission constituted a Group of Experts under the chairmanship of Justice Ajit P. Shah, a retired Chief Justice of the Delhi High Court who delivered the judgment in the Naz Foundation case, to study foreign privacy laws, analyse existing Indian legal provisions and make specific proposals for incorporation into future Indian law. The Justice Shah Group of Experts submitted its Report to the Planning Commission on 16 October 2012 wherein it proposed the adoption of nine National Privacy Principles.[14] These are the principles of notice, choice and consent, collection limitation, purpose limitation, disclosure of information, security, openness, and accountability. The Report recommended the application of these principles in laws relating to interception of communications, video and audio recordings, use of personal identifiers, bodily and genetic material, and personal data.

Criminal Procedure and Special Laws Relating to Privacy

While the Kharak Singh and Gobind cases first brought the questions of permissibility and limits of police surveillance to the Supreme Court, the power to collect information and personal data of a person is firmly embedded in Indian criminal law and procedure. Surveillance is an essential condition of the nation-state; the inherent logic of its foundation requires the nation-state to perpetuate itself by interdicting threats to its peaceful existence. Surveillance is a method by which the nation-state’s agencies interdict those threats. The challenge for democratic countries such as India is to find the optimal balance between police powers of surveillance and the essential freedoms of its citizens, including the right to privacy.

The regime governing the interception of communications is contained in section 5(2) of the Indian Telegraph Act, 1885 ("Telegraph Act") read with rule 419A of the Indian Telegraph Rules, 1951 ("Telegraph Rules"). The Telegraph Rules were amended in 2007[15] to give effect to, amongst other things, the procedural safeguards laid down by the Supreme Court in the PUCL case. However, India’s federal scheme permits States to also legislate in this regard. Hence, in addition to the general law on interceptions contained in the Telegraph Act and Telegraph Rules, some States have also empowered their police forces with interception functions in certain cases.[16] Ironically, even though some of these State laws invoke heightened public order concerns to justify their invasions of privacy, they establish procedural safeguards based on the principle of probable cause that surpasses the Telegraph Rules.

In addition, further subordinate legislation issued to fulfil the provisions of sections 69(2) and 69B(3) of the IT Act permit the interception and monitoring of electronic communications — including emails — to collect traffic data and to intercept, monitor, and decrypt electronic communications.[17]

The proposed Privacy (Protection) Bill, 2013 and Roundtable Meetings

In this background, the proposed Privacy (Protection) Bill, 2013 seeks to protect privacy by regulating (i) the manner in which personal data is collected, processed, stored, transferred and destroyed — both by private persons for commercial gain and by the state for the purpose of governance; (ii) the conditions upon which, and procedure for, interceptions of communications — both voice and data communications, including both data-in-motion and data-at-rest — may be conducted and the authorities permitted to exercise those powers; and, (iii) the manner in which forms of surveillance not amounting to interceptions of communications — including the collection of intelligence from humans, signals, geospatial sources, measurements and signatures, and financial sources — may be conducted.

Previous roundtable meetings to seek comments and opinion on the proposed Privacy (Protection) Bill, 2013 took place at:

The roundtable meetings were multi-stakeholder events with participation from industry representatives, lawyers, journalists, civil society organizations and Government representatives. On an average, 75 per cent of the participants represented industry concerns, 15 per cent represented civil society and 10 per cent represented regulatory authorities. The model followed at the roundtable meetings allowed for equal participation from all participants.


[1]. See generally, Dan Solove, “A Taxonomy of Privacy” University of Pennsylvania Law Review (Vol. 154, No. 3, January 2006).

[2]. Wainwright v. Home Office [2003] UKHL 53.

[3]. See A v. B plc [2003] QB 195; Wainwright v. Home Office [2001] EWCA Civ 2081; R (Ellis) v. Chief Constable of Essex Police [2003] EWHC 1321 (Admin).

[4]. Kharak Singh v. State of Uttar Pradesh AIR 1963 SC 1295.

[5]. Gobind v. State of Madhya Pradesh AIR 1975 SC 1378.

[6]. R. Rajagopal v. State of Tamil Nadu AIR 1995 SC 264.

[7]. People’s Union for Civil Liberties v. Union of India (1997) 1 SCC 30.

[8]. A Division Bench of the Supreme Court of India comprising Kuldip Singh and Saghir Ahmad, JJ, found that the procedure set out in section 5(2) of the Indian Telegraph Act, 1885 and rule 419 of the Indian Telegraph Rules, 1951 did not meet the “just, fair and reasonable” test laid down in Maneka Gandhi v. Union of India AIR 1978 SC 597 requisite for the deprivation of the right to personal liberty, from whence the Division Bench found a right to privacy emanated, guaranteed under Article 21 of the Constitution of India. Therefore, Kuldip Singh, J, imposed nine additional procedural safeguards that are listed in paragraph 35 of the judgment.

[9]. Naz Foundation v. Government of NCT Delhi (2009) 160 DLT 277.

[10]. The 2010 data adequacy assessment of Indian data protection laws was conducted by Professor Graham Greenleaf. His account of the process and his summary of Indian law can found at Graham Greenleaf, "Promises and Illusions of Data Protection in Indian Law" International Data Privacy Law (47-69, Vol. 1, No. 1, March 2011).

[11]. The Rules were brought into effect vide Notification GSR 313(E) on 11 April 2011. CIS submitted comments on the Rules that can be found here – http://cis-india.org/internet-governance/blog/comments-on-the-it-reasonable-security-practices-and-procedures-and-sensitive-personal-data-or-information-rules-2011.

[12]. The Committee on Subordinate Legislation, a parliamentary ‘watchdog’ committee, is mandated by rules 317-322 of the Rules of Procedure and Conduct of Business in the Lok Sabha (14th edn., New Delhi: Lok Sabha Secretariat, 2010) to examine the validity of subordinate legislation.

[13]. See the 31st Report of the Committee on Subordinate Legislation that was presented on 21 March 2013.

[14]. See paragraphs 7.14-7.17 on pages 69-72 of the Report of the Group of Experts on Privacy, 16 October 2012, Planning Commission, Government of India.

[15]. See, the Indian Telegraph (Amendment) Rules, 2007, which were brought into effect vide Notification GSR 193(E) of the Department of Telecommunications of the Ministry of Communications and Information Technology, Government of India, dated 1 March 2007.

[16]. See, inter alia, section 14 of the Maharashtra Control of Organised Crime Act, 1999; section 14 of the Andhra Pradesh Control of Organised Crime Act, 2001; and, section 14 of the Karnataka Control of Organised Crime Act, 2000.

[17]. See, the Information Technology (Procedure and Safeguards for Monitoring and Collecting Traffic Data and Information) Rules, 2009 vide GSR 782 (E) dated 27 October 2009; and, Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009 vide GSR 780 (E) dated 27 October 2009.

Blocking of Websites

by Prasad Krishna last modified Sep 24, 2013 09:11 AM

PDF document icon Blocking of websites A1.pdf — PDF document, 2037 kB (2086287 bytes)

Freedom of Speech (Poster)

by Prasad Krishna last modified Sep 24, 2013 09:16 AM

PDF document icon Freedom of speech.pdf — PDF document, 1448 kB (1482956 bytes)

Intermediary Liabilty Poster

by Prasad Krishna last modified Sep 24, 2013 09:30 AM

PDF document icon intermediary 36x12.pdf — PDF document, 1566 kB (1604607 bytes)

Internet Governance Forum Poster

by Prasad Krishna last modified Sep 24, 2013 09:35 AM

PDF document icon IGF a2.pdf — PDF document, 11476 kB (11752118 bytes)

DNA Poster 1

by Prasad Krishna last modified Sep 24, 2013 10:12 AM

PDF document icon DNA 1.pdf — PDF document, 205 kB (210890 bytes)

DNA Poster 2

by Prasad Krishna last modified Sep 24, 2013 10:14 AM

PDF document icon DNA2.pdf — PDF document, 200 kB (205486 bytes)

UID Poster 1

by Prasad Krishna last modified Sep 24, 2013 10:15 AM

PDF document icon UID 1.pdf — PDF document, 187 kB (191529 bytes)

UID Poster 2

by Prasad Krishna last modified Sep 24, 2013 10:17 AM

PDF document icon UID 2.pdf — PDF document, 235 kB (241347 bytes)

Privacy Events Poster

by Prasad Krishna last modified Sep 24, 2013 10:19 AM

PDF document icon Privacy Events.pdf — PDF document, 37 kB (38448 bytes)

CIS and International Coalition Calls upon Governments to Protect Privacy

by Elonnai Hickok last modified Sep 25, 2013 07:21 AM
The Centre for Internet and Society (CIS) along with the International Coalition has called upon governments across the globe to protect privacy.

On September 20 in Geneva, CIS joined a huge international coalition in calling upon countries across the globe, including India to assess whether national surveillance laws and activities are in line with their international human rights obligations.

The Centre for Internet and Society has endorsed a set of international principles against unchecked surveillance. The 13 Principles set out for the first time an evaluative framework for assessing surveillance practices in the context of international human rights obligations.

A group of civil society organizations officially presented the 13 Principles this past Friday in Geneva at a side event attended by Navi Pillay, the United Nations High Commissioner for Human Rights and the United Nations Special Rapporteur on Freedom of Expression and Opinion, Frank LaRue, during the 24th session of the Human Rights Council. The side event was hosted by the Permanent Missions of Austria, Germany, Liechtenstein, Norway, Switzerland and Hungary.

Elonnai Hickok, Programme Manager at the Centre for Internet and Society has noted that "the 13 Principles are an important first step towards informing governments, corporates, and individuals across jurisdictions, including India, about needed safeguards for surveillance practices and related policies to ensure that they are necessary and proportionate."

Navi Pillay, the United Nations High Commissioner for Human Rights, speaking at the Human Rights Council stated in her opening statement on September 9:

"Laws and policies must be adopted to address the potential for dramatic intrusion on individuals’ privacy which have been made possible by modern communications technology."

Navi Pillay, the United Nations High Commissioner for Human Rights, speaking at the event, said that:

"technological advancements have been powerful tools for democracy by giving access to all to participate in society, but increasing use of data mining by intelligence agencies blurs lines between legitimate surveillance and arbitrary mass surveillance."

Frank La Rue, the United Nations Special Rapporteur on Freedom of Expression and Opinion made clear the case for a direct relationship between state surveillance, privacy and freedom of expression in this latest report to the Human Rights Council:

"The right to privacy is often understood as an essential requirement for the realization of the right to freedom of expression. Undue interference with individuals’ privacy can both directly and indirectly limit the free development and exchange of ideas. … An infringement upon one right can be both the cause and consequence of an infringement upon the other."

Speaking at the event, the UN Special Rapporteur remarked that:

"previously surveillance was carried out on targeted basis but the Internet has changed the context by providing the possibility for carrying out mass surveillance. This is the danger."

Representatives of the Centre for Internet and Society, Privacy International, the Electronic Frontier Foundation,Access,Human Rights Watch,Reporters Without Borders, Association for Progressive Communications, and theCenter for Democracy and Technology all are taking part in the event.

Find out more about the Principles at https://NecessaryandProportionate.org

Contacts

NGOs currently in Geneva for the 24th Human Rights Council:

Access
Fabiola Carrion: [email protected]

Association for Progressive Communication
Shawna Finnegan: [email protected]

Center for Democracy and Technology
Matthew Shears: [email protected]

Electronic Frontier Foundation
Katitza Rodriguez:  [email protected] - @txitua

Human Rights Watch
Cynthia Wong: [email protected]

Privacy International
Carly Nyst: [email protected]

Reporters Without Borders
Lucie Morillon: [email protected]
Hélène Sackstein: [email protected]

Signatories

Argentina
Ramiro Alvarez: [email protected]
Asociación por los Derechos Civiles

Argentina
Beatriz Busaniche: [email protected]
Fundación Via Libre

Colombia
Carolina Botero: [email protected]
Fundación Karisma

Egypt
Ahmed Ezzat: [email protected]
Afteegypt

Honduras
Hedme Sierra-Castro: [email protected]
ACI-Participa

India
Elonnai Hickok: [email protected]
Center for Internet and Society

Korea
Prof. Park:  [email protected]
Open Net Korea

Macedonia
Bardhyl Jashari: [email protected]
Metamorphosis Foundation for Internet and Society

Mauritania, Senegal, Tanzania
Abadacar Diop: [email protected]
Jonction

Portugal
Andreia Martins: [email protected]
ASSOCIAÇÃO COOLPOLITICS

Peru
Miguel Morachimo: [email protected]
Hiperderecho

Russia
Andrei Soldatov: [email protected]
Agentura.ru

Serbia
Djordje Krivokapic: [email protected]
SHARE Foundation

Western Balkans
Valentina Pellizer: [email protected]
Oneworldsee

Brasil
Marcelo Saldanha: [email protected]
IBEBrasil

Bangalore + Social Good

by Prasad Krishna last modified Sep 25, 2013 07:41 AM

PDF document icon The Social Good Summit.pdf — PDF document, 181 kB (185484 bytes)

The National Cyber Security Policy: Not a Real Policy

by Bhairav Acharya last modified Sep 25, 2013 09:49 AM
Cyber security in India is still a nascent field without an organised law and policy framework. Several actors participate in and are affected by India's still inchoate cyber security regime. The National Cyber Security Policy (NCSP) presented the government and other stakeholders with an opportune moment to understand existing legal limitations before devising a future framework. Unfortunately, the NCSP's poor drafting and meaningless provisions do not advance the field.

This article was published in the Observer Research Foundation's Cyber Security Monitor Vol. I, Issue.1, August 2013.


For some time now, law and policy observers in India have been noticing a definite decline in the quality of national policies emanating from the Central Government. Unlike legislation, which is notionally subject to debate in the Parliament of India, policies face no public evaluation before they are brought in to force. Since, unlike legislation, policies are neither binding nor enforceable, there has been no principled ground for demanding public deliberation of significant national policies. While Parliament’s falling standard of competence has been almost unanimously condemned, there has been nearly no criticism of the corresponding failure of the Centre to invigilate the quality of the official policies of its ministries. Luckily for the drafters of the National Cyber Security Policy (NCSP), the rest of the country has also mostly failed to notice its poor content.

The NCSP was notified into effect on 2 July 2013 by the Department of Electronics and Information Technology – which calls itself DeitY – of the Ministry of Communications and Information Technology. As far as legislation and legal drafting go, DeitY has a dubious record. In March 2013, in a parliamentary appraisal of subordinate law framed by DeitY, a Lok Sabha committee found ambiguity, invasions of privacy and potentially illegal clauses. Apprehensions about statutory law administered by DeitY have also found their way to the Supreme Court of India, where a constitutional challenge to certain provisions of the Information Technology Act, 2000 (IT Act) continues. On more than one occasion, owing to poor drafting, DeitY has been forced to issue advisories and press releases to clarify the meaning of its laws. Ironically, the legal validity of these clarifications is also questionable.

A national policy must set out, in real and quantifiable terms, the objectives of the government in a particular field within a specified time frame. To do that, the policy must provide the social, economic, political and legal context prevalent at the time of its issue as well as a normative statement of factual conditions it seeks to achieve at the time of its expiry. Between these two points in time, the policy must identify and explain all the particular social, economic, political and legal measures it intends to implement to secure its success. Albeit concerned solely with economic growth, the Five-Year Plans – the Second and Tenth Plans in particular, without prejudice to their success or failure, are samples of policies that are well-drafted. In this background, the NCSP should be judged on the basis of how it addresses, in no particular order, national security, democratic freedoms, economic growth and knowledge development. Let us restrict ourselves to the first two issues.

There are broadly two intersections between national security and information technology; these are: (i) the security of networked communications used by the armed forces and intelligence services, and (ii) the storage of civil information of national importance. While the NCSP makes no mention of it, the adoption of the doctrine of network-centric warfare by the three armed forces is underway. Understanding the doctrine is simple – an intensive use of information technology to create networks of information aids situational awareness and enables collaboration to bestow an advantage in combat. However, the doctrine is vulnerable to asymmetric attack using both primitive and highly sophisticated means. Pre-empting such attacks should be a primary policy concern; not so, apparently, for the NCSP which is completely silent on this issue. The NCSP is slightly more forthcoming on the protection of critical information infrastructure of a civil nature. Critical information infrastructure, such as the national power grid or the Aadhar database, is narrowly defined in section 70 of the IT Act where it used to describe a protected system. Other provisions of the IT Act also deal with the protection of critical information infrastructure. The NCSP does not explain how these statutory provisions have worked or failed, as the case may be, to necessitate further mention in a policy document. For instance, section 70A of the IT Act, inserted in 2008, enables the creation of a national nodal agency to undertake research and development and other activities in respect of critical information infrastructure. Despite this, five years later, the NCSP makes a similar recommendation to operate a National Critical Information Infrastructure Protection Centre to undertake the same activities. In the absence of any meaningful explanation of intended policy measures, there is no reason to expect that the NCSP will succeed where an Act of Parliament has failed.

But, putting aside the shortcomings of its piece-meal provisions, the NCSP also fails to address high-level conceptual policy concerns. As information repositories and governance services through information technology become increasingly integrated and centralised, the security of the information that is stored or distributed decreases. Whether by intent or error, if these consolidated repositories of information are compromised, the quantity of information susceptible to damage is greater leading to higher insecurity. Simply put, if power transmission is centrally controlled instead of zonally, a single attack could black out the entire country instead of only a part of it. Or if personal data of citizens is centrally stored, a single leak could compromise the privacy of millions of people instead of only hundreds. Therefore, a credible policy must, before it advocates greater centralisation of information, examine the merits of diffused information storage to protect national security. The NCSP utterly fails in this regard.

Concerns short of national security, such as the maintenance of law and order, are also in issue because crime is often planned and perpetrated using information technology. The prevention of crime before it is committed and its prosecution afterwards is a key policy concern. While the specific context may vary depending on the nature of the crime – the facts of terrorism are different from those of insurance fraud – the principles of constitutional and criminal law continue to apply. However, the NCSP neither examines the present framework of cybersecurity-related offences nor suggests any changes in existing law. It merely calls for a “dynamic legal framework and its periodic review to address the cyber security challenges” (sic). This is self-evident, there was no need for a new national policy to make this discovery; and, ironically, it fails to conduct the very periodic review that it envisages. This is worrying because the NCSP presented DeitY with an opportunity to review existing laws and learn from past mistakes. There are concerns that cybersecurity laws, especially relevant provisions of the IT Act and its rules, betray a lack of understanding of India’s constitutional scheme. This is exemplified by the insertion, in 2008, of section 66A into the IT Act that criminalises the sending of annoying, offensive and inconvenient electronic messages without regard for the fact that free speech that is annoying is constitutionally protected.

In India, cybersecurity law and policy attempts to compensate for the state’s inability to regulate the internet by overreaching into and encroaching upon democratic freedoms. The Central Monitoring System (CMS) that is being assembled by the Centre is a case in point. Alarmed at its inability to be privy to private communications, the Centre proposes to build systems to intercept, in real time, all voice and data traffic in India. Whereas liberal democracies around the world require such interceptions to be judicially sanctioned, warranted and supported by probable cause, India does not even have statutory law to regulate such an enterprise. Given that, once completed, the CMS will represent the largest domestic interception effort in the world, the failure of the NCSP to examine the effect of such an exercise on daily cybersecurity is bewildering. This is made worse by the fact that the state does not possess the technological competence to build such a system by itself and is currently tendering private companies for equipment. The state’s incompetence is best portrayed by the activities of the Indian Computer Emergency Response Team (CERT-In) that was constituted under section 70B of the IT Act to respond to “cyber incidents”. CERT-In has repeatedly engaged in extra-judicial censorship and has ham-handedly responded to allegedly objectionable blogs or websites by blocking access to entire domains. Unfortunately, the NCSP, while reiterating the operations of CERT-In, attempts no evaluation of its activities precluding the scope for any meaningful policy measures.

The NCSP’s poor drafting, meaningless provisions, deficiency of analysis and lack of stated measures renders it hollow. Its notification into force adds little to the public or intellectual debate about cybersecurity and does nothing to further the trajectory of either national security or democratic freedoms in India. In fairness, this problem afflicts many other national policies. There is a need to revisit the high intellectual and practical standards set by most national policies that were issued in the years following Independence.

India:Privacy in Peril

by Bhairav Acharya last modified Sep 25, 2013 09:56 AM
The danger of mass surveillance in India is for real. The absence of a regulating law is damning for Indians who want to protect their privacy against the juggernaut of state and private surveillance.
India:Privacy in Peril

The police browsing case details using the Crime and Criminal Tracking Network and Systems Technology in Hyderabad. Photo:K. RAMESH BABU


The article was originally published in the Frontline on July 12, 2013.


At the concluding scene of his latest movie, Superman disdainfully flings a surveillance drone down to earth in front of a horrified general. “You can’t control me,” he tells his military minder. “You can’t find out where I hang up my cape.” This exchange goes to the crux of surveillance: control. Surveillance is the means by which nation-states exercise control over people. If the logical basis of the nation-state is the establishment and maintenance of homogeneity, it is necessary to detect and interdict dissent before it threatens the boundedness and continuity of the national imagination. This imagination often cannot encompass diversity, so it constructs categories of others that include dissenters and outsiders. Admittedly, this happens less in India because the foundation of the Indian nation-state imagined a diverse society expressing a plurality of ideas in a variety of languages secured by a syncretic and democratic government that protected individual freedoms. Unfortunately, this vision is still to be realised, and the foundational idea of India continues to be challenged by poor governance, poverty, insurgencies and rebellion. Consequently, surveillance is, for the modern nation-state, a condicio sine qua non—an essential element without which it will eventually cease to exist. The challenge for democratic nation-states is to find the optimal balance between surveillance and the duty to protect the freedoms of its citizens.

History of wiretaps

Some countries, such as the United States, have assembled a vast apparatus of surveillance to monitor the activities of their citizens and foreigners. Let us review the recent controversy revealed by the whistle-blower Edward Snowden. In 1967, the U.S. Supreme Court ruled in Katz vs United States that wiretaps had to be warranted, judicially sanctioned and supported by probable cause. This resulted in the passage of the Wiretap Act of 1968 that regulated domestic surveillance. Following revelations that Washington was engaging in unrestricted foreign surveillance in the context of the Vietnam war and anti-war protests, the U.S. Congress enacted the Foreign Intelligence Surveillance Act (FISA) in 1978. FISA gave the U.S. government the power to conduct, without judicial sanction, surveillance for foreign intelligence information; and, with judicial sanction from a secret FISA court, surveillance of anybody if the ultimate target was a foreign power. Paradoxically, even a U.S. citizen could be a foreign power in certain circumstances. Domestically, FISA enabled secret warrants for specific items of information such as library book borrowers and car rentals.

Following the 9/11 World Trade Centre attacks, Congress enacted the Patriot Act of 2001, Section 215 of which dramatically expanded the scope of FISA to allow secret warrants to conduct surveillance in respect of “any tangible thing” that was relevant to a national security investigation. In exercise of this power, a secret FISA court issued secret warrants ordering a number of U.S. companies to share, in real time, voice and data traffic with the National Security Agency (NSA). We may never know the full scope of the NSA’s surveillance, but we know this: (a) Verizon Communications, a telecommunications major, was ordered to provide metadata for all telephone calls within and without the U.S.; (b) the NSA runs a clandestine programme called PRISM that accesses Internet traffic, such as e-mails, web searches, forum comments and blogs, in real time; and (c) the NSA manages a comprehensive data analysis system called Boundless Informant that intercepts and analyses voice and data traffic around the world and subjects them to automated pattern recognition. The documents leaked by Snowden allege that Google, Facebook, Apple, Dropbox, Microsoft and Yahoo! participate in PRISM, but these companies have denied their involvement.

India fifth-most monitored

How does this affect India? The Snowden documents reveal that India is the NSA’s fifth-most monitored country after Iran, Pakistan, Jordan and Egypt. Interestingly, China is monitored less than India. Several billion pieces of data from India, such as e-mails and telephone metadata, were intercepted and monitored by the NSA. For Indians, it is not inconceivable that our e-mails, should they be sent using Gmail, Yahoo! Mail or Hotmail, or our documents, should we be subscribing to Dropbox, or our Facebook posts, are being accessed and read by the NSA. Incredibly, most Indian governmental communication, including that of Ministers and senior civil servants, use private U.S. e-mail services. We no longer enjoy privacy online. The question of suspicious activity, irrespective of the rubric under which suspicion is measured, is moot. Any use of U.S. service providers is potentially compromised since U.S. law permits intrusive dragnet surveillance against foreigners. This clearly reveals a dichotomy in U.S. constitutional law: the Fourth Amendment’s guarantees of privacy, repeatedly upheld by U.S. courts, protect U.S. citizens to a far greater extent than they do foreigners. It is natural for a nation-state to privilege the rights of its citizens over others. As Indians, therefore, we must clearly look out for ourselves.

Privacy and personal liberty

Unfortunately, India does not have a persuasive jurisprudence of privacy protection. In the Kharak Singh (1964) and Gobind (1975) cases, the Supreme Court of India considered the question of privacy from physical surveillance by the police in and around homes of suspects. In the latter case, the court found that some of the Fundamental Rights “could be described as contributing to the right to privacy”, which was subject to a compelling public interest. This insipid inference held the field until 1994 when, in the Rajagopal (“Auto Shankar”, 1994) case, the Supreme Court, for the first time, directly located privacy within the ambit of the right to personal liberty recognised by Article 21 of the Constitution. However, Rajagopal dealt specifically with the publication of an autobiography, it did not consider the privacy of communications. In 1997, the Supreme Court considered the question of wiretaps in the People’s Union for Civil Liberties (PUCL) case. While finding that wiretaps invaded the privacy of communications, it continued to permit them subject to some procedural safeguards which continue to be routinely ignored. A more robust statement of the right to privacy was made by the Delhi High Court in the Naz Foundation case (2011) that decriminalised consensual homosexual acts; however, there is an appeal against the judgment in the Supreme Court.

Legislative silence

Judicial vagueness has been compounded by legislative silence. India does not have a law to operationalise a right to privacy. Consequently, a multitude of laws permit daily infractions of privacy. These infractions have survived because they are diverse, dissipated and quite disorganised. However, the technocratic impulse to centralise and consolidate surveillance and data collection has, in recent years, alarmed many citizens. The state hopes to, through enterprises such as the Central Monitoring System (CMS), the Crime and Criminals Tracking Network and System (CCTNS), the National Intelligence Grid (NATGRID), the Telephone Call Interception System (TCIS) and the Unique Identification Number (UID), replicate the U.S. successes in surveillance and monitoring and profiling all its citizens. However, unlike the U.S., India proposes to achieve this without an enabling law. Let us consider the CMS. No documents have been made available that indicate the scope and size of the CMS.

From a variety of police tenders for private equipment, it appears that the Central government hopes to put in place a system that will intercept, in real time, all voice and data traffic originating or terminating in India or being carried by Indian service providers. This data will be subject to pattern recognition and other automated tests to detect emotional markers, such as hate, compassion or intent. The sheer scale of this enterprise is intimidating; all communications in India’s many languages will be subject to interception and testing designed to detect different forms of dissent. This mammoth exercise in monitoring is taking place—it is understood that some components of the CMS are already operational—without statutory sanction. No credible authorities exist to supervise this exercise, no avenues for redress have been identified and no consequences have been laid down for abuse.

Statutory Surveillance

In a recent interview, Milind Deora, Minister of State for Communications and Information Technology, dismissed public scepticism of the CMS saying that direct state access to private communications was better for privacy since it reduced dependence on the interception abilities of private service providers. This circular argument is both disingenuous and incorrect. No doubt, trusting private persons with the power to intercept and store the private data of citizens is flawed. The leaking of the Niira Radia tapes, which contain the private communications of Niira Radia taped on the orders of the Income Tax Department, testifies to this flaw. However, bypassing private players to enable direct state access to private communications will preclude leaks and, thereby, remove from public knowledge the fact of surveillance. This messy situation may be obviated by a regime of statutory regulation of warranted surveillance by an independent and impartial authority. This system is favoured by liberal democracies around the world but conspicuously resisted by the Indian government.

The question of privacy legislation was recently considered by a committee chaired by Justice Ajit Prakash Shah, a former judge of the Delhi High Court who sat on the Bench that delivered the Naz Foundation judgment. The Shah Committee was constituted by the Planning Commission for a different reason: the need to protect personal data that are outsourced to India for processing. The lack of credible privacy law, it is foreseen, will result in European and other foreign personal data being sent to other attractive processing destinations, such as Vietnam, Israel or the Philippines, resulting in the decline of India’s outsourcing industry. However, the Shah Committee also noted the absence of law sufficient to protect against surveillance abuses. Most importantly, the Shah Committee formulated nine national privacy principles to inform any future privacy legislation (see story on page 26). In 2011, the Department of Personnel and Training (DoPT) of the Ministry of Human Resource Development, the same Ministry entrusted with implementing the Right to Information Act, 2005, leaked a draft privacy Bill, marked ‘Secret’, on the Internet. The DoPT Bill received substantive criticism from the Attorney General and some government Secretaries for the clumsy drafting. A new version of the DoPT Bill is reported to have been drafted and sent to the Ministry of Law for consideration. This revised Bill, which presumably contains chapters to regulate surveillance, including the interception of communications, has not been made public.

The need for privacy legislation cannot be overstated. The Snowden affair reveals the extent of possible state surveillance of private communications. For Indians who must now explore ways to protect their privacy against the juggernaut of state and private surveillance, the absence of regulatory law is damning. Permitting, through public inaction, unwarranted and non-targetted dragnet surveillance by the Indian state without reasonable cause would be an act of surrender of far-reaching implications.

Information, they say, is power. Allowing governments to exercise this power over us without thought for the rule of law constitutes the ultimate submission possible in a democratic nation-state. And, since superheroes are escapist fantasies, without the prospect of good laws we will all be subordinate to a new national imagination of control and monitoring, surveillance and profiling. If allowed to come to pass, this will be a betrayal of the foundational idea of India as a free and democratic republic tolerant of dissent.


Bhairav Acharya is a constitutional lawyer practising in the Supreme Court of India. He advises the Centre for Internet & Society, Bangalore, on privacy law and other constitutional issues.

The Central Monitoring System: Some Questions to be Raised in Parliament

by Bhairav Acharya last modified Sep 25, 2013 10:30 AM
The following are some model questions to be raised in the Parliament regarding the lack of transparency in the central monitoring system.

Preliminary

  • The Central Monitoring System (CMS) is a Central Government project to intercept communications, both voice and data, that is transmitted via telephones and the internet to, from and within India. Owing to the vast nature of this enterprise, the CMS cannot be succinctly described and the many issues surrounding this project are diverse. This Issue Brief will outline preliminary constitutional, legal and technical concerns that are presented by the CMS.
  • At the outset, it must be clearly understood that no public documentation exists to explain the scope, functions and technical architecture of the CMS. This lack of transparency is the single-largest obstacle to understanding the Central Government’s motives in conceptualising and operationalizing the CMS. This lack of public documentation is also the chief reason for the brevity of this Issue Note. Without making public the policy, law and technical abilities of the CMS, there cannot be an informed national debate on the primary concerns posed by the CMS, i.e the extent of envisaged state surveillance upon Indian citizens and the safeguards, if any, to protect the individual right to privacy.

Surveillance and Privacy

  • Surveillance is necessary to secure political organisation. Modern nation-states, which are theoretically organised on the basis of shared national and societal characteristics, require surveillance to detect threats to these characteristics. In democratic societies, beyond the immediate requirements of national integrity and security, surveillance must be targeted at securing the safety and rights of individual citizens. This Issue Brief does not dispute the fact that democratic countries, such as India, should conduct surveillance to secure legitimate ends. Concerns, however, arise when surveillance is conducted in a manner unrestricted and unregulated by law; these concerns are compounded when a lack of law is accompanied by a lack of transparency.
  • Technological advancement leads to more intrusive surveillance. The evolution of surveillance in the United States resulted, in 1967, in the first judicial recognition of the right to privacy. In Katz v. United States the US Supreme Court ruled that the privacy of communications had to be balanced with the need to conduct surveillance; and, therefore, wiretaps had to be warranted, judicially sanctioned and supported by probable cause. Katz expanded the scope of the Fourth Amendment of the US Constitution, which protected against unreasonable searches and seizures. Most subsequent US legal developments relating to the privacy of communications from surveillance originate in the Katz judgement. Other common law countries, such as the United Kingdom and Canada, have experienced similar judicial evolution to recognise that the right to privacy must be balanced with governance.


Right to Privacy in India

  • Unfortunately, India does not have a persuasive jurisprudence of privacy protection. In the Kharak Singh (1964) and Gobind (1975) cases, the Supreme Court of India considered the question of privacy from physical surveillance by the police in and around the homes of suspects. In the latter case, the Supreme Court found that some of the Fundamental Rights “could be described as contributing to the right to privacy” which was nevertheless subject to a compelling public interest. This insipid inference held the field until 1994 when, in the Rajagopal (“Auto Shankar”, 1994) case, the Supreme Court, for the first time, directly located privacy within the ambit of the right to personal liberty recognised by Article 21 of the Constitution. However, Rajagopal dealt specifically with the publication of an autobiography, it did not consider the privacy of communications. In 1997, the Supreme Court considered the question of wiretaps in the PUCL case. While finding that wiretaps invaded the privacy of communications, it continued to permit them subject to some procedural safeguards which continue to be routinely ignored. A more robust statement of the right to privacy was made recently by the Delhi High Court in the Naz Foundation case (2011) that de-criminalised consensual homosexual acts; however, this judgment has been appealed to the Supreme Court.

Issues Pertaining to the CMS

  • While judicial protection from physical surveillance was cursorily dealt with in the Kharak Singh and Gobind cases, the Supreme Court of India directly considered the issue of wiretaps in the PUCL case. Wiretaps in India primarily occur on the strength of powers granted to certain authorities under section 5(2) of the Indian Telegraph Act, 1885. The Court found that the Telegraph Act, and Rules made thereunder, did not prescribe adequate procedural safeguards to create a “just and fair” mechanism to conduct wiretaps. Therefore, it laid down the following procedure to conduct wiretaps:

(a) the order should be issued by the relevant Home Secretary (this power is delegable to a Joint Secretary),
(b) the interception must be carried out exactly in terms of the order and not in excess of it,
(c) a determination of whether the information could be reasonably secured by other means,
(d) the interception shall cease after sixty (60) days.

  • Therefore, prima facie, any voice interception conducted through the CMS will be in violation of this Supreme Court judgement. The CMS will enforce blanket surveillance upon the entire country without regard for reasonable cause or necessity. This movement away from targeted surveillance to blanket surveillance without cause, conducted without statutory sanction and without transparency, is worrying.
  • Accordingly, the following questions may be raised, in Parliament, to learn more about the CMS project:
  1. Which statutes, Government Orders, notifications etc deal with the establishment and maintenance of the CMS?
  2. Which is the nodal agency in charge of implementing the CMS?
  3. What are the powers and functions of the nodal agency?
  4. What guarantees exist to protect ordinary Indian citizens from intrusive surveillance without cause?
  5. What are the technical parameters of the CMS?
  6. What are the consequences for misuse or abuse of powers by any person working in the CMS project?
  7. What recourse is available to Indian citizens against whom there is unnecessary surveillance or against whom there has been a misuse or abuse of power?

CYFY 2013 Event Brochure

by Prasad Krishna last modified Sep 26, 2013 06:49 AM

PDF document icon cyfy flyer prog.pdf — PDF document, 854 kB (875361 bytes)

Privacy Timeline

by Prasad Krishna last modified Sep 26, 2013 10:08 AM

PDF document icon Timeline.pdf — PDF document, 42 kB (43167 bytes)

Privacy Roundtable Delhi (October)

by Prasad Krishna last modified Sep 27, 2013 12:52 PM

PDF document icon Invite-Delhi.pdf — PDF document, 2395 kB (2453051 bytes)

Privacy Protection Bill (September 2013)

by Prasad Krishna last modified Sep 27, 2013 02:03 PM

PDF document icon Privacy (Protection) Bill - 20 Sep 2013.pdf — PDF document, 199 kB (204657 bytes)

Privacy (Protection) Bill, 2013: Updated Third Draft

by Bhairav Acharya last modified Oct 01, 2013 12:25 PM
The Centre for Internet and Society has been researching privacy in India since 2010 with the objective of raising public awareness around privacy, completing in depth research, and driving a privacy legislation in India. As part of this work, we drafted the Privacy (Protection) Bill, 2013.

This research is being undertaken as part of the 'SAFEGUARDS' project that CIS is doing with Privacy International and IDRC. The following is the latest version with changes based on the Round Table held on August 24:


[Preamble]

CHAPTER I

Preliminary

1. Short title, extent and commencement. – (1) This Act may be called the Privacy (Protection) Act, 2013.

(2) It extends to the whole of India.

(3) It shall come into force on such date as the Central Government may, by notification in the Official Gazette, appoint.

2. Definitions. – In this Act and in any rules made thereunder, unless the context otherwise requires, –

(a) “anonymise” means, in relation to personal data, the removal of all data that may, whether directly or indirectly in conjunction with any other data, be used to identify the data subject;

(b) “appropriate government” means, in relation the Central Government or a Union Territory Administration, the Central Government; in relation a State Government, that State Government; and, in relation to a public authority which is established, constituted, owned, controlled or substantially financed by funds provided directly or indirectly –

(i) by the Central Government or a Union Territory Administration, the Central Government;

(ii) by a State Government, that State Government;

(c) “authorised officer” means an officer, not below the rank of a Gazetted Officer, of an All India Service or a Central Civil Service, as the case may be, who is empowered by the Central Government, by notification in the Official Gazette, to intercept a communication of another person or carry out surveillance of another person under this Act;

(d) “biometric data” means any data relating to the physical, physiological or behavioural characteristics of a person which allow their unique identification including, but not restricted to, facial images, finger prints, hand prints, foot prints, iris recognition, hand writing, typing dynamics, gait analysis and speech recognition;

(e) “Chairperson” and “Member” mean the Chairperson and Member appointed under sub-section (1) of section 17;

(f) “collect”, with its grammatical variations and cognate expressions, means, in relation to personal data, any action or activity that results in a data controller obtaining, or coming into the possession or control of, any personal data of a data subject;

(g) “communication” means a word or words, spoken, written or indicated, in any form, manner or language, encrypted or unencrypted, meaningful or otherwise, and includes visual representations of words, ideas, symbols and images, whether transmitted or not transmitted and, if transmitted, irrespective of the medium of transmission;

(h) “competent organisation” means an organisation or public authority listed in the Schedule;

(i) “data controller” means a person who, either alone or jointly or in concert with other persons, determines the purposes for which and the manner in which any personal data is processed;

(j) “data processor” means any person who processes any personal data on behalf of a data controller;

(k) “Data Protection Authority” means the Data Protection Authority constituted under sub-section (1) of section 17;

(l) “data subject” means a person who is the subject of personal data;

(m) “deoxyribonucleic acid data” means all data, of whatever type, concerning the characteristics of a person that are inherited or acquired during early prenatal development;

(n) “destroy”, with its grammatical variations and cognate expressions, means, in relation to personal data, to cease the existence of, by deletion, erasure or otherwise, any personal data;

(o) “disclose”, with its grammatical variations and cognate expressions, means, in relation to personal data, any action or activity that results in a person who is not the data subject coming into the possession or control of that personal data;

(p) “intelligence organisation” means an intelligence organisation under the Intelligence Organisations (Restriction of Rights) Act, 1985 (58 of 1985);

(q) “interception” or “intercept” means any activity intended to capture, read, listen to or understand the communication of a person;

(r) “personal data” means any data which relates to a natural person if that person can, whether directly or indirectly in conjunction with any other data, be identified from it and includes sensitive personal data;

(s) “prescribed” means prescribed by rules made under this Act;

(t) “process”, with its grammatical variations and cognate expressions, means, in relation to personal data, any action or operation which is performed upon personal data, whether or not by automated means including, but not restricted to, organisation, structuring, adaptation, modification, retrieval, consultation, use, alignment or destruction;

(u) “receive”, with its grammatical variations and cognate expressions, means, in relation to personal data, to come into the possession or control of any personal data;

(v) “sensitive personal data” means personal data as to the data subject’s –

(i) biometric data;

(ii) deoxyribonucleic acid data;

(iii) sexual preferences and practices;

(iv) medical history and health;

(v) political affiliation;

(vi) commission, or alleged commission, of any offence;

(vii) ethnicity, religion, race or caste; and

(viii) financial and credit information.

(w) “store”, with its grammatical variations and cognate expressions, means, in relation to personal data, to retain, in any form or manner and for any purpose or reason, any personal data;

(x) “surveillance” means any activity intended to watch, monitor, record or collect, or to enhance the ability to watch, record or collect, any images, signals, data, movement, behaviour or actions, of a person, a group of persons, a place or an object, for the purpose of obtaining information of a person;

and all other expressions used herein shall have the meanings ascribed to them under the General Clauses Act, 1897 (10 of 1897) or the Code of Criminal Procedure, 1973 (2 of 1974), as the case may be.

CHAPTER II

Regulation of Personal Data

3. Regulation of personal data. – Notwithstanding anything contained in any other law for time being in force, no person shall collect, store, process, disclose or otherwise handle any personal data of another person except in accordance with the provisions of this Act and any rules made thereunder.

4. Exemption. – Nothing in this Act shall apply to the collection, storage, processing or disclosure of personal data for personal or domestic use.

CHAPTER III

Protection of Personal Data

5. Regulation of collection of personal data. – (1) No personal data of a data subject shall be collected except in conformity with section 6 and section 7.

(2) No personal data of a data subject may be collected under this Act unless it is necessary for the achievement of a purpose of the person seeking its collection.

(3) Subject to section 6 and section 7, no personal data may be collected under this Act prior to the data subject being given notice, in such and form and manner as may be prescribed, of the collection.

6. Collection of personal data with prior informed consent. – (1) Subject to sub-section (2), a person seeking to collect personal data under this section shall, prior to its collection, obtain the consent of the data subject.

(2) Prior to a collection of personal data under this section, the person seeking its collection shall inform the data subject of the following details in respect of his personal data, namely: –

(a) when it will be collected;

(b) its content and nature;

(c) the purpose of its collection;

(d) the manner in which it may be accessed, checked and modified;

(e) the security practices, privacy policies and other policies, if any, to which it will be subject;

(f) the conditions and manner of its disclosure; and

(g) the procedure for recourse in case of any grievance in relation to it.

(3) Consent to the collection of personal data under this section may be obtained from the data subject in any manner or medium but shall not be obtained as a result of a threat, duress or coercion:

Provided that the data subject may, at any time after his consent to the collection of personal data has been obtained, withdraw the consent for any reason whatsoever and all personal data collected following the original grant of consent shall be destroyed forthwith:

Provided that the person who collected the personal data in respect of which consent is subsequently withdrawn may, if the personal data is necessary for the delivery of any good or the provision of any service, not deliver that good or deny that service to the data subject who withdrew his grant of consent.

7. Collection of personal data without prior consent. – Personal data may be collected without the prior consent of the data subject if it is –

(a) necessary for the provision of an emergency medical service to the data subject;

(b) required for the establishment of the identity of the data subject and the collection is authorised by a law in this regard;

(c) necessary to prevent a reasonable threat to national security, defence or public order; or

(d) necessary to prevent, investigate or prosecute a cognisable offence.

8. Regulation of storage of personal data. – (1) No person shall store any personal data for a period longer than is necessary to achieve the purpose for which it was collected or received, or, if that purpose is achieved or ceases to exist for any reason, for any period following such achievement or cessation.

(2) Save as provided in sub-section (3), any personal data collected or received in relation to the achievement of a purpose shall, if that purpose is achieved or ceases to exist for any reason, be destroyed forthwith.

(3) Notwithstanding anything contained in this section, any personal data may be stored for a period longer than is necessary to achieve the purpose for which it was collected or received, or, if that purpose has been achieved or ceases to exist for any reason, for any period following such achievement or cessation, if –

(a) the data subject grants his consent to such storage prior to the purpose for which it was collected or received being achieved or ceasing to exist;

(b) it is adduced for an evidentiary purpose in a legal proceeding; or

(c) it is required to be stored under the provisions of an Act of Parliament:

Provided that only that amount of personal data that is necessary to achieve the purpose of storage under this sub-section shall be stored and any personal data that is not required to be stored for such purpose shall be destroyed forthwith:

Provided further that any personal data stored under this sub-section shall, to the extent possible, be anonymised.

9. Regulation of processing of personal data. – (1) No person shall process any personal data that is not necessary for the achievement of the purpose for which it was collected or received.

(2) Save as provided in sub-section (3), no personal data shall be processed for any purpose other than the purpose for which it was collected or received.

(3) Notwithstanding anything contained in this section, any personal data may be processed for a purpose other than the purpose for which it was collected or received if –

(a) the data subject grants his consent to the processing and only that amount of personal data that is necessary to achieve the other purpose is processed;

(b) it is necessary to perform a contractual duty to the data subject;

(c) it is necessary to prevent a reasonable threat to national security, defence or public order; or

(d) it necessary to prevent, investigate or prosecute a cognisable offence.

10. Transfer of personal data for processing. – (1) Subject to the provisions of this section, personal data that has been collected in conformity with this Act may be transferred by a data controller to a data processor, whether located in India or otherwise, if the transfer is pursuant to an agreement that explicitly binds the data processor to same or stronger measures in respect of the storage, processing, destruction, disclosure and other handling of the personal data as are contained in this Act.

(2) No data processor shall process any personal data transferred under this section except to achieve the purpose for which it was collected.

(3) A data controller that transfers personal data under this section shall remain liable to the data subject for the actions of the data processor.

11. Security of personal data and duty of confidentiality. – (1) No person shall collect, receive, store, process or otherwise handle any personal data without implementing measures, including, but not restricted to, technological, physical and administrative measures, adequate to secure its confidentiality, secrecy, integrity and safety, including from theft, loss, damage or destruction.

(2) Data controllers and data processors shall be subject to a duty of confidentiality and secrecy in respect of personal data in their possession or control.

(3) Without prejudice to the provisions of this section, a data controller or data processor shall, if the confidentiality, secrecy, integrity or safety of personal data in its possession or control is violated by theft, loss, damage or destruction, or as a result of any disclosure contrary to the provisions of this Act, or for any other reason whatsoever, notify the data subject, in such form and manner as may be prescribed, forthwith.

12. Regulation of disclosure of personal data. – Subject to section 10, section 13 and section 14, no person shall disclose, or otherwise cause any other person to receive, the content or nature of any personal data that has been collected in conformity with this Act.

13. Disclosure of personal data with prior informed consent. – (1) Subject to sub-section (2), a data controller or data processor seeking to disclose personal data under this section shall, prior to its disclosure, obtain the consent of the data subject.

(2) Prior to a disclosure of personal data under this section, the data controller or data processor, as the case may be, seeking to disclose the personal data, shall inform the data subject of the following details in respect of his personal data, namely: –

(a) when it will be disclosed;

(b) the purpose of its disclosure;

(c) the security practices, privacy policies and other policies, if any, that will protect it; and

(d) the procedure for recourse in case of any grievance in relation to it.

14. Disclosure of personal data without prior consent. – (1) Subject to sub-section (2), personal data may be disclosed without the prior consent of the data subject if it is necessary –

(a) to prevent a reasonable threat to national security, defence or public order; or

(b) to prevent, investigate or prosecute a cognisable offence.

(2) No data controller or data processor shall disclose any personal data unless it has received an order in writing from a police officer not below the rank of [___] in such form and manner as may be prescribed:

Provided that an order for the disclosure of personal data made under this sub-section shall not require the disclosure of any personal data that is not necessary to achieve the purpose for which the disclosure is sought:

Provided further that the data subject shall be notified, in such form and manner as may be prescribed, of the disclosure of his personal data, including details of its content and nature, and the identity of the police officer who ordered its disclosure, forthwith.

15. Quality and accuracy of personal data. – (1) Each data controller and data processor shall, to the extent possible, ensure that the personal data in its possession or control, is accurate and, where necessary, is kept up to date.

(2) No data controller or data processor shall deny a data subject whose personal data is in its possession or control the opportunity to review his personal data and, where necessary, rectify anything that is inaccurate or not up to date.

(3) A data subject may, if he finds personal data in the possession or control of a data controller or data processor that is not necessary to achieve the purpose for which it was collected, received or stored, demand its destruction, and the data controller shall destroy, or cause the destruction of, the personal data forthwith.

16. Special provisions for sensitive personal data. – Notwithstanding anything contained in this Act and the provisions of any other law for the time being in force –

(a) no person shall store sensitive personal data for a period longer than is necessary to achieve the purpose for which it was collected or received, or, if that purpose has been achieved or ceases to exist for any reason, for any period following such achievement or cessation;

(b) no person shall process sensitive personal data for a purpose other than the purpose for which it was collected or received;

(c) no person shall disclose sensitive personal data to another person, or otherwise cause any other person to come into the possession or control of, the content or nature of any sensitive personal data, including any other details in respect thereof.

CHAPTER IV

The Data Protection Authority

17. Constitution of the Data Protection Authority. – (1) The Central Government shall, by notification, constitute, with effect from such date as may be specified therein, a body to be called the Data Protection Authority consisting of a Chairperson and not more than four other Members, to exercise the jurisdiction and powers and discharge the functions and duties conferred or imposed upon it by or under this Act.

(2) The Chairperson shall be a person who has been a Judge of the Supreme Court:

Provided that the appointment of the Chairperson shall be made only after consultation with the Chief Justice of India.

(3) Each Member shall be a person of ability, integrity and standing who has a special knowledge of, and professional experience of not less than ten years in privacy law and policy.

18. Term of office, conditions of service, etc. of Chairperson and Members. – (1) Before appointing any person as the Chairperson or Member, the Central Government shall satisfy itself that the person does not, and will not, have any such financial or other interest as is likely to affect prejudicially his functions as such Chairperson or Member.

(2) The Chairperson and every Member shall hold office for such period, not exceeding five years, as may be specified in the order of his appointment, but shall be eligible for reappointment:

Provided that no person shall hold office as the Chairperson or Member after he has attained the age of sixty-seven years.

(3) Notwithstanding anything contained in sub-section (2), the Chairperson or any Member may –

(a) by writing under his hand resign his office at any time;

(b) be removed from office in accordance with the provisions of section 19 of this Act.

(4) A vacancy caused by the resignation or removal of the Chairperson or Member under sub-section (3) shall be filled by fresh appointment.

(5) In the event of the occurrence of a vacancy in the office of the Chairperson, such one of the Members as the Central Government may, by notification, authorise in this behalf, shall act as the Chairperson till the date on which a new Chairperson, appointed in accordance with the provisions of this Act, to fill such vacancy, enters upon his office.

(6) When the Chairperson is unable to discharge his functions owing to absence, illness or any other cause, such one of the Members as the Chairperson may authorise in writing in this behalf shall discharge the functions of the Chairperson, till the date on which the Chairperson resumes his duties.

(7) The salaries and allowances payable to and the other terms and conditions of service of the Chairperson and Members shall be such as may be prescribed:

Provided that neither the salary and allowances nor the other terms and conditions of service of the Chairperson and any member shall be varied to his disadvantage after his appointment.

19. Removal of Chairperson and Members from office in certain circumstances. – The Central Government may remove from office the Chairperson or any Member, who –

(a) is adjudged an insolvent; or

(b) engages during his term of office in any paid employment outside the duties of his office; or

(c) is unfit to continue in office by reason of infirmity of mind or body; or

(d) is of unsound mind and stands so declared by a competent court; or

(e) is convicted for an offence which in the opinion of the President involves moral turpitude; or

(f) has acquired such financial or other interest as is likely to affect prejudicially his functions as a Chairperson or Member, or

(g) has so abused his position as to render his continuance in offence prejudicial to the public interest.

20. Functions of the Data Protection Authority. – (1) The Chairperson may inquire, suo moto or on a petition presented to it by any person or by someone acting on his behalf, in respect of any matter connected with the collection, storage, processing, disclosure or other handling of any personal data and give such directions or pass such orders as are necessary for reasons to be recorded in writing.

(2) Without prejudice to the generality of the foregoing provision, the Data Protection Authority shall perform all or any of the following functions, namely –

(a) review the safeguards provided by or under this Act and other law for the time being       in force for the protection of personal data and recommend measures for their effective implementation;

(b) review any measures taken by any entity for the protection of personal data and take such further action is it deems fit;

(c) review any action, policy or procedure of any entity to ensure compliance with this Act and any rules made hereunder;

(d) formulate, in consultation with experts, norms for the effective protection of personal data;

(e) promote awareness and knowledge of personal data protection through any means necessary;

(f) undertake and promote research in the field of protection of personal data;

(g) encourage the efforts of non-governmental organisations and institutions working in the field of personal data protection;

(h) publish periodic reports concerning the incidence of collection, processing, storage, disclosure and other handling of personal data;

(i) such other functions as it may consider necessary for the protection of personal data.

(3) Subject to the provisions of any rules prescribed in this behalf by the Central Government, the Data Protection Authority shall have the power to review any decision, judgement, decree or order made by it.

(4) In the exercise of its functions under this Act, the Data Protection Authority shall give such directions or pass such orders as are necessary for reasons to be recorded in writing.

(5) The Data Protection Authority may, in its own name, sue or be sued.

21. Secretary, officers and other employees of the Data Protection Authority. – (1) The Central Government shall appoint a Secretary to the Data Protection Authority to exercise and perform, under the control of the Chairperson such powers and duties as may be prescribed or as may be specified by the Chairperson.

(2) The Central Government may provide the Data Protection Authority with such other officers and employees as may be necessary for the efficient performance of the functions of the Data Protection Authority.

(3) The salaries and allowances payable to and the conditions of service of the Secretary and other officers and employees of the Data Protection Authority shall be such as may be prescribed.

22. Salaries, etc. be defrayed out of the Consolidated Fund of India. – The salaries and allowances payable to the Chairperson and Members and the administrative expenses, including salaries, allowances and pension, payable to or in respect of the officers and other employees of the of the Data Protection Authority shall be defrayed out of the Consolidated Fund of India.

23. Vacancies, etc. not to invalidate proceedings of the Data Protection Authority. – No act or proceeding of the Data Protection Authority shall be questioned on the ground merely of the existence of any vacancy or defect in the constitution of the Data Protection Authority or any defect in the appointment of a person acting as the Chairperson or Member.

24. Chairperson, Members and employees of the Data Protection Authority to be public servants. – The Chairperson and Members and other employees of the Data Protection Authority shall be deemed to be public servants within the meaning of section 21 of the Indian Penal Code, 1860 (45 of 1860).

25. Location of the office of the Data Protection Authority. The offices of the Data Protection Authority shall be in [___] or any other location as directed by the Chairperson in consultation with the Central Government.

26. Procedure to be followed by the Data Protection Authority. – (1) Subject to the provisions of this Act, the Data Protection Authority shall have powers to regulate –

(a) the procedure and conduct of its business;

(b) the delegation to one or more Members of such powers or functions as the Chairperson may specify.

(2) In particular and without prejudice to the generality of the foregoing provisions, the powers of the Data Protection Authority shall include the power to determine the extent to which persons interested or claiming to be interested in the subject-matter of any proceeding before it may be allowed to be present or to be heard, either by themselves or by their representatives or to cross-examine witnesses or otherwise take part in the proceedings:

Provided that any such procedure as may be prescribed or followed shall be guided by the principles of natural justice.

27. Power relating to inquiries. – (1) The Data Protection Authority shall, for the purposes of any inquiry or for any other purpose under this Act, have the same powers as vested in a civil court under the Code of Civil Procedure, 1908 (5 of 1908), while trying suits in respect of the following matters, namely –

(a) the summoning and enforcing the attendance of any person from any part of India and examining him on oath;

(b) the discovery and production of any document or other material object producible as evidence;

(c) the reception of evidence on affidavit;

(d) the requisitioning of any public record from any court or office;

(e) the issuing of any commission for the examination of witnesses; and,

(f) any other matter which may be prescribed.

(2) The Data Protection Authority shall have power to require any person, subject to any privilege which may be claimed by that person under any law for the time being in force, to furnish information on such points or matters as, in the opinion of the Data Protection Authority, may be useful for, or relevant to, the subject matter of an inquiry and any person so required shall be deemed to be legally bound to furnish such information within the meaning of section 176 and section 177 of the Indian Penal Code, 1860 (45 of 1860).

(3) The Data Protection Authority or any other officer, not below the rank of a Gazetted Officer, specially authorised in this behalf by the Data Protection Authority may enter any building or place where the Data Protection Authority has reason to believe that any document relating to the subject matter of the inquiry may be found, and may seize any such document or take extracts or copies therefrom subject to the provisions of section 100 of the Code of Criminal Procedure, 1973 (2 of 1974), in so far as it may be applicable.

(4) The Data Protection Authority shall be deemed to be a civil court and when any offence as is described in section 175, section 178, section 179, section 180 or section 228 of the Indian Penal Code, 1860 (45 of 1860) is committed in the view or presence of the Data Protection Authority, the Data Protection Authority may, after recording the facts constituting the offence and the statement of the accused as provided for in the Code of Criminal Procedure, 1973 (2 of 1974), forward the case to a Magistrate having jurisdiction to try the same and the Magistrate to whom any such case is forwarded shall proceed to hear the complaint against the accused as if the case had been forwarded to him under section 346 of the Code of Criminal Procedure, 1973 (2 of 1974).

28. Decisions of the Data Protection Authority. – (1) The decisions of the Data Protection Authority shall be binding.

(2) In its decisions, the Data Protection Authority has the power to –

(a) require an entity to take such steps as may be necessary to secure compliance with the provisions of this Act;

(b) require an entity to compensate any person for any loss or detriment suffered;

(c) impose any of the penalties provided under this Act.

29. Proceedings before the Data Protection Authority to be judicial proceedings. – The Data Protection Authority shall be deemed to be a civil court for the purposes of section 195 and Chapter XXVI of the Code of Criminal Procedure, 1973 (2 of 1974), and every proceeding before the Data Protection Authority shall be deemed to be a judicial proceeding within the meaning of section 193 and section 228 and for the purposes of section 196 of the Indian Penal Code, 1860 (45 of 1860).

CHAPTER V

Regulation by Data Controllers and Data Processors

30. Co-regulation by Data Controllers and the Data Protection Authority. – (1) The Data Protection Authority may, in consultation with data controllers, formulate codes of conduct for the collection, storage, processing, disclosure or other handling of any personal data.

(2) No code of conduct formulated under sub-section (1) shall be binding on a data controller unless –

(a) it has received the written approval of the Data Protection Authority; and

(b) it has received the approval, by signature of a director or authorised signatory, of the data controller.

31. Co-regulation without prejudice to other remedies. – Any code of conduct formulated under this chapter shall be without prejudice to the jurisdiction, powers and functions of the Data Protection Authority.

32. Self-regulation by data controllers. – (1) The Data Protection Authority may encourage data controllers and data processors to formulate professional codes of conduct to establish rules for the collection, storage, processing, disclosure or other handling of any personal data.

(2) No code of conduct formulated under sub-section (1) shall be effective unless it is registered, in such form and manner as may be prescribed, by the Data Protection Authority.

(3) The Data Protection Authority shall, for reasons to be recorded in writing, not register any code of conduct formulated under sub-section (1) that is not adequate to protect personal data.

CHAPTER IV

Surveillance and Interception of Communications

33. Surveillance and interception of communication to be warranted. – Notwithstanding anything contained in any other law for the time being in force, no –

(i) surveillance shall be carried out, and no person shall order any surveillance of another person;

(ii) communication shall be intercepted, and no person shall order the interception of any communication of another person; save in execution of a warrant issued under section 36, or an order made under section 38, of this Act.

34. Application for issuance of warrant. – (1) Any authorised officer seeking to carry out any surveillance or intercept any communication of another person shall prefer an application for issuance of a warrant to the Magistrate.

(2) The application for issuance of the warrant shall be in the form and manner prescribed in the Schedule and shall state the purpose for which the warrant is sought.

(3) The application for issuance of the warrant shall be accompanied by –

(i) a report by the authorised officer of the suspicious conduct of the person in respect of whom the warrant is sought, and all supporting material thereof;

(ii) an affidavit of the authorised officer, or a declaration under his hand and seal, that the contents of the report and application are true to the best of his knowledge, information and belief, and that the warrant shall be executed only for the purpose stated in the application and shall not be misused or abused in any manner including to interfere in the privacy of any person;

(iii) details of all warrants previously issued in respect of the person in respect of whom the warrant is sought, if any.

35. Considerations prior to the issuance of warrant. – (1) No warrant shall issue unless the requirements of section 34 and this section have been met.

(2) The Magistrate shall consider the application made under section 34 and shall satisfy himself that the information contained therein sets out –

(i) a reasonable threat to national security, defence or public order; or

(ii) a cognisable offence, the prevention, investigation or prosecution of which is necessary in the public interest.

(3) The Magistrate shall satisfy himself that all other lawful means to acquire the information that is sought by the execution of the warrant have been exhausted.

(4) The Magistrate shall verify the identity of the authorised officer and shall satisfy himself that the application for issuance of the warrant is authentic.

36. Issue of warrant. – (1) Subject to section 34 and section 35, the Magistrate may issue a warrant for surveillance or interception of communication, or both of them.

(2) The Magistrate may issue the warrant in Chambers.

37. Magistrate may reject application for issuance of warrant. – If the Magistrate is not satisfied that the requirements of section 34 and section 35 have been met, he may, for reasons to be recorded in writing, –

(i) refuse to issue the warrant and dispose of the application;

(ii) return the application to the authorised officer without disposing of it;

(iii) pass any order that he thinks fit.

38. Order by Home Secretary in emergent circumstances. – (1) Notwithstanding anything contained in section 35, if the Home Secretary of the appropriate government is satisfied that a grave threat to national security, defence or public order exists, he may, for reasons to be recorded in writing, order any surveillance or interception of communication.

(2) An authorised officer seeking an order for surveillance or interception of communication under this section shall prefer an application to the Home Secretary in the form and manner prescribed in the Schedule and accompanied by the documents required under sub-section (3) of section 34.

(3) No order for surveillance or interception of communication made by the Home Secretary under this section shall be valid upon the expiry of a period of seven days from the date of the order.

(4) Before the expiry of a period of seven days from the date of an order for surveillance or interception of communication made under this section, the authorised officer who applied for the order shall place the application before the Magistrate for confirmation.

39. Duration of warrant or order. – (1) The warrant or order for surveillance or interception of communication shall specify the period of its validity and, upon its expiry, all surveillance and interception of communication, as the case may be, carried out in relation to that warrant or order shall cease forthwith:

Provided that no warrant or order shall be valid upon the expiry of a period of sixty days from the date of its issue.

(2) A warrant issued under section 36, or an order issued under section 38, for surveillance or interception of communication, or both of them, may be renewed by a Magistrate if he is satisfied that the requirements of sub-section (2) of section 35 continue to exist.

40. Duty to inform the person concerned. – Subject to sub-section (2), before the expiry of a period of sixty days from the conclusion of any surveillance or interception of communication carried out under this Act, the authorised officer who carried out the surveillance or interception of communication shall, in writing in such form and manner as may be prescribed, notify, with reference to the warrant of the Magistrate, and, if applicable, the order of the Home Secretary, each person in respect of whom the warrant or order was issued, of the fact of such surveillance or interception and duration thereof.

(2) The Magistrate may, on an application made by an authorised officer in such form and manner as may be prescribed, if he is satisfied that the notification under sub-section (1) would –

(a) present a reasonable threat to national security, defence or public order, or

(b) adversely affect the prevention, investigation or prosecution of a cognisable offence,

for reasons to be recorded in writing addressed to the authorised officer, order that the person in respect of whom the warrant or order of surveillance or interception of communication was issued, not be notified of the fact of such interception or the duration thereof:

41. Security and duty of confidentiality and secrecy. – (1) No person shall carry out any surveillance or intercept any communication of another person without implementing measures, including, but not restricted to, technological, physical and administrative measures, to secure the confidentiality and secrecy of all information obtained as a result of the surveillance or interception of communication, as the case may be, including from theft, loss or unauthorised disclosure.

(2) Any person who carries out any surveillance or interception of any communication, or who obtains any information, including personal data, as a result of surveillance or interception of communication, shall be subject to a duty of confidentiality and secrecy in respect of it.

(3) Every competent organisation shall, before the expiry of a period of one hundred days from the enactment of this Act, designate as many officers as it deems fit as Privacy Officers who shall be administratively responsible for all interceptions of communications carried out by that competent organisation.

42. Disclosure of information. – (1) Save as provided in this section, no person shall disclose to any other person, or otherwise cause any other person to come into the knowledge or possession of, the content or nature of any information, including personal data, obtained as a result of any surveillance or interception carried out under this Act.

(2) Notwithstanding anything contained in this section, if the disclosure of any information, including personal data, obtained as a result of any surveillance or interception of any communication is necessary to –

(a) prevent a reasonable threat to national security, defence or public order, or

(b) prevent, investigate or prosecute a cognisable offence,

an authorised officer may disclose the information, including personal data, to any authorised officer of any other competent organisation.

CHAPTER VI

Offences and penalties

43. Punishment for offences related to personal data. – (1) Whoever, except in conformity with the provisions of this Act, collects, receives, stores, processes or otherwise handles any personal data shall be punishable with imprisonment for a term which may extend to [___] years and may also be liable to fine which may extend to [___] rupees.

(2) Whoever attempts to commit any offence under sub section (1) shall be punishable with the punishment provided for such offence under that sub-section.

(3) Whoever, except in conformity with the provisions of this Act, collects, receives, stores, processes or otherwise handles any sensitive personal data shall be punishable with imprisonment for a term which may extend to [increased for sensitive personal data] years and and may also be liable to fine which may extend to [___] rupees.

(4) Whoever attempts to commit any offence under sub section (3) shall be punishable with the punishment provided for such offence under that sub-section.

44. Abetment and repeat offenders. – (1) Whoever abets any offence punishable under this Act shall, if the act abetted is committed in consequence of the abetment, be punishable with the punishment provided for that offence.

(2) Whoever, having been convicted of an offence under any provision of this Act is again convicted of an offence under the same provision, shall be punishable, for the second and for each subsequent offence, with double the penalty provided for that offence.

45. Offences by companies. – (1) Where an offence under this Act has been committed by a company, every person who, at the time of the offence was committed, was in charge of, and was responsible to, the company for the conduct of the business of the company, as well as the company shall be deemed to be guilty of the offence and shall be liable to be proceeded against and punished accordingly:

Provided that nothing contained in this sub-section shall render any such person liable to any punishment, if he proves that the offence was committed without his knowledge or that he had exercised all due diligence to prevent the commission of such offence.

(2) Notwithstanding anything contained in sub-section (1), where any offence under this Act has been committed by a company and it is proved that the offence has been committed with the consent or connivance of, or is attributable to any neglect on the part of any director, manager, secretary or other officer of the company, such director, manager, secretary or other officer shall be deemed to be guilty of that offence, and shall be liable to be proceeded against and punished accordingly.

46. Cognisance. – Notwithstanding anything contained in the Code of Criminal Procedure, 1973 (2 of 1974), the offences under section 43, section 44 and section 45 shall be cognisable and non-bailable.

47. General penalty. – Whoever, in any case in which a penalty is not expressly provided by this Act, fails to comply with any notice or order issued under any provisions thereof, or otherwise contravenes any of the provisions of this Act, shall be punishable with fine which may extend to [___] rupees, and, in the case of a continuing failure or contravention, with an additional fine which may extend to [___] rupees for every day after the first during which he has persisted in such failure or contravention.

48. Punishment to be without prejudice to any other action. – The award of punishment for an offence under this Act shall be without prejudice to any other action which has been or which may be taken under this Act with respect to such contravention.

CHAPTER VII

Miscellaneous

49. Power to make rules. – (1) The Central Government may, by notification in the Official Gazette, make rules to carry out the provisions of this Act.

(2) In particular, and without prejudice to the generality of the foregoing power, such rules may provide for –

[__]

(3) Every rule made under this section shall be laid, as soon as may be after it is made, before each House of Parliament while it is in session for a period of thirty days which may be comprised in one session or in two successive sessions and if before the expiry of the session in which it is so laid or the session immediately following, both Houses agree in making any modification in the rule, or both Houses agree that the rule should not be made, the rule shall thereafter have effect only in such modified form or be of no effect, as the case may be, so however, that any such modification or annulment shall be without prejudice to the validity of anything previously done under that rule.

50. Bar of jurisdiction. – (1) On and from the appointed day, no court or authority shall have, or be entitled to exercise, any jurisdiction, powers or authority (except the Supreme Court and a High Court exercising powers under Article 32, Article 226 and Article 227 of the Constitution) in relation to matters specified in this Act.

(2) No order passed under this Act shall be appealable except as provided therein and no civil court shall have jurisdiction in respect of any matter which the Data Protection Authority is empowered by, or under, this Act to determine and no injunction shall be granted by any court or other authority in respect of any action taken or to be taken in pursuance of any power conferred by or under this Act.

51. Protection of action taken in good faith. – No suit or other legal proceeding shall lie against the Central Government, State Government, Data Protection Authority, Chairperson, Member or any person acting under the direction either of the Central Government, State Government, Data Protection Authority, Chairperson or Member in respect of anything which is in good faith done or intended to be done in pursuance of this Act or of any rules or any order made thereunder.

52. Power to remove difficulties. – (1) If any difficulty arises in giving effect to the provisions of this Act, the Central Government may, by order, published in the Official Gazette, make such provisions, not inconsistent with the provisions of this Act, as appears to it to be necessary or expedient for removing the difficulty:

Provided that no such order shall be made under this section after the expiry of a period of three years from the commencement of this Act.

(2) Every order made under this section shall be laid, as soon as may be after it is made, before each House of Parliament.

53. Act to have overriding effect. – The provisions of this Act shall have effect notwithstanding anything inconsistent therewith contained in any other law for the time being in force.

US Privacy FTC

by Prasad Krishna last modified Sep 30, 2013 06:41 AM

PDF document icon US-FTC Privacy Overview (India 2013).pdf — PDF document, 1628 kB (1668070 bytes)

A Privacy Meeting with the Federal Trade Commission in New Delhi

by Elonnai Hickok last modified Oct 03, 2013 10:25 AM
On September 20, the Centre for Internet and Society held a roundtable meeting with Betsy Broder, Counsel for International Consumer Protection, and Sarah Schroeder, Attorney, Bureau of Consumer Protection, Federal Trade Commission (FTC), United States. The meeting took place at the Imperial, Janpath, New Delhi and discussed both the U.S framework to privacy and potential frameworks and challenges to privacy in India.

As a note, thoughts shared during the meeting represented personal perspectives, and did not constitute the official position of the Federal Trade Commission.

When explaining the U.S regulatory framework for privacy the FTC attorneys highlighted that the United States does not have comprehensive privacy legislation, like in Europe,  but instead has  sectoral laws that address different aspects of privacy. For example, the Fair Credit Reporting Act maintains confidentiality of consumer credit report information, the Gramm Leach Bliley Act imposes privacy and security requirements for financial institutions, HIPAA applies to patient health information,  and the Children’s Online Privacy Protection Act prevents the collection and posting of personal information from minors.  It was discussed that the sectoral model followed by the United States allows for a nuanced balance to be struck between privacy protection and the market.  It was noted, however, that some have critiqued the U.S. regulatory framework for lacking clear principles that apply to the commercial world and lay out strong privacy protections for the individual. In light of this, the White House is developing a Privacy Bill of Rights.

The Federal Trade Commission is an independent agency in the United States Government with responsibility for enforcing both consumer protection and competition laws. It is composed of five commissioners, and a staff of roughly 1,000, which includes attorneys and economists. The FTC is primarily a law enforcement agency, but also undertakes policy development through workshops and reports, Consumer education is another key function of the agency.

On the consumer protection side, Congress has directed the FTC to enforce the Federal Trade Commission Act, as well as some more specific statutes, such as those that protect consumers from unwanted telemarketing laws, and the protection of children on line.  Its main objectives are to protect consumer interests, and prevent fraud and unfair and deceptive business practices. The FTC carries out its privacy work through its consumer protection mission.

When understanding the FTC’s role in relation to privacy, it is important to understand that the FTC’s jurisdiction applies only to certain industries as defined by Congress. Thus, for example, the FTC does not have jurisdiction over banks or telecommunications.

The most critical part of the FTC’s activities is its law enforcement function.  The FTC can investigate an organization if the staff believes that the entity may be involved in conduct that contravenes the FTC Act’s prohibition on unfair or deceptive practices, or another specific privacy law. The FTC has brought a number of privacy-related cases against major companies including Facebook, Google, ChoicePoint, and Twitter.  Many of these cases address new challenges brought about by rapidly changing technologies.

The vast majority of the FTC’s actions have been settled with consent judgments.  When the statute that the FTC enforces allows for the imposition of a civil penalty, the FTC sets the penalty at a level that ensures that it is fair and provides a deterrent, but will not impose a hardship on the company.  As a civil enforcement agency, the FTC cannot seek criminal sanctions. While enforcement is the cornerstone of the FTC’s approach to privacy, the agency also supports self-regulation, where appropriate.  In this system the FTC does not pre-approve an organization’s practices or define principles that all companies should abide by as it is felt that every organization is unique and has different needs and abilities, and assigning specific technical standards may stifle innovation.

In the meeting it was also discussed how US privacy laws may apply to overseas companies where they are providing services for US consumers or working on behalf of US companies.  For example, under the Gramm Leach Bliley Act the FTC has created the Safeguards Rule, which speaks to how financial data by financial institutions must be handled and protected.  This Rule applies to companies overseas if the company is performing work for US companies or US consumers.  In other words, a US company cannot avoid compliance by outsourcing its work to an off shore organization.    Discussions during the meeting also focused on consent and the key role that context, accessibility, and timing play in ensuring individuals have the ability to provide informed consent.  Some of the attendees suggested that this  practice  could be greatly improved in India. For example, currently in India there are companies that only provide consumers access to the company privacy policy after an individual has consented and signed up to the service.  When asked about the challenges to privacy that exist in India, many shared that, culturally, there is a different understanding of privacy in India than in many western countries.

Other thoughts included that the Indian government is currently imagining privacy regulation as being either fluid and purely self regulatory or being enforced through strict legal provisions.  Instead, the government needs to begin to expand the possibilities for a regulatory framework for privacy in India in such a way that allows for strong legal enforcement, and flexible standards.  The right to be forgotten was also discussed and it was mentioned that California has proposed a law that will allow individuals to request deletion of information.

CPR South 1

by Prasad Krishna last modified Sep 30, 2013 10:58 AM

PDF document icon CPR South 1.pdf — PDF document, 221 kB (226687 bytes)

CPR South 2

by Prasad Krishna last modified Sep 30, 2013 11:17 AM

PDF document icon CPR South 2.pdf — PDF document, 163 kB (167757 bytes)

An Analysis of the Cases Filed under Section 46 of the Information Technology Act, 2000 for Adjudication in the State of Maharashtra

by Bhairav Acharya last modified Oct 01, 2013 03:29 PM
This is a brief review of some of the cases related to privacy filed under section 46 of the Information Technology Act, 2000 ("the Act") seeking adjudication for alleged contraventions of the Act in the State of Maharashtra.

Background

Section 46 of the Act grants the Central Government the power to appoint an adjudicating officer to hold an enquiry to adjudge, upon complaints being filed before that adjudicating officer, contraventions of the Act. The adjudicating officer may be of the Central Government or of the State Government [see section 46(1) of the Act], must have field experience with information technology and law [see section 46(3) of the Act] and exercises jurisdiction over claims for damages up to `5,00,00,000 [see section 46(1A) of the Act]. For the purpose of adjudication, the officer is vested with certain powers of a civil court [see section 46(5) of the Act] and must follow basic principles of natural justice while conducting adjudications [see section 46(2) of the Act]. Hence, the adjudicating officer appointed under section 46 is a quasi-judicial authority.

In addition, the quasi-judicial adjudicating officer may impose penalties, thereby vesting him with some of the powers of a criminal court [see section 46(2) of the Act], and award compensation, the quantum of which is to be determined after taking into account factors including unfair advantage, loss and repeat offences [see section 47 of the Act]. The adjudicating officer may impose penalties for any of the offences described in section 43, section 44 and section 45 of the Act; and, further, may award compensation for losses suffered as a result of contraventions of section 43 and section 43A. The text of these sections is reproduced in the Schedule below. Further law as to the appointment of the adjudicating officer and the procedure attendant on all adjudications was made by Information Technology (Qualification and Experience of Adjudicating Officers and the Manner of Holding Enquiry) Rules, 2003.[1]

It is clear that the adjudicating officer is vested with significant judicial powers, including the power to enforce certain criminal penalties, and is an important quasi-judicial authority.

Excursus

At the outset, it is important to understand the distinction between compensation and damages. Compensation is a sum of money awarded by a civil court, before or along with the primary decree, to indemnify a person for injury or loss. It is usually awarded to a person who has a suffered a monetary loss as a result of the acts or omissions of another party. Its quantification is usually guided by principles of equity. [See Shantilal Mangaldas AIR 1969 SC 634 and Ranbir Kumar Arora AIR 1983 P&H 431]. On the hand, damages are punitive and, in addition to restoring an indemnitee to wholeness, may be imposed to deter an offender, punish exemplary offences, and recover consequential losses, amongst other objectives. Damages that are punitive, while not judicially popular in India, are usually imposed by a criminal court in common law jurisdictions. They are distinct from civil and equitable actions. [See the seminal case of The Owners of the Steamship Mediana [1900] AC 113 (HL)].

Unfortunately, section 46 of the Act uses the terms “damage”, “injury” and “compensation” interchangeably without regard for the long and rich jurisprudence that finds them to be different concepts.

The Cases related to Privacy

In the State of Maharashtra, there have been a total of 47 cases filed under section 46 of the Act. Of these, 33 cases have been disposed of by the Adjudicating Officer and 14 are currently pending disposal. [2] At least three of these cases before the Adjudicating Officer deal with issues related to privacy of communications and personal data. They are:

Case TitleForumDate

Vinod Kaushik v. Madhvika Joshi

Shri Rajesh Aggarwal
Adjudicating Officer, ex-officio Secretary, IT
Government of Maharashtra
10.10.2011
Amit D. Patwardhan v. Rud India Chains Shri Rajesh Aggarwal
Adjudicating Officer, ex-officio
Secretary, IT
Government of Maharashtra
15.04.2013
Nirmalkumar Bagherwal v. Minal Bagherwal Shri Rajesh Aggarwal
Adjudicating Officer, ex-officio Secretary, IT
Government of Maharashtra
26.08.2013

In all three cases the Adjudicating Officer was called upon to determine and penalise unauthorised access to personal data of the complainants. In the Vinod Kaushik case, the complainants’ emails and chat sessions were accessed, copied and made available to the police for legal proceedings without the permission of the complainants. In the Amit Patwardhan and Nirmalkumar Bagherwal cases, the complainants’ financial information in the form of bank account statements were obtained from their respective banks without their consent and used against them in legal proceedings.

The Vinod Kaushik complaint was filed in 2010 for privacy violations committed between 2008 and 2009. The complaint was made against the complainant’s daughter-in-law – the respondent, who was estranged from her husband, the complainant’s son. The respondent had, independent of the proceedings before the Adjudicating Officer, instituted criminal proceedings alleging cruelty and dowry-related harassment against her estranged husband and the complainant. To support some of the claims made in the criminal proceedings, the respondent accessed the email accounts of her estranged husband and the complainant and printed copies of certain communications, both emails and chat transcripts. The complaint to the Adjudicating Officer was made in relation to these emails and chat transcripts that were obtained without the consent and knowledge of the complainant and his son. On 09.08.2010, the then Adjudicating Officer dismissed the complaint after finding that, owing to the marriage between the respondent and the complainant’s son, there was a relation of mutual trust between them that resulted in the complainant and his son consensually sharing their email account passwords with the respondent. This ruling was appealed to the Cyber Appellate Tribunal ("CyAT") which, in a decision of 29.06.2011, found irregularities in the complainant’s son’s privity to the proceedings and remanded the complaint to the Adjudicating Officer for re-adjudication. The re-adjudication, which was conducted by Shri Rajesh Aggarwal as Adjudicating Officer, resulted in a final order of 10.10.2011 ("the final order") that is the subject of this analysis. The final order found that the respondent had violated the privacy of the complainant and his son by her unauthorised access of their email accounts and sharing of their private communications. However, the Adjudicating Officer found that the intent of the unauthorised access – to obtain evidence to support a criminal proceeding – was mitigatory and hence ordered the respondent to pay only a small token amount in compensation, not to the complainants but instead to the State Treasury. The Delhi High Court, which was moved in appeal because the CyAT was non-functional, upheld the final order in its decision of 27.01.2012.

The Amit Patwardhan complaint was filed against the complainant’s ex-employer – the respondent, for illegally obtaining copies of the complainant’s bank account statement. The complainant had left the employ of the respondent to work with a competing business company but not before colluding with the competing business company and diverting the respondent’s customers to them. For redress, the respondent filed suit for a decree of compensation and lead the complainant’s bank statements in evidence to prove unlawful gratification. Since the bank statements were obtained electronically by the respondent without the complainant’s consent, the jurisdiction of the Adjudicating Officer was invoked. In his order of 15.04.2013, Shri Rajesh Aggarwal, the Adjudicating Officer, found that the respondent had, by unlawfully obtaining the complainant’s bank account statements which constitute sensitive personal data, violated the complainant’s privacy. The Adjudicating Officer astutely applied the equitable doctrine of clean hands to deny compensation to the complainant; however, because the complainant’s bank was not a party to the complaint, the Adjudicating Officer was unable to make a ruling on the lack of action by the bank to protect the sensitive personal data of its depositors.

The Nirmalkumar Bagherwal complaint bears a few similarities to the preceding two cases. Like the Vinod Kaushik matter, the issue concerned the manner in which a wife, estranged but still legally married, accessed electronic records of personal data of the complainants; and, like the Amit Patwardhan matter, the object of the privacy violation was the bank account statements of the complainants that constitute sensitive personal data. The respondent was the estranged wife of one of the complainants who, along with his complainant father, managed the third complainant company. To support her claim for maintenance from the complainant and his family in an independent legal proceeding, the respondent obtained certain bank account statements of the complainants without their consent and, possibly, with the collusion of the respondent bank. After reviewing relevant law from the European Union and the United States, and observant of relevant sectoral regulations applicable in India including the relevant Master Circular of the Reserve Bank of India, and further noting preceding consumer case law on the subject, the Adjudicating Officer issued an order on 26.08.2013. The order found that the complainant’s right to privacy was violated by both the respondents but, while determining the quantum of compensation, distinguished between the respondents in respect of the degree of liability; the respondent wife was ordered to pay a token compensation amount while the respondent bank was ordered to pay higher compensation to each of the three complainants individually.

The high quality of each of the three orders bears specific mention. Despite the superb quality of the judgments of the Indian higher judiciary in the decades after independence, the overall quality of judgment-writing appears to have declined. [3] In the last decade, several Indian judges have called for higher standards of judgment writing from their fellow judges. [4] In this background, it is notable that Shri Rajesh Aggarwal, despite not being a member of the judiciary, has delivered well-reasoned, articulate and clear orders that are cognisant of legal issues and also easily understandable to a non-legal reader.

In each of these cases, the Adjudicating Officer has successfully navigated around the fact that none of the primary parties were interacting and transacting at arm’s length. In the Vinod Kaushik and Nirmalkumar Bagherwal matters, the primary parties were estranged but still legally married partners and in the Amit Patwardhan matter the parties were in an employer-employee relationship. The first Adjudicating Officer in the Vinod Kaushik matter failed, in his order of 09.08.2010, to appreciate that the individual communications of individual persons were privileged by an expectation of privacy, regardless of their relationship. Hence, despite acknowledging that the marital partners in that matter were in conflict with each other, and despite being told by one party that the other party’s access to those private communications was made without consent, the Adjudicating Officer allowed his non-judicial opinion of marriage to influence his order. This mistake was corrected when the matter was remanded for re-adjudication. In the re-adjudication, the new Adjudicating Officer correctly noted that the respondent wife could have chosen to approach the police or a court to follow the proper investigative procedure for accessing emails and other private communications of another person and that her unauthorised use of the complainant’s passwords amounted to a violation of their privacy.

Popular conceptions of different types of relationships may affect the (quasi) judicial imagination of privacy. In comparison to the Vinod Kaushik matter, the Nirmalkumar Bagherwal and Amit Patwardhan matters both dealt with unauthorised access to bank account statements, by a wife and by an ex-employer respectively. In any event, the same Adjudicating Officer presided over all three matters and correctly found that the facts in all three matters admitted to contraventions of the privacy of the complainants. The conjecture as to whether the first Adjudicating Officer in the Vinod Kaushik matter would have applied the same standard of family unity to unauthorised access of bank account statements by an estranged wife who was seeking maintenance remains untested. However, the reliance placed on the decision of the Delhi State Consumer Protection Commission in the matter of Rupa Mahajan Pahwa, [5] where the Commission found that unauthorised access to a bank pass book by an estranged husband violated the privacy of the wife, would suggest that judges clothe financial information with a standard of privacy higher than that given to emails.

Emails are a form of electronic communication. The PUCL case (Supreme Court of India, 1996)[6] while it did not explicitly deal with the standard of protection accorded to emails, held that personal communications were protected by an individual right to privacy that emanated from the protection of personal liberty guaranteed under Article 21 of the Constitution of India. Following the Maneka Gandhi case (Supreme Court of India, 1978)[7]

it is settled that persons may be deprived of their personal liberty only by a just, fair and reasonable procedure established by law. As a result, interceptions of private communications that are protected by Article 21 may only be conducted in pursuance of such a procedure. This procedure exists in the form of the Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009 that came into effect on 27 October 2009 ("the Interception Rules"). The Interception Rules set out a regime for accessing private emails in certain conditions. The powers and procedure of Section 91 of the Code of Criminal Procedure ("CrPC") may also apply to obtain data at rest, such as emails stored in an inbox or sent-mail folder.

Finally, the orders of the Adjudicating Officer reveal a well-reasoned and progressive understanding of the law and principles relating to the quantification of compensation. By choosing to impose larger amounts of compensation on the bank that violated the privacy of the complainant in the Nirmalkumar Bagherwal matter, the Adjudicating Officer has indicated that the institutions that hold sensitive personal data, such as financial information, are subject to a higher duty of care in relation of it. But, most importantly, the act of imposing monetary compensation of privacy violations is a step forward because, for the first time in India, it recognises that privacy violations are civil wrongs or injuries that demand compensation.


[1]. These Rules were issued vide GSR 220(E), dated 17 March 2003 and published in the Gazette of India, Extraordinary, Part II, Section 3(i). These Rules can be accessed here – http://it.maharashtra.gov.in/PDF/Qual_ExpAdjudicatingOfficer_Manner_of_Holding_Enquiry_Rules.PDF (visited on 30 September 2013).

[2]. These cases and statistics may be viewed here – http://it.maharashtra.gov.in/1089/IT-Act-Judgements (visited on 30 September 2013).

[3]. See generally, Upendra Baxi “"The Fair Name of Justice": The Memorable Voyage of Chief Justice Chandrachud” in A Chandrachud Reader (Justice V. S. Deshpande ed., Delhi: Documentation Centre etc., 1985) and, Rajeev Dhavan, "Judging the Judges" in Judges and the Judicial Power: Essays in Honour of Justice V. R. Krishna Iyer (Rajeev Dhavan and Salman Khurshid eds., London: Sweet & Maxwell, 1985).

[4]. See generally, Justice B.G .Harindranath, Art of Writing Judgments (Bangalore: Karnataka Judicial Academy, 2004); Justice T .S. Sivagnanam, The Salient Features of the Art of Writing Orders and Judgments (Chennai: Tamil Nadu State Judicial Academy, 2010); and, Justice Sunil Ambwani, “Writing Judgments: Comparative Models” Presentation at the National Judicial Academy, Bhopal (2006) available here – http://districtcourtallahabad.up.nic.in/articles/writing%20judgment.pdf (visited on 29 Sep 2013).

[5]. Appeal No. FA-2008/659 of the Delhi State Consumer Protection Commission, decided on 16 October 2008.

[6]. (1997) 1 SCC 301.

[7]. (1978) 1 SCC 248.

CIS Cybersecurity Series (Part 11) - Anja Kovacs

by Purba Sarkar last modified Oct 15, 2013 03:25 PM
CIS interviews Anja Kovacs, researcher and activist, and director of the Internet Democracy, Project as part of the Cybersecurity Series.

"Having the cyber security debate become more and more important was a real challenge for civil society. I think in part because many of us who were focused on human rights aren't necessarily techies. And so, when you have a conversation with a government bureaucrat, and ask questions about the kind of decisions they decided to take, very often they will come up with a technical answer in response. And then, if you don't have that expertise, it is difficult to react. In the meantime though, I think it has become clear that this is one of the biggest issues in the internet field at the moment. It is also one of the big issues that is driving the desires of governments to have a bigger role to play in internet governance. So it is an area that is unavoidable for activists. What has happened slowly is that we have come to realize that the first thing, as in most other areas, is not the technical details, but principles, and those principles are fairly similar to how they are in many other fields." - Anja Kovacs, Internet Democracy Project

Centre for Internet and Society presents its eleventh installment of the CIS Cybersecurity Series. 

The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

In this installment, CIS speaks to Anja Kovacs, director of the Internet Democracy Project. Her work focuses on a wide range of questions regarding freedom of expression, cybersecurity and the architecture of Internet governance as they relate to the Internet and democracy. Anja is currently also a member of the of the Investment Committee of the Digital Defenders Partnership and of the interim Steering Group of Best Bits, a global network of civil society members.

(Bio from internetdemocracy.in) 

Internet Democracy Project homepage: http://internetdemocracy.in/

 

This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.


The India Privacy Monitor Map

by Maria Xynou last modified Oct 09, 2013 04:26 PM
The Centre for Internet and Society has started the first Privacy Watch in India! Check out our map which includes data on the UID, NPR and CCTNS schemes, as well as on the installation of CCTV cameras and the use of drones throughout the country.
The India Privacy Monitor Map

by gruntzooki on flickr

In a country of twenty-eight diverse states and seven union territories, it remained unclear to what extent surveillance, biometric and other privacy-intrusive schemes are being implemented. We are trying to make up for this by mapping out data in every single state in India on the UID, CCTNS and NPR schemes, as well as on the installation of CCTV cameras and the use of Unmanned Aerial Vehicles (UAVs), otherwise known as drones.

In particular, the map in its current format includes data on the following:

UID: The Unique Identification Number (UID), also known as AADHAAR, is a 12-digit unique identification number which the Unique Identification Authority of India (UIDAI) is currently issuing for all residents in India (on a voluntary basis). Each UID is stored in a centralised database and linked to the basic demographic and biometric information of each individual. The UIDAI and AADHAAR currently lack legal backing.

NPR: Under the National Population Register (NPR), the demographic data of all residents in India is collected on a mandatory basis. The Unique Identification Authority of India (UIDAI) supplements the NPR with the collection of biometric data and the issue of the AADHAAR number.

CCTV: Closed-circuit television cameras which can produce images or recordings for surveillance purposes.

UAV: Unmanned Aerial Vehicles (UAVs), otherwise known as drones, are aircrafts without a human pilot on board. The flight of a UAV is controlled either autonomously by computers in the vehicle or under the remote control of a pilot on the ground or in another vehicle. UAVs are used for surveillance purposes.

CCTNS: The Crime and Criminal Tracking Networks and Systems (CCTNS) is a nationwide networking infrastructure for enhancing efficiency and effectiveness of policing and sharing data among 14,000 police stations across India.

Our India Privacy Monitor Map can be viewed through the following link: http://cis-india.org/cisprivacymonitor

This map is part of on-going research and will hopefully expand to include other schemes and projects which are potentially privacy-intrusive. We encourage all feedback and additional data!

Interview with Big Brother Watch on Privacy and Surveillance

by Maria Xynou last modified Oct 15, 2013 02:24 PM
Maria Xynou interviewed Emma Carr, the Deputy Director of Big Brother Watch, on privacy and surveillance. View this interview and gain an insight on why we should all "have something to hide"!

For all those of you who haven't heard of Big Brother Watch, it's a London-based campaign group which was founded in 2009 to protect individual privacy and defend civil liberties.

Big Brother Watch was set up to challenge policies that threaten our privacy, our freedoms and our civil liberties, and to expose the true scale of the surveillance state. The campaign group has produced unique research exposing the erosion of civil liberties in the UK, looking at the dramatic expansion of surveillance powers, the growth of the database state and the misuse of personal information. Big Brother Watch campaigns to give individuals more control over their personal data, and hold to account those who fail to respect our privacy, whether private companies, government departments or local authorities.

Emma Carr joined Big Brother Watch as Deputy Director in February 2012 and has since been regularly quoted in the UK press. The Centre for Internet and Society interviewed Emma Carr on the following questions:

  1. How do you define privacy?

  2. Can privacy and freedom of expression co-exist? Why/Why not?

  3. What is the balance between Internet freedom and surveillance?

  4. According to your research, most people worldwide care about their online privacy – yet they give up most of it through the use of social networking sites and other online services. Why, in your opinion, does this occur and what are the potential implications?

  5. Should people have the right to give up their right to privacy? Why/Why not?

  6. What implications on human rights can mass surveillance potentially have?

  7. “I'm not a terrorist and I have nothing to hide...and thus surveillance can't affect me personally.” Please comment.

  8. Do we have Internet freedom?

 

VIDEO

Interview with Bruce Schneier - Internationally Renowned Security Technologist

by Maria Xynou last modified Oct 17, 2013 08:54 AM
Maria Xynou recently interviewed Bruce Schneier on privacy and surveillance. View this interview and gain an insight on why we should all "have something to hide"!

Bruce Schneier is an internationally renowned security technologist, called a "security guru" by The Economist.

He is the author of 12 books -- including Liars and Outliers: Enabling the Trust Society Needs to Survive -- as well as hundreds of articles, essays, and academic papers. His influential newsletter "Crypto-Gram" and his blog "Schneier on Security" are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press.

Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation's Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Security Futurologist for BT -- formerly British Telecom.

The Centre for Internet and Society (CIS) interviewed Bruce Schneier on the following questions:

  1. Do you think India needs privacy legislation? Why/ Why not?

  2. The majoity of India's population lives below the line of poverty and barely has any Internet access. Is surveillance an elitist issue or should it concern the entire population in the country? Why/ Why not?

  3. “I'm not a terrorist and I have nothing to hide...and thus surveillance can't affect me personally.” Please comment.

  4. Can free speech and privacy co-exist? What is the balance between privacy and freedom of expression?

  5. Should people have the right to give up their right to privacy? Why/ Why not?

  6. Should surveillance technologies be treated as traditional arms/weapons? Why/ Why not?

  7. How can individuals protect their data (and themselves) from spyware, such as FinFisher?

  8. How would you advise young people working in the surveillance industry?

VIDEO

Interview with the Tactical Technology Collective on Privacy and Surveillance

by Maria Xynou last modified Oct 18, 2013 09:56 AM
The Centre for Internet and Society recently interviewed Anne Roth from the Tactical Technology Collective in Berlin. View this interview and gain an insight on why we should all "have something to hide"!

For all those of you who haven't heard of the Tactical Technology Collective, it's a Berlin and Bangalore-based non-profit organisation which aims to advance the skills, tools and techniques of rights advocates, empowering them to use information and communications to help marginalised communities understand and effect progressive social, environmental and political change.

Tactical Tech's Privacy & Expression programme builds the digital security awareness and capacity of human rights defenders, independent journalists, anti-corruption advocates and activists. The programme's activities range from awareness-raising comic films aimed at audiences new to digital security issues, to direct training and materials for high-risk defenders working in some of the world's most repressive environments.

Anne Roth works with Tactical Tech on the Privacy & Expression programme as a researcher and editor. Anne holds a degree in political science from the Free University of Berlin. She cofounded one of the first interactive media activist websites, Indymedia, in Germany in 2001 and has been involved with media activism and various forms of activist online media ever since. She has worked as a web editor and translator in the past. Since 2007 she has written a blog that covers privacy, surveillance, media, net politics and feminist issues.

The Centre for Internet and Society interviewed Anne Roth on the following questions:

  1. How do you define privacy?

  2. Can privacy and freedom of expression co-exist? Why/ Why not?

  3. What is the balance between Internet freedom and surveillance?

  4. According to research, most people worldwide care about their online privacy – yet they give up most of it through the use of social networking sites and other online services. Why, in your opinion, does this occur and what are the potential implications?

  5. Should people have the right to give up their right to privacy? Why/ Why not?

  6. What implications on human rights can mass surveillance potentially have?

  7. “I'm not a terrorist and I have nothing to hide...and thus surveillance can't affect me personally”. Please comment.

  8. Do we have Internet freedom?

VIDEO

Tweets from Bali IGF 2013

by Pranesh Prakash last modified Oct 28, 2013 09:09 AM
CIS is logging all tweets with the words "igf2013", "igf13", "igf", "bestbits", and "genderit" during the Intenet Governance Forum going on in the Bali this week, and making it available in downloadable files.

To enable research by those who don't want to mess around with Twitter's APIs, we are making available CSV files available to the general public. These files can be opened up in any spreadsheet software (including web-based ones), or even in a text editor.

These files will be updated with the latest version at the end of each evening in Bali.

If you have any ideas as to other keywords we should capture, or about visualizations that we should engage in, do get in touch with pranesh AT cis-india DOT org.

Open Letter to Members of the European Parliament of the Civil Liberties, Justice and Home Affairs Committee

by Elonnai Hickok last modified Oct 23, 2013 05:00 AM
An open letter was sent to the Members of the European Parliament of the Civil Liberties, Justice and Home Affairs Committee on the proposed EU Regulation. The letter was apart of an initiative that Privacy International and a number of other NGO's are undertaking.

Dear Members of the European Parliament of the Civil Liberties, Justice and Home Affairs Committee,

On behalf of The Centre for Internet and Society, Bangalore, India,  we are writing to express our support of the European Commission’s proposed General Data Protection Regulation (COM (2012) 11).

The legal framework established under the 1995 Data Protection Directive (95/46/EC) in Europe has positively influenced many existing privacy regimes worldwide, serving as a model legal framework in jurisdictions that are in the process of developing privacy regimes, including India. The positive impact of the Data Protection Directive shows the potential of the Regulation to become a global model for the protection of personal data. The Regulation seeks to address new scenarios that have arisen in the context of rapidly changing technologies and practices, increasing its potential for positively influencing privacy rights for individuals globally.

India is currently in the process of considering the enactment of privacy legislation, in part with the aim of ensuring adequate safeguards to enable and enhance information flows into India from countries around the world, including Europe. At the same time, India is seeking  Data Secure Status from the EU, on the basis of its current regime.

It is clear that the EU framework for data protection has a major influence on the current and emerging privacy regime in India. India is only one country of many that are in the beginning stages of developing a comprehensive privacy regime. Thus, we ask that you keep in mind how the Regulation will impact the rights of individual in countries outside of Europe, particularly in countries that are in the process of developing privacy regimes.

We ask that you take into consideration the four following points that we believe need to be addressed in the Regulation to help ensure adequate protection of the rights of individuals in the European Union and around the world.

  1. Strengthen the principle of purpose limitation: The Regulation should incorporate a strong purpose limitation principle that strictly limits present and future uses of personal data to the purposes for which it was originally collected. Currently, Article 6(4) allows for the further processing of data when the processing is “not compatible with the one for which the personal data have been collected”. Though the provision establishes legal requirements, one of which must be before information can be used for a further purpose, this is has proven insufficient in the existing Directive. The current provision in the Regulation dilutes the principle of purpose limitation as well as weakening an individual’s ability to make informed decisions about their personal data.
  2. Define principles for interpretation of broad terms: The Regulation should create principles for interpreting broad terms such as “legitimate interest” and “public interest”. These vague terms are used throughout the Regulation, and create the potential for loopholes or abuse. Because these terms can be interpreted in many different ways, it is important to create a set of principles to guide their interpretation  by data protection authorities and courts to avoid inconsistent application and enforcement of the Regulation.
  3. Clarify the scope of the Regulation: The Regulation should clearly describe the jurisdictional scope and reach of its provisions. Currently Article 3(1) states that the Regulation will apply to the processing of data “in the context of the activities of an establishment of a controller or a processor in the Union”.  The flow of information on the online environment coupled with trends such as cloud computing, outsourcing, and cross border business creates a scenario where defining what constitutes “context of the activities of an establishment”, is difficult and could lead to situations where personal data is not protected, as the collection, use, or storage of it does not necessarily fall within the “context of the activities”.
  4. Address access by foreign alliance bodies: In light of growing demands by law enforcement for access, use, and transfer of personal information for investigative purposes across jurisdictions– the Regulation should define the circumstances in which personal data protected by its provisions can be accessed and used by foreign intelligence bodies, and the procedure by which to do so. The Regulation should address challenges such as access by foreign intelligence bodies to data stored on the cloud and data that has passed through/is stored on foreign networks/servers.

Interview with Dr. Alexander Dix - Berlin Data Protection and Freedom of Information Commissioner

by Maria Xynou last modified Nov 06, 2013 09:29 AM
Maria Xynou recently interviewed Berlin's Data Protection and Freedom of Information Commissioner: Dr. Alexander Dix. View this interview and gain an insight on recommendations for better data protection in India!

Dr. Alexander Dix has been Berlin's Data Protection and Freedom of Information Commissioner since June 2005. He has more than 26 years of practical experience in German data protection authorities and previously served as Commissioner for the state of Bradenburg for seven years.

Dr. Dix is a specialist in telecommunications and media and has dealt with a number of issues regarding the cross-border protection of citizen’s privacy. He chairs the International Working Group on Data Protection in Telecommunications (“Berlin Group”) and is a member of the Article 29 Working Party of European Data Protection Supervisory Authorities. In this Working Party he represents the Data Protection Authorities of the 16 German States (Länder).

A native of Bad Homburg, Hessen, Dr. Alexander Dix graduated from Hamburg University with a degree in law in 1975. He received a Master of Laws degree from the London School of Economics and Political Science in 1976 and a Doctorate in law from Hamburg University in 1984. He has published extensively on issues of data protection and freedom of information. Inter alia he is a co-editor of the German Yearbook on Freedom of Information and Information Law.

The Centre for Internet and Society interviewed Dr. Alexander Dix on the following questions:

  1. What activities and functions does the Berlin data commissioner's office undertake?

  2. What powers does the Berlin data commissioner's office have? In your opinion, are these sufficient? Which powers have been most useful? If there is a lack, what would you feel is needed?

  3. How is the office of the Berlin Data Protection Commissioner funded?

  4. What is the organisational structure at the Office of the Berlin Data Protection Commissioner and the responsibilities of the key executives?

  5. If India creates a Privacy Commissioner, what structure/framework would you suggest for the office?

  6. What challenges has your office faced?

  7. What is the most common type of privacy violation that your office is faced with?

  8. Does your office differ from other EU data protection commissioner offices?

  9. How do you think data should be regulated in India?

  10. Do you support the idea of co-regulation or self-regulation?

  11. How can India protect its citizens' data when it is stored in foreign servers?

VIDEO

An Interview with Jacob Kohnstamm, Dutch Data Protection Authority and Chairman of the Article 29 Working Party

by Elonnai Hickok last modified Oct 25, 2013 04:50 AM
The Centre for Internet and Society interviewed Jacob Kohnstamm, Dutch Data Protection Authority and Chairman of the Article 29 Working Party.

What activities and functions does your office undertake?

The activities and functions of the Dutch data protection authority can roughly be divided in 4 different categories: supervisory activities, giving advise on draft legislation, raising awareness and international tasks.

The Dutch DPA supervises the legislation applicable in the Netherlands with regard to the use of personal data. The most important law is the Dutch Data Protection Act, but the Dutch DPA also supervises for example the Acts governing data processing by police and justice as well as parts of the Telecoms Act.

The supervisory activities mainly consist of investigating, ex officio, violations of the law, with the focus on violations that are serious, structural and impact a large amount of people. Where necessary, the Dutch DPA can use its sanctioning powers, including imposing a conditional fine, to enforce the law. The Dutch DPA can also decide to examine sector-wide codes of conduct that are submitted to it and provide its views in the form of a formal opinion.

In addition to investigations, the Dutch DPA advises the government, and sometimes the parliament, on draft legislation related to the processing of personal data. Following the Data Protection Act, the government is obliged to submit both primary and secondary legislation related to data processing to the DPA for advice.

As regards awareness-raising, next to publishing the results of the investigations, its views on codes of conduct and its advice on legislation, the Dutch DPA also issues guidelines, on its own initiative, explaining legal norms. Via its websites, the Dutch DPA provides more information to both data subjects and controllers on how data can and cannot be processed. Specifically for data subjects, self-empowerment tools – including standard letters to exercise their rights – are made available. Furthermore, they can contact the Dutch DPA daily via a telephone hotline.

Last but not least, the Dutch DPA participates in several International and European fora, including the Article 29 Working Party of which I am the Chair, the European and the International Conference of data protection and privacy commissioners, of whose Executive Committee I am also the Chair.

What powers does your office have? in your opinion are these sufficient? Which powers have been most useful? If there is a lack, what do you feel is needed?

The Dutch DPA has a broad range investigative powers, including the power to order the controller to hand over all relevant information and entering the premises of the controller unannounced. All organisations subjected to the supervision of the Dutch DPA are obligated to cooperate.

The Dutch DPA also has a considerable range of sanctioning powers, it can for example order the suspension or termination of certain processing operations and can also impose a conditional fine. Currently a bill is before Parliament to provide the Dutch DPA with fining powers as well.

Especially when the bill providing the Dutch DPA with fining powers will be passed, I feel the powers are sufficient, giving us all the necessary enforcement tools to ensure compliance with the law.

How is your office funded?

The Dutch DPA is funded through the government who, together with the parliament, each year determines the budget for the next year. The budget is drafted on the basis of a proposal from the Dutch DPA.

What is the organizational structure of your office and the responsibilities of the key executives?

The Dutch DPA consists of a college of commissioners and the supporting Secretariat, itself consisting of 6 departments and headed by the Director. The Dutch DPA has 2 supervision departments, one for the private and one for the public sector, a legal department, a communications department, an international department and a department providing the operational support.

If India creates a  framework of co-regulation, how would you suggest the overseeing body be structured?

Considering the many differences between India and the Netherlands - and Europe - this is a very hard question to answer. But whatever construction is chosen in India, it is of utmost importance to guarantee the independence of the supervisory authorit(y)(ies), who shall be provided with sufficient and scalable powers to be able to sanction violations.

What legal challenges has your office faced?

The biggest legal challenge we face at the moment is the new European legal framework currently being discussed. It is as yet uncertain whether and when this will enter into force, but it is clear that it will bring new challenges for our office.

What are the main differences between your offices?

Generally, I think that the differences between my office and the UK and Canadian offices mostly stem from our different legal and cultural backgrounds, especially the difference between the common law and codified law systems.

In addition, the norms and powers differ per supervisory authority. The Dutch DPA for example can enter a building without prior notice, while the ICO, if I understand correctly, can only enter with the consent of the supervised organisation.

I however prefer to look at the similarities and possibilities to overcome our differences, because I think that we all feel that providing a high level of data protection and ensuring user control are all of our main priorities.

Naturally, I am very curious to hear from Chrisopher and Chantal as well.

What are the most recent privacy developments for each of your respective offices?

The technological developments of the past decades and the increasing use of smartphones and tablets, have also made privacy developments necessary and have obliged us, as data protection authorities, to consider the rules and norms in this new environment.

What would you broadly recommend for a privacy legislation for India?

In my view the privacy legislation in India should in any case contain the basic principles of the protection of personal data, applicable to both the public and the private sector. Naturally with some exceptions for law enforcement purposes.

Furthermore, the Indian law should protect the imported data of citizens from other parts of the world as well, including the EU.

And as mentioned in my answer to question 5, it is of utmost importance that the Indian legislation guarantees the establishment of (a) completely independent supervisory authorit(y)(ies), provided with sufficient sanctioning powers, to supervise compliance with the legislation also of the government, including police and justice.

What India can Learn from the Snowden Revelations

by Elonnai Hickok last modified Oct 25, 2013 07:29 AM
Big Brother is watching, across cyberspace and international borders. Meanwhile, the Indian government has few safeguards in theory and fewer in practice. There’s no telling how prevalent or extensive Indian surveillance really is.

The title of the article was changed in the version published by Yahoo on October 23, 2013.


Since the ‘Snowden revelations’, which uncovered the United States government’s massive global surveillance through the PRISM program, there have been reactions aplenty to their impact.

The Snowden revelations highlighted the issue of human rights in the context of the existing cross-border and jurisdictional nightmare: the data of foreign citizens surveilled and harvested by agencies such as the National Security Agency through programs such as PRISM are not subject to protection found in the laws of the country. Thus, the US government has the right to access and use the data, but has no responsibility in terms of how the data will be used or respecting the rights of the people from whom the data was harvested.

The Snowden revelations demonstrated that the biggest global surveillance efforts are now being conducted by democratically elected governments – institutions of the people, by the people, for the people – that are increasingly becoming suspicious of all people.

Adding irony to this worrying trend, Snowden sought asylum from many of the most repressive regimes: this dynamic speaks to the state of society today. The Snowden revelations also demonstrate how government surveillance is shifting from targeted surveillance, warranted for a specific reason and towards a specified individual, to blanket surveillance where security agencies monitor and filter massive amounts of information.

This is happening with few checks and balances for cross-border and domestic surveillance in place, and even fewer forms of redress for the individual. This is true for many governments, including India.

India’s reaction

After the first news of the Snowden revelations, the Indian Supreme Court agreed to hear a Public Interest Litigation requesting that foreign companies that shared the information with US security agencies be held accountable for the disclosure. In response to the PIL, the Supreme Court stated it did not have jurisdiction over the US government.

The response of the Supreme Court of India demonstrates the potency of jurisdiction in today’s global information economy in the context of governmental surveillance. Despite being upset at the actions of America’s National Security Agency (NSA), there is little direct legal action that any government or individual can take against the US government or companies incorporated there.

In the PIL, the demand that companies be held responsible is interesting and representative of a global debate, as it implies that in the context of governmental surveillance, companies have a responsibility to actively evaluate and reject or accept governmental surveillance requests. Although I do not disagree with this as a principle, in reality, this evaluation is a difficult step for companies to take.

For example, in India, under Section 69 of the Information Technology Act, 2000, service providers are penalized with up to seven years in prison for non-compliance with a governmental request for surveillance. The incentives for companies to actually reject governmental requests are minimal, but one factor that could possibly push companies to become more pronounced in their resistance to installing backdoors for the government and complying with governmental surveillance requests is market pressure from consumers.

To a certain extent, this has already started to happen. Companies such as Facebook, Yahoo and Google have created ‘transparency reports’ that provide – at different granularities – information about governmental requests and the company’s compliance or rejection of the same.

In India, P. Rajeev, Member of Parliament from Kerala, has started a petition asking that the companies disclose information on Indian data given to US security agencies. Although transparency by complying companies does not translate directly into regulation of surveillance, it allows the customer to make informed choices and decide whether a company’s level of compliance with governmental requests will impact his/her use of that service.

The PIL also called for the establishment of Indian servers to protect the privacy of Indian data. This solution has been voiced by many, including government officials. Though the creation of domestic servers would ensure that the US government does not have direct and unfettered access to Indian data, as it would require that foreign governments access Indian information through a formal Mutual Legal Assistance Treaty process, it does not necessarily enhance the privacy of Indian data.

As a note, India has MLAT treaties with 34 countries. If domestic servers were established, the information would be subject to Indian laws and regulations.

Snooping

The Snowden Revelations are not the first instance to spark a discussion on domestic servers by the Government of India.

For example, in the back-and-forth between the Indian government and the Canadian company RIM, now BlackBerry, the company eventually set up servers in Mumbai and provided a lawful interception solution that satisfied the Indian government. The Indian government made similar demands from Skype and Google. In these instances, the domestic servers were meant to facilitate greater surveillance by Indian law enforcement agencies.

Currently in India there are a number of ways in which the government can legally track data online and offline. For example, the interception of telephonic communications is regulated by the Indian Telegraph Act, 1885, and relies on an order from the Secretary to the Ministry of Home Affairs. Interception, decryption, and monitoring of digital communications are governed by Section 69 of the Information Technology Act, 2000 and again rely on the order of the executive.

The collection and monitoring of traffic data is governed by Section 69B of the Information Technology Act and relies on the order of the Secretary to the government of India in the Department of Information Technology. Access to stored data, on the other hand, is regulated by Section 91 of the Code of Criminal Procedure and permits access on the authorization of an officer in charge of a police station.

The gaps in the Indian surveillance regime are many and begin with a lack of enforcement and harmonization of existing safeguards and protocols. Presently, India is in the process of realizing a privacy legislation.

In 2012, a committee chaired by Justice AP Shah (of which the Center for Internet and Society was a member) wrote The Report of the Group of Experts on Privacy, which laid out nine national privacy principles meant to be applied to different legislation and sectors – including Indian provisions on surveillance.

The creation of domestic servers is just one example of how the Indian government has been seeking greater access to information flowing within its borders. New requirements for Indian service providers and the creation of projects that go beyond the legal limits of governmental surveillance in India enable greater access to details about an individual on a real-time and blanket basis.

For example, telecoms in India are now required to include user location data as part of the ‘call detail record’ and be able to provide the same to law enforcement agencies on request under provisions in the Unified Access Service and Internet Service Provider Licenses.

At the same time, the Government of India is in the process of putting in place a Central Monitoring System that would provide Indian security agencies the ability to directly intercept communications, bypassing the service provider.

Even if the Central Monitoring System were to adhere to the legal safeguards and procedures defined under the Indian Telegraph Act and Information Technology Act, the system can only do so partially, as both provisions create a clear chain of custody that the government and service providers must follow – that is, the service provider was included as an integral component of the interception process.

If the Indian government implements the Central Monitoring System, it could remove governmental surveillance completely from the public eye. Bypassing the service provider allows the government to fully determine how much the public knows about surveillance. It also removes the market and any pressure that consumers could exert from insight provided by companies on the surveillance requests that they are facing.

Though the Indian government could (and should) be transparent about the amount and type of surveillance it is undertaking, currently there is no legal requirement for the government of India to disclose this information, and security agencies are exempt from the Right to Information Act. Thus, unless India has a Snowden somewhere in the apparatus, the Indian public cannot hope to get an idea of how prevalent or extensive Indian surveillance really is.

Policy vacuum

For any government, the surveillance of its citizens, to some degree, might be necessary. But the Snowden revelations demonstrate that there is a vacuum when it comes to surveillance policy and practices. This vacuum has permitted draconian measures of surveillance to take place and created an environment of mistrust between citizens and governments across the globe.

When governments undertake surveillance, it is critical that the purpose, necessity and legality of monitoring, and the use of the material collected are built into the regime to ensure it does not violate the human rights of the people surveilled, foreign or domestic.

In 2013, the International Principles on the Application of Human Rights to Communications Surveillance were drafted, in part, to address this vacuum. The principles seek to explain how international human rights law applies to surveillance of communications in the current digital and technological environment. They define safeguards to ensure that human rights are protected and upheld when governments undertake surveillance of communications.

When the Indian surveillance regime is measured against these principles, it appears to miss a number of them, and does not fully meet several others. In the context of surveillance projects like the Central Monitoring System, and in order to avoid an Indian version of the PRISM program, India should take into consideration the safeguards defined in the principles and strengthen its surveillance regime to ensure not only the protection of human rights in the context of surveillance, but to also establish trust in its surveillance regime and practices with other countries.


Elonnai Hickok is the Program Manager for Internet Governance at the Centre for Internet and Society, and leads its research on privacy.

(IMDEC) 2013

by Prasad Krishna last modified Oct 25, 2013 06:09 AM

PDF document icon Proposed-Program-IMDEC.pdf — PDF document, 120 kB (122998 bytes)

Mapping Digital Media Background Note

by Prasad Krishna last modified Oct 25, 2013 09:14 AM

PDF document icon Background note_MDM.pdf — PDF document, 447 kB (458684 bytes)

MDM Invite Poster

by Prasad Krishna last modified Oct 25, 2013 09:22 AM

PDF document icon MDM Invite_Poster.pdf — PDF document, 1104 kB (1130749 bytes)

MDM Press Invite

by Prasad Krishna last modified Oct 25, 2013 09:24 AM

PDF document icon MDM_Press Invite.pdf — PDF document, 775 kB (794198 bytes)

MDM Digital Media Press Release

by Prasad Krishna last modified Oct 25, 2013 09:30 AM

PDF document icon Press release_MDM Public Consultation (1).pdf — PDF document, 216 kB (221365 bytes)

Spy Files 3: WikiLeaks Sheds More Light On The Global Surveillance Industry

by Maria Xynou last modified Nov 14, 2013 04:21 PM
In this article, Maria Xynou looks at WikiLeaks' latest Spy Files and examines the legality of India's surveillance technologies, as well as their potential connection with India's Central Monitoring System (CMS) and implications on human rights.
Spy Files 3: WikiLeaks Sheds More Light On The Global Surveillance Industry

by RamyRaoof on flickr

Last month, WikiLeaks released Spy Files 3”, a mass exposure of the global surveillance trade and industry. WikiLeaks first released the Spy Files in December 2011, which entail brochures, presentations, marketing videos and technical specifications on the global trade of surveillance technologies. Spy Files 3 supplements this with 294 additional documents from 92 global intelligence contractors.

So what do the latest Spy Files reveal about India?

When we think about India, the first issues that probably come to mind are poverty and corruption, while surveillance appears to be a more “Western” and elitist issue. However, while many other developing countries are excluded from WikiLeaks’ list of surveillance technology companies, India is once again on the list with some of the most controversial spyware.

ISS World Surveillance Trade Shows

The latest Spy Files include a brochure of the ISS World 2013 -the so-called “wiretapper’s ball”- which is the world’s largest surveillance trade show. This yearsISS World Asia will take place in Malaysia during the first week of December and law enforcement agencies from around the world will have another opportunity to view and purchase the latest surveillance tech. The leaked ISS World 2013 brochure entails a list of last years’ global attendees. According to the brochure, 53% of the attendees included law enforcement agencies and individuals from the defense, public safety and interior security sectors, 41% of the attendees were ISS vendors and technology integrators, while only 6% of the attendees were telecom operators and from the private enterprise. The brochure boasts that 4,635 individuals from 110 countries attended the ISS World trade shows last year and that the percentage of attendance is increasing.

The following table lists the Indian attendees at last yearsISS World:

Law Enforcement, Defense and Interior Security Attendees

Telecom Operators and Private Enterprises Attendees

ISS Vendors and Technology Integrators Attendees

Andhra Pradesh India Police

BT

AGC Networks

CBI Academy

Cogence Investment Bank

Aqsacom India

Government of India, Telecom Department

India Reliance Communications

ClearTrail Technologies

India Cabinet Secretariat

Span Telecom Pvt. Ldt.

Foundation Technologies

India Centre for Development of Telematics (C-DOT)

Kommlabs

India Chandigarh Police

Paladion Networks

India Defence Agency

Polaris Wireless

India General Police

Polixel Security Systems

India Intelligence Department

Pyramid Cyber Security

India National Institute of Criminology

Schleicher Group

India office LOKAYUKTA NCT DELHI

Span Technologies

India Police Department, A.P.

TATA India

India Tamil Nadu Police Department

Tata Consultancy Services

Indian Police Service, Vigilance

Telecommunications India

Indian Telecommunications Authority

Vehere Interactive

NTRO India

SAIC Indian Tamil Nadu Police

17                                                        4                                                      15

According to the above table - which is based on data from the WikiLeaksISS World 2013 brochure- the majority of Indian attendees at last years’ ISS World were from the law enforcement, defense and interior security sectors. 15 Indian companies exhibited and sold their surveillance technologies to law enforcement agencies from around the world and it is notable that India’s popular ISP provider, Reliance Communications, attended the trade show too.

In addition to the ISS World 2013 brochure, the Spy Files 3 entail a detailed brochure of a major Indian surveillance technology company: ClearTrail Technologies.

ClearTrail Technologies

ClearTrail Technologies is an Indian company based in Indore. The document titled Internet Monitoring Suite from ClearTrail Technologies boasts about the company’s mass monitoring, deep packet inspection, COMINT, SIGINT, tactical Internet monitoring, network recording and lawful interception technologies. ClearTrail’s Internet Monitoring Suite includes the following products:

1. ComTrail: Mass Monitoring of IP and Voice Networks

ComTrail is an integrated product suite for centralized interception and monitoring of voice and data networks. It is equipped with an advanced analysis engine for pro-active analysis of thousands of connections and is integrated with various tools, such as Link Analysis, Voice Recognition and Target Location.

ComTrail is deployed within a service provider network and its monitoring function correlates voice and data intercepts across diverse networks to provide a comprehensive intelligence picture. ComTrail supports the capture, record and replay of a variety of Voice and IP communications in pretty much any type of communication, including - but not limited to- Gmail, Yahoo, Hotmail, BlackBerry, ICQ and GSM voice calls.

Additionally, ComTrail intercepts data from any type of network -whether Wireless, packet data, Wire line or VoIP networks- and can decode hundreds of protocols and P2P applications, including HTTP, Instant Messengers, Web-mails, VoIP Calls and MMS.

In short, ComTrail’s key features include the following:

- Equipped to handle millions of communications per day intercepted over high speed STM & Ethernet Links

- Doubles up as Targeted Monitoring System

- On demand data retention, capacity exceeding several years

- Instant Analysis across thousands of Terabytes

- Correlates Identities across multiple networks

- Speaker Recognition and Target Location

2. xTrail: Targeted IP Monitoring

xTrail is a solution for interception, decoding and analysis of high speed data traffic over IP networks and independently monitors ISPs/GPRS and 3G networks. xTrail has been designed in such a way that it can be deployed within minutes and enables law enforcement agencies to intercept and monitor targeted communications without degrading the service quality of the IP network. This product is capable of intercepting all types of networks -including wireline, wireless, cable, VoIP and VSAT networks- and acts as a black box for “record and replay” targeted Internet communications.

Interestingly enough, xTrail can filter based on a “pure keyword”, a URL/Domain with a keyword, an IP address, a mobile number or even with just a user identity, such as an email ID, chat ID or VoIP ID. Furthermore, xTrail can be integrated with link analysis tools and can export data in a digital format which can allegedly be presented in court as evidence.

In short, xTrail’s key features include the following:

- Pure passive probe

- Designed for rapid field operations at ISP/GPRS/Wi-Max/VSAT Network Gateways

- Stand-alone solution for interception, decoding and analysis of multi Gigabit IP traffic

- Portable trolley based for simplified logistics, can easily be deployed and removed from any network location

- Huge data retention, rich analysis interface and tamper proof court evidence

- Easily integrates with any existing centralized monitoring system for extended coverage

3. QuickTrail: Tactical Wi-Fi Monitoring

Some of the biggest IP monitoring challenges that law enforcement agencies face include cases when targets operate from public Internet networks and/or use encryption.

QuickTrail is a device which is designed to gather intelligence from public Internet networks, when a target is operating from a cyber cafe, a hotel, a university campus or a free Wi-Fi zone. In particular, QuickTrail is equipped with multiple monitoring tools and techniques that can help intercept almost any wired, Wi-Fi or hybrid Internet network so that a target communication can be monitored. QuickTrail can be deployed within fractions of seconds to intercept, reconstruct, replay and analyze email, chat, VoIP and other Internet activities of a target. This device supports real time monitoring and wiretapping of Ethernet LANs.

According to ClearTrail’s brochure, QuickTrail is a “all-in-one” device which can intercept secured communications, know passwords with c-Jack attack, alert on activities of a target, support active and passive interception of Wi-Fi and wired LAN and capture, reconstruct and replay. It is noteworthy that QuickTrail can identify a target machine on the basis of an IP address, MAC ID, machine name, activity status and several other parameters. In addition, QuickTrail supports protocol decoding, including HTTP, SMTP, POP3 and HTTPS. This device also enables the remote and central management of field operations at geographically different locations.

In short, QuickTrail’s key features include the following:

- Conveniently housed in a laptop computer

- Intercepts Wi-Fi and wired LANs in five different ways

- Breaks WEP, WPA/WPA2 to rip-off secured Wi-Fi networks

- Deploys spyware into a target’s machine

- Monitor’s Gmail, Yahoo and all other HTTPS-based communications

- Reconstructs webmails, chats, VoIP calls, news groups and social networks

4. mTrail: Off-The-Air Interception

mTrail offers active and passive ‘off-the-air’ interception of GSM 900/1800/1900 Mhz phone calls and data to meet law enforcement surveillance and investigation requirements. The mTrail passive interception system works in the stealth mode so that there is no dependence on the network operator and so that the target is unaware of the interception of its communications.

The mTrail system has the capability to scale from interception of 2 channels (carrier frequencies) to 32 channels. mTrail can be deployed either in a mobile or fixed mode: in the mobile mode the system is able to fit into a briefcase, while in the fixed mode the system fits in a rack-mount industrial grade chassis.

Target location identification is supported by using signal strength, target numbers, such as IMSI, TIMSI, IMEI or MSI SDN, which makes it possible to listen to the conversation on so-called “lawfully intercepted” calls in near real-time, as well as to store all calls. Additionally, mTrail supports the interception of targeted calls from pre-defined suspect lists and the monitoring of SMS and protocol information.

In short, mTrail’s key features include the following:

- Designed for passive interception of GSM communications

- Intercepts Voice and SMS “off-the-air”

- Detects the location of the target

- Can be deployed as a fixed unit or mounted in a surveillance van

- No support required from GSM operator

5. Astra: Remote Monitoring and Infection framework

Astra is a remote monitoring and infection framework which incorporates both conventional and proprietary infection methods to ensure bot delivery to the targeted devices. It also offers a varied choice in handling the behavior of bots and ensuring non-traceable payload delivery to the controller.

The conventional methods of infection include physical access to a targeted device by using exposed interfaces, such as a CD-ROM, DVD and USB ports, as well as the use of social media engineering techniques. However, Astra also supports bot deployment without requiring any physical access to the target device.

In particular, Astra can push bot to any targeted machine sharing the same LAN (wired, wi-fi or hybrid). The SEED is a generic bot which can identify a target’s location, log keystrokes, capture screen-shots, capture Mic, listen to Skype calls, capture webcams and search the target’s browsing history. Additionally, the SEED bot can also be remotely activated, deactivated or terminated, as and when required. Astra allegedly provides an un-traceable reporting mechanism that operates without using any proxies, which overrules the possibility of getting traced by the target.

Astra’s key features include the following:

- Proactive intelligence gathering

- End-to-end remote infection and monitoring framework

- Follow the target, beat encryption, listen to in-room conversations, capture keystrokes and screen shots

- Designed for centralized management of thousands of targets

- A wide range of deployment mechanisms to optimize success ration

- Non-traceable, non-detectable delivery mechanism

- Intrusive yet stealthy

- Easy interface for handling most complex tasks

- Successfully tested over the current top 10 anti-virus available in the market

- No third party dependencies

- Free from any back-door intervention

ClearTrail Technologies argue that they meet lawful interception regulatory requirements across the globe. In particular, they claim that their products are compliant with ETSI and CALEA regulations and that they are efficient to cater to region specific requirements as well.

The latest Spy Files also include data on foreign surveillance technology companies operating in India, such as Telesoft Technologies, AGT International and Verint Systems. In particular, Verint Systems has its headquarters in New York and offices all around the world, including Bangalore in India. Founded in 1994 and run by Dan Bodner, Verint Systems produces a wide range of surveillance technologies, including the following:

- Impact 360 Speech Analytics

- Impact 360 Text Analytics

- Nextiva Video Management Software (VMS)

- Nextiva Physical Security Information Management (PSIM)

- Nextiva Network Video Recorders (NVRs)

- Nextiva Video Business Intelligence (VBI)

- Nextiva Surveillance Analytics

- Nextiva IP cameras

- CYBERVISION Network Security

- ENGAGE suite

- FOCAL-INFO (FOCAL-COLLECT & FOCAL-ANALYTICS)

- RELIANT

- STAR-GATE

- VANTAGE

While Verint Systems claims to be in compliance with ETSI, CALEA and other worldwide lawful interception and standards and regulations, it remains unclear whether such products successfully help law enforcement agencies in tackling crime and terrorism, without violating individuals’ right to privacy and other human rights. After all, Verint Systems has participated in ISS World Trade shows which exhibit some of the most controversial spyware in the world, used to target individuals and for mass surveillance.

And what do the latest Spy Files mean for India?

Why is it even important to look at the latest Spy Files? Well, for starters, they reveal data about which Indian law enforcement agencies are interested in surveillance and which companies are interested in selling and/or buying the latest spy gear. And why is any of this important? I can think of three main reasons:

1. The Central Monitoring System (CMS)

2. Is any of this surveillance even legal in India?

3. Can such surveillance result in the violation of human rights?

Spy Files 3...and the Central Monitoring System (CMS)

Following the Mumbai 2008 terrorist attacks, the Telecom Enforcement, Resource and Monitoring (TREM) cells and the Centre for Development of Telematics (C-DOT) started preparing the Central Monitoring System (CMS). As of April 2013, this project is being manned by the Intelligence Bureau, while agencies which are planned to have access to it include the Research & Analysis Wing (RAW) and the Central Bureau of Investigation (CBI). ISP and Telecom operators are required to install the gear which enables law enforcement agencies to carry out the Central Monitoring System under the Unified Access Services (UAS) License Agreement.

The Central Monitoring System aims at centrally monitoring all telecommunications and Internet communications in India and its estimated cost is Rs. 4 billion. In addition to equipping government agencies with Direct Electronic Provisioning, filters and alerts on the target numbers, the CMS will also enable Call Data Records (CDR) analysis and data mining to identify personal information of the target numbers. The CMS supplements regional Internet Monitoring Systems, such as that of Assam, by providing a nationwide monitoring of telecommunications and Internet communications, supposedly to assist law enforcement agencies in tackling crime and terrorism.

However, data monitored and collected through the CMS will be stored in a centralised database, which could potentially increase the probability of centralized cyber attacks and thus increase, rather than reduce, threats to national security. Furthermore, some basic rules of statistics indicate that the bigger the amount of data, the bigger the probability of an error in matching profiles, which could potentially result in innocent people being charged with crimes they did not commit. And most importantly: the CMS currently lacks adequate legal oversight, which means that it remains unclear how monitored data will be used. The UAS License Agreement regarding the CMS mandates mass surveillance by requiring ISPs and Telecom operators to enable the monitoring and interception of communications. However, targeted and mass surveillance through the CMS not only raises serious questions around its legality, but also creates the potential for abuse of the right to privacy and other human rights.

Interestingly enough, Indian law enforcement agencies which attended last yearsISS World trade shows are linked to the Central Monitoring System. In particular, last years’ law enforcement, defense and interior security attendees include the Centre for Development of Telematics (C-DOT) and the Department of Telecommunications, both of which prepared the Central Monitoring System. The list of attendees also includes India’s Intelligence Bureau, which is manning the CMS, as well as the agencies which will have access to the CMS: the Central Bureau of Investigation (CBI), the Research and Analysis Wing (RAW), the National Technical Research Organization (NTRO) and various other state police departments and intelligence agencies.

Furthermore, Spy Files 3 entail a list of last yearsISS World security company attendees, which includes several Indian companies. Again, interestingly enough, many of these companies may potentially be aiding law enforcement with the technology to carry out the Central Monitoring System. ClearTrail Technologies, in particular, provides solutions for targeted and mass monitoring of IP and voice networks, as well as remote monitoring and infection frameworks - all of which would potentially be perfect to aid the Central Monitoring System.

In fact, ClearTrail states in its brochure that its ComTrail product is equipped to handle millions of communications per day, while its xTrail product can easily be integrated with any existing centralised monitoring system for extended coverage. And if that’s not enough, ClearTrail’s Astrais designed for the centralized management of thousands of targets. While there may not be any concrete proof that ClearTrail is indeed aiding the Centralized Monitoring System, the facts speak for themselves: ClearTrail is an Indian company which sells target and mass monitoring products to law enforcement agencies. The Centralized Monitoring System is currently being implemented. What are the odds that ClearTrail is not equipping the CMS? And what are the odds that such technology is not being used for other mass electronic surveillance programmes, such as the Lawful Intercept and Monitoring (LIM)?

Spy Files 3...and the legality of India’s surveillance technologies

ClearTrail Technologies’ brochure -the only leaked document on Indian surveillance technology by the latest Spy Files- states that the company complies with ETSI and CALEA regulations. While it’s clear that the company complies with U.S. and European regulations on the interception of communications to attract more customers in the international market, such regulations don’t really apply within India, which is part of ClearTrail’s market. Notably enough, ClearTrail does not mention any compliance with Indian regulations in its brochure. So let’s have a look at them.

India has five laws which regulate surveillance:

1. The Indian Telegraph Act, 1885

2. The Indian Post Office Act, 1898

3. The Indian Wireless Telegraphy Act, 1933

4. The Code of Criminal Procedure (CrPc), 1973: Section 91

5. The Information Technology (Amendment) Act, 2008

The Indian Post Offices Act does not cover electronic communications and the Indian Wireless Telegraphy Act lacks procedures which would determine if surveillance should be targeted or not. Neither the Indian Telegraph Act nor the Information Technology (Amendment) Act cover mass surveillance, but are both limited to targeted surveillance. Moreover, targeted interception in India according to these laws requires case-by-case authorization by either the home secretary or the secretary department of information technology. In other words, unauthorized, limitless, mass surveillance is not technically permitted by law in India.

The Indian Telegraph Act mandates that the interception of communications can only be carried out on account of a public emergency or for public safety. However, in 2008, the Information Technology Act copied most of the interception provisions of the Indian Telegraph Act, but removed the preconditions of public emergency or public safety, and instead expanded the power of the government to order interception for the “investigation of any offense”.

The interception of Internet communications is mainly covered by the 2009 Rules under the Information Technology Act 2008 and Sections 69 and 69B are particularly noteworthy. According to these Sections, an Intelligence Bureau officer who leaked national secrets may be imprisoned for up to three years, while Section 69 not only allows for the interception of any information transmitted through a computer resource, but also requires that users disclose their encryption keys upon request or face a jail sentence of up to seven years.

While these laws allow for the interception of communications and can be viewed as widely controversial, they do not technically permit the mass surveillance of communications. In other words, ClearTrail’s products, such as ComTrail, which enable the mass interception of IP networks, lack legal backing. However, the Unified Access Services (UAS) License Agreement regarding the Central Monitoring System mandates mass surveillance and requires ISP and Telecom operators to comply.

Through the licenses of the Department of Telecommunications, Internet service providers, cellular providers and telecoms are required to provide the Government of India direct access to all communications data and content even without a warrant, which is not permitted under the laws on interception. These licenses also require cellular providers to have ‘bulk encryption’ of less than 40 bits, which means that potentially any person can use off-the-air interception to monitor phone calls. However, such licenses do not regulate the capture of signal strength, target numbers like IMSI, TIMSI, IMEI or MSI SDN, which can be captured through ClearTrail’s mTrail product.

More importantly, following allegations that the National Technical Research Organization (NTRO) had been using off-the-air interception equipment to snoop on politicians in 2011, the Home Ministry issued a directive to ban the possession or use of all off-the-air phone interception gear. As a result, the Indian Government asked the Customs Department to provide an inventory of all all such equipment imported over a ten year period, and it was uncovered that as many as 73,000 pieces of equipment had been imported. Since, the Home Ministry has informed the heads of law enforcement agencies that there has been a compete ban on use of such equipment and that all those who possess such equipment and fail to inform the Government will face prosecution and imprisonment. In short, ClearTrail's product, mTrail, which undertakes off-the-air phone monitoring is illegal and Indian law enforcement agencies are prohibited from using it.

ClearTrail’s Astra product is capable of remote infection and monitoring, which can push bot to any targeted machine sharing the same LAN. While India’s ISP and telecommunications licenses generally provide some regulations, they appear to be inadequate in regulating specific surveillance technologies which have the capability to target machines and remotely monitor them. Such licenses mandate mass surveillance, but legally, wireless communications are completely unregulated, which raises the question of whether the interception of public Internet networks is allowed. In other words, it is not clear if ClearTrail’s QuickTrail is technically legal or not. The UAS License agreement mandates mass surveillance, and while the law does not prohibit it, it does not mandate mass surveillance either. This remains a grey area.

The issue of data retention arises from ClearTrails leaked brochure. In particular, ClearTrail states in its brochure that ComTrail - which undertakes mass monitoring of IP and Voice networks - retains data upon request, with a capacity that exceeds several years. xTrail - for targeted IP monitoring - has the ability to retain huge volumes of data which can potentially be used as proof in court. However, India currently lacks privacy legislation which would regulate data retention, which means that data collected by ClearTrail could potentially be stored indefinitely.

Section 7 of the Information Technology (Amendment) Act, 2008, deals with the retention of electronic records. However, this section does not state a particular data retention period, nor who will have authorized access to data during its retention, who can authorize such access, whether retained data can be shared with third parties and, if so, under what conditions. Section 7 of the Information Technology (Amendment) Act, 2008, appears to be incredibly vague and to fail to regulate data retention adequately.

Data retention requirements for service providers are included in the ISP and UASL licenses and, while they clarify the type of data they retain, they do not specify adequate conditions for data retention. Due to the lack of data protection legislation in India, it remains unclear how long data collected by companies, such as ClearTrail, would be stored for, as well as who would have authorized access to such data during its retention period, whether such data would be shared with third parties and disclosed and if so, under what conditions.

India currently lacks specific regulations for the use of various types of technologies, which makes it unclear whether ClearTrails spy products are technically legal or not. It is clear that ClearTrail’s mass interception products, such as ComTrail, are not legalized - since Indian laws allow for targeted interception- but they are mandated through the UAS License agreement regarding the Central Monitoring System.

In short, the legality of ClearTrail’s surveillance technologies remains ambiguous. While India’s ISP and telecom licenses and the UAS License Agreement mandate mass surveillance, the laws - particularly the 2009 Information Technology Rules- mandate targeted surveillance and remain silent on the issue of mass surveillance. Technically, this does not constitute mass surveillance legal or illegal, but rather a grey area. Furthermore, while Indias Telegraph Act, Information Technology Act and 2009 Rules allow for the interception, monitoring and decryption of communications and surveillance in general, they do not explicitly regulate the various types of surveillance technologies, but rather attempt to “legalize” them through the blanket term of surveillance.

One thing is clear: India’s license agreements ensure that all ISPs and telecom operators are a part of the surveillance regime. The lack of regulations for India’s surveillance technologies appear to create a grey zone for the expansion of mass surveillance in the country. According to Saikat Datta, an investigative journalist, a senior privacy telecom official stated:

Do you really think a private telecom company can stand up to the government or any intelligence agency and cite law if they want to tap someone’s phone?”



Spy Files 3...and human rights in India

The facts speak for themselves. The latest Spy Files confirm that the same agencies involved in the development of the Central Monitoring System (CMS) are also interested in the latest surveillance technology sold in the global market. Spy Files 3 also provide data on one of India’s largest surveillance technology companies, ClearTrail, which sells a wide range of surveillance technologies to law enforcement agencies around the world. And Spy Files 3 show us exactly what these technologies can do.

In particular, ClearTrail’s ComTrail provides mass monitoring of IP and voice networks, which means that law enforcement agencies using it are capable of intercepting millions of communications every day through Gmail, Yahoo, Hotmail and others, of correlating our identities across networks and of targeting our location. xTrail enables law enforcement agencies to monitor us based on our “harmless” metadata, such as our IP address, our mobile number and our email ID. Think our data is secure when using the Internet through a cyber cafe? Well QuickTrail proves us wrong, as it’s able to assist law enforcement agencies in monitoring and intercepting our communications even when we are using public Internet networks.

And indeed, carrying a mobile phone is like carrying a GPS device, especially since mTrail provides law enforcement with off-the-air interception of mobile communications. Not only can mTrail target our location, listen to our calls and store our data, but it can also undertake passive off-the-air interception and monitor our voice, SMS and protocol information. Interestingly enough, mTrail also intercepts targeted calls from a predefined suspect list. The questions though which arise are: who is a suspect? How do we even know if we are suspects? In the age of the War on Terror, potentially anyone could be a suspect and thus potentially anyone’s mobile communications could be intercepted. After all, mass surveillance dictates that we are all suspicious until proven innocent.

And if anyone can potentially be a suspect, then potentially anyone can be remotely infected and monitored by Astra. Having physical access to a targeted device is a conventional surveillance mean of the past. Today, Astra can remotely push bot to our laptops and listen to our Skype calls, capture our Webcams, search our browsing history, identify our location and much more. And why is any of this concerning? Because contrary to mainstream belief, we should all have something to hide!

Privacy protects us from abuse from those in power and safeguards our individuality and autonomy as human beings. If we are opposed to the idea of the police searching our home without a search warrant, we should be opposed to the idea of our indiscriminate mass surveillance. After all, mass surveillance - especially the type undertaken by ClearTrails products - can potentially result in the access, sharing, disclosure and retention of data much more valuable than that acquired by the police searching our home. Our credit card details, our photos, our acquaintances, our personal thoughts and opinions, and other sensitive personal information can usually be found in our laptops, which potentially can constitute much more incriminating information than that found in our homes.

And most importantly: even if we think that we have nothing to hide, it’s really not up to us to decide: it’s up to data analysts. While we may think that our data is “harmless”, a data analyst linking our data to various other people and search activities we have undertaken might indicate otherwise. Five years ago, a UK student studying Islamic terrorism for his Masters dissertation was detained for six days. The student may not have been a terrorist, but his data said this: “Young, male, Muslim... who is downloading Al-Qaeda’s training material” - and that was enough for him to get detained. Clearly, the data analysts mining his online activity did not care about the fact that the only reason why he was downloading Al-Qaeda material was for his Masters dissertation. The fact that he was a male Muslim downloading terrorist material was incriminating enough.

This incident reveals several concerning points: The first is that he was clearly already under surveillance, prior to downloading Al-Qaeda’s material. However, given that he did not have a criminal record and was “just a Masters student in the UK”, there does not appear to be any probable cause for his surveillance in the first place. Clearly he was on some suspect list on the premise that he is male and Muslim - which is a discriminative approach. The second point is that after this incident, it is likely that some male Muslims may be more cautious about their online activity - with the fear of being on some suspect list and eventually being prosecuted because their data shows that “they’re a terrorist”. Thus, mass surveillance today appears to also have implications on freedom of expression. The third point is that this incident reveals the extent of mass surveillance, since even a document downloaded by a Masters student is being monitored.

This case proves that innocent people can potentially be under surveillance and prosecuted, as a result of mass, indiscriminate surveillance. Anyone can potentially be a suspect today, and maybe for the wrong reasons. It does not matter if we think our data is “harmless”, but what matters is who is looking at our data, when and why. Every bit of data potentially hides several other bits of information which we are not aware of, but which will be revealed within a data analysis. We should always have something to hide, as that is the only way to protect us from abuse by those in power.

In the contemporary surveillance state, we are all suspects and mass surveillance technologies, such as the ones sold by ClearTrail, can potentially pose major threats to our right to privacy, freedom of expression and other human rights. And probably the main reason for this is because surveillance technologies in India legally fall in a grey area. Thus, it is recommended that law enforcement agencies in India regulate the various types of surveillance technologies in compliance with the International Principles on Communications Surveillance and Human Rights.

Spy Files 3 show us why our human rights are at peril and why we should fight for our right to be free from suspicion.

 

This article was cross-posted in Medianama on 6th November 2013.

Re: The Human DNA Profiling Bill, 2012

by Bhairav Acharya last modified Oct 29, 2013 10:00 AM
This short note speaks to legal issues arising from the proposed Human DNA Profiling Bill, 2012 ("DBT Bill") that was circulated drafted under the aegis of the Department of Biotechnology of the Ministry of Science and Technology, Government of India, which seeks to collect human DNA samples, profile them and store them. These comments are made clause-by-clause against the DBT Bill.

Note: Clause-by-clause comments on the Working Draft version of April 29, 2012 from the Centre for Internet and Society


  1. This short note speaks to legal issues arising from the proposed Human DNA Profiling Bill, 2012 ("DBT Bill") that was circulated within the Experts Committee constituted under the aegis of the Department of Biotechnology of the Ministry of Science and Technology, Government of India.
  2. This note must be read against the relevant provisions of the DBT Bill and, where indicated, together with the proposed Forensic DNA Profiling (Regulation) Bill, 2013 that was drafted by the Centre for Internet & Society, Bangalore ("CIS Bill"). These comments must also be read alongside the two-page submission titled “A Brief Note on the Forensic DNA Profiling (Regulation) Bill, 2013” ("CIS Note"). Whereas the aforesaid CIS Note raised issues that informed the drafting of the CIS Bill, this present note seeks to provide legal comments on the DBT Bill.
    Preamble
  3. The DBT Bill, in its current working form, lacks a preamble. No doubt, a preamble will be added later once the text of the DBT Bill is finalised. Instead, the DBT Bill contains an introduction. It must be borne in mind that the purpose of the legislation should be spelt out in the preamble since preambular clauses have interpretative value. [See, A. Thangal Kunju Musaliar AIR 1956 SC 246; Burrakur Coal Co. Ltd. AIR 1961 SC 954; and Arnit Das (2000) 5 SCC 488]. Hence, a preamble that states the intent of Parliament to create permissible conditions for DNA source material collection, profiling, retention and forensic use in criminal trials is necessary.
    Objects Clause
  4. An ‘objects clause,’ detailing the intention of the legislature and containing principles to inform the application of a statute, in the main body of the statute is an enforceable mechanism to give directions to a statute and can be a formidable primary aid in statutory interpretation. [See, for example, section 83 of the Patents Act, 1970 that directly informed the Order of the Controller of Patents, Mumbai, in the matter of NATCO Pharma and Bayer Corporation in Compulsory Licence Application No. 1 of 2011.] Therefore, the DBT Bill should incorporate an objects clause that makes clear that (i) the principles of notice, confidentiality, collection limitation, personal autonomy, purpose limitation and data minimisation must be adhered to at all times; (ii) DNA profiles merely estimate the identity of persons, they do not conclusively establish unique identity; (iii) all individuals have a right to privacy that must be continuously weighed against efforts to collect and retain DNA; (iv) centralised databases are inherently dangerous because of the volume of information that is at risk; (v) forensic DNA profiling is intended to have probative value; therefore, if there is any doubt regarding a DNA profile, it should not be received in evidence by a court; (vi) once adduced, the evidence created by a DNA profile is only corroborative and must be treated on par with other biometric evidence such as fingerprint measurements.
    Definitions
  5. The definition of “analytical procedure” in clause 2(1)(a) of the DBT Bill is practically redundant and should be removed. It is used only twice – in clauses 24 and 66(2)(p) which give the DNA Profiling Board the power to frame procedural regulations. In the absence of specifying the content of any analytical procedure, the definition serves no purpose.
  6. The definition of “audit” in clause 2(1)(b) is relevant for measuring the training programmes and laboratory conditions specified in clauses 12(f) and 27. However, the term “audit” is subsequently used in an entirely different manner in Chapter IX which relates to financial information and transparency. This is a conflicting definition. The term “audit” has a well-established use for financial information that does not require a definition. Hence, this definition should be removed.
  7. The definition of “calibration” in clause 2(1)(d) is redundant and should be removed since the term is not meaningfully used in the DBT Bill.
  8. The definition of “DNA Data Bank” in clause 2(1)(h) is unnecessary. The DBT Bill seeks to establish a National DNA Data Bank, State DNA Data Banks and Regional DNA Data Banks vide clause 32. These national, state and regional databases must be defined individually with reference to their establishment clauses. Defining a “DNA Data Bank”, exclusive of the national, state and regional databases, creates the assumption that any private individual can start and maintain a database. This is a drafting error.
  9. The definition of “DNA Data Bank Manager” in clause 2(1)(i) is misleading since, in the text of the DBT Bill, it is only used in relation to the proposed National DNA Data Bank and never in relation to the State and Regional Data Banks. If it is the intention of DBT Bill that only the national database should have a manager, the definition should be renamed to ‘National DNA Data Bank Manager’ and the clause should specifically identify the National DNA Data Bank. This is a drafting error.
  10. The definition of “DNA laboratory” in clause 2(1)(j) should refer to the specific clauses that empower the Central Government and State Governments to license and recognise DNA laboratories. This is a drafting error.
  11. The definition of “DNA profile” in clause 2(1)(l) is too vague. Merely the results of an analysis of a DNA sample may not be sufficient to create an actual DNA profile. Further, the results of the analysis may yield DNA information that, because of incompleteness or lack of information, is inconclusive. These incomplete bits of information should not be recognised as DNA profiles. This definition should be amended to clearly specify the contents of a complete and valid DNA profile that contains, at least, numerical representations of 17 or more loci of short tandem repeats that are sufficient to estimate biometric individuality of a person.
  12. The definition of “forensic material” in clause 2(1)(o) needs to be amended to remove the references to intimate and non-intimate body samples. If the references are retained, then evidence collected from a crime scene, where an intimate or non-intimate collection procedure was obviously not followed, will not fall within the scope of “forensic material”.
  13. The terms “intimate body sample” and “non-intimate body sample” that are defined in clauses 2(1)(q) and 2(1)(v) respectively are not used anywhere outside the definitions clause except for an inconsequential reference to non-intimate body samples only in the rule-making provision of clause 66(2)(zg). “Intimate body sample” is not used anywhere outside the definitions clause. Both these definitions are redundant and should be removed.
  14. The terms “intimate forensic procedure” and “non-intimate forensic procedure”, that are defined in clauses 2(1)(r) and 2(1)(w) respectively, are not used anywhere except for an inconsequential reference of non-intimate forensic procedure in the rule-making provision of clause 66(2)(zg). “Intimate forensic procedure” is not used anywhere outside the definitions clause. Both these definitions are redundant and should be removed.
  15. The term “known samples” that is defined in clause 2(1)(s) is not used anywhere outside the definitions clause and should be removed for redundancy.
  16. The definition of “offender” in clause 2(1)(y) if vague because it does not specify the offences for which an “offender” need be convicted. It is also linked to an unclear definition of the term “undertrial”, which does not specify the nature of pending criminal proceedings and, therefore, could be used to describe simple offences such as, for example, failure to pay an electricity bill, which also attracts criminal penalties.
  17. The term “proficiency testing” that is defined in clause 2(1)(zb) is not used anywhere in the text of the DBT Bill and should be removed.
  18. The definitions of “quality assurance”, “quality manual” and “quality system” serve no enforceable purpose since they are used only in relation to the DNA Profiling Board’s rule-making powers under clauses 18 and 66. Their inclusion in the definitions clause is redundant. Accordingly, these definitions should be removed.
  19. The term “suspect” defined in clause 2(1)(zi) is vague and imprecise. The standard by which suspicion is to be measured, and by whom suspicion may be entertained – whether police or others, has not been specified. The term “suspect” is not defined in either the Code of Criminal Procedure, 1973 ("CrPC") or the Indian Penal Code, 1860 ("IPC").
    The
    DNA Profiling Board
  20. Clause 3 of the DBT Bill, which provides for the establishment of the DNA Profiling Board, contains a sub-clause (2) which vests the Board with corporate identity. This vesting of legal personality in the DNA Profiling Board – when other boards and authorities, even ministries and independent departments, and even the armed forces do not enjoy this function – is ill-advised and made without sufficient thought. Bodies corporate may be corporations sole – such the President of India, or corporations aggregate – such as companies. The intent of corporate identity is to create a fictional legal personality where none previously existed in order for the fictional legal personality to exist apart from its members, enjoy perpetual succession and to sue in its own legal name. Article 300 of the Constitution of India vests the Central Government with legal personality in the legal name of the Union of India and the State Governments with legal personality in the legal names of their respective states. Apart from this constitutional dispensation, some regulatory authorities, such as the Telecom Regulatory Authority of India ("TRAI") and the Securities and Exchange Board of India ("SEBI") have been individually vested with legal personalities as bodies corporate to enable their autonomous governance and independent functioning to secure their ability to free, fairly and impartially regulate the market free from governmental or private collusion. Similarly, some overarching national commissions, such as the Election Commission of India and the National Human Rights Commission ("NHRC") have been vested with the power to sue and be sued in their own names. In comparison, the DNA Profiling Board is neither an independent market regulator nor an overarching national commission with judicial powers. There is no legal reason for it to be vested with a legal personality on par with the Central Government or a company. Therefore, clause 3(2) should be removed.
  21. The size and composition of the Board that is staffed under clause 4 is extremely large. Creating unwieldy and top-heavy bureaucratic authorities and investing them with regulatory powers, including the powers of licensing, is avoidable. The DBT Bill proposes to create a Board of 16 members, most of them from a scientific background and including a few policemen and one legal administrator. In its present form, the Board is larger than many High Courts but does not have a single legal member able to conduct licensing. Drawing from the experiences of other administrative and regulatory bodies in India, the size of the Board should be drastically reduced to no more than five members, at least half of whom should be lawyers or ex-judges. The change in the legal composition of the Board is necessary because the DBT Bill contemplates that it will perform the legal function of licensing that must obey basic tenets of administrative law. The current membership may be viable only if the Board is divested of its administrative and regulatory powers and left with only scientific advice functions. Moreover, stacking the Board with scientists and policemen appears to ignore the perils that DNA collection and retention pose to the privacy of ordinary citizens and their criminal law rights. The Board should have adequate representation from the human rights community – both institutional (e.g NHRC and the State Human Rights Commissions) and non-institutional (well-regarded and experienced human rights activists). The Board should also have privacy advocates.
  22. Clauses 5(2) and 5(3) establish an unequal hierarchy within the Board by privileging some members with longer terms than others. There is no good reason for why the Vice-Chancellor of a National Law University, the Director General of Police of a State, the Director of a Central Forensic Science Laboratory and the Director of a State Forensic Science Laboratory should serve membership terms on the Board that are longer than those of molecular biologists, population geneticists and other scientists. Such artificial hierarchies should be removed at the outset. The Board should have one pre-eminent chairperson and other equal members with equal terms.
  23. The Chairperson of the Board, who is first mentioned in clause 5(1), has not been duly and properly appointed. Clause 4 should be modified to mention the appointment of the Chairperson and other Members.
  24. Clause 7 deals with the issue of conflict of interest in narrow cases. The clause requires members to react on a case-by-case basis to the business of the Board by recusing themselves from deliberations and voting where necessary. Instead, it may be more appropriate to require members to make a full and public disclosures of their real and potential conflicts of interest, and then granting the Chairperson the power to prevent such members from voting on interested matters. Failure to follow these anti-collusion and anti-corruption safeguards should attract criminal penalties.
  25. Clause 10 anticipates the appointment of a Chief Executive Officer of the Board who shall be a serving Joint Secretary to the Central Government. Clause 10(3) further requires this officer to be scientist. This may not be possible because the administrative hierarchy of the Central Government may not contain a genetic scientist.
  26. The functions of the Board specified in clause 12 are overbroad. Advising ministries, facilitating governments, recommending the size of funds and so on – these are administrative and governance functions best left to the executive. Once the Board is modified to have sufficient legal and human rights representation, then the functions of the Board can non-controversially include licensing, developing standards and norms, safeguarding privacy and other rights, ensuring public transparency, promoting information and debate and a few other limited functions necessary for a regulatory authority.
    DNA Laboratories
  27. The provisions of Chapters V and VI may be simplified and merged.
    DNA Data Banks
  28. The creation of multiple indices in clause 32(4) cannot be justified and must be removed. The collection of biological source material is an invasion of privacy that must be conducted only in strict conditions when the potential harm to individuals is outweighed by the public good. This balance may only be struck when dealing with the collection and profiling of samples from certain categories of offenders. The implications of collecting and profiling DNA samples from corpses, suspects, missing persons and others are vast and have either not been properly understood or deliberately ignored. At this moment, the forcible collection of biological source material should be restricted to the categories of offenders mentioned in the Identification of Prisoners Act, 1920 ("Prisoners Act") with a suitable addition for persons arrested in connection with certain specified terrorism-related offences. Therefore, databases should contain only an offenders’ index and a crime scene index.
  29. Clause 32(6), which requires the names of individuals to be connected to their profiles, and hence accessible to persons connected with the database, should be removed. DNA profiles, once developed, should be anonymised and retained separate from the names of their owners.
  30. Clause 36, which allows international disclosures of DNA profiles of Indians, should be removed immediately. Whereas an Indian may have legal remedies against the National DNA Data Bank, he/she certainly will not be able to enforce any rights against a foreign government or entity. This provision will be misused to rendition DNA profiles abroad for activities not permitted in India. Similarly, as in data protection regimes around the world, DNA profiles should remain within jurisdictions with high privacy and other legal standards.
    Use
  31. The only legitimate purpose for which DNA profiles may be used is for establishing the identity of individuals in criminal trials and confirming their presence or absence from a certain location. Accordingly, clauses 39 and 40 should be re-drafted to specify this sole forensic purpose and also specify the manner in which DNA profiles may be received in evidence. For more information on this point, see the relevant provisions of the CIS Note and the CIS Bill.
  32. The disclosure of DNA profiles should only take place to a law enforcement agency conducting a valid investigation into certain offences and to courts currently trying the individuals to whom the DNA profiles pertains. All other disclosures of DNA profiles should be made illegal. Non-consensual disclosure of DNA profiles for the study of population genetics is specifically illegal. The DBT Bill does not prescribe stringent criminal penalties and other mechanisms to affix individual liability on individual scientists and research institutions for improper use of DNA profiles; it is therefore open to the criticism that it seeks to sacrifice individual rights of persons, including the fundamental right to privacy, without parallel remedies and penalties. Clause 40 should be removed in entirety.
  33. Clause 43 should be removed in entirety. This note does not contemplate the retention of DNA profiles of suspects and victims, except as derived from a crime scene.
  34. Clause 45 sets out a post-conviction right related to criminal procedure and evidence. This would fundamentally alter the nature of India’s criminal justice system, which currently does not contain specific provisions for post-conviction testing rights. However, courts may re-try cases in certain narrow cases when fresh evidence is brought forth that has a nexus to the evidence upon which the person was convicted and if it can be proved that the fresh evidence was not earlier adduced due to bias. Any other fresh evidence that may be uncovered cannot prompt a new trial. Clause 45 is implicated by Article 20(2) of the Constitution of India and by section 300 of the CrPC. The principle of autrefois acquit that informs section 300 of the CrPC specifically deals with exceptions to the rule against double jeopardy that permit re-trials. [See, for instance, Sangeeta Mahendrabhai Patel (2012) 7 SCC 721].

Concerns Regarding DNA Law

by Bhairav Acharya last modified Oct 29, 2013 10:09 AM
Recently, a long government process to draft a law to permit the collection, processing, profiling, use and storage of human DNA is nearing conclusion. There are several concerns with this government effort. Below, we present broad-level issues to be kept in mind while dealing with DNA law.

Background

The Department of Biotechnology released, in 29 April 2012, a working draft of a proposed Human DNA Profiling Bill, 2012 ("DBT Bill") for public comments. The draft reveals an effort to (i) permit the collection of human blood, tissue and other samples for the purpose of creating DNA profiles, (ii) license private laboratories that create and store the profiles, (iii) store the DNA samples and profiles in various large databanks in a number of indices, and (iv) permit the use of the completed DNA profiles in scientific research and law enforcement. The regulation of human DNA profiling is of significant importance to the efficacy of law enforcement and the criminal justice system and correspondingly has a deep impact on the freedoms of ordinary citizens from profiling and monitoring. Below, we highlight five important concerns to bear in mind before drafting and implementing DNA legislation.

Primary Issues

Purpose of DNA Profiling

DNA  profiling  serves  two broad  purposes – (i) forensic – to establish  unique  identity  of a person in the criminal justice system; and, (ii) research – to understand human genetics and its contribution  to  anthropology, biology  and  other  sciences.  These  two  purposes have  very different approaches  to DNA  profiling and  the  issues and  concerns attendant on them vary accordingly. Forensic DNA profiling is undertaken to afford either party in a criminal trial a better  possibility  of  adducing corroborative evidence to  prosecute,  or to  defend, an alleged offence. DNA, like fingerprints, is a biometric estimation of the individuality of a person. By itself, in the same manner that fingerprint evidence is only proof of the presence of a person at a particular place and not proof of the commission of a crime, DNA is merely corroborative evidence  and cannot,  on its  own  strength,  result  in a conviction  or  acquittal  of  an  offence. Therefore, DNA  and fingerprints,  and the  process  by which they  are  collected and  used as evidence, should be broadly similar.

Procedural Integrity

Forensic DNA profiling results from biological source material that is usually collected from crime scenes or forcibly from offenders and convicts. Biological source material found at a crime scene is very rarely non-contaminated and the procedure by which it is collected and its integrity ensured is of primary legislative importance. To avoid the danger of contaminated crime scene evidence being introduced in the criminal justice system to pervert the course of justice, it is crucial to ensure that DNA is collected only from intact human cells and not from compromised genetic material. Therefore, if the biological source material found at a crime scene  does  not  contain  at  least  one  intact  human  cell,  the  whole  of  the biological  source material should be destroyed to prevent the possibility of compromised genetic material being collected to  yield  inconclusive results.  Adherence  to  this  basic  principle  will  obviate  the possibility  of  partial  matches  of  DNA  profiles  and  the  resulting  controversy  and  confusion that ensues.

Conditions of Collection

In India, the taking of fingerprints is chiefly governed by the Identification of Prisoners Act, 1920 ("Prisoners Act") and section 73 of the Indian Evidence Act, 1872 ("Evidence Act"). The Prisoners Act permits  the forcible taking of  fingerprints from convicts and  suspects in certain  conditions.  The Evidence  Act,  in  addition,  permits  courts  to  require  the  taking  of fingerprints  for  the  forensic  purpose  of  establishing  unique  identity  in  a  criminal  trial. No
provisions exist for consensual taking of fingerprints, presumably because of the danger of self-incrimination and general privacy concerns. Since, as discussed earlier, fingerprints and DNA are  biometric  measurements  that  should  be treated  equally to the  extent possible, the conditions for the collection of DNA should be similar to those for the taking of fingerprints.Accordingly,  there  should  be  no  legal  provisions  that  enable  other  kinds  of  collection, including from volunteers and innocent people.

Retention of DNA

As  a  general  rule applicable  in  India,  the  retention  of  biometric  measurements  must  be supported  by  a  clear  purpose  that  is  legitimate, judicially  sanctioned  and  transparent. The Prisoners Act, which permits the forcible taking of fingerprints from convicts, also mandates the destruction of these fingerprints when the person is acquitted or discharged. The indefinite collection  of  biometric  measurements  of people  is  dangerous,  susceptible  to  abuse  and invasive of civil rights. Therefore, once lawfully collected from crime scenes and offenders, their DNA profiles must  be  retained  in  strictly  controlled  databases with  highly  restricted access for the forensic purpose of law enforcement only. DNA should not be held in databases that allow non-forensic use. Further, the indices within these databases should be watertight and exclusive of each other.

DNA Laboratories

The process by which DNA profiles are created from biological source material is of critical importance. Because of the evidentiary value of DNA profiles, the laboratories in which these profiles  are  created  must  be  properly  licensed, professionally  managed  and manned  by competent  and  impartial  personnel.  Therefore,  the  process  by  which  DNA laboratories  are licensed and permitted to operate is significant.

Interview with Caspar Bowden - Privacy Advocate and former Chief Privacy Adviser at Microsoft

by Maria Xynou last modified Nov 06, 2013 08:16 AM
Maria Xynou recently interviewed Caspar Bowden, an internationally renowned privacy advocate and former Chief Privacy Adviser at Microsoft. Read this exciting interview and gain an insight on India's UID and CMS schemes, on the export of surveillance technologies, on how we can protect our data in light of mass surveillance and much much more!
Caspar Bowden is an independent advocate for better Internet privacy technology and regulation. He is a specialist in data protection policy, privacy enhancing technology research, identity management and authentication. Until recently he was Chief Privacy Adviser for Microsoft, with particular focus on Europe and regions with horizontal privacy law.
From 1998-2002, he was the director of the Foundation for Information Policy Research (www.fipr.org) and was also an expert adviser to the UK Parliament for the passage of three bills concerning privacy, and was co-organizer of the influential Scrambling for Safety public conferences on UK encryption and surveillance policy. His previous career over two decades ranged from investment banking (proprietary trading risk-management for option arbitrage), to software engineering (graphics engines and cryptography), including work for Goldman Sachs, Microsoft Consulting Services, Acorn, Research Machines, and IBM.
The Centre for Internet and Society interviewed Caspar Bowden on the following questions:

 

1. Do you think India needs privacy legislation? Why / Why not?

 

Well I think it's essential for any modern democracy based on a constitution to now recognise a universal human right to privacy. This isn't something that would necessarily have occurred to the draft of constitutions before the era of mass electronic communications, but this is now how everyone manages their lives and maintains social relationships at a distance, and therefore there needs to be an entrenched right to privacy – including communications privacy – as part of the core of any modern state.

2. The majority of India's population lives below the line of poverty and barely has any Internet access. Is surveillance an elitist issue or should it concern the entire population in the country? Why / Why not?

 

Although the majority of people in India are still living in conditions of poverty and don't have access to the Internet or, in some cases, to any electronic communications, that's changing very rapidly. India has some of the highest growth rates in take up with both mobile phones and mobile Internet and so this is spreading very rapidly through all strata of society. It's becoming an essential tool for transacting with business and government, so it's going to be increasingly important to have a privacy law which guarantees rights equally, no matter what anyone's social station or situation. There's also, I think, a sense in which having a right to privacy based on individual rights is much preferable to some sort of communitarian approach to privacy, which has a certain philosophical following; but that model of privacy - that somehow, because of a community benefit, there should also be a sort of community sacrifice in individual rights to privacy - has a number of serious philosophical flaws which we can talk about.

3. "I'm not a terrorist and I have nothing to hide...and thus surveillance can't affect me personally." Please comment.

 

Well, it's hard to know where to begin. Almost everybody in fact has “something to hide”, if you consider all of the social relationships and the way in which you are living your life. It's just not true that there's anybody who literally has nothing to hide and in fact I think that it's rather a dangerous idea, in political culture, to think about imposing that on leaders and politicians. There's an increasing growth of the idea – now, probably coming from America- that political leaders (and even their staff - to get hired in the current White House) should open up their lives, even to the extent of requiring officials to give up their passwords to their social network accounts (presumably so that they can be vetted for sources of potential political embarrassment in their private life). This is a very bad idea because if we only elect leaders, and if we only employ bureaucrats, who do not accord any subjective value to privacy, then it means we will almost literally be electing (philosophical) zombies. And we can't expect our political leaders to respect our privacy rights, if we don't recognise that they have a right to privacy in their own lives also. The main problem with the “nothing to hide, so nothing to fear” mantra is that this is used as a rhetorical tool by authoritarian forces in government and society, who simply wish to take a more paternalistic and protective attitude. This reflects a disillusionment within the “deep state” about how democratic states should function.

Essentially, those who govern us are given a license through elections to exercise power with consent, but this entails no abrogation of a citizen's duty to question authority. Instead, that should be seen as a civic duty - providing the objections are reasonable. People actually know that there are certain things in their lives that they don't wish other people to know, but by indoctrinating the “nothing to hide” ideology, it inculcates a general tendency towards more conformism in society, by inhibiting critical voices.

4. Should people have the right to give up their right to privacy? Why / Why not?

 

In European data protection law there is an obscure provision which is particularly relevant to medical privacy, but almost never used in the area of so-called sensitive personal data, like political views or philosophical views. It is possible currently for European governments to legislate to override the ability of the individual to consent. So this might arise, for example, if a foreign company sets up a service to get people to consent to have their DNA analysed and taken into foreign databases, or generally where people might consent to a big foreign company analysing and capturing their medical records. I think there is a legitimate view that, as a matter of national policy, a government could decide that these activities were threatening to data sovereignty, or that was just bad public policy. For example, if a country has a deeply-rooted social contract that guarantees the ability to access medical care through a national health service, private sector actors could try to undermine that social-solidarity basis for universal provision of health care. So for those sorts of reasons I do think it's defensible for governments to have the ability in those sectors to say: “Yes, there are areas where people should not be able to consent to give up their privacy!”

But then going back to the previous answer, more generally, commercial privacy policies are now so complicated – well, they've always been complicated, but now are mind-blowingly devious as well - people have no real possibility of knowing what they're consenting to. For example, the secondary uses of data flows in social networks are almost incomprehensible, even for technologists at the forefront of research. The French Data Protection authorities are trying to penalize Google for replacing several very complicated privacy policies by one so-called unified policy, which says almost nothing at all. There's no possible way for people to give informed consent to this over-simplified policy, because it doesn't even tell anything useful to an expert. So again in these circumstances, it's right for a regulator to intercede to prevent unfair exploitation of the deceptive kind of “tick-box” consent. Lastly, it is not possible for EU citizens to waive or trade away their basic right to access (or delete) their own data in future, because this seems a reckless act and it cannot be foreseen when this right might become essential in some future circumstances. So in these three senses, I believe it is proper for legislation to be able to prevent the abuse of the concept of consent.

5. Do you agree with India's UID scheme? Why / Why not?

 

There is a valid debate about whether it's useful for a country to have a national identity system of some kind - and there's about three different ways that can be engineered technically. The first way is to centralise all data storage in a massive repository, accessed through remote terminal devices. The second way is a more decentralised approach with a number of different identity databases or systems which can interoperate (or “federate” with eachother), with technical and procedural rules to enforce privacy and security safeguards. In general it's probably a better idea to decentralise identity information, because then if there is a big disaster (or cyber-attack) or data loss, you haven't lost everything. The third way is what's called “user-centric identity management”, where the devices (smartphones or computers) citizens use to interact with the system keep the identity information in a totally decentralised way.

Now the obvious objection to that is: “Well, if the data is decentralised and it's an official system, how can we trust that the information in people's possession is authentic?”. Well, you can solve that with cryptography. You can put digital signatures on the data, to show that the data hasn't been altered since it was originally verified. And that's a totally solved problem. However, unfortunately, not very many policy makers understand that and so are easily persuaded that centralization is the most efficient and secure design – but that hasn't been true technically for twenty years. Over that time, cryptographers have refined the techniques (the alogithms can now run comfortably on smartphones) so that user-centric identity management is totally achievable, but policy makers have not generally understood that. But there is no technical reason a totally user-centric vision of identity architecture should not be realized. But still the UID appears to be one of the most centralised large systems ever conceived.

There are still questions I don't understand about its technical architecture. For example, just creating an identity number by itself doesn't guarantee security and it's a classic mistake to treat an identifier as an authenticator. In other words, to use an identifier or knowledge of an identifier - which could become public information, like the American social security number – to treat knowledge of that number as if it were a key to open up a system to give people access to their own private information is very dangerous. So it's not clear to me how the UID system is designed in that way. It seems that by just quoting back a number, in some circumstances this will be the key to open up the system, to reveal private information, and that is an innately insecure approach. There may be details of the system I don't understand, but I think it's open to criticism on those systemic grounds.

And then more fundamentally, you have to ask what's the purpose of that system in society. You can define a system with a limited number of purposes – which is the better thing to do – and then quite closely specify the legal conditions under which that identity information can be used. It's much more problematic, I think, to try and just say that “we'll be the universal identity system”, and then you just try and find applications for it later. A number of countries tried this approach, for example Belgium around 2000, and they expected that having created a platform for identity, that many applications would follow and tie into the system. This really didn't happen, for a number of social and technical reasons which critics of the design had predicted. I suppose I would have to say that the UID system is almost the anithesis of the way I think identity systems should be designed, which should be based on quite strong technical privacy protection mechanisms - using cryptography - and where, as far as possible, you actually leave the custody of the data with the individual.

Another objection to this user-centric approach is “back-up”: what happens when you lose the primary information and/or your device? Well, you can anticipate that. You can arrange for this information to be backed-up and recovered, but in such a way that the back-up is encrypted, and the recovered copy can easily be checked for authenticity using cryptography.

6. Should Indian citizens be concerned about the Central Monitoring System (CMS)? Why / Why not?


Well, the Central Monitoring System does seem to be an example of very large scale “strategic surveillance”, as it is normally called. Many western countries have had these for a long time, but normally only for international communications. Normally surveillance of domestic communications is done under a particular warrant, which can only be applied one investigation at a time. And it's not clear to me that that is the case with the Central Monitoring System. It seems that this may also be applicable to mass surveillance of communications inside India. Now we're seeing a big controversy in the U.S - particularly at the moment - about the extent to which their international strategic surveillance systems are also able to be used internally. What has happened in the U.S. seems rather deceptive; although the “shell” of the framework of individual protection of rights was left in place, there are actually now so many exemptions when you look in the detail, that an awful lot of Americans' domestic communications are being subjected to this strategic mass surveillance. That is unacceptable in a democracy.

There are reasons why, arguably, it's necessary to have some sort of strategic surveillance in international communications, but what Edward Snowden revealed to us is that in the past few years many countries – the UK, the U.S, and probably also Germany, France and Sweden – have constructed mass surveillance systems which knowingly intrude on domestic communications also. We are living through a transformation in surveillance power, in which the State is becoming more able to monitor and control the population secretively than ever before in history. And it's very worrying that all of these systems appear to have been constructed without the knowledge of Parliaments and without precise legislation. Very few people in government even seem to have understood the true mind-boggling breadth of this new generation of strategic surveillance. And no elections were fought on a manifesto asking “Do people want this or not?”. It's being justified under a counter-terrorism mantra, without very much democratic scrutiny at all. The long term effects of these systems on democracies are really uncharted territory.

We know that we're not in an Orwellian state, but the model is becoming more Kafkaesque. If one knows that this level of intensive and automated surveillance exists, then it has a chilling effect on society. Even if not very much is publicly known about these systems, there is still a background effect that makes people more conformist and less politically active, less prepared to challenge authority. And that's going to be bad for democracy in the medium term – not just the long term.

7. Should surveillance technologies be treated as traditional arms / weapons? If so, should export controls be applied to surveillance technologies? Why / Why not?


Surveillance technologies probably do need to be treated as weapons, but not necessarily as traditional weapons. One probably is going to have to devise new forms of export control, because tangible bombs and guns are physical goods – well, they're not “goods”, they're “bads” - that you can trace by tagging and labelling them, but many of the “new generation” of surveillance weapons are software. It's very difficult to control the proliferation of bits – just as it is with copyrighted material. And I remember when I was working on some of these issues thirteen years ago in the UK – during the so-called crypto wars – that the export of cryptographic software from many countries was prohibited. And there were big test cases about whether the source code of these programs was protected under the US First Amendment, which would prohibit such controls on software code. It was intensely ironic that in order to control the proliferation of cryptography in software, governments seemed to be contemplating the introduction of strategic surveillance systems to detect (among other things) when cryptographic software was being exported. In other words, the kind of surveillance systems which motivated the “cypherpunks” to proselytise cryptography, were being introduced (partly) with the perverse justification of preventing such proliferation of such cryptography!

In the case of the new, very sophisticated software monitoring devices (“Trojans”) which are being implanted into people's computers – yes, this has to be subject to the same sort of human rights controls that we would have applied to the exports of weapon systems to oppressive regimes. But it's quite difficult to know how to do that. You have to tie responsibility to the companies that are producing them, but a simple system of end-user licensing might not work. So we might actually need governments to be much more proactive than they have been in the past with traditional arms export regimes and actually do much more actively to try and follow control after export – whether these systems are only being used by the intended countries. As for the law enforcement agencies of democratic countries which are buying these technologies: the big question is whether law enforcement agencies are actually applying effective legal and operational supervision over the use of those systems. So, it's a bit of a mess! And the attempts that have been made so far to legislate this area I don't think are sufficient.

8. How can individuals protect their data (and themselves) from spyware, such as FinFisher?

 

In democratic countries, with good system of the rule of law and supervision of law enforcement authorities, there have been cases – notably in Germany – where it's turned out that the police using techniques, like FinFisher, have actually disregarded legal requirements from court cases laying down the proper procedures. So I don't think it's good enough to assume that if one was doing ordinary lawful political campaigning, that one would not be targeted by these weapons. So it's wise for activists and advocates to think about protecting themselves – of course, other professions as well who look after confidential information – because these techniques may also get into the hands of industrial spies, private detectives and generally by people who are not subject to even the theoretical constraints of law enforcement agencies.

After Edward Snowden's revelations, we understand that all our computer infrastructure is much more vulnerable – particularly to foreign and domestic intelligence agencies – than we ever imagined. So for example, I don't use Microsoft software anymore – I think that there are techniques which are now being sold to governments and available to governments for penetrating Microsoft platforms and probably other major commercial platforms as well. So, I've made the choice, personally, to use free software – GNU/Linux, in particular – and it still requires more skill for most people to use, but it is much much easier than even a few years ago. So I think it's probably wise for most people to try and invest a little time getting rid of proprietary software if they care at all about societal freedom and privacy. I understand that using the latest, greatest smartphone is cool, and the entertainment and convenience of Cloud and tablets – but people should not imagine that they can keep those platforms secure.

It might sound a bit primitive, but I think people should have to go back to the idea that if they really want confidential communications with their friends, or if they are involved with political work, they have to think about setting aside one machine - which they keep offline and just use essentially for editing and encrypting/decrypting material. Once they've encrypted their work on their “air gap” machine, as it's called, then they can put their encrypted emails on a USB stick and transfer them to their second machine which they use to connect online (I notice Bruce Schneier is just now recommending the same approach). Once the “air gap” machine has been set up and configured, you should not connect that to the network – and preferably, don't connect it to the network, ever! So if you follow those sorts of protocols, that's probably the best that is achievable today.

9. How would you advise young people working in the surveillance industry?

 

Young people should try and read a little bit into the ethics of surveillance and to understand their own ethical limits in what they want to do, working in that industry. And in some sense, I think it's a bit like contemplating a career in the arms industry. There are defensible uses of military weapons, but the companies that build these weapons are, at the end of the day, just corporations maximizing value for shareholders. And so, you need to take a really hard look at the company that you're working for or the area you want to work in and satisfy your own standard of ethics, and that what you're doing is not violating other people's human rights. I think that in the fantastically explosive growth of surveillance industries that we've seen over the past few years – and it's accelerating – the sort of technologies particularly being developed for electronic mass surveillance are fundamentally and ethically problematic. And I think that for a talented engineer, there are probably better things that he/she can do with his/her career.

    Mapping Digital Media: Broadcasting, Journalism and Activism in India: A Public Consultation

    by Samantha Cassar last modified Nov 07, 2013 03:38 AM
    Lawyers, researchers, journalists and activists gathered on Sunday, October 27, 2013 at the Bangalore International Centre in response to India’s country report on Mapping Digital Media, which examines citizen’s access to quality news and information across different industries, and impacts on media freedoms as a result of digitisation. Respondents examined themes related to regulation, journalism and activism, and engaging discussions took place among attendees.
    Mapping Digital Media: Broadcasting, Journalism and Activism in India: A Public Consultation

    Respondents of various perspectives spoke for the public consultation regarding different sections of the Mapping Digital Media: India report.


    On behalf of event organizers, we invite you to view the report, available online for free access here: "Mapping Digital Media: India.


    Event organizers, Alternative Law Forum, The Centre for Internet & Society, and Maraa, held a public consultation at the Bangalore International Centre with the ultimate goals to inform and engage the public within key themes of the Mapping Digital Media: India report, as a new knowledge basis for better understanding India’s transitioning digital landscape. Many resulting ideas about moving forward with the report’s findings also came about, as prospective proceeding steps within the life cycle following the report’s release.

    Respondents consisted of reputed media lawyers, researchers, journalists, activist and other media professionals. Each spoke before the meeting room within three panel discussions pertaining to different sections of the report: Policies, Laws and Regulators; Digital Activism; and Digital Journalism. Each speaker shed a new light on key challenges confronting our emergent digital media landscape with special focus given to broadcasting (radio and television), cable operations and newspapers (print & online) as each of these sectors undergo digitisation.

    Opening

    Vibodh Parthasarathi, who had anchored the country report, started off the consultation by underscoring the report's objective of mapping the different sectors and seemingly disparate aspects of India's complex media landscape. Following a brief introduction to the report was the setting of the stage by Alternative Law Forum Co-founder and Partner, Lawrence Liang, as he shared the ultimate aims of the event in speaking collectively to the report so that we may gain a better understanding of an area that is otherwise opaque by most. Lawrence also brings to the forefront the report’s debunking of the idea of the digital divide for India, and its account of a rich media landscape.

    Policies, Laws and Regulators

    The consultation’s first panel discussion was started by Lawrence, as he responded to the report from a perspective of legality. Lawrence examines the role of the state in India’s rich media landscape, specifically in terms of the four values at the centre of such: freedom of speech and expression, access to infrastructure, the question of development, and the question of market regulations—all of which are tied together within the country report.  Lawrence argues that we must arrive at quantitative measures of accessing diversity and quantity of freedom of speech, but only after understanding the ecology in which freedom of speech operates, and attempts to do so in examining drafted policies, policing measures, and market regulatory measures taken within the context of India.

    An engaging discussion following this panel’s speakers took place. Amongst points made by event attendees includes questions of how to scale up the citizen’s stake in media within a legal paradigm, as well as points made with reference to challenges to equity in media in terms of content and challenges to such.

    Digital Media and Society (Digital Activism)

    The discussion had begun with panelist, Arjun Venkatraman, Co-founder of the Mojolab Foundation as well as the digital activism platform, Swara. Arjun engages within the digital media debate in speaking on behalf of members of civil society that act from within the digital divide and exposes the gaps within new modes of activism that arise out of a lack of understanding on how to engage with these new medias. He also informed attendees of how to make cheap IVR based voice portals, linking voice users to the web for under USD200 as means of leveraging users’ voices via unlicensed spectrum.

    Also contributing to the discussion on digital activism was Meera K, Cofounder of Bangalore News publication, Citizen Matters. In examining examples of new spaces that digital media has provided for the exchange of pluralistic views and alternative voices, Meera critiques different types of activism that have emerged, including  social activism, political activism, and middle class activism. She questions whether new media can be seen as sufficient space for free speech with reference to various challenges, such as the polarization of debates, and also compares and contrasts the positive outcomes of new media campaigns—such as tangible capitalized solutions—with corresponding pitfalls.

    A debate amongst attendees followed in response to the question of assessing the value of media in terms of impact or size of public outreach, along with how content is generated and controlled.

    Digital Media and Journalism

    Independent journalist and media analyst, Geeta Seshu, got the conversation started regarding digital media and journalism by comparing the pitfalls of journalism in traditional media with the possibilities offered by digital journalism. Geeta argues that journalists have become devalued and are losing their footing within traditional media. She discussed the new forms of journalism and how news can be generated in an interactive and non-hierarchical manner and examined the intersections of mainstream media and journalism.  She questions the possibility of digital journalism existing on its own, without the influence of or incorporation of principles of traditional media, and grapples with possibilities for providing a new model for doing so.

    The day’s last speaker was Subhash Rai, Associate Editor of New Indian Express. Subhash offers a mainstream perspective and argues that we must look at traditional and mainstream forms of media as a starting point for emerging forms of journalism before we can begin to understand these journalism models better. Just as well, traditional and mainstreams means of news dissemination can learn from digital media, however we should not be quick to look away from the core of the entire picture, as traditional forms of media are still very strong in comparison.

    A discussion followed surrounded questions posed by speakers and attendees, such as what digital journalism should look like, and how such a transition to new forms of media should be imagined. How information has changed with respect to its creation and consumption was debated as well.

    Moving Forward

    Before the conclusion of the public consultation, attendees and speakers discussed future advancements for the country report.  Many recommendations and ideas were generated, including suggestions for future public consultations, advocacy windows offered by the report, and ways to produce another iteration of the report. Prospective initiatives included online working groups to dive deeper into specific themes of the report, a Hackathon where attendees will pool ideas together, and follow-up public consultations.

    Mapping Digital Media 2

    Participants brainstormed together on how to move forward the report’s findings. Many ideas were drafted, including a Hack-a-thon and online focus groups.

    The event's agenda went as follows:

    TimeDetail
    10.00 a.m. Introductory Remarks by Vibodh Parthasarathi, CCMG, Jamia
    10.15 a.m. - 11.30 a.m. Policies, Laws and Regulators
    Session Moderator – Ram Bhat
    Speakers – Lawrence Liang (ALF) and Mathew John (JGLS)
    11.30 a.m. - 11.45 a.m.

    Tea Break

    11.45 a.m. - 1.15 p.m. Digital Media and Society (Digital Activism)
    Session Moderator – Lawrence Liang
    Speakers – Arjun Venkatraman (Mojolab) and Meera K (Citizen Matters)

    1.15 p.m. - 2.00 p.m. Lunch Break
    2.00 p.m. - 3.15 p.m. Digital Media and Journalism
    Session Moderator – Vibodh Parthasarathi
    Speakers – Geeta Seshu (Free Speech Hub) and Subhash Rai (newindianexpress.com)
    3.15 p.m. - 4.00 p.m. The Way Ahead (Moving Forward)
    Moderated by Lawrence Liang

    Event Participants

    1. Rashmi Vallabhrajasyuva
    2. Meera K, Oorvani Foundation
    3. Samantha Cassar, CIS
    4. Sharath Chandra Ram, CIS
    5. Suresh Kumar, Artist
    6. Aruna Sekhar, Amnesty India
    7. Sriram Sharma, Part time Blogger
    8. Ammu Joseph, Independent Researcher
    9. Mathew John, Jindal Global Law School
    10. Swati Mehta, The Rules
    11. James North, The Rules
    12. Bhairav Acharya, Lawyer
    13. Deepa Kurup, The Hindu
    14. Abhilash N, Independent
    15. Deepu, Pedestrian Pictures
    16. Rashmi M, PhD Student at NIAS
    17. Jayanth S, LOCON Solutions Pvt Ltd.
    18. Nehaa Chaudhari, CIS
    19. Dinesh TB, Servelots
    20. Snehashish Ghosh, CIS
    21. Lawrence Liang, ALF
    22. Vibodh Parthasarathi, CCMG, Jamia
    23. Ram Bhat, Maraa
    24. Ashish Sen, AMARC
    25. Subhash Rai, New Indian Express
    26. Geeta Seshu, Free Speech Hub, The Hoot
    27. Arjun Venkatraman, Mojo Lab Foundation
    28. Raajen, Centre for Education and Documentation
    29. Ekta, Maraa
    30. Smarika Kumar, ALF

    Press Coverage

    1. Need to increase diversity in online journalism (The New Indian Express, October 28, 2013).
    2. Experts moot holistic approach to media laws (The Hindu, October 28, 2013).

    CIS Cybersecurity Series (Part 12) - Namita Malhotra

    by Purba Sarkar last modified Nov 18, 2013 10:03 AM
    CIS interviews Namita Malhotra, researcher and lawyer at Alternative Law Forum, Bangalore, as part of the Cybersecurity Series.

    "In a strange mix of how both capitalism and state control work, what is happening is that more and more of these places that one could access, for various reasons, whether it is for ones own pleasure or for political conversations, are getting further and further away from us. And I think that that mix of both corporate interests and state control is particularly playing a role in this regard." - Namita Malhotra, researcher and lawyer, Alternative Law Forum

    Centre for Internet and Society presents its twelfth installment of the CIS Cybersecurity Series. 

    The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

    Namita Malhotra is a researcher and lawyer at Alternative Law Forum (ALF). She has a keen interest in working on law, technology and media through legal research, cultural studies, new media practices and film making.

    ALF homepage: www.altlawforum.org


    This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

     

    First Look: CIS Cybersecurity documentary film

    by Purba Sarkar last modified Dec 17, 2013 08:16 AM
    CIS presents the trailer of its documentary film DesiSec: Cybersecurity & Civil Society in India

    The Centre for Internet and Society is pleased to release the trailer of its first documentary film, on cybersecurity and civil society in India. 

    The documentary is part of the CIS Cybersecurity Series, a work in progress which may be found here.

    DesiSec: Cybersecurity and Civil Society in India

    The trailer of DesiSec: Cybersecurity and Civil Society in India was shown at the Internet Governance Forum in Bali on October 24. It was a featured presentation at the Citizen Lab workshop, Internet Governance For The Next Billion Users.

    The transcript of the workshop is available here: http://www.intgovforum.org/cms/component/content/article/121-preparatory-process/1476-ws-344-internet-governance-for-the-next-billion-users 

    This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

    Seventh Privacy Round-table

    by Elonnai Hickok last modified Nov 20, 2013 09:58 AM
    On October 19, 2013, the Centre for Internet and Society (CIS) in collaboration with the Federation for Indian Chambers of Commerce and Industry, the Data Security Council of India, and Privacy International held a “Privacy Round-table” in New Delhi at the FICCI Federation House.

    The Round-table was the last in a series of seven, beginning in April 2013, which were held across India.

    Previous Privacy Round-tables were held in:

    • New Delhi: (April 13, 2013) with 45 participants;
    • Bangalore: (April 20, 2013) with 45 participants;
    • Chennai: (May 18, 2013) with 25 participants;
    • Mumbai, (June 15, 2013) with 20 participants;
    • Kolkata: (July 13, 2013) with 25 participants; and
    • New Delhi: (August 24, 2013) with 40 participants.

    Chantal Bernier, Assistant Privacy Commissioner Canada, Jacob Kohnstamm, Dutch Data Protection Authority and Chairman of the Article 29 Working Party, and Christopher Graham, Information Commissioner UK were the featured speakers for this event.

    The Privacy Round-tables were organised to ignite spark in public dialogues and gain feedback for a privacy framework for India. To achieve this, the Privacy Protection Bill, 2013, drafted by the Centre for Internet and Society, Strengthening Privacy through Co-regulation by the Data Security Council of India, and the Report of the Group of Experts on Privacy by the Justice A.P. Shah committee were used as background documents for the Round-tables. As a note, after each Round-table, CIS revised the text of the Privacy Protection Bill, 2013 based on feedback gathered from the general public.

    The Seventh Privacy Round-table meeting began with an overview of the past round-tables and a description of the evolution of a privacy legislation in India till date, and an overview of the Indian interception regime. In 2011, the Department of Personnel and Training drafted a Privacy Bill that incorporated provisions regulating data protection, surveillance, interception of communications, and unsolicited messages. Since 2010, India has been seeking data secure status from the European Union, and in 2012 a report was issued noting that the Reasonable Security Practices and Procedures and Sensitive Personal Data or Information Rules found under section 43A of the Information Technology Act, were not sufficient to meet EU data secure adequacy.  In 2012, the Report of the Group of Experts on Privacy was published recommending a privacy framework for India and was accepted by the government, and the Department of Personnel and Training is presently responsible for drafting of a privacy legislation for India.


    Presentation: Jacob Kohnstamm, Dutch Data Protection Authority and Chairman of the Article 29 Working Group


    Jacob Kohnstamm, made a presentation on the privacy framework in the European Union. In his presentation, Khonstamm shared how history, such as the Second World War, shaped the present understanding and legal framework for privacy in the European Union, where privacy is seen as a fundamental human right. Kohnstamm also explained how over the years technological developments have made data gold, and subsequently, companies who process this data and create services that allow for the generation of more data are becoming monopolies. This has created an unbalanced situation for the individual consumer, where his or her data is being routinely collected by companies, and once collected — the individual loses control over the data. Because of this asymmetric relationship, data protection regulations are critical to ensure that individual rights are safeguarded.

    Kohnstamm recognized the tension between stringent data protection regulations and security for the government, and the provision of services for businesses was recognized. However, he argued that the use of technology without regulation — for commercial reason or security reasons, can lead to harm. Thus, it is key that any regulation incorporate proportionality as a cornerstone to the use of these technologies to ensure trust between the individual and the State, and the individual and the corporation. This will also ensure that individuals are given the right of equality, and the right to live free of discrimination. Kohnstamm went on to explain that any regulation needs to ensure that individuals are provided the necessary tools to control their data and that a robust supervisory authority is established with enough powers to enforce the provisions, and that checks and balances are put in place to safeguard against abuse.

    In response to a question asked about how the EU addresses the tension of data protection and national security, Kohnstamm clarified that in the EU, national security is left as a matter for member states to address but the main principles found in the EU Data Protection Directive also apply to the handling of information for national security purposes. He emphasized the importance of the creation of checks and balances. As security agencies are given additional and broader powers, they must also be subjected to stronger safeguards.

    Kohnstamm also discussed the history of the fair trade agreement with India, and India’s request for data secure status. It was noted that currently the fair trade agreement between India and the EU is stalled, as India has asked for data secure status. For the EU to grant this status, it must be satisfied that when European data is transferred and processed in India and that it is subject to the same level of protections as it would be if it were processed in the EU. Without a privacy legislation in place, India’s present  regime does not reflect the same level of protections as the EU regime. To find a way out of this ‘dead lock’, the EU and India have agreed to set up an expert group — with experts from both the EU and India to find a way in which India’s regime can be modified to meet EU date secure adequacy. As of date, no experts from the Indian side have been nominated and communicated to the EU.

    Key Points:

    1. Europe’s history has influenced the understanding and formulation of the right to privacy as a fundamental right.
    2. Any privacy regulation must have strong checks and balances in place and ensure that individuals are given the tools to control their data.
    3. India’s current regime does not meet EU data secure adequacy. Currently, the EU is waiting for India to nominate experts to work with the EU to find a way of the ‘dead lock’.

    Discussion: National Security, Surveillance and Privacy


    Opening the discussion up to the floor, it was discussed how in India, there is a tension between data protection and national security, as national security is always a blanket exception to the right to privacy. This tension has been discussed and debated by both democratic institutions in India and commercial entities. It was pointed out that though data protection is a new debate, national security is a debate that has existed in India for many years. It was also pointed out that currently there are not sufficient checks and balances for the powers given to Indian security agencies. One missing safeguard that the Indian regime has been heavily criticized for is the power of the Secretary of the Home Ministry to authorize interception requests, as having the authorization power vested in the executive leaves little space between interested parties seeking approval of interception orders, and could result in abuse or conflict of interest. With regards to the Indian interception regime, it was explained that currently there are five ways in which messages can be intercepted in India. Previously, the Law Commission of India had asked that amendments be made to both the Indian Post Office Act and the Indian Telegraph Act.

    Moving the discussion to the Privacy Protection Bill, 2013 by CIS, in Chapter V “Surveillance and Interception of Communications” clause 34, the authorization of interception and surveillance orders is left to a magistrate. Previously, the authorization of interception orders rested with the Privacy Commissioner, but this model was heavily critiqued in previous round-tables, and the authorizing authority has been subsequently changed to a magistrate. Participants pointed out that the Bill should specify the level of the magistrate that will be responsible for the authorization of surveillance orders, and also raised the concern that the lower judiciary in India is not adequately functioning as the courts are overwhelmed, thus creating the possibility for abuse. Participants also suggested that perhaps data protection and surveillance should be de-linked from each other and placed in separate bills. This echoes public feedback from previous roundtables.

    While discussing needed safeguards in an interception and surveillance regime for India, it was called out that transparency of surveillance, by both the government and the service providers as key safeguards to ensuring the protection of privacy, as it would enable individuals to make educated decisions about the services they choose to use and the extent of governmental surveillance. The need to bring in a provision that incorporated the idea of "nexus of surveillance" was also highlighted. It was also pointed out that in Canada, entities wanting to deploy surveillance in the name of public safety, must take steps to prove nexus. For example, the organization must empirically prove that there is a need for a security requirement, demonstrate that only data that is absolutely necessary will be collected, show how the technology will be effective, prove that there is not a less invasive way to collect the information, demonstrate security measures in place to ensure against loss and misuse, and the organizations must have in place both internal and external oversight mechanisms. It was also shared that in Canada, security agencies are regulated by the Office of the Canadian Privacy Commissioner, as privacy and security are not seen as separate matters. In the Canadian regime, because security agencies have more powers, they are also subjected to greater oversight.

    Key Points:

    1. The Indian surveillance regime currently does not have strong enough safeguards.
    2. The concept of ‘nexus’ should be incorporated into the Privacy Protection Bill, 2013.
    3. A magistrate, through judicial oversight for interception and surveillance requests, might not be the most effective authority for this role in India.

    Presentation: Chantal Bernier, Deputy Privacy Commissioner, Canada


    In her presentation, Bernier made the note that in the Canadian model there are multiple legislative initiatives that are separate but connected, and all provide a legislative basis for the right to privacy. Furthermore, it was pointed out that there are two privacy legislations in Canada, one regulating the private sector and the other regulating the public sector. It has been structured this way as it is understood that the relationship between individuals and business is based on consent, while the relationship between individuals and the state is based on human rights. Furthermore, aspects of privacy, such as consent are different in the public sector and the private sector. In her presentation, Bernier pointed out that privacy is a global issue and because of this, it is critical that countries have privacy regimes that can speak to each other. This does not mean that the regimes must be identical, but they must at the least be inter-operable.

    Bernier described three main characteristics of the Canadian privacy regime including:

    1. It is comprehensive and applies to both the public and the private sectors.
    2. The right to privacy in Canada is constitutionally based and is a fundamental right as it is attached to personal integrity. This means that privacy is above contractual fairness. That said, the right to privacy must be balanced collectively with other imperatives.
    3. The Canadian privacy regime is principle based and not rule based. This flexible model allows for quick adaption to changing technologies and societal norms. Furthermore, Bernier explained how Canada places responsibility and accountability on companies to respect, protect, and secure privacy in the way in which the company believes it can meet. Bernier also noted that all companies are responsible and accountable for any data that they outsource for processing.

    Furthermore, any company that substantially deals with Canadians must ensure that the forum for which complaints etc., are heard is Canada. Furthermore, under the Canadian privacy regime, accountability for data protection rests with the original data holder who must ensure — through contractual clauses — that any information processed through a third party meets the Canadian level of protection. This means any company that deals with a Canadian company will be required to meet the Canadian standards for data protection.

    Speaking to the governance structure of the Office of the Privacy Commissioner in Canada, Bernier explained that the OPC is a completely independent office and reports directly to the Parliament. The OPC hears complaints from both individuals and organizations. The OPC does not have any enforcement powers, such as finding a company, but does have the ability to "name" companies who are not in compliance with Canadian regulations, if it is in the public interest to do so. The OPC can perform audits upon discretion with respect to the public sector, and can perform audits on the private sector if they have reasonable grounds to investigate.

    Bernier concluded her presentation with lessons that have been learned from the Canadian experience including:

    1. The importance of having strong regulators.
    2. Privacy regulators must work and cooperate together.
    3. Privacy has become a condition of trade.
    4. In today’s age, issues around surveillance cannot be underestimated.
    5. Companies that have strong privacy practices now have a competitive advantage in place in today’s global market.
    6. Privacy frameworks must be clear and flexible.
    7. Oversight must be powerful to ensure proper protection of citizens in a world of asymmetry between individuals, corporations, and governments.

    Key Points:

    1. The Right to Privacy is a fundamental right in Canada.
    2. The Canadian privacy regime regulates the public sector and the private sector, but through two separate legislations.
    3. The OPC does not have the power to levy fines, but does have the power to conduct audits and investigations and ‘name’ companies who are not in compliance with Canadian regulations if it is in the public interest.

    Discussion: The Data Protection Authority


    Participants also discussed the composition of the Data Protection Authority as described in chapter IV of the Privacy Protection Bill. It was called out that the in the Bill, the Data Protection Authority might need to be made more independent. It was suggested that to avoid having the office of the Data Protection Authority be filled with bureaucrats, the Bill should specify that the office must be staffed by individuals with IT experience, lawyers, judges, etc. On the other hand it was cautioned, that though this might be useful to some extent, it might not be helpful to be overly prescriptive, as there is no set profile of what composition of employees makes for a strong and effective Data Protection Authority. Instead the Bill should ensure that the office of the Data Protection Authority is independent, accountable, and chosen by an independent selection board.

    When discussing possible models for the framework of the Data Protection Authority, it was pointed out that there are many models that could be adopted. Currently in India the commission model is not flexible, and many commissions that are set up, are not effective due to funding and internal bureaucracy. Taking that into account, in the Privacy Protection Bill, 2013, the Data Protection Authority, could be established as a small regulator with an appellate body to hear complaints.

    Key Points:

    1. The Data Protection Authority established in the Privacy Protection Bill must be adequately independent.
    2. The composition of the Data Protection Authority be diverse and it should have the competence to address the dynamic nature of privacy.
    3. The Data Protection Authority could be established as a small regulator with an appellate body attached.

    Presentation: Christopher Graham, Information Commissioner, United Kingdom


    Christopher Graham, the UK Information Commissioner, spoke about the privacy regime in the United Kingdom and his role as the UK Information Commissioner. As the UK Information Commissioner, his office is responsible for both the UK Data Protection Act and the Freedom of Information Act. In this way, the right to know is not in opposition to the right to privacy, but instead an integral part.

    Graham said that his office also provides advice to data controllers on how to comply with the privacy principles found in the Data Protection Act, and his office has the power to fine up to half a million pounds on non-compliant data controllers. Despite having this power, it is rarely used, as a smaller fine is usually sufficient enough for the desired effect. Yet, at the end of the day, whatever penalty is levied, it must be proportionate and risk based i.e., selective to be effective. In this way the regulatory regime should not be heavy handed but instead should be subtle and effective. In fact, one of the strongest regulators is the reality of the market place where the price of not having strong standards is innovation and economic growth. To this extent, Graham also pointed out that self regulation and co-regulation are both workable models, if there is strong enforcement mechanisms. Graham emphasized the fact that any data protection must go beyond, and cannot be limited to, just security.

    Graham also explained that he has found that currently there is a lack of confidence in Indian partners. This is problematic as the Indian industry tries to grow with European partners. For example, he has been told that customers are moving banks because their previous bank’s back offices were located in India. Citing other examples of cases of data breaches from Indian data controllers, such as a call center merging the accounts of two customers and another call centre selling customer information, he explained that the lack of confidence in the Indian regime has real economic implications. Graham further explained that one difficulty that the office of the UK ICO is faced with, is that India does not have the equivalent of the ICO. Thus, when a breach does happen, it is unclear who can be approached in India about the breach.

    Touching upon the issue of data adequacy with the EU, Graham noted that if data adequacy is a goal of India, the privacy principles as defined in the Directive and reflected in the UK Data Protection Act, must be addressed in addition to security. In his presentation, Graham emphasized the importance of India amending their current regime, if they want data secure status and spoke about the economic benefits for both Europe and India, if India does in fact obtain data secure status. In response to a question about why it is so important that India amend its laws, if in effect the UK has the ability to enforce the provisions of UK Data Protection Act, Graham clarified that most important is the rule of law, and according to UK law and more broadly the EU Directive, companies cannot transfer information to jurisdictions that do not have recognized adequate levels of protection. Thus, if companies still wish to transfer information to India, this must be done through binding corporate rules.

    Another question which was put forth was about how the right to privacy differs from other human rights, and why countries are requiring that other countries to uphold the right to privacy to the same level, when, for example this is not practiced for other human rights such as children’s rights. In response Graham explained that data belongs to the individual, and when it is transferred to another country — it still belongs to the individual. Although the UK would like all countries to uphold the rights of children to the standard that they do, the UK is not exporting UK citizen’s children to India. Thus, as the Information Commissioner he has a responsibility to protect his citizen’s data, even when it leaves the UK jurisdiction.  Graham explained further that in the history of Europe, the misuse of data to do harm has been a common trend, which is why privacy is seen as a fundamental right, and why it is paramount that European data is subject to the same level of protection no matter what jurisdiction it is in. India needs to understand that privacy is a fundamental right and goes beyond security, and that when a company processes data it does not own the data, the individual owns the data and thus has rights attached to it to understand why Europe requires countries to be ‘data secure’ before transferring data to them.

    Key Points:

    1. The UK Information Commissioners Office regulates both the right to information and privacy, and thus the two rights are seen as integral to each other.
    2. Penalties must be proportionate and scalable to the offense.
    3. Co-regulation and self-regulation can both be viable models to for privacy, but enforcement is key to them being effective.

    Discussion: Collection of Data with Consent and Collection of Data without Consent


    Participants also discussed the collection of data with consent and the collection of data without consent found in Chapter III of the Bill. When asked opinions about the circumstances when informed consent should not be required,  it was pointed out that in the Canadian model, the option to collect information without consent only applies to the public sector if it is necessary for the delivery of a service by the government. In the private sector all collection of information requires informed and meaningful consent. Yet, collection of data without consent in the commercial context is an area that Canada is wrestling with, as there are instances, such as online advertising, where it is unreasonable to expect consent all the time. It was also pointed out that in the European Directive, consent is only one of the seven grounds under which data can be collected. As part of the conversation on consent, it was pointed out that the Bill currently does not take explicitly take into account the consent for transfer of information, and it does not address changing terms of service and if companies must re-take consent, or if providing notice to the individual was sufficient. The question about consent and additional collection of data that is generated through use of that service was also raised. For example, if an individual signs up for a mobile connection and initially provides information that the service provider stores in accordance to the privacy principles, does the service provider have an obligation to treat all data generated by the user while using the service of the same? The exception of disclosure without consent was also raised and it was pointed out that companies are required to disclose information to law enforcement when required. For example, telecom service providers must now store location data of all subscribers for up to 6 months and share the same when requested by law enforcement.

    Key Points:

    1. There are instances where expecting companies to have informed consent for every collection of information is not reasonable. Alternative models, based on — for example transparency — must be explored to address these situations.
    2. The Privacy Protection Bill should explicitly address transfer of information to other countries.
    3. The Privacy Protection Bill should address consent in the context of changing terms of service.

    Discussion: Penalties and Offences


    The penalties and offenses prescribed in chapter VI of the Privacy Protection Bill were discussed by participants. While discussing the chapter, many different opinions were voiced. For example, some participants held the opinion that offences and penalties should not exist in the Privacy Protection Bill, because in reality they are more likely than not to be effective. For example, when litigating civil penalties, it takes a long time for the money to be realized. Others argued that in India, where enforcement of any law is often weak, strong, clear, and well defined criminal penalties are needed. Another comment raised the point that a distinction should be made between breaches of the law by data controllers and breaches by rogue individuals — as the type of violation. For example, a breach by a data controller is often a matter identifying the breach and putting in place strictures to ensure that it does not happen again by holding the company accountable through oversight. Where as a breach by a rogue agent entails identifying the breach and the rogue agent and creating a strong enough penalty to ensure that they will not repeat the violation.  Adding to this discussion, it was pointed out that in the end, scalability is key in ensuring that penalties are proportional and effective. It was also noted that in the UK, any fine that is levied is appealable. This builds in a system of checks and balances, and ensures that companies and individuals are not subject to unfair or burdensome penalties.

    The possibility of incentivizing compliance, through rewards and distinctions, was discussed by participants. Some felt that incentivizing compliance would be more effective as it would give companies distinct advantages to incorporating privacy protections, while others felt that incentives can be included but penalties cannot be excluded, otherwise the provisions of the Privacy Protection Bill 2013 will not be enforceable. It was also pointed out that in the context of India, ideally there should be a mechanism to address the ‘leakages’ that happen in the system i.e., corruption. Though this is difficult to achieve, regulations could take steps like specifically prohibiting the voluntary disclosure of information by companies to law enforcement. Taking a sectoral approach to penalties was also suggested as companies in different sectors face specific challenges and types of breaches. Another approach that could be implemented is the statement of a time limit for data controllers and commissioners to respond to complaints. This has worked for the implementation of the Right to Information Act in India, and it would be interesting to see how it plays out for the right to privacy. Throughout the discussion a number of different possible ways to structure offenses and penalties were suggested, but for all of them it was clear that  it is important to be creative about the type of penalties and not rely only on financial penalty, as for many companies, a fine has less of an impact than perhaps having to publicly disclose what happened around a data breach.

    Key Points:

    1. Penalties and offenses by companies vs. rogue agents should be separately addressed in the Bill.
    2. Instead of levying penalties, the Bill should include incentives to ensure compliance.
    3. Penalties for companies should go beyond fines and include mechanisms such as requiring the company to disclose to the public information about the breach.

    Discussion: Cultural Aspects of Privacy


    The cultural realities of India, and the subsequent impact on the perception of privacy in India were discussed. It was pointed out that India has a history of colonization, multiple religions and languages, ethnic tensions, a communal based society, and a large population. All of these factors impact understandings, perceptions, practices, and the effectiveness of different frameworks around privacy in India. For example, the point was raised that given India’s cultural and political diversity, having a principle based model might be too difficult to enforce as every judge, authority, and regulator will have a different perspective and agenda. Other participants pointed out that there is a lack of awareness around privacy in India, and this will impact the effectiveness of the regulation. It was also highlighted that anecdotal claims that cultural privacy in India is different, such as the fact that in India on a train everyone will ask you personal questions, and thus Indian’s do not have a concept of privacy, cannot influence how a privacy law is framed for India.

    Key Points:

    1. India’s diverse culture will impact perceptions of privacy and the implementation of any privacy regulation.
    2. Given India’s diversity, a principle based model might not be adequate.
    3. Though culture is important to understand and incorporate into the framing of any privacy regulation in India, anecdotal stories and broad assumptions about India’s culture and societal norms around privacy cannot influence how a privacy law is framed for India.

    Conclusion

    The seventh privacy round-table concluded with a conversation on the NSA spying and the Snowden Revelations. It was asked if domestic servers could be an answer to protect Indian data. Participants agreed that domestic servers are just a band aid to the problem. With regards to the Privacy Protection Bill it was clarified that CIS is now in the process of collecting public statements to the Bill and will be submitting a revised version to the Department of Personnel and Training. Speaking to the privacy debate at large, it was emphasized that every stakeholder has an important voice and can impact the framing of a privacy law in India.

    Why 'Facebook' is More Dangerous than the Government Spying on You

    by Maria Xynou last modified Nov 23, 2013 08:38 AM
    In this article, Maria Xynou looks at state and corporate surveillance in India and analyzes why our "choice" to hand over our personal data can potentially be more harmful than traditional, top-down, state surveillance. Read this article and perhaps reconsider your "choice" to use social networking sites, such as Facebook.
    Why 'Facebook' is More Dangerous than the Government Spying on You

    by AJC1 on flickr

    Do you have a profile on Facebook? Almost every time I ask this question, the answer is ‘yes’. In fact, I think the amount of people who have replied ‘no’ to this question can literally be counted on my right hand. But this is not an article about Facebook per se. It’s more about the ‘Facebooks’ of the world, and of people’s increasing “choice” to hand over their most personal data. More accurate questions are probably:

    Would you like the Government to go through your personal diary? If not, then why do you have a profile on Facebook?”

    The Indian Surveillance State

    Following Snowdens revelations, there’s finally been more talk about surveillance. But what is surveillance?

    David Lyon - who directs the Surveillance Studies Centre - defines surveillance as “any collection and processing of personal data, whether identifiable or not, for the purposes of influencing or managing those whose data have been garnered”. Surveillance can also be defined as the monitoring of the behaviour, activities or other changing information of individuals or groups of people. However, this definition implies that individuals and/or groups of people are being monitored in a top-down manner, without this being their “choice”. But is that actually the case? To answer this question, let’s have a look at how the Indian government and corporations operating in India spy on us.

    State Surveillance

    The first things that probably come to mind when thinking about India from a foreigner’s perspective are poverty and corruption. Surveillance appears to be a “Western, elitist issue”, which mainly concerns those who have already solved their main survival problems. In other words, the most mainstream argument I hear in India is that surveillance is not a real issue, especially since the majority of the population in the country lives below the line of poverty and does not even have any Internet access. Interestingly enough though, the other day when I was walking around a slum in Koramangala, I noticed that most people have Airtel satellites...even though they barely have any clean water!

    The point though is that surveillance in India is a fact, and the state plays a rather large role in it. In particular, Indian law enforcement agencies follow three steps in ensuring that targeted and mass surveillance is carried out in the country:

    1. They create surveillance schemes, such as the Central Monitoring System (CMS), which carry out targeted and/or mass surveillance

    2. They create laws, guidelines and license agreements, such as the Information Technology (Amendment) Act 2008, which mandate targeted and mass surveillance and which require ISP and telecom operators to comply

    3. They buy surveillance technologies from companies, such as CCTV cameras and spyware, and use them to carry out targeted and/or mass surveillance

    While Indian law enforcement agencies don’t necessarily follow these steps in this precise order, they usually try to create surveillance schemes, legalise them and then buy the gear to carry them out.

    In particular, surveillance in India is regulated under five laws: the Indian Telegraph Act 1885, the Indian Post Office Act 1898, the Indian Wireless Telegraphy Act 1933, section 91 of the 1973 Code of Criminal Procedure (CrPc) and the Information Technology (Amendment) Act 2008. These laws mandate targeted surveillance, but remain silent on the issue of mass surveillance which means that technically it is neither allowed nor prohibited, but remains a grey legal area.

    While surveillance laws in India may not mandate mass surveillance, some of their sections are particularly concerning. Section 69 of the Information Technology (Amendment) Act 2008 allows for the interception of all information transmitted through a computer resource, while requiring that all users disclose their private encryption keys or face a jail sentence of up to seven years. This appears to be quite bizarre, as individuals can only keep their data private and protect themselves from surveillance through encryption.

    Section 44 of the Information Technology (Amendment) Act 2008 imposes stiff penalties on anyone who fails to provide requested information to authorities - which kind of reminds us of Orwell’s totalitarian regime in “1984”. Furthermore, section 66A of the same law states that individuals will be punished for sending “offensive messages through communication services”. However, the vagueness of this section raises huge concerns, as it remains unclear what defines an “offensive message” and whether this will have grave implications on the freedom of expression. The arrest of two Indian women last November over a Facebook post reminds us of this.

    Laws in India may not mandate mass surveillance, but guidelines and license agreements issued by the Department of Telecommunications do. In particular, the UAS License Agreement regarding the Central Monitoring System (CMS) not only mandates mass surveillance, but also attempts to legalise a mass surveillance scheme which aims to intercept all telecommunications and Internet communications in India. Furthermore, the Department of Telecommunications has issued numerous guidelines and license agreements for ISPs and telecom operators, which require them to not only be “surveillance-friendly”, but to also enable law enforcement agencies to tap into their servers on the grounds of national security. And then, of course, there’s the new National Cyber Security Policy, which mandates surveillance to tackle cyber-crime, cyber-terrorism, cyber-war and cyber-vandalism.

    As both a result and prerequisite of these laws, the Indian government has created various surveillance schemes and teams to aid them. In particular, Indias Computer Emergency Response Team (CERT) is currently monitoring “any suspicious move on the Internet” in order to checkmate any potential cyber attacks from hackers. While this may be useful for the purpose of preventing and detecting cyber-criminals, it remains unclear how “any suspicious move” is defined and whether that inevitably enables mass surveillance, without individuals’ knowledge or consent.

    The Crime and Criminal Tracking and Network & Systems (CCTNS) is the creation of a nationwide networking infrastructure for enhancing the efficiency and effectiveness of policing and sharing data among 14,000 police stations across the country. It has been estimated that Rs. 2000 crore has been allocated for the CCTNS project and while it may potentially increase the effectiveness of tackling crime and terrorism, it raises questions around the legality of data sharing and its potential implications on the right to privacy and other human rights - especially if such data sharing results in data being disclosed or shared with unauthorised third parties.

    Similarly, the National Intelligence Grid (NATGRID) is an integrated intelligence grid that will link the databases of several departments and ministries of the Government of India so as to collect comprehensive patterns of intelligence that can be readily accessed by intelligence agencies. This was first proposed in the aftermath of the Mumbai 2008 terrorist attacks and while it may potentially aid intelligence agencies in countering crime and terrorism, enforced privacy legislation should be a prerequisite, which would safeguard our data from potential abuse.

    However, the most controversial surveillance scheme being implemented in India is probably the Central Monitoring System (CMS). While several states, such as Assam, already have Internet Monitoring Systems in place, the Central Monitoring System appears to raise even graver concerns. In particular, the CMS is a system through which all telecommunications and Internet communications in India will be monitored by Indian authorities. In other words, the CMS will be capable of intercepting our calls and of analyzing our data on social networking sites, while all such data would be retained in a centralised database. Given that India currently lacks privacy legislation, such a system would mostly be unregulated and would pose major threats to our right to privacy and other human rights. Given that data would be centrally stored, the system would create a type of “honeypot” for centralised cyber attacks. Given that the centralised database would have massive volumes of data for literally a billion people, the probability of error in pattern and profile matching would be high - which could potentially result in innocent people being convicted for crimes they did not commit. Nonetheless, mass surveillance through the CMS is currently a reality in India.

    And the even bigger question: How can law enforcement agencies mine the data of 1.2 billion people? How do they even carry out surveillance in practice? Well, that’s where surveillance technology companies come in. In fact, the surveillance industry in India is massively expanding - especially in light of its new surveillance schemes which require advanced and sophisticated technology. According to CISIndia Privacy Monitor Map - which is part of ongoing research - Indian law enforcement agencies use CCTV cameras in pretty much every single state in India. The map also shows that Unmanned Aerial Vehicles (UAVs), otherwise known as drones, are being used in most states in India and the DRDOsNetra - which is a lightweight drone, not much bigger than a bird - is particularly noteworthy.

    But Indian law enforcement agencies also buy surveillance software and hardware which is aimed at intercepting telecommunications and Internet communications. In particular, ClearTrail Technologies is an Indian company - based in Indore - which equips law enforcement agencies in India and around the world with surveillance software which can probably be compared with the “notorious” FinFisher. So in short, there appears to be a tight collaboration between Indian law enforcement agencies and the surveillance industry, which can be clearly depicted in the ISS surveillance trade shows, otherwise known as “the wiretappers’ ball”.

    Corporate Surveillance

    When I ask people about corporate surveillance, the answer I usually get is: “Corporations only care about their profit - they don’t do surveillance per se”. And while that may be true, David Lyons definition of surveillance - as “any collection and processing of personal data, whether identifiable or not, for the purposes of influencing or managing those whose data have been garnered” - may indicate otherwise.

    Corporations, like Google, Amazon and Facebook, may not have an agenda for spying per se, but they do collect massive volumes of personal data and, in cases such as PRISM, allow law enforcement to tap into their servers. Once law enforcement agencies get hold of data collected by companies, such as Facebook, they then use data mining software - equipped by various surveillance technology companies - to process and mine the data. And how do companies, like Google and Facebook, make money off our personal data? By selling it to big buyers, such as law enforcement agencies.

    So while Facebook and all the ‘Facebooks’ of the world may not profit from surveillance per se, they do profit from collecting our personal data and selling it to third parties, which include law enforcement agencies. And David Lyon argues that surveillance involves the collection of personal data - which corporations, like Facebook, do - for the purpose of influencing and managing individuals. While this last point can probably be widely debated on, it is clear that corporations share their collected data with third parties, which ultimately leads to the influence or managing of individuals - directly or indirectly. In other words, the collection of personal data, in combination with its disclosure to third parties, is surveillance. So when we think about companies, like Google or Facebook, we should not just think of businesses interested in their profit - but also of spying agencies. After all, if the product is free, you are the product.

    Now if we look at online corporations more closely, we can probably identify three categories:

    1. Websites through which we buy products and hand over our personal details - e.g. Amazon

    2. Websites through which we use services and hand over our personal details - e.g. flight ticket

    3. Websites through which we communicate and hand over our personal details - e.g. Facebook

    And why could the above be considered “spying” at all? Because such corporations collect massive volumes of personal data and subsequently:

    - Disclose such data to law enforcement agencies

    - Allow law enforcement agencies to tap into their servers

    - Sell such data to “third parties”

    What’s notable about so-called corporate surveillance is that, in all cases, there is a mutual, key element: we consent to the handing over of our personal information. We are not forced to hand over our personal data when buying a book online, booking a flight ticket or using Facebook. Instead, we “choose” to hand over our personal data in exchange for a product or service. Now what significantly differentiates state surveillance to corporate surveillance is the factor of “choice”. While we may choose to hand over our most personal details to large online corporations, such as Google and Facebook, we do not have a choice when the government monitors our communications, collects and stores our personal data.

    State Surveillance vs. Corporate Surveillance

    Both Indian law enforcement agencies and corporations collect massive volumes of personal data. In fact, it is probably noteworthy to mention that Facebook, in particular, collects 20 times more data per day than the NSA in total. In addition, Facebook has claimed that it has received more demands from the US government for information about its users than from all other countries combined. In this sense, the corporate collection of personal data can potentially be more harmful than government surveillance, especially when law enforcement agencies are tapping into the servers of companies like Facebook. After all, the Indian government and all other governments would have very little data to analyse if it weren’t for such corporations.

    Surveillance is not just about “spying” or about “watching people” - it’s about much much more. Observing people’s behaviour only really becomes harmful when the data observed is collected, retained, analysed, shared and disclosed to unauthorised third parties. In other words, surveillance is meaningful to examine because it involves the analysis of data, which in turn involves pattern matching and profiling, which can potentially have actual, real-world implications - good or bad. But such analysis cannot be possible without having access to large volumes of data - most of which belong to large corporations, like Facebook. The question, though, is: How do corporations collect such large volumes of personal data, which they subsequently share with law enforcement agencies? Simple: Because we “choose” to hand over our data!

    Three years ago, when I was doing research on young people’s perspective of Facebook, all of the interviewees replied that they feel that they are in control of their personal data, because they “choose” what they share online. While this may appear to be a valid point, the “choice” factor can widely be debated on. There are many reasons why people “choose” to hand over their personal data, whether to buy a product, use a service, to communicate with peers or because they feel socially pressured into using social networking sites. Nonetheless, it all really comes down to one main reason: convenience. Today, in most cases, the reason why we hand over our personal data online in exchange for products or services is because it is simply more convenient to do so. And while that is understandable, at the same time we are exposing our data (and ultimately our lives) in the name of convenience.

    The irony in all of this is that, while many people reacted to Snowdens revelations on NSA dragnet surveillance, most of these people probably have profiles on Facebook. Secret, warrantless government surveillance is undeniably intrusive, but in the end of the day, our profiles on Facebook - and on all the ‘Facebooks’ of the world - is what enabled it to begin with. In other words, if we didn’t choose to give up our personal data - especially without really knowing how it would be handled - large databases would not exist and the NSA - and all the ‘NSAs’ of the world - would have had a harder time gathering and analysing data.

    In short, the main difference between state and corporate surveillance is that the first is imposed in a top-down manner by authorities, while the second is a result of our “choice” to give up our data. While many may argue that it’s worse to have control imposed on you, I strongly disagree. When control and surveillance are imposed on us in a top-down manner, it’s likely that we will perceive this - sooner or later - as a direct threat to our human rights, which means that it’s likely that we will resist to it at some point. People usually react to what they perceive as a direct threat, whereas they rarely react to what does not directly affect them. For example, one may perceive murder or suicide as a direct threat due the immediateness of its effect, whereas smoking may not be seen as an equally direct threat, because its consequences are indirect and can usually be seen in the long term. It’s somehow like that with surveillance.

    University students have protested on the streets against the installation of CCTV cameras, but how many of them have profiles on social networking sites, such as Facebook? People may react to the installation of CCTV cameras, because it may appear as a direct threat to their right to privacy. However, the irony is that the real danger does not necessarily lie within some CCTV cameras, but rather within the profile of each person on a major commercial social networking site. At very best, a CCTV camera will capture some images of us and through that, track our location and possibly our acquaintances. What type of data is captured through a simple, “harmless” Facebook profile? The following probably only includes a tiny percentage of what is actually captured:

    - Personal photos

    - Biometrics (possibly through photos)

    - Family members

    - Friends and acquaintances

    - Habits, hobbies and interests

    - Location (through IP address)

    - Places visited

    - Economic standing (based on pictures, comments, etc.)

    - Educational background

    - Ideas and opinions (which may be political, religious, etc.)

    - Activities

    - Affiliations

    The above list could potentially go on and on, probably depending on how much - or what type - of data is disclosed by the individual. The interesting element to this is that we can never really know how much data we are disclosing, even if we think we control it. While an individual may argue that he/she chooses to disclose an x amount of data, while retaining the rest, that individual may actually be disclosing a 10x amount of data. This may be the case because usually every bit of data hides lots of other bits of data, that we may not be aware of. It all really comes down to who is looking at our data, when and why.

    For example, (fictional) Priya may choose to share on her Facebook profile (through photos, comments, or any other type of data) that she is female, Indian, a Harvard graduate and that her favourite book is Anarchism and other Essays by Emma Goldman. At first glance, nothing appears to be “wrong” with what Priya is revealing and in fact, she appears to care about her privacy by not revealing “the most intimate details” of her life. Moreover, one could argue that there is absolutely nothing “incriminating” about her data and that, on the contrary, it just reflects that she is a “shiny star” from Harvard. However, I am not sure if a data analyst would be restricted to this data and if data analysis would show the same “sparkly” image.

    In theory, the fact that Priya is an Indian who attended Harvard reveals another bit of information, that Priya did not choose to share: her economic standing. Given that the majority of Indians live below the line of poverty, there is a big probability that Priya belongs to India’s middle class - if not elite. Priya may not have intentionally shared this information, but it was indirectly revealed through the bits of data that she did reveal: female Indian and Harvard graduate. And while there may not be anything “incriminating” about the fact that she has a good economic standing, in India this usually means that there’s also some strong political affiliation. That brings us to her other bit of information, that her favourite author is a feminist, anarchist. While that may be viewed as indifferent information, it may be crucial depending on the specific political actors in the country she’s in and on the general political situation. If a data analyst were to map the data that Priya chose to share, along with all her friends and acquaintances that she inevitably has through Facebook, that data analyst could probably tell a story about her. And the concerning part is that that story may or may not be true. But that doesn’t really matter.

    Today, governments don’t judge us and take decisions based on our version of our data, but based on what our data says about us. And perhaps, under certain political, social and economic circumstances, our “harmless” data could be more incriminating than what we think. While an individual may express strong political views within a democratic regime, if that political system were to change in the future and to become authoritarian, that individual would possibly be suspicious in the eyes of the government - to say the least. This is where data retention plays a significant role.

    Most companies retain data indefinitely or for a long period of time, which means that future, potentially less-democratic governments may have access to it. And the worst part is that we can never really know what data is being held about us, because within data analysis, every bit of data may potentially entails various other bits of data that we are not even aware of. So, when we “choose” to hand over our data, we don’t necessarily know what or how much we are choosing to disclose. Thus, this is why I agree with Bruce Schneier’s argument that people have an illusionary sense of control over their personal data.

    Social network analysis software is specifically designed to mine huge volumes of data that is collected through social networking sites, such as Facebook. Such software is specifically designed to profile individuals, to create “trees of communication” around them and to match patterns. In other words, this software tells a story about each and every one of us, based on our activities, interests, acquaintances, and all other data. And as mentioned before, such a story may or may not be true.

    In data mining, behavioural statistics are being used to analyse our data and to predict how we are likely to behave. When applied to national databases, this may potentially amount to predicting how masses or groups within the public are likely to behave and to subsequently control them. If a data analyst can predict an individual’s future behaviour - with some probability - based on that individuals’ data, the same could potentially occur on a mass, public level. As such, the danger within surveillance - especially corporate surveillance through which we voluntarily disclose massive amounts of data about ourselves - is that it appears to come down to public control.

    According to security expert Bruce Schneier, data today is a byproduct of the Information Society. Unlike an Orwellian totalitarian state where surveillance is imposed in a top-down manner, surveillance today appears to widely exist because we indirectly choose and enable it (by handing over our data to online companies), rather than it being imposed on us in a solely top-down manner. However, contemporary surveillance may potentially be far worse than that described in Orwell’s “1984”, because surveillance is publicly perceived to be an indirect threat - if considered to be a threat at all. It is more likely that people will resist a direct threat, than an indirect threat, which means that the possibility of mass violations of human rights as a result of surveillance is real.

    Hannah Arendt argued that a main prerequisite and component of totalitarian power is support by the masses. Today, surveillance appears to be socially integrated within societies which indicates that contemporary power fueled by surveillance has mass support. While the argument that surveillance is being socially integrated can potentially be widely debated on and requires an entire in depth research of its own, few simple facts might be adequate to prove it at this stage. Firstly, CCTV cameras are installed in most countries, yet there has been very little resistance - on the contrary, there appears to be a type of universal acceptance on the grounds of security. Secondly, different types of spy products exist in the market - such as Spy Coca Cola cans - which can be purchased by anyone online. Thirdly, countries all over the world carry out controversial surveillance schemes - such as the Central Monitoring System in India - yet public resistance to such projects is limited. And while one may argue that the above cases don’t necessarily prove that surveillance is being socially integrated, it would be interesting to look at a fourth fact: most people who have Internet access choose to share their personal data through the use of social networking sites.

    Reality shows, such as Big Brother, which broadcast the surveillance of people’s lives and present it as a form of entertainment - when actually, I think it should be worrisome - appear to enable the social integration of surveillance. The very fact that we all probably - or, hopefully - know that Facebook can share our personal data with unauthorised third parties and - now, after the Snowden revelations - that governments can tap into Facebook’s servers, should be enough to convince us to delete our profiles. Yet, why do we still all have Facebook profiles? Perhaps because surveillance is socially integrated and perhaps because it is just convenient to be on Facebook. But that doesn’t change the fact that surveillance can potentially be a threat to our human rights. It just means that we perceive surveillance as an indirect threat and that we are unlikely to react to it.

    In the long term, what does this mean? Well, it seems like we will probably be more acceptive towards more authoritarian power, that we will be used to the idea of censoring our own thoughts and actions (in the fear of getting caught by the CCTV camera on the street or the spyware which may or may not be implanted in our laptop) and that ultimately, we will be less politically active and more reluctant to challenge the authority.

    What’s particularly interesting though about surveillance today is that it is fueled and enabled through our freedom of speech and general Internet freedom. If we didn’t have any Internet freedom - or as much as we do - we would have disclosed less personal data and thus surveillance would probably have been more restricted. The more Internet freedom we have, the more personal data we will disclose on Facebook - and on all the ‘Facebooks’ of the world - and the more data will potentially be available to mine, analyse, share and generally incorporate in the surveillance regime. So in this sense, Internet freedom appears to be a type of prerequisite of surveillance, as contradictory and ironic as it may seem. No wonder why the Chinese government has gone the extra mile in creating the Chinese versions of Facebook and Twitter - it’s probably no coincidence.

    While we may blame governments for establishing surveillance schemes, ISP and TSP operators for complying with governments’ license agreements which often mandate that they create backdoors for spying on us and security companies for creating the surveillance gear in the first place, in the end of the day, we are all equally a part of this mess. If we didn’t choose to hand over our personal data to begin with, none of the above would have been possible.

    The real danger in the Digital Age is not necessarily surveillance per se, but our choice to voluntarily disclose our personal data.

    Programme Booklet

    by Prasad Krishna last modified Nov 20, 2013 04:30 AM

    PDF document icon Programme Booklet_small res-1.pdf — PDF document, 1473 kB (1508988 bytes)

    CV Booklet

    by Prasad Krishna last modified Nov 20, 2013 04:34 AM

    PDF document icon CV Booklet_small res-1.pdf — PDF document, 1796 kB (1840125 bytes)

    Consilience 2013 Report

    by Prasad Krishna last modified Nov 20, 2013 06:14 AM

    PDF document icon Consilience 2013- Recommendatory Report and Conference Proceedings.pdf — PDF document, 915 kB (937019 bytes)

    CIS Supports the UN Resolution on “The Right to Privacy in the Digital age”.

    by Elonnai Hickok last modified Nov 30, 2013 07:25 AM
    The United Nations adopted the resolution on the right to privacy recently. It recognised privacy as a human right, integral to the right to free expression, and also declared that mass surveillance could have negative impacts on human rights.

    On November 26, 2013, the United Nations adopted a non-binding resolution on The Right to Privacy in the Digital Age. The resolution was drafted by Brazil and Germany and expressed concern over the negative impact of surveillance and interception on the exercise of human rights. The resolution was controversial as countries such as the US, the UK, and Canada opposed language that spoke to the right to privacy extending equally to citizens and non-citizens of a country. The resolution welcomed the report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression that examined the implications of surveillance of communications on the human rights of privacy and freedom of expression.

    The resolution made a number of important statements that India, as a member of the United Nations, and as a country in the process of implementing a number of surveillance projects, like the Central Monitoring System, should take cognizance of, including in short:

    1. Privacy is a human right: Privacy is a human right according to which no one should be subjected to arbitrary or unlawful interference with his or her privacy, family, home, or correspondence.
    2. Privacy is integral to the right to free expression: an integral component in recognizing the right to freedom of expression.
    3. Unlawful and arbitrary surveillance violates the right to privacy and freedom of expression: Unlawful and/or arbitrary surveillance, interception, and collection of personal data are intrusive acts that violate the right to privacy and freedom of expression.
    4. Exceptions to privacy and freedom of expression should be in compliance with human rights law: Public security is a potential exception justifying collection and protection of information, but States must ensure that this is done fully in compliance with international human rights law.
    5. Mass surveillance may have negative implications for human rights: Domestic and extraterritorial surveillance, interception, and the collection of personal data on a mass scale may have a negative impact on individual human rights.
    6. Equal protection for online and offline privacy: The right to privacy must be equally protected online and offline.

    The resolution further called upon states to:

    1. Respect and protect the right to privacy, particularly in the context of digital communications.
    2. To ensure that relevant legislation is in compliance with international human rights law
    3. To review national procedures and practices around surveillance to ensure full and effective implementation of obligations under international human rights law.
    4. To establish and maintain effective domestic oversight mechanisms around domestic surveillance capable of ensuring transparency and accountability.

    The resolution finally calls upon the UN High Commissioner for Human Rights to present a report with views and recommendations on the protection and promotion of the right to privacy in the context of surveillance to the Human Rights Council at its twenty-seventh session and to the General Assembly at its sixty-ninth session and decides to examine “Human rights questions, including alternative approaches for improving the effective enjoyment of human rights and fundamental freedoms”.

    The UN Resolution on the Right to Privacy in the Digital Age is a welcome step towards an international recognition of privacy as a human right in the context of communications and extra territorial surveillance. The Centre for Internet and Society encourages the Government of India to, as called upon in the Resolution, to review national procedures and practices around surveillance to ensure full and effective implementation of obligations under international human rights law.

    Prior to the UN Resolution on “The Right to Privacy in the Digital Age”, a group of international NGO’s developed the Necessary and Proportionate principles that seek to form a backbone for a response to mass surveillance and provide a framework for governments to assess if domestic surveillance regimes are in compliance with international Human Rights Law. CIS has contributed to the process of developing these principles.  The principles include legality, legitimate aim, necessity, adequacy, proportionality, competent judicial authority, due process, user notification, transparency, public oversight, integrity of communications and systems, safeguards for international cooperation, and safeguards against illegitimate access.  A petition to sign onto the principles and demand an end to mass surveillance is currently underway.

    Both the Government of India and public of India should take into consideration the UN Resolution and the necessary and proportionate principles to reflect on how India’s surveillance regime and practices can be brought in line with international human rights law and understand where the balance is drawn for necessary and proportionate surveillance, specific to the Indian context.

     

    Open Secrets

    by Nishant Shah last modified Nov 30, 2013 08:21 AM
    We need to think of privacy in different ways — not only as something that happens between people, but between you and corporations.

    Dr. Nishant Shah's article was originally published in the Indian Express on October 27.


    If you are a part of any social networking site, then you know that privacy is something to be concerned about. We put out an incredible amount of personal data on our social networks. Pictures with family and friends, intimate details about our ongoing drama with the people around us, medical histories, and our spur-of-the-moment thoughts of what inspires, peeves or aggravates us. In all this, the more savvy use filters and group settings which give them some semblance of control about who has access to this information and what can be done with it.

    But it is now a given that in the world of the worldwide web, privacy is more or less a thing of the past. Data transmits. Information flows. What you share with one person immediately gets shared with thousands. Even though you might make your stuff accessible to a handful of people, the social networks work through a "friend-of-a-friend effect", where others in your networks use, like, share and spread your information around so that there is an almost unimaginable audience to the private drama of our lives. Which is why there is a need for a growing conversation about what being private in the world of big data means.

    Privacy is about having control over the data and some ownership about who can use it and for what purpose. Interface designs and filters that allow limited access help this process. The legal structures are catching up with regulations that control what individuals, entities, governments and corporations can do with the data we provide. However, most people think of privacy as a private matter. Just look at last month's conversations around Facebook's new privacy policies, which no longer allow you to hide. If you are on Facebook, people can find you using all kinds of parameters — meta data — other than just your name. They might find you through hobbies, pages you like, schools you have studied in, etc. This can be scary because it means that based on particular activities, people can profile and follow you. Especially for people in precarious communities — the young adults, queer people who might not be ready to be out of the closet, women who already face increased misogyny and hostility online. This means they are officially entering a stalkers' paradise.

    While those concerns need to be addressed, there is something that seems to be missing from the debate. Almost all of these privacy alarms are about what people can do to people. That we need to protect ourselves from people, when we are in public — digital or otherwise. We are reminded that the world is filled with predators, crackers and scamsters, who can prey on our personal data and create physical, emotional, social and financial havoc. But this is the world we already know. We live in a universe filled with perils and we have learned and coped with the fact that we navigate through dangerous spaces, times and people all the time. The digital is no different than the physical when it comes to the possible perils that we live in, though digital might facilitate some kinds of behaviour and make data-stalking easier.

    What is different with the individualised, just-for-you crafted world of the social web is that there are things which are not human, which are interacting with you in unprecedented ways. Make a list of the top five people you interact with on Facebook. And you will be wrong. Because the thing that you interact with the most on Facebook, is Facebook. Look at the amount of chatter it creates — How are you feeling today?; Your friend has updated their status; Somebody liked your comment… the list goes on. In fact, much as we would like to imagine a world that revolves around us, we know that there are a very few people who have the energy and resources to keep track of everything we do. However, no matter how boring your status message or how pedestrian your activity, deep down in a server somewhere, an artificial algorithm is keeping track of everything that you do. Facebook is always listening, and watching, and creating a profile of you. People might forget, skip, miss or move on, but Facebook will listen, and remember long after you have forgotten.

    If this is indeed the case, we need to think of privacy in different ways — not only as something that happens between people, but between people and other entities like corporations. The next time there is a change in the policy that makes us more accessible to others, we should pay attention. But what we need to be more concerned about are the private corporations, data miners and information gatherers, who make themselves invisible and collect our personal data as we get into the habit of talking to platforms, gadgets and technologies.

    I Just Pinged to Say Hello

    by Nishant Shah last modified Nov 30, 2013 08:36 AM
    A host of social networks find us more connected than ever before, but leave us groping for words in the digital space.

    Dr. Nishant Shah's article was published in the Indian Express on November 24, 2013.


    I am making a list of all the platforms that I use to connect with the large networks that I belong to. Here goes: I use Yahoo! Messenger to talk to my friends in east Asia. Most of my work meetings happen on Skype and Google Hangout. A lot of friendly chatter fills up my Facebook Messenger. Twitter is always available for a little back-chat and bitching. On the phone, I use Viber to make VoIP calls and WhatsApp is the space for unending conversations spread across days. And these are just the spaces for real-time conversation. Across all these platforms, something strange is happening. As I stay connected all the time, I am facing a phenomenon where we have run out of things to say, but not the desire to talk.

    I had these three conversations today on three different instant-messaging platforms:

    Person 1 (on WhatsApp): Hi.

    Me: Hey, good to hear from you. How are you doing?

    Person 1: Good.

    Me (after considerable silence): So what's up?

    Person 1: Nothing.

    End of conversation.

    Person 2 (On an incoming video call on Skype): Hey, you there?

    Me: Yeah. What time is it for you right now?

    Person 2: It is 10 at night.

    Me: Oh! That is late. How come you are calling me so late?

    Person 2: Oh, I saw you online.

    Me: Ok….. *eyes raised in question mark*

    Person 2: So, that's it. I am going to sleep soon.

    Me: Ok…. Er…goodnight.

    Person2: Goodnight.

    We hang up.

    Person 3 (pinging me on Facebook): Hey, you are in the US right now?

    Me: Yes. I am attending a conference here.

    Person 3: Cool!

    Me: Umm… yeah, it is.

    Person 3: emoticon of a Facebook 'like'. Have fun. Bye.

    Initially I was irritated at the futility of these pings that are bewildering in their lack of content. I dismissed it as one of those things, but I realise that there is a pattern here. Our lives are so particularly open and documented, such minute details of what we do, where we are and who we are with, is now available for the rest of the world to consume, making most of the conversations seeking information, redundant. If you know me on my social media networks, you already know most of the basic things that you would want to know about me. And it goes without saying that no matter how close and connected we are, we are not necessarily in a state where we want to talk all the time. The more distributed our lives are, the more diminished is the need for personal communication.

    And yet, the habit or the urge to ping, buzz, DM or chat has not caught up with this interaction deficit. So, we still seem to reach out, using a variety of platforms just to say hello, even when there is nothing to say. I call this the 'Always On' syndrome. We live in a world where being online all the time has become a ubiquitous reality. Even when we are asleep, or busy in a meeting, or just mentally disconnected from the online spaces, our avatars are still awake. They interact with others. And when they feel too lonely, they reach out and send that empty ping — just to confirm that they are not alone. That on the other side of the glowing screen is somebody else who is going to connect back, and to reassure you that we are all together in this state of being alone.

    This empty ping has now become a signifier, loaded with meaning. The need for human connection has been distributed, but it does not compensate our need for one-on-one contact. In the early days of the cell phone, when incoming calls were still being charged, the missed call, without any content, was a code between friends and lovers. It had messages about where to meet, when to meet, or sometimes, just that you were missing somebody. The empty ping is the latest avatar of the missed call — in a world where we are always online but not always connected, when we are constantly together, but also spatially and emotionally alone, the ping remains that human touch in the digital space that reassures us that on the other side of that seductive interface and the buzzing gadget, is somebody we can say hello to.

     

    Chances and Risks of Social Participation

    by Prasad Krishna last modified Nov 30, 2013 09:13 AM

    PDF document icon programme_participation_woOSHWuNaMiuRBGHMPwNSHIIG2.pdf — PDF document, 75 kB (77702 bytes)

    Misuse of Surveillance Powers in India (Case 1)

    by Pranesh Prakash last modified Dec 06, 2013 09:37 AM
    In this series of blog posts, Pranesh Prakash looks at a brief history of misuse of surveillance powers in India. He notes that the government's surveillance powers have been freqently misused, very often without any kind of judicial or political redressal. This, he argues, should lead us as concerned citizens to demand a scaling down of the government's surveillance powers and pass laws to put it place more robust oversight mechanisms.

    Case 1: Unlawful Phone-tapping in Himachal Pradesh

    In December 2012, the government changed in Himachal Pradesh. The Bharatiya Janata Party (BJP) went out of power, and the Indian National Congress (INC) came into power. One of the first things that Chief Minister Virbhadra Singh did, within hours of taking his oath as Chief Minister on December 25, 2012, was to get a Special Investigation Team (SIT) to investigate phone tapping during the BJP government’s tenure.

    On December 25th and 26th, 12 hard disk drives were seized from the offices of the Crime Investigation Department (CID) and the Vigilance Department (which is supposed to be an oversight mechanism over the rest of the police). These hard disks showed that 13711 phone numbers were targetted and hundreds of thousands of phone conversations were recorded. These included conversations of prominent leaders “mainly of” the INC but also from the BJP, including three former cabinet ministers and close relatives of multiple chief ministers, a journalist, and many senior police officials, including the Director General of Police.

    Violations of the Law

    While the law required the state’s Home Secretary to grant permission for each person that was being tapped, the Home Secretary had legitimately only granted permission in 342 cases. This leaves over a thousand cases where phones were tapped illegally, in direct violation of the law. The oversight mechanism provided in the law, namely the Review Committee under Rule 419A of the Indian Telegraph Rules, was utterly powerless to check this. Indeed, the internal checks for the police, namely the Vigilance Department, also seems to have failed spectacularly.

    Every private telecom company cooperated in this unlawful surveillance, even though the people who were conducting it did so without proper legal authority. Clearly we need to revise our interception rules to ensure that these telecom companies do not cooperate unless they are served with an order digitally signed by the Home Secretary.

    While all interception recordings are required to be destroyed within 6 months as per Rule 419A of the Indian Telegraph Rules, that rule was also evidently ignored and conversations going back to 2009 were being stored.

    Concluding Concerns

    What should concern us is not merely that such a large number of politicians/police officers were tapped, but that no criminal charges were brought about on the basis of these phone taps, indicating that much of it was being used for political purposes.

    What should concern us is that the requirement under Section 5 of the Indian Telegraph Act, which covers phone taps, of the existence of a “public emergency” or endangerment of “public safety”, which is a prerequisite of phone taps as per the law and as emphasised by the Supreme Court in 1996 in the PUCL judgment, were blatantly ignored.

    What should concern us is that it took a change in government to actually uncover this sordid tale.


    1. 1385 according to a Hindustan Times report [1]: http://indiatoday.intoday.in/story/himachal-pradesh-police-registers-first-fir-in-phone-tapping-scandal/1/285698.html

    2. A Zee News report states 34 while it’s 171 according to a Mail Today report

    Sub Tracks

    by Prasad Krishna last modified Dec 11, 2013 10:08 AM

    PDF document icon Sub Tracks for Discussion-1.pdf — PDF document, 327 kB (334958 bytes)

    Consilience Speakers Profile

    by Prasad Krishna last modified Dec 11, 2013 10:11 AM

    PDF document icon Consilience 2013-14 Speakers Profiles-1.pdf — PDF document, 451 kB (462811 bytes)

    Brochures from Expos on Smart Cards, e-Security, RFID & Biometrics in India

    by Maria Xynou last modified Dec 26, 2013 05:24 AM
    Electronics Today organised a series of expos on smart cards, e-security, RFID and biometric technology in Delhi on 16-18 October 2013. The Centre for Internet and Society is sharing the brochures it collected from these public expos for research purposes.

    In Pragati Maidan, New Delhi, many companies from India and abroad gathered to exhibit their products at the following expos which were organised by Electronics Today (India's first electronic exhibition organiser) on 16-18 October 2013:

    • SmartCards Expo 2013
    • e-Security Expo 2013
    • RFID Expo 2013
    • Biometrics Expo 2013

    The Centre for Internet and Society (CIS) attended these exhibitions for research purposes and is sharing the publicly available brochures it gathered through the attached zip file. The use of these brochures constitutes Fair Use.

    by Maria Xynou last modified Dec 26, 2013 05:23 AM

    ZIP archive icon Brochures.zip — ZIP archive, 78400 kB (80282581 bytes)

    Big Brother is watching you

    by Chinmayi Arun last modified Jan 06, 2014 09:31 AM
    India has no requirements of transparency whether in the form of disclosing the quantum of interception or in the form of notification to people whose communication was intercepted.

    The article by Chinmayi Arun was published in the Hindu on January 3, 2014.


    The Gujarat telephone tapping controversy is just one of many kinds of abuse that surveillance systems enable. If a relatively primitive surveillance system can be misused so flagrantly despite safeguards that the government claims are adequate, imagine what is to come with the Central Monitoring System (CMS) and Netra in place.

    News reports indicate Netra — a “NEtwork TRaffic Analysis system” — will intercept and examine communication over the Internet for keywords like “attack,” “bomb,” “blast” or “kill.” While phone tapping and the CMS monitor specific targets, Netra is vast and indiscriminate. It appears to be the Indian government’s first attempt at mass surveillance rather than surveillance of predetermined targets. It will scan tweets, status updates, emails, chat transcripts and even voice traffic over the Internet (including from platforms like Skype and Google Talk) in addition to scanning blogs and more public parts of the Internet. Whistle-blower Edward Snowden said of mass-surveillance dragnets that “they were never about terrorism: they’re about economic spying, social control, and diplomatic manipulation. They’re about power.”

    So far, our jurisprudence has dealt with only targeted surveillance; and even that in a woefully inadequate manner. This article discusses the slow evolution of the right to privacy in India, highlighting the context and manner in which it is protected. It then discusses international jurisprudence to demonstrate how the right to privacy might be protected more effectively.

    Privacy and the Constitution

    A proposal to include the right to privacy in the Constitution was rejected by the Constituent Assembly with very little debate. Separately, a proposal to give citizens an explicit fundamental right against unreasonable governmental search and seizure was also put before the Constituent Assembly. This proposal was supported by Dr. B.R. Ambedkar. If accepted, it would have included within our Constitution the principles from which the United States derives its protection against state surveillance. However, the proposed amendment was rejected by the Constituent Assembly.

    Fortunately, the Supreme Court has gradually been reading the right to privacy into the fundamental rights explicitly listed in the Constitution. After its initial reluctance to affirm the right to privacy in the 1954 case of M.P. Sharma vs. Satish Chandra, the court came around to the view that other rights and liberties guaranteed in the Constitution would be seriously affected if the right to privacy was not protected. In Kharak Singh vs. The State of U.P., the court recognised “the right of the people to be secure in their persons, houses, papers, and effects” and declared that their right against unreasonable searches and seizures was not to be violated. The right to privacy here was conceived around the home, and unauthorised intrusions into homes were seen as interference with the right to personal liberty.

    If the Kharak Singh judgment was progressive in its recognition of the right to privacy, it was conservative about the circumstances in which the right applies. The majority of judges held that shadowing a person could not be seen to interfere with that person’s liberty. Dissenting with the majority, Justice Subba Rao maintained that broad surveillance powers put innocent citizens at risk, and that the right to privacy is an integral part of personal liberty. He recognised that when a person is shadowed, her movements will be constricted, and will certainly not be free movements. His dissenting judgment showed remarkable foresight and his reasoning is consistent with what is now a universally acknowledged principle that there is a “chilling effect” on expression and action when people think that they are being watched.

    The right to privacy as defined by the Supreme Court now extends beyond government intrusion into private homes. After Govind vs. State of M.P., and Dist. Registrar and Collector of Hyderabad vs. Canara Bank, this right is seen to protect persons and not places. Any inroads into this right for surveillance of communication must be for permissible reasons and according to just, fair and reasonable procedure. State action in violation of this procedure is open to a constitutional challenge.

    Our meagre procedural safeguards against phone tapping were introduced in PUCL vs. Union of India (1997) after the Supreme Court was confronted with extensive, undocumented phone tapping by the government. The apex court found itself compelled to lay down what it saw as bare minimum safeguards, consisting mostly of proper record-keeping and internal executive oversight by senior officers such as the home secretary, the cabinet secretary, the law secretary and the telecommunications secretary. These safeguards are of little use since they are opaque and rely solely on members of the executive to review surveillance requests.

    Right and safeguards

    There is a difference between targeted surveillance in which reasons have to be given for surveillance of particular people, and the mass-surveillance which Netra sets up. The question of mass surveillance and its attendant safeguards has been considered by the European Court of Human Rights in Liberty and Others vs. the United Kingdom. Drawing upon its own past jurisprudence, the European Court insisted on reasonable procedural safeguards. It stated quite clearly that there are significant risks of arbitrariness when executive power is exercised in secret and that the law should be sufficiently clear to give citizens an adequate indication of the circumstances in which interception might take place. Additionally, the extent of discretion conferred and the manner of its exercise must be clear enough to protect individuals from arbitrary interference. The principles laid down by the European Court in relation to phone-tapping also require that the nature of the offences which may give rise to an interception order, the procedure to be followed for examining, using and storing the data obtained, the precautions to be taken when communicating the data to other parties, and the circumstances in which recordings may or must be erased or the tapes destroyed be made clear.

    Opaque and ineffective

    Our safeguards apply only to targeted surveillance, and require written requests to be provided and reviewed before telephone tapping or Internet interception is carried out. CMS makes the process of tapping more prone to misuse by the state, by making it even more opaque: if the state can intercept communication directly, without making requests to a private telecommunication service provider, then it is one less layer of scrutiny through which the abuse of power can reach the public. There is no one to ask whether the requisite paperwork is in place or to notice a dramatic increase in interception requests.

    India has no requirements of transparency whether in the form of disclosing the quantum of interception taking place each year, or in the form of subsequent notification to people whose communication was intercepted. It does not even have external oversight in the form of an independent regulatory body or the judiciary to ensure that no abuse of surveillance systems takes place. Given these structural flaws, the Amit Shah controversy is just the beginning of what is to come. Unfettered mass surveillance does not bode well for democracy.

    (Chinmayi Arun is research director, Centre for Communication Governance, National Law University, Delhi, and fellow, Centre for Internet and Society, Bangalore.)

    Letter requesting public consultation on position of GoI at WGEC

    by Snehashish Ghosh last modified Jan 08, 2014 06:36 PM
    Snehashish Ghosh on behalf of the Centre for Internet and Society sent a letter to the Ministry of Communication and Information Technology, requesting for a public consultation on India's position at the Working Group on Enhanced Cooperation (WGEC).

    January 3, 2014

    Shri Kapil Sibal,
    Honourable Minister for Communication and Information Technology
    Ministry of Communication and Information Technology,
    Government of India

    Subject: Public consultation at the domestic level on the position of Government of India at WGEC

    Dear Sir,

    We at the Centre for Internet and Society, Bangalore (“CIS”) commend, Government of India’s participation at the Working Group on Enhanced Cooperation (WGEC), working under the aegis of United Nations Commission on Science and Technology and Development (CSTD). The Working Group was set up in pursuance of General Assembly Resolution A/Res/67/195, to identify a shared understanding of enhanced cooperation on public policy issues pertaining to the internet. The WGEC after its first meeting circulated a questionnaire to collect the views and positions of the stakeholders on various aspects of enhanced cooperation. The Government of India responded to the questionnaire and also represented its position at the second meeting of WGEC held in Geneva from November 6-8, 2013. We would like the Government to take cognizance of representations from concerned stakeholders before finalizing its position.

    In this regard, we would like to note, Government of India’s commitment towards multi-stakeholder approach in formulation of public policy pertaining to the internet. At the Internet Governance Forum, 2012 held in Baku, the Honourable Minister for Communications and Information Technology noted that the “issues of public policy related to the internet have to be dealt with, by adopting a multi-stakeholder, democratic and transparent approach”. Furthermore, the Government of India’s stand at the World Conference on International Telecommunications, 2012 in Dubai supported and recognized the multi-stakeholder nature of the internet.

    However, it seems that the Government has digressed from its previous stand on internet governance whereas it fell short of having a multi-stakeholder public consultation on India’s position on enhanced cooperation at the WGEC. We earnestly urge you to hold domestic public consultation before the next WGEC meeting.

    Thank you.
    Sincerely,

    Snehashish Ghosh,
    Policy Associate,
    Centre for Internet and Society, Bangalore

    Copied to: Dr. Ajay Kumar, Joint Secretary, DietY, MOCIT and Shri. J. Satyanarayana, Secretary, DietY, MOCIT


    Download a copy of the letter here

    Letter on WGEC

    by Prasad Krishna last modified Jan 07, 2014 09:12 AM

    PDF document icon Letter on WGEC.pdf — PDF document, 315 kB (323404 bytes)

    Internet Monitor

    by Prasad Krishna last modified Jan 09, 2014 07:33 AM
    Malavika's piece on India's Identity Crisis is published in this report.

    PDF document icon SSRN-id2366840.pdf — PDF document, 7223 kB (7396414 bytes)

    India's Identity Crisis

    by Malavika Jayaram last modified Jan 09, 2014 07:56 AM
    Malavika Jayaram's article was published in 2013 Internet Monitor Annual Report: Reflections on the Digital World, published by Harvard's Berkman Center for Internet and Society.

    India’s Unique Identity (UID) project is already the world’s largest biometrics identity program, and it is still growing. Almost 530 million people have been registered in the project database, which collects all ten fingerprints, iris scans of both eyes, a photograph, and demographic information for each registrant. Supporters of the project tout the UID as a societal game changer. The extensive biometric information collected, they argue, will establish the uniqueness of each individual, eliminate fraud, and provide the identity infrastructure needed to develop solutions for a range of problems. Despite these potential benefits, however, critical concerns remain about the UID’s legal and physical architecture as well as about unforeseen risks associated with the linking and analysis of personal data.

    The most basic concerns regarding the UID project stem from the fact that biometric technologies have never been tested on such a large population. As a result, well-founded concerns exist around scalability, false acceptance and rejection rates, and the project’s core premise that biometrics can uniquely and unambiguously identify people in a foolproof manner. Some of these concerns are based on technical issues—collecting fingerprints and iris scans “in the field,” for instance, can be complicated when a registrant’s fingerprints are eroded by manual labor or her irises are affected by malnutrition and cataracts. Other concerns relate to the project’s federated implementation architecture, which, by outsourcing collection to a massive group of private and public registrars and operators, increases the chance for data breaches, error, and fraud.

    Perhaps even more vexing are concerns regarding how the UID, which promises financial inclusion (by reducing the identification barriers to opening bank accounts, for example), might in fact lead to new types of exclusion for already marginalized groups. Members of the LGBT community, for instance, question whether the inclusion of the transgender category within the UID scheme is a laudable attempt at inclusion, or a new means of listing and targeting members of their community for exclusion. More fundamentally, as more and more services and benefits are linked to the UID, the project threatens to exclude all those who cannot or will not participate in the scheme due to logistical failures or philosophical objections.

    It is worth noting that the UID is not the only large data project in India. A slew of “Big Brother” projects exist: the Centralised Monitoring System (CMS), the Telephone Call Interception System (TCIS), the National Population Register (NPR), the Crime and Criminal Tracking Network and Systems (CCTNS), and the National Intelligence Grid (NATGRID), which is working to aggregate up to 21 different databases relating to tax, rail and air travel, credit card transactions, immigration, and other domains. The UID is intended to serve as a common identifier across these databases, creating a massive surveillance state. It also facilitates an ecosystem where access to goods and services, from government subsidies to drivers’ licenses to mobile phones to cooking gas, increasingly requires biometric authentication.

    The UID project was originally vaunted as voluntary, but the inexorable slippery slope toward compulsory participation has triggered a series of lawsuits challenging the legality of forced enrollment and the constitutionality of the entire project. Most recently, in September 2013, India’s federal Supreme Court affirmed by way of an interim decision that the UID was not mandatory, that not possessing a UID should not disadvantage anybody, and that citizenship should be ascertained as a criteria for registering in order to ensure that UIDs are not issued to illegal immigrants. This last stipulation is particularly thorny given that the Unique Identification Authority of India (UIDAI, the body in charge of the UID project) has consistently distanced the UID from questions of citizenship under the justification that it is a matter beyond their remit (i.e., the UID is open to residents, and is not linked to citizenship). The government moved quickly to urge a modification of the order, but the Supreme Court declined to do so and will instead release its final decision after it reviews a batch of petitions from activists and others. The UIDAI approached the court, arguing that not making the UID mandatory has serious consequences for welfare schemes, but the court recently ordered the federal government, the Reserve Bank of India, and the Election Commission to delink the LPG cooking gas scheme from the UID. This is a considerable setback for the project, given that this was one of the most hyped linkages for the UID. It remains to be seen whether the court will similarly halt other attempts to make the UID mandatory.

    In the meantime, the UID project is effectively being implemented in a legal vacuum without support from the Supreme Court or Parliament. The Cabinet is seeking to rectify this and has cleared a bill that would finally provide legal backing for the UID program—its previous attempt was rejected by the Standing Committee on Finance in 2010. This bill is scheduled to come up for debate during the winter session of Parliament. The bill’s progress, along with the final decision of the Supreme Court, will have far reaching consequences for the UID project’s implementation and longevity, as well as for the relationship between India’s citizens and the state.

    If fully implemented, the UID system will fundamentally alter the way in which citizens interact with the government by creating a centrally controlled, technology-based standard that mediates access to social services and benefits, financial systems, telecommunications, and governance. It will undoubtedly also have implications for how citizens relate to private sector entities, on which the UID rests and which have their own vested interests in the data. The success or failure of the UID represents a critical moment for India. Whatever course the country takes, its decision to travel further toward or turn away from becoming a “database nation” will have implications for democracy, free speech, and economic justice within its own borders and also in the many neighboring countries that look to it as a technological standard bearer.

    The Indian government seems to envision “big data” as a panacea for fraud, corruption, and abuse, but it has given little attention to understanding and addressing the fraud, corruption, and abuse that massive databases can themselves engender. The government’s actions have yet to demonstrate an appreciation for the fact that the matrix of identity and surveillance schemes it has implemented can create a privacy-invading technology layer that is not only a barrier to online activity but also to social participation writ large.

    The lack of identification documents for a large portion of the Indian population does need to be addressed. Whether the UID project is the best means to do this—whether it has the right architecture and design, whether it can succeed without an overhaul of several other failures of governmental institutions, and whether fixing the identity piece alone causes more harm than good—should be the subject of intense debate and scrutiny. Only through rigorous threat modeling and analysis of the risks arising out of this burgeoning “data industrial complex” can steps be taken to stem the potential repercussions of the project not just for identity management, fraud, corruption, distributive justice, and welfare generally, but also for autonomy, openness, and democracy.


    Click to download the article published in the annual report of Berkman's Center for Internet and Society (PDF 7223 Kb)

    Surveillance and the Indian Constitution - Part 1: Foundations

    by Pranesh Prakash last modified Jan 23, 2014 03:12 PM
    In this insightful seven-part series, Gautam Bhatia looks at surveillance and the right to privacy in India from a constitutional perspective, tracing its genealogy through Supreme Court case law and compares it with the law in the USA.

    Note: This was originally posted on the Indian Constitutional Law and Philosophy blog.


    On previous occasions, we have discussed the ongoing litigation in ACLU v. Clapper in the United States, a challenge to the constitutionality of the National Security Agency’s (NSA) bulk surveillance program. Recall that a short while after the initial Edward Snowden disclosures, The Hindu revealed the extent of domestic surveillance in India, under the aegis of the Central Monitoring System (CMS). The CMS (and what it does) is excellently summarized here. To put thing starkly and briefly:

    “With the C.M.S., the government will get centralized access to all communications metadata and content traversing through all telecom networks in India. This means that the government can listen to all your calls, track a mobile phone and its user’s location, read all your text messages, personal e-mails and chat conversations. It can also see all your Google searches, Web site visits, usernames and passwords if your communications aren’t encrypted.”

    The CMS is not sanctioned by parliamentary legislation. It also raises serious privacy concerns. In order to understand the constitutional implications, therefore, we need to investigate Indian privacy jurisprudence. In a series of posts, we plan to discuss that.

    Privacy is not mentioned in the Constitution. It plays no part in the Constituent Assembly Debates. The place of the right – if it exists – must therefore be located within the structure of the Constitution, as fleshed out by judicial decisions. The first case to address the issue was M. P. Sharma v. Satish Chandra, in 1954. In that case, the Court upheld search and seizure in the following terms:

    "A power of search and seizure is in any system of jurisprudence an overriding power of the State for the protection of social security and that power is necessarily regulated by law. When the Constitution makers have thought fit not to subject such regulation to Constitutional limitations by recognition of a fundamental right to privacy, analogous to the American Fourth Amendment, we have no justification to import it, into a totally different fundamental right. by some process of strained construction."

    The right in question was 19(1)(f) – the right to property. Notice here that the Court did not reject a right to privacy altogether – it only rejected it in the context of searches and seizures for documents, the specific prohibition of the American Fourth Amendment (that has no analogue in India). This specific position, however, would not last too long, and was undermined by the very next case to consider this question, Kharak Singh.

    In Kharak Singh v. State of UP, the UP Police Regulations conferred surveillance power upon certain “history sheeters” – that is, those charged (though not necessarily convicted) of a crime. These surveillance powers included secret picketing of the suspect’s house, domiciliary visits at night, enquiries into his habits and associations, and reporting and verifying his movements. These were challenged on Article 19(1)(d) (freedom of movement) and Article 21 (personal liberty) grounds. It is the second ground that particularly concerns us.

    As a preliminary matter, we may observe that the Regulations in question were administrative – that is, they did not constitute a “law”, passed by the legislature. This automatically ruled out a 19(2) – 19(6) defence, and a 21 “procedure established by law” defence – which were only applicable when the State made a law. The reason for this is obvious: fundamental rights are extremely important. If one is to limit them, then that judgment must be made by a competent legislature, acting through the proper, deliberative channels of lawmaking – and not by mere administrative or executive action. Consequently – and this is quite apart from the question of administrative/executive competence - if the Police Regulations were found to violate Article 19 or Article 21, that made them ipso facto void, without the exceptions kicking in. (Paragraph 5)

    It is also important to note one other thing: as a defence, it was expressly argued by the State that the police action was reasonable and in the interests of maintaining public order precisely because it was “directed only against those who were on proper grounds suspected to be of proved anti-social habits and tendencies and on whom it was necessary to impose some restraints for the protection of society.” The Court agreed, observing that this would have “an overwhelming and even decisive weight in establishing that the classification was rational and that the restrictions were reasonable and designed to preserve public order by suitable preventive action” if there had been a law in the first place, which there wasn’t. Thus, this issue itself was hypothetical, but what is crucial to note is that the State argued – and the Court endorsed – the basic idea that what makes surveillance reasonable under Article 19 is the very fact that it is targeted – targeted at individuals who are specifically suspected of being a threat to society because of a history of criminality.

    Let us now move to the merits. The Court upheld secret picketing on the ground that it could not affect the petitioner’s freedom of movement since it was, well secret – and what you don’t know, apparently, cannot hurt you. What the Court found fault with was the intrusion into the petitioner’s dwelling, and knocking at his door late at night to wake him up. The finding required the Court to interpret the meaning of the term “personal liberty” in Article 21. By contrasting the very specific rights listed in Article 21, the Court held that:

    Is then the word “personal liberty” to be construed as excluding from its purview an invasion on the part of the police of the sanctity of a man’s home and an intrusion into his personal security and his right to sleep which is the normal comfort and a dire necessity for human existence even as an animal? It might not be inappropriate to refer here to the words of the preamble to the Constitution that it is designed to “assure the dignity of the individual” and therefore of those cherished human value as the means of ensuring his full development and evolution. We are referring to these objectives of the framers merely to draw attention to the concepts underlying the constitution which would point to such vital words as “personal liberty” having to be construed in a reasonable manner and to be attributed that these which would promote and achieve those objectives and by no means to stretch the meaning of the phrase to square with any preconceived notions or doctrinaire constitutional theories.” (Paragraph 16)

    A few important observations need to be made about this paragraph. The first is that it immediately follows the Court’s examination of the American Fifth and Fourteenth Amendments, with their guarantees of “life, liberty and property…” and is, in turn, followed by the Court’s examination of the American Fourth Amendment, which guarantees the protection of a person’s houses, papers, effects etc from unreasonable searches and seizures. The Court’s engagement with the Fourth Amendment is ambiguous. It admits that “our Constitution contains no like guarantee…”, but holds that nonetheless these extracts [from the 1949 case, Wolf v Colorado] would show that an unauthorised intrusion into a person’s home and the disturbance caused to him thereby, is as it were the violation of a common law right of a man – an ultimate essential of ordered liberty”, thus tying its own holding in some way to the American Fourth Amendment jurisprudence. But here’s the crucial thing: at this point, American Fourth Amendment jurisprudence was propertarian based – that is, the Fourth Amendment was understood to codify – with added protection – the common law of trespass, whereby a man’s property was held sacrosanct, and not open to be trespassed against. Four years later, in 1967, in Katz, the Supreme Court would shift its own jurisprudence, to holding that the Fourth Amendment protected zones where persons had a “reasonable expectation of privacy”, as opposed to simply protecting listed items of property (homes, papers, effects etc). Kharak Singh was handed down before Katz. Yet the quoted paragraph expressly shows that the Court anticipated Katz, and in expressly grounding the Article 21 personal liberty right within the meaning of dignity, utterly rejected the propertarian-tresspass foundations that it might have had. To use a phrase invoked by later Courts – in this proto-privacy case, the Court already set the tone by holding it to attach to persons, not places.

    While effectively finding a right to privacy in the Constitution, the Court expressly declined to frame it that way. In examining police action which involved tracking a person’s location, association and movements, the Court upheld it, holding that “the right of privacy is not a guaranteed right under our Constitution and therefore the attempt to ascertain the movements of an individual which is merely a manner in which privacy is invaded is not an infringement of a fundamental right guaranteed by Part III.”

    The “therefore” is crucial. Although not expressly, the Court virtually holds, in terms, that tracking location, association and movements does violate privacy, and only finds that constitutional because there is no guaranteed right to privacy within the Constitution. Yet.

    In his partly concurring and partly dissenting opinion, Subba Rao J. went one further, by holding that the idea of privacy was, in fact, contained within the meaning of Article 21: “it is true our Constitution does not expressly declare a right to privacy as a fundamental right, but the said right is an essential ingredient of personal liberty.” Privacy he defined as the right to “be free from restrictions or encroachments on his person, whether those restrictions or encroachments are directly imposed or indirectly brought about by calculated measures.” On this ground, he held all the surveillance measures unconstitutional.

    Justice Subba Rao’s opinion also explored a proto-version of the chilling effect. Placing specific attention upon the word “freely” contained within 19(1)(d)’s guarantee of free movment, Justice Subba Rao went specifically against the majority, and observed:

    “The freedom of movement in clause (d) therefore must be a movement in a free country, i.e., in a country where he can do whatever he likes, speak to whomsoever he wants, meet people of his own choice without any apprehension, subject of course to the law of social control. The petitioner under the shadow of surveillance is certainly deprived of this freedom. He can move physically, but he cannot do so freely, for all his activities are watched and noted. The shroud of surveillance cast upon him perforce engender inhibitions in him and he cannot act freely as he would like to do. We would, therefore, hold that the entire Regulation 236 offends also Art. 19(1)(d) of the Constitution.”

    This early case, therefore, has all the aspects that plague the CMS today. What to do with administrative action that does not have the sanction of law? What role does targeting play in reasonableness – assuming there is a law? What is the philosophical basis for the implicit right to privacy within the meaning of Article 21’s guarantee of personal liberty? And is the chilling effect a valid constitutional concern?

    We shall continue with the development of the jurisprudence in the next post.


    You can follow Gautam Bhatia on Twitter

    Electoral Databases – Privacy and Security Concerns

    by Snehashish Ghosh last modified Jan 16, 2014 11:07 AM
    In this blogpost, Snehashish Ghosh analyzes privacy and security concerns which have surfaced with the digitization, centralization and standardization of the electoral database and argues that even though the law provides the scope for protection of electoral databases, the State has not taken any steps to ensure its safety.

    The recent move by the Election Commission of India (ECI) to tie-up with Google for providing electoral look-up services for citizens and electoral information services has faced heavy criticism on the grounds of data security and privacy.[i] After due consideration, the ECI has decided to drop the plan.[ii]

    The plan to partner with Google has led to much apprehension regarding Google gaining access to the database of 790 million voters including, personal information such as age, place of birth and residence. It could have also gained access to cell phone numbers and email addresses had the voter chosen to enroll via the online portal on the ECI website.  Although, the plan has been cancelled, it does not necessarily mean that the largest database of citizens of India is safe from any kind of security breach or abuse. In fact, the personal information of each voter in a constituency can be accessed by anyone through the ECI website and the publication of electoral rolls is mandated by the law.

    Publication of Electoral Rolls
    The electoral roll essentially contains the name of the voter, name of the relationship (son of/wife of, etc.), age, sex, address and the photo identity card number. The main objective of creation and maintenance of electoral rolls and the issue of Electoral Photo Identity Card (EPIC) was to ensure a free and fair election where the voter would have been  able to cast his own vote as per his own choice. In other words, the main purpose of the exercise was to curtail bogus voting. This is achieved by cross referencing the EPIC with the electoral roll.

    The process of creation and maintenance of electoral rolls is governed by the Registration of Electors Rules, 1960. Rule 22 requires the registration officer to publish the roll with list of amendments at his office for inspection and public information. Furthermore, ECI may direct the registration officer to send two copies of the electoral roll to every political party for which a symbol has exclusively been reserved by the ECI. It can be safely concluded that the electoral roll of a constituency is a public document[iii] given that the roll is published and can be circulated on the direction of the ECI.

    With the computational turn, in 1998 the ECI took the decision to digitize the electoral databases. Furthermore, printed electoral rolls and compact discs containing the rolls are available for sale to general public.[iv] In addition to that, the electoral rolls for the entire country are available on the ECI website.[v] However, the current database is not uniform and standardized, and entries in some constituencies are available only in the local language. The ECI has taken steps to make the database uniform, standardized and centralized.[vi]

    Security Concerns
    The Registration of Electoral Rules, 1960 is an archaic piece of delegated legislation which is still in force and casts a statutory duty on the ECI to publish the electoral rolls. The publication of electoral rolls is not a threat to security when it is distributed in hard copies and the availability of electoral rolls is limited. The security risks emerge only after the digitization of electoral database, which allows for uniformity, standardization and centralization of the database which in turn makes it vulnerable and subject to abuse. The law has failed to evolve with the change in technology.

    In a recent article, Bill Davidow analyzes "the dark side of Moore’s Law" and argues that with the growth processing power there has been a growth in surveillance capabilities and on this note the article is titled, “With Great Computing Power Comes Great Surveillance”[vii] Drawing from Davidow’s argument, with the exponential growth in computing power, search has become convenient, faster and cheap. A uniform, standardized and centralized database bearing the personal information of 790 million voters can be searched and categorized in accordance with the search terms. The personal information of the voters can be used for good, but it can be equally abused if it falls into the wrong hands. Big data analysis or the computing power makes it easier to target voters, as bits and pieces of personal information give a bigger picture of an individual, a community, etc. This can be considered intrusive on individual’s privacy since the personal information of every voter is made available in the public domain

    For example, the availability of a centralized, searchable database of voters along with their age would allow the appropriate authorities to identify wards or constituencies, which has a high population of voters above the age of 65. This would help the authority to set up polling booths at closer location with special amenities. However, the same database can be used to search for density of members of a particular community in a ward or constituency based on the name, age, sex of the voters. This information can be used to disrupt elections, target vulnerable communities during an election and rig elections.

    Current IT Laws does not mandate the protection of the electoral database
    A centralized electoral database of the entire country can be considered as a critical information infrastructure (CII) given the impact it may have on the election which is the cornerstone of any democracy. Under Section 70 of the Information Technology Act, 2000 (IT Act) CII means “the computer resource, incapacitation or destruction of which, shall have debilitating impact on national security, economy.”[viii] However, the appropriate Government has not notified the electoral database as a protected system[ix]. Therefore, information security practices and procedures for a protected system are not applicable to the electoral database.

    The Information Technology Rules (IT Rules) are also not applicable to electoral databases, per se. Since, ECI is not a body corporate, the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information), Rules, 2011 (hereinafter Reasonable Security Practices Rules) do not apply to electoral databases. Ignoring that Reasonable Security Practices Rules only apply to a body corporate, the electoral database does fall within the ambit of definition of “personal information”[x] and should arguably be made subject to the Rules.

    The intent of the ECI for hosting the entire country’s electoral database online inter alia is to provide electronic service delivery to the citizens. It seeks to provide “electoral look up services for citizens ... for better electoral information services.”[xi] However, the Information Technology (Electronic Service Delivery) Rules, 2011 are not applicable to the electoral database given that it is not notified by the appropriate Government as a service to be delivered electronically. Hence, the encryption and security standards for electronic service delivery are not applicable to electoral rolls.

    The IT Act and the IT Rules provide a reasonable scope for the appropriate Government to include electoral databases within the ambit of protected system and electronic service delivery. However, the appropriate government has not taken any steps to notify electoral database as protected system or a mode of electronic service delivery under the existing laws.

    Conclusion
    Publication of electoral rolls is a necessary part of an election process. It ensures free and fair election and promotes transparency and accountability. But unfettered access to electronic electoral databases may have an adverse effect and would endanger the very goal it seeks to achieve because the electronic database may pose threat to privacy of the voters and also lead to security breach.  It may be argued that the ECI is mandated by the law to publish the electoral database and hence, it is beyond the operation of the IT Act. But Section 81 of the IT Act has an overriding effect on any law inconsistent, therewith. The appropriate Government should take necessary steps under the IT Act and notify electoral databases as a protected system.

    It is recommended that the Electors Registration Rules, 1960 should be amended, taking into account the advancement in technology. Therefore, the Rules should aim at restricting the unfettered electronic access to the electoral database and also introduce purposive limitation on the use of the electoral database. It should also be noted that more adequate and robust data protection and privacy laws should be put in place, which would regulate the collection, use, storage and processing of databases which are critical to national security.


    [i] Pratap Vikram Singh, Post-uproar, EC’s Google tie-up plan may go for a toss, Governance Now, January 7, 2014 available at http://www.governancenow.com/news/regular-story/post-uproar-ecs-google-tie-plan-may-go-toss

    [ii] Press Note No.ECI/PN/1/2014, Election Commission of India , January 9, 2014 available at http://eci.nic.in/eci_main1/current/PN09012014.pdf

    [iii] Section 74, Indian Evidence Act, 1872

    [vi] “At present, in most States and UTs the Electoral Database is kept at the district level. In some cases it is kept even with the vendors. In most States/UTs it is maintained in MS Access, while in some cases it is on a primitive technology like FoxPro and in some other cases on advanced RDBMS like Oracle or Sql Server. The database is not kept in bilingual form in some of the States/UTs, despite instructions of the Commission. In most cases Unicode fonts are not used. The database structure not being uniform in the country, makes it almost impossible for the different databases to talk to each other” –  Election Commission of India, Revision of Electoral Rolls with reference to 01-01-2010 as the qualifying date – Integration and Standardization of the database- reg., No. 23/2009-ERS, January 6, 2010 available at eci.nic.in/eci_main/eroll&epic/ins06012010.pdf

    [viii] Section 70, Information Technology Act, 2000

    [ix] Computer resource which directly or indirectly affects the facility of Critical Information Infrastructure

    [x] Rule 2(1)(i), Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011

    [xi] Press Note No.ECI/PN/1/2014, Election Commission of India , January 9, 2014 available at http://eci.nic.in/eci_main1/current/PN09012014.pdf

    GNI Assessment Finds ICT Companies Protect User Privacy and Freedom of Expression

    by Elonnai Hickok last modified Jan 20, 2014 06:17 AM
    Elonnai Hickok analyses a public report recently published by GNI on the independent assessment process for Google, Microsoft, and Yahoo. The report finds Google, Microsoft, and Yahoo to be in compliance with the GNI principles on privacy and freedom of expression.

    Introduction

    In January 2014, the Global Network Initiative (GNI) published the Public Report on the Independent Assessment Process for Google, Microsoft, and Yahoo. GNI is an industry consortium that was started in 2008 with the objective of protecting user’s right to privacy and freedom of expression globally. The main objectives of GNI are to provide a framework for companies that is based on international standards, ensure accountability of ICT companies through independent assessments, create opportunities for policy engagement, and create opportunities for stakeholders from multiple jurisdictions to engage in dialogue with each other. The Centre for Internet and Society, Bangalore, is a member of GNI. Companies based in India have yet to join as members to the GNI network.

    Overview of the Public Report

    The Public Report provides an overview of assessments completed on the practices and policies of Google, Yahoo, and Microsoft from 2011 - 2013 to measure company compliance with the GNI principles on freedom of expression and privacy. The principles lay out broad guidelines that member companies  should seek to incorporate in their internal and external practices and speak to freedom of expression, privacy, responsible company decision making, multi – stakeholder collaboration, and organizational governance, accountability, and transparency. The GNI principles have also been developed with Implementation Guidelines to provide companies with a framework for companies to respond to government requests. The assessment carried out by GNI reviewed cases in each company pertaining to governmental: blocking and filtering, takedown requests, criminalization of speech, intermediary liability, selective enforcement, content surveillance, and requests for user information.

    Importantly, the assessment undertaken by GNI finds Yahoo, Microsoft, and Google to be in compliance with the GNI principles on freedom of expression and privacy. The Report highlights practices by the companies that work to protect freedom of expression and privacy such as conducting human rights impact assessments, issuing transparency reports, and notifying affected users when content is removed, have been, adopted by these companies. For example, Google conducts Human Rights Impact Assessments to assess potential threats to freedom of expression and privacy. Google also has in place internal processes to review governmental requests impacting freedom of expression and privacy, and the legal team at Google prepares a “global removal report” to provide a bird’s eye view of trends emerging from content removal requests. If Google has the email address of a user who’s posted content is removed, Google will often notify the user and directs the user to the Chilling Effects website. Google has also published a transparency report since 2010. Like Google, Microsoft conducts Human Rights Impact Assessments before making decisions on whether to incorporate certain features into its platforms when operating in high risk markets. Microsoft has also issued two global law enforcement requests reports in 2013. Yahoo has established a Business and Human Rights Program to ensure responsible actions are taken by the company with regards to freedom of expression and privacy, and now issues transparency reports about government requests. Yahoo’s Public Policy team also engages in dialogue with governments  on an international level about existing and proposed legislation impacting and implicating privacy and freedom of expression.

    The Report highlights challenges to compliance with the GNI principles that companies face – namely legal restraints and mandates that they are faced with. On the issue of transparency, the assessment found that companies do not disclose information when there are legal prohibitions on such disclosure, when users privacy would be implicated, when companies choose to assert attorney client privilege, and when trade secrets are involved. Despite this, the assessment found that companies do deny and push back on governmental requests impacting freedom of expression and privacy for reasons such as the request needed clarification and modification, or that the request needed to follow established procedure.

    A number of findings came out of the assessments undertaken for the Report including:

    1. As demonstrated by the lack of ability to access information about secret national security requests, and the lack of ability for companies to disclose information on this topic there is a dire need for governments to reform surveillance policy and law impacting freedom of expression and privacy.
    2. The implementation of the GNI Principles is challenging when a company is undergoing an acquisition. In this scenario, contractual provisions limiting third party disclosure are critical in ensuring protection of privacy and free expression rights.
    3. Companies need to pro-actively and on an ongoing basis internally review governmental restrictions on content to determine if it is in compliance with the commitment made by that company to the GNI Principles.

    The assessment resulted in GNI defining a number of actionable (non-binding) recommendations for companies such as:

    • Improving the integration of human rights considerations in the due diligence process with respect to the acquiring and selling companies.
    • Consider the impact of hardware on freedom of expression and privacy.
    • Improve external and internal reporting.
    • Review employee access to user data to ensure that employee access rights are restricted by both policy and technical measures on a ‘need to know’ basis across global operations.
    • Review executive management training.
    • Improve stakeholder engagement.
    • Improve communication with users.
    • Increase sharing of best practices.
    • The GNI principles are focused on freedom of expression and privacy and are based on internationally recognized laws and standards for human rights.

    NSA leaks, global push for governmental surveillance reform, and the Public Report

    With special attention given to the various companies responses to the NSA leaks, the Report notes that in response to the NSA leaks the assessed companies have issued public statements and filed legal challenges with the US government  and filed suit with the FISA Court seeking the right to disclose data relating to the number of FISA requests received with the public. All three companies have also supported legislation and policy that would allow for such transparency. Furthermore in December 2014, the companies , along with other internet companies, developed and issued the five Principles on Global Government Surveillance Reform.  Similar to other efforts to end mass and disproportionate surveillance, such as the Necessary and Proportionate principles, the Principles on Global Government Surveillance Reform address: Limiting Governments’ Authority to Collect Users’ Information, Oversight and Accountability, Transparency about Government Demands, Respecting the Free Flow of Information, Avoiding Conflicts Among Governments. Other companies that signed these principles include AOL, Facebook, LinkedIn, and Twitter.

    Along these lines, on January 14th, GNI released the statement “Surveillance Reforms to Protect Rights and Restore Trust”, urging the U.S Government to review and enact surveillance legislation that incorporate a ‘rights based’ approach to issues involving national security. In the statement, GNI specifically recommends the Government to action and: end mass collection of communications metadata, protect and uphold the rights of non-Americans, continue to increase transparency of surveillance practices, support the use of strong encryption standards.

    Conclusion and way forward

    Looking ahead, GNI is planning on developing and implementing a mechanism to address effectively address consumer engagement and complaints issued by individuals who feel that GNI member companies have not acted consistently with the commitments made as a GNI member. GNI is also looking to expand work around public policy and surveillance.

    The Public Report on the Independent Assessment Process for Google, Microsoft, and Yahoo is an important step towards ensuring ICT sector companies are accountable to the public in their practices impacting freedom of expression and privacy. The assessment comes at a time when ICT companies often find themselves stuck between a rock and a hard place – with Governments issuing surveillance and censorship demands with mandates for non-disclosure, and the public demanding transparency, company resistance to such demands from the Government, and a strong commitment to users freedom of expression and privacy. Hopefully, the GNI assessment is and will evolve into a middle ground for ICT companies – where they can be accountable to the public and their customers and compliant with Governmental mandates in all jurisdictions that they operate in. It will be interesting to see if in the future Indian companies join GNI as members and being to adopt the GNI principles and undergo GNI assessments.

    Interview with Mathew Thomas from the Say No to UID campaign - UID Court Cases

    by Maria Xynou last modified Jan 27, 2014 12:47 PM
    The Centre for Internet and Society (CIS) recently interviewed Mathew Thomas from the Say No to UID campaign about his ongoing efforts to challenge the UID scheme legally in the Bangalore High Court and Supreme Court of India. Read this interview and gain an interesting insight on recent legal developments with regards to the UID!
    Interview with Mathew Thomas from the Say No to UID campaign - UID Court Cases

    by benoit.crouzet on flickr

    Hi Mathew! We've heard that you've been in court a lot over the last few years with regards to the UID scheme. Could you please tell us about the UID case you have filed?

    In early 2012, I filed a civil suit at the Bangalore Court to declare the UID scheme illegal and to stop further biometric enrollments. I alleged that foreign agencies are involved in the process of biometric enrollment, and that cases of corruption have occurred with regards to the companies contracted by the UID Authority of India (UIDAI). Many dubious companies have been empanelled for biometric enrollments by the UIDAI and many cases of corruption have been noted, especially with regards to the preparation of biometric databases for below poverty line (BPL) ration cards in Karnataka.

    In 2010, according to a government audit report, COMAT Technologies Private Limited had a contract with the Karnataka Government and was required to undertake a door-to-door survey and to set up biometric devices. COMAT Technologies Private Limited was paid ₹ 542.3 million for this purpose, but it turns out that the company did not comply with the terms of the contract and did not fullfill its obligations under the contract. Even though COMAT Technologies Private Limited had been contracted and had been paid ₹ 542.3 million, the company did not hand over any biometric device to the Karnataka Government. Instead, when the company got questioned, it walked away from the contract in 2010, even though it had been paid for a service it did not deliver.

    In the same year, 2010, COMAT Technologies was empanelled as an Enrolling Agency of the UIDAI. COMAT Technologies also carries out enrollments in Mysore and a TV channel sting operation revealed that fake IDs were being issued in the Mysore enrollment center. After much persuasion, the e-Government department of Karnataka informed me that they have filed an FIR. And this is just one case of a corrupt company empanelled as an enrollement agency with the UIDAI. Many similar cases with other companies have occurred in other cities in India, such as Mumbai, where the empanelled agencies have committed fraud and police complaints have been filed. But unfortunately, there is no publicly available information on the state of the investigations.

    As such, I filed a case at the Bangalore Court and stated that the whole UID system is insecure, that it will not achieve the objective of preventing leakages of welfare subsidies and that, therefore, it is a waste of public funds, which also affects individuals' right to privacy and right to life. In my complaint in the civil court I made allegations of corruption and dangers to national security backed by documentary evidence. According to Order 8 of the Civil Procedure Code (CPC), defendants are required to specifically deny each of the allegations against them and if they don't, the court is required to accept the allegations as accurate. According to law, vague, bald denials are not acceptable in courts. Interestingly enough, the defendants in this court case did not deny any of the allegations, but instead stated that they (allegations) are “trivial” and requested the judge to dismiss the case without a trial. The judge requested the defendants to file a written application, asking for the suit to be dismissed under Order 7, Rule 11, of the Civil Procedure Code. Nonetheless, in May 2012, the judge observed that this is a serious case which should not be dismissed and that he would like to have a daily hearing of the case, especially since the case was grounded on the allegation that thousands of crores of rupees of public money are spent every day.

    However, one month later in June 2012, the judge dismissed the case by stating that I did not have a “cause of action” and that the case is not of civil nature under Section 9 of the Code of Civil Procedure. I argued that tax payers have a right to know where their money is going and that we all have a right to privacy and that therefore, I did have a cause for action. I quoted the Supreme Court case setting out the law relating to the meaning of “civil nature”. The Apex court said, “Anything which is not of criminal nature is of civil nature”. I also quoted several court precedents which explained conditions under which complaints could be dismissed under Order VII Rule 11. Unfortunately though, the judge dismissed all of this and suggested that I should take this case to the High Court or to the Supreme Court, since the Bangalore Court did not have the authority to address the violation of fundamental human rights. In my opinion, the fallacy in this judgement was that, on the one hand, the judge stated in his order that there was “no cause for action”, but on the other hand, he said that I should take the case to the High Court or to the Supreme Court! And on top of that, the judge stated that my case was frivolous and levied on me a Rs. 25, 000 fine, because apparently I was “wasting the court's time” !

    In addition to all of this, the judge made a very intriguing statement in his order: he claimed that the biometric enrollment with the UIDAI is voluntary and that therefore I need not enrol. I argued that although the UID is voluntary in theory, it is actually mandatory on many levels, especially since access to many governmental services require enrollment with the UIDAI. Nonetheless, the judge insisted that the UID is purely voluntary and that if I am not happy with the UID, then I should just “stay at home”.

    And how did the case continue thereafter?

    In October 2012 I appealed against this to the High Court by stating that there was a misapplication of Order 7, Rule 11, of the Civil Procedure Code and requested the High Court to send the suit back for trial at the Bangalore Court.

    Now, when you appeal in India, the Court has to issue notices to the opposite party, which are usually sent by registered post. However, nothing was happening, so I filed a number of applications to hear the case. The registrar’s office filed a number of trivial “objections” with which I needed to comply and this took three months, until January 2013. For example, one “objection” was that the lower court order stated the date of the order as "03-07-12", whereas I had mentioned the date as 3 July 2012. Then they would argue that the acknowledgement of the receipt of the notice from the respondents was not received. The High Court is located next to the head post office (GPO) in Bangalore and normally it would be sent there, then directly to the GPO in Delhi and from there to the Planning Commission or to the UIDAI. Yet, the procedure was delayed because apparently the notices weren't sent. In one hearing, the court clerk said that the address of the defendant was wrong and that the address of the Planning Commission should also be included. All in all, it seemed to me like there was some deliberate attempt to delay the procedure and the dismissal of the case by the Bangalore Court seemed very questionable. As a result, in January 2013, I asked the High Court to permit me to personally hand over my appeal to the Government Council. And finally, on 17th December 2013, my appeal was heard by the Bangalore High Court!

    Over the last three months, the defendants have not filed any counter affidavit. Instead, the Government Council came to the High Court and stated that I have not filed a “paper book” (which includes depositions and evidence, among other things). However, the judge stated that this is not a case which requires a “paper book”, since my appeal was about the misapplication of Order 7, Rule 11, of the Civil Procedure Code. Then the Government Council asked for more time to review the appeal and it is has been postponed.

    Have there been any other recent court cases against the UID?

    Yes. While all of this was going on, retired judge, Justice Puttaswamy, filed a petition in the Supreme Court, stating that the UID scheme is illegal, since it violates article 73 of the Constitution. Aruna Roy, who is an activist at the National Council for People’s Right to Information, has also filed a petition where she has questioned the UID because it violates privacy rights and the rights of the poor.

    Furthermore, petitions have been filed in the Madras High Court and in the Mumbai High Court. In 2012, it was argued in the Madras High Court that the only legal provision for taking fingerprints exists under the Prisoners Act, whereas the UIDAI is taking the fingerprints of people who are not prisoners and therefore it is illegal. In 2013, Vikram Crishna, Kamayani Bahl and a few others argued in the Mumbai High Court that the right to privacy is being violated through the UID scheme. It is noteworthy that in most of these cases, the defendants have not filed any counter-arguments. The only exceptions were in the Aruna Roy and Puttaswamy cases, where the defendants claimed that the UID is secure and supported it in general. In the end, the Supreme Court directed that the cases in Mumbai and Madras should be clubbed together and addressed by it. As such, the cases filed in the Madras and Mumbai High Courts have been sent to the Supreme Court of India.

    Major General Vombathakere also filed a petition in the Supreme Court, arguing that the UID scheme violates individuals' right to privacy. When the counsel for the General commenced his arguments the judge pointed to the possibility of the Government passing the NIA Bill soon, which will contain provisions for privacy, as stated by the Government. As such, the judge implied that if the Government passes such a law the argument, that the Government is implementing the scheme in a legal vacuum, may not be valid.

    So what is the status of your pending court cases?

    Well, I impleaded myself in Aruna Roy's petition and brought my arguments with regards to corruption in the case of companies contracted with the UIDAI and the danger to national security through the involvement of persons linked to US intelligence agencies. The last hearing in the Supreme Court was on 10th December 2013, but it was postponed to 28 January 2014. So in short, in the Supreme Court I am currently filing a case for investigation with regards to corruption and links with foreign intelligence agencies by companies contracted with the UIDAI, while in the Bangalore High Court, I have appealed a civil trial with regards to the misplacement of Order 7, Rule 11, of the Civil Procedure Code.

    Surveillance and the Indian Constitution - Part 2: Gobind and the Compelling State Interest Test

    by Pranesh Prakash last modified Jan 27, 2014 06:03 PM
    Gautam Bhatia analyses the first case in which the Supreme Court recognized a constitutional right to privacy, Gobind v. State of Madhya Pradesh, and argues that the holding in that case adopted the three-pronged American test of strict scrutiny, compelling State interest, and narrow tailoring in its approach to privacy violations.

    After its judgment in Kharak Singh, the Court was not concerned with the privacy question for a while. The next case that dealt – peripherally – with the issue came eleven years later. In R.M. Malkani v State of Maharashtra, the Court held that attaching a recording device to a person’s telephone did not violate S. 25 of the Telegraph Act, because

    "where a person talking on the telephone allows another person to record it or to hear it, it can-not be said that the other person who is allowed to do so is damaging, removing, tampering, touching machinery battery line or post for intercepting or acquainting himself with the contents of any message. There was no element of coercion or compulsion in attaching the tape recorder to the telephone."

    Although this case was primarily about the admissibility of evidence, the Court also took time out to consider – and reject – a privacy-based Article 21 argument, holding that:

    "Article 21 was invoked by submitting that the privacy of the appellant’s conversation was invaded. Article 21 contemplates procedure established by law with regard to deprivation of life or personal liberty. The telephonic conversation of an innocent citizen will be protected by Courts against wrongful or high handed interference by tapping the conversation. The protection is not for the guilty citizen against the efforts of the police to vindicate the law and prevent corruption of public servants. It must not be understood that the Courts will tolerate safeguards for the protection of the citizen to be imperiled by permitting the police to proceed by unlawful or irregular methods."

    Apart from the fact that it joined Kharak Singh in refusing to expressly find a privacy right within the contours of Article 21, there is something else that unites Kharak Singh and R.M. Malkani: they hypothetical in Kharak Singh became a reality in Malkani – what saved the telephone tapping precisely because it was directed at "… a guilty person", with the Court specifically holding that the laws were not for targeting innocent people. Once again, then, the targeted  and specific nature of interception became a crucial – and in this case, a decisive – factor. One year later, in another search and seizure case, Pooran Mal v Inspector, the Court cited M.P. Sharma and stuck to its guns, refusing to incorporate the Fourth Amendment into Indian Constitutional law.

    It is Gobind v State of MP, decided in 1975, that marks the watershed moment for Indian privacy law in the Constitution. Like Kharak Singh, Gobind also involved domiciliary visits to the house of a history-sheeter. Unlike Kharak Singh, however, in Gobind the Court found that the Regulations did have statutory backing – S. 46(2)(c) of the Police Act, which allowed State Government to make notifications giving effect to the provisions of the Act, one of which was the prevention of commission of offences. The surveillance provisions in the impugned regulations, according to the Court, were indeed for the purpose of preventing offences, since they were specifically aimed at repeat offenders. To that extent, then, the Court found that there existed a valid “law” for the purposes of Articles 19 and 21.

    By this time, of course, American constitutional law had moved forward significantly from eleven years ago, when Kharak Singh had been decided. The Court was able to invoke Griswold v Connecticut and Roe v Wade, both of which had found a "privacy" as an "interstitial" or "penumbral" right in the American Constitution – that is, not reducible to any one provision, but implicit in a number of separate provisions taken together. The Court ran together a number of American authorities, referred to Locke and Kant, to dignity, to liberty and to autonomy, and ended by holding, somewhat confusingly:

    “the right to privacy must encompass and protect the personal intimacies of the home, the family marriage, motherhood, procreation and child rearing. This catalogue approach to the question is obviously not as instructive as it does not give analytical picture of that distinctive characteristics of the right of privacy. Perhaps, the only suggestion that can be offered as unifying principle underlying the concept has been the assertion that a claimed right must be a fundamental right implicit in the concept of ordered liberty… there are two possible theories for protecting privacy of home. The first is that activities in the home harm others only to the extent that they cause offence resulting from the mere thought that individuals might he engaging in such activities and that such ‘harm’ is not Constitutionally protective by the state. The second is that individuals need a place of sanctuary where they can be free from societal control. The importance of such a sanctuary is that individuals can drop the mask, desist for a while from projecting on the world the image they want to be accepted as themselves, an image that may reflect the values of their peers rather than the realities of their natures… the right to privacy in any event will necessarily have to go through a process of case-by-case development."

    But if no clear principle emerges out of the Court’s elucidation of the right, it was fairly unambiguous in stressing the importance of the right itself. Interestingly, it grounded the right within the context of the freedom struggle. "Our founding fathers," it observed, "were thoroughly opposed to a Police Raj even as our history of the struggle for freedom has borne eloquent testimony to it." (Para 30) The parallels to the American Fourth Amendment are striking here: in his historical analysis Akhil Amar tells us that the Fourth Amendment was meant precisely to avoid the various abuses of unreasonable searches and seizures that were common in England at the time.

    The parallels with the United States become even more pronounced, however, when the Court examined the grounds for limiting the right to privacy. "Assuming that the fundamental rights explicitly guaranteed to a citizen have penumbral zones and that the right to privacy is itself a fundamental right, that fundamental right must be subject to restriction on the basis of compelling public interest." "Compelling public interest" is an interesting phrase, for two reasons. First, “public interest” is a ground for fundamental rights restrictions under Article 19 (see, e.g., Article 19(6)), but the text of the Article 19 restrictions do not use – and the Court, in interpreting them, has not held – that the public interest must be “compelling”. This suggests a stricter standard of review for an Article 21 privacy right violation than Article 19 violations. This is buttressed by the fact that in the same paragraph, the Court ended by observing: “even if it be assumed that Article 19(5) [restrictions upon the freedom of movement] does not apply in terms, as the right to privacy of movement cannot be absolute, a law imposing reasonable restriction upon it for compelling interest of State must be upheld as valid.” The Court echoes the language of 19(5), and adds the word “compelling”. This surely cannot be an oversight.

    More importantly – the compelling State interest is an American test, used often in equal protection cases and cases of discrimination, where “suspect classes” (such as race) are at issue. Because of the importance of the right at issue, the compelling state interest test goes hand-in-hand with another test: narrow tailoring. Narrow tailoring places a burden upon the State to demonstrate that its restriction is tailored in a manner that infringes the right as narrowest manner that is possible to achieve its goals. The statement of the rule may be found in the American Supreme Court case of Grutter v Bollinger:

    "Even in the limited circumstance when drawing racial distinctions is permissible to further a compelling state interest, government is still constrained under equal protection clause in how it may pursue that end: the means chosen to accomplish the government’s asserted purpose must be specifically and narrowly framed to accomplish that purpose."

    To take an extremely trivial example that will illustrate the point: the State wants to ban hate speech against Dalits. It passes legislation that bans “all speech that disrespects Dalits.” This is not narrowly tailored, because while all hate speech against Dalits necessarily disrespects them, all speech that disrespects Dalits is not necessarily hate speech. It was possible for the government to pass legislation banning only hate speech against Dalits, one that would have infringed upon free speech more narrowly than the “disrespect law”, and still achieved its goals. The law is not narrowly tailored.

    Crucially, then, the Court in Gobind seemed to implicitly accept the narrow-tailoring flip side of the compelling state interest coin. On the constitutionality of the Police Regulations itself, it upheld their constitutionality by reading them narrowly. Here is what the Court said:

    “Regulation 855, in our view, empowers surveillance only of persons against whom reasonable materials exist to induce the opinion that they show a determination, to lead a life of crime – crime in this context being confined to such as involve public peace or security only and if they are dangerous security risks. Mere convictions in criminal cases where nothing gravely imperiling safety of society cannot be regarded as warranting surveillance under this Regulation. Similarly, domiciliary visits and picketing by the police should be reduced to the clearest cases of danger to community security and not routine follow-up at the end of a conviction or release from prison or at the whim of a police officer.”

    But Regulation 855 did not refer to the gravity of the crime at all. Thus, the Court was able to uphold its constitutionality only by narrowing its scope in a manner that the State’s objective of securing public safety was met in a way that minimally infringed the right to privacy.

    Therefore, whether the Gobind bench was aware of it or not, its holding incorporates into Indian constitutional law and the right to privacy, not just the compelling State interest test, but narrow tailoring as well. The implications for the CMS are obvious. Because with narrow tailoring, the State must demonstrate that bulk surveillance of all individuals, whether guilty or innocent, suspected of crimes or not suspected of crimes (whether reasonably or otherwise), possessing a past criminal record or not, speaking to each other of breaking up the government or breaking up a relationship – every bit of data must be collected to achieve the goal of maintaining public security, and that nothing narrower will suffice. Can the State demonstrate this? I do not think it can, but at the very least, it should be made to do so in open Court.

    Making the Powerful Accountable

    by Chinmayi Arun last modified Jan 30, 2014 06:43 AM
    If powerful figures are not subjected to transparent court proceedings, the opacity in the face of a critical issue is likely to undermine public faith in the judiciary.
    Making the Powerful Accountable

    CANDID CAMERA: Media coverage is often critical to whether someone relatively powerless is able to assert her rights against a very powerful person. Photo: Monica Tiwari


    Chinmayi Arun's Op-ed was published in the Hindu on January 29, 2014.


    It is odd indeed that the Delhi High Court seems to believe that sensational media coverage can sway the Supreme Court into prejudice against one of its own retired judges. Justice Manmohan Singh of the Delhi High Court has said in Swatanter Kumar v. Indian Express and others that the pervasive sensational media coverage of the sexual harassment allegations against the retired Supreme Court judge 'may also result in creating an atmosphere in the form of public opinion wherein a person may not be able to put forward his defence properly and his likelihood of getting fair trial would be seriously impaired.'  This Delhi High court judgment has drawn upon the controversial 2011 Supreme Court judgment in Sahara India Real Estate Corp. Ltd v. SEBI (referred to as the Gag Order case here) to prohibit the media from publishing headlines connecting retired Justice Swatanter Kumar with the intern's allegations, and from publishing his photograph in connection with the allegations.

    Although the Gag Order judgment was criticised at the time that it was delivered Swatanter Kumar v. Indian Express illustrates its detractors' argument more vividly that anyone could have imagined.

    Sukumar Muralidharan wrote of Gag Order case that the postponement (of media coverage) order remedy that it created, could become an "instrument in the hands of wealthy and influential litigants, to subvert the course of open justice".

    Here we find that although a former Supreme Court judge is pitted against a very young former intern within a system over which he once presided, Justice Manmohan Singh seems to think that it is the judge who is danger of being victimised.

    The Swatanter Kumar judgment was enabled by both the Gag Order case as well as the 1966 Supreme Court judgment in Naresh Sridhar Mirajkar v. State of Maharashtra, which in combination created a process for veiling court proceedings. Naresh Mirajkar stated that courts' inherent powers extend to barring media reports and comments on ongoing trials in the interests of justice, and that such powers do not violate the right to freedom of speech; and the Gag Order case created an instrument - the 'postponement order' - for litigants, such that they can have media reports of a pending case restricted. The manner in which this is used in the Swatanter Kumar judgment raises very worrying questions about how the judiciary views the boundaries of the right to freedom of expression, particularly in the context of reporting court proceedings.

    Broad power to restrict reporting

    The Gag Order case was problematic: it used arguments for legitimate restraints on media reporting in exceptional circumstances, to permit restrictions on media reporting of court proceedings under circumstances 'where there is a real and substantial risk of prejudice to fairness of the trial or to proper administration of justice'.  The Supreme Court refused to narrow this or clarify what publications would fall within this category. It merely stated that this would depend on the content and context of the offending publication, and that no 'straightjacket formula' could be created to enumerate these categories. This leaves higher judiciary with a broad discretionary power to decide what amounts to
    legitimate restraints on media reporting, using an ambiguous standard. Exercise of this power to veil proceedings involving powerful public figures whose actions have public implications, imperils openness and transparency when they are most critical.

    Court proceedings are usually open to the public. This openness serves as a check on the judiciary, and ensures public faith in the judiciary. In countries as large as ours, media coverage of important cases ensures actual openness of court proceedings - we are able to follow the arguments made by petitioners who ask that homosexuality be decriminalised, the trial of suspected terrorists and alleged murderers, and the manner in which our legal system handles sexual harassment complaints filed by young women.

    When court proceedings are closed to the public (known as 'in-camera' trials) or when media dissemination of information about them is restricted, the openness and transparency of court proceedings is compromised. Such compromise of transparency does take place in many countries, to protect the rights of the parties involved, or prevent miscarriage of justice. For example, child-participants are protected by holding trials in-camera; names of parties to court proceedings are withheld to protect their privacy sometimes; and in countries where juries determine guilt, news coverage that may prejudice the jury is also restricted.

    The damage done

    Although the Supreme Court stated in principle that the openness of court proceedings should only be restricted where strictly necessary, this appears to lend itself to very varied interpretation. For example, it is very difficult for some of us to understand why it was strictly necessary to restrict media coverage of sexual harassment proceedings in the Swatanter Kumar case. J. Manmohan Singh on the other hand seems to believe that the adverse public opinion will affect the retired judge's chance of getting a fair trial. His judgment also seems to indicate his concern that the sensational headlines will impact the public confidence in the Supreme Court.

    The Delhi High Court's apprehension about the effects of the newspaper coverage on the reputation of the judge did not need to translate into a prior restraint on media coverage. They may better have been addressed later, by evaluating a defamation claim pertaining to published material. The larger concerns about the reputation of the judiciary are better addressed by openness: if powerful public figures, especially those with as much influence as a former Supreme Court judge are not subjected to transparent court proceedings, the opacity in the face of such a critical issue is likely to undermine public faith in the judiciary as an institution.Such opacity undermines the purpose of open courts. It is much worse for the reputation of the judiciary than publicised complaints about individual judges.

    Since the Delhi High Court ruling, there has been little media coverage of the sexual harassment case. Suppression of media coverage leaves the young woman comparatively isolated. Wide coverage of the harassment complaint involving Justice Ganguly, helped the intern in that case find support. The circulation of information enabled other former interns as well as a larger network of lawyers and activists, reach out to her. This is apart from the general pressure to be fair that arises when a case is being followed closely by the public. Media coverage is often critical to whether someone relatively powerless is able to assert her rights against a very powerful person. This is why media freedom is sacred to democracies.

    If the Supreme Court was confident that the high courts in India would use their broad discretionary power under the Gag Order case sparingly and only in the interests of justice, the Swatanter Kumar case should offer it grounds to reconsider.  Openness and freedom of expression are not meant to be diluted to protect the powerful - they exist precisely to ensure that even the powerful are held accountable by state systems that they might otherwise be able to sway.

    (Chinmayi Arun is research director, Centre for Communication Governance, National Law University, Delhi, and fellow, Centre for Internet and Society, Bangalore.)

    India's Central Monitoring System (CMS): Something to Worry About?

    by Maria Xynou last modified Feb 22, 2014 01:50 PM
    In this article, Maria Xynou presents new information about India's controversial Central Monitoring System (CMS) based on official documents which were shared with the Centre for Internet and Society (CIS). Read this article and gain an insight on how the CMS actually works!
    India's Central Monitoring System (CMS): Something to Worry About?

    by SnaPsi on flickr

    The idea of a Panoptikon, of monitoring all communications in India and centrally storing such data is not new. It was first envisioned in 2009, following the 2008 Mumbai terrorist attacks. As such, the Central Monitoring System (CMS) started off as a project run by the Centre for Communication Security Research and Monitoring (CCSRM), along with the Telecom Testing and Security Certification (TTSC) project.

    The Central Monitoring System (CMS), which was largely covered by the media in 2013, was actually approved by the Cabinet Committee on Security (CCS) on 16th June 2011 and the pilot project was completed by 30th September 2011. Ever since, the CMS has been operated by India's Telecom Enforcement Resource and Monitoring (TERM) cells, and has been implemented by the Centre for Development of Telematics (C-DOT), which is an Indian Government owned telecommunications technology development centre. The CMS has been implemented in three phases, each one taking about 13-14 months. As of June 2013, government funding of the CMS has reached at least Rs. 450 crore (around $72 million).

    In order to require Telecom Service Providers (TSPs) to intercept all telecommunications in India as part of the CMS, clause 41.10 of the Unified Access Services (UAS) License Agreement was amended in June 2013. In particular, the amended clause includes the following:

    But, in case of Centralized Monitoring System (CMS), Licensee shall provide the connectivity upto the nearest point of presence of MPLS (Multi Protocol Label Switching) network of the CMS at its own cost in the form of dark fibre with redundancy. If dark fibre connectivity is not readily available, the connectivity may be extended in the form of 10 Mbps bandwidth upgradeable upto 45 Mbps or higher as conveyed by the Governemnt, till such time the dark fibre connectivity is established. However, LICENSEE shall endeavor to establish connectivity by dark optical fibre at the earilest. From the point of presence of MPLS network of CMS onwards traffic will be handled by the Government at its own cost.”

    Furthermore, draft Rule 419B under Section 5(2) of the Indian Telegraph Act, 1885, allows for the disclosure of “message related information” / Call Data Records (CDR) to Indian authorities. Call Data Records, otherwise known as Call Detail Records, contain metadata (data about data) that describe a telecomunication transaction, but not the content of that transaction. In other words, Call Data Records include data such as the phone numbers of the calling and called parties, the duration of the call, the time and date of the call, and other such information, while excluding the content of what was said during such calls. According to draft Rule 419B, directions for the disclosure of Call Data Records can only be issued on a national level through orders by the Secretary to the Government of India in the Ministry of Home Affairs, while on the state level, orders can only be issued by the Secretary to the State Government in charge of the Home Department.

    Other than this draft Rule and the amendment to clause 41.10 of the UAS License Agreement, no law exists which mandates or regulates the Central Monitoring System (CMS). This mass surveillance system is merely regulated under Section 5(2) of the Indian Telegraph Act, 1885, which empowers the Indian Government to intercept communications on the occurence of any “public emergency” or in the interest of “public safety”, when it is deemed “necessary or expedient” to do so in the following instances:

    • the interests of the sovereignty and integrity of India

    • the security of the State

    • friendly relations with foreign states

    • public order

    • for preventing incitement to the commission of an offense

    However, Section 5(2) of the Indian Telegraph Act, 1885, appears to be rather broad and vague, and fails to explicitly regulate the details of how the Central Monitoring System (CMS) should function. As such, the CMS appears to be inadequately regulated, which raises many questions with regards to its potential misuse and subsequent violation of Indian's right to privacy and other human rights.

    So how does the Central Monitoring System (CMS) actually work?

    We have known for quite a while now that the Central Monitoring System (CMS) gives India's security agencies and income tax officials centralized access to the country's telecommunications network. The question, though, is how.

    Well, prior to the CMS, all service providers in India were required to have Lawful Interception Systems installed at their premises in order to carry out targeted surveillance of individuals by monitoring communications running through their networks. Now, in the CMS era, all TSPs in India are required to integrate Interception Store & Forward (ISF) servers with their pre-existing Lawful Interception Systems. Once ISF servers are installed in the premises of TSPs in India and integrated with Lawful Interception Systems, they are then connected to the Regional Monitoring Centres (RMC) of the CMS. Each Regional Monitoring Centre (RMC) in India is connected to the Central Monitoring System (CMS). In short, the CMS involves the collection and storage of data intercepted by TSPs in central and regional databases.

    In other words, all data intercepted by TSPs is automatically transmitted to Regional Monitoring Centres, and subsequently automatically transmitted to the Central Monitoring System. This means that not only can the CMS authority have centralized access to all data intercepted by TSPs all over India, but that the authority can also bypass service providers in gaining such access. This is due to the fact that, unlike in the case of so-called “lawful interception” where the nodal officers of TSPs are notified about interception requests, the CMS allows for data to be automatically transmitted to its datacentre, without the involvement of TSPs.

    The above is illustrated in the following chart:

    CMS chart

    The interface testing of TSPs and their Lawful Interception Systems has already been completed and, as of June 2013, 70 ISF servers have been purchased for six License Service Areas and are being integrated with the Lawful Interception Systems of TSPs. The Centre for Development of Telematics has already fully installed and integrated two ISF servers in the premises of two of India's largest service providers: MTNL and Tata Communications Limited. In Delhi, ISF servers which connect with the CMS have been installed for all TSPs and testing has been completed. In Haryana, three ISF servers have already been installed in the premises of TSPs and the rest of currently being installed. In Chennai, five ISF servers have been installed so far, while in Karnataka, ISF servers are currently being integrated with the Lawful Interception Systems of the TSPs in the region.

    The Centre for Development of Telematics plans to integrate ISF servers which connect with the CMS in the premises of service providers in the following regions:

    • Delhi

    • Maharashtra

    • Kolkata

    • Uttar Pradesh (West)

    • Andhra Pradesh

    • Uttar Pradesh (East)

    • Kerala

    • Gujarat

    • Madhya Pradesh

    • Punjab

    • Haryana

    With regards to the UAS License Agreement that TSPs are required to comply with, amended clause 41.10 specifies certain details about how the CMS functions. In particular, the amended clause mandates that TSPs in India will provide connectivity upto the nearest point of presence of MPLS (Multi Protocol Label Switching) network of the CMS at their own cost and in the form of dark optical fibre. From the MPLS network of the CMS onwards, traffic will be handled by the Government at its own cost. It is noteworthy that a Memorandum of Understanding (MoU) for MPLS connectivity has been signed with one of India's largest ISPs/TSPs: BSNL. In fact, Rs. 4.8 crore have been given to BSNL for interconnecting 81 CMS locations of the following License Service Areas:

    • Delhi

    • Mumbai

    • Haryana

    • Rajasthan

    • Kolkata

    • Karnataka

    • Chennai

    • Punjab

    Clause 41.10 of the UAS License Agreement also mandates that the hardware and software required for monitoring calls will be engineered, provided, installed and maintained by the TSPs at their own cost. This implies that TSP customers in India will likely have to pay for more expensive services, supposedly to “increase their safety”. Moreover, this clause mandates that TSPs are required to monitor at least 30 simultaneous calls for each of the nine designated law enforcement agencies. In addition to monitored calls, clause 41.10 of the UAS License Agreement also requires service providers to make the following records available to Indian law enforcement agencies:

    • Called/calling party mobile/PSTN numbers

    • Time/date and duration of interception

    • Location of target subscribers (Cell ID & GPS)

    • Data records for failed call attempts

    • CDR (Call Data Records) of Roaming Subscriber

    • Forwarded telephone numbers by target subscriber

    Interception requests from law enforcement agencies are provisioned by the CMS authority, which has access to the intercepted data by all TSPs in India and which is stored in a central database. As of June 2013, 80% of the CMS Physical Data Centre has been built so far.

    In short, the CMS replaces the existing manual system of interception and monitoring to an automated system, which is operated by TERM cells and implemented by the Centre for Development of Telematics. Training has been imparted to the following law enforcement agencies:

    • Intelligence Bureau (IB)

    • Central Bureau of Investigation (CBI)

    • Directorate of Revenue Intelligence (DRI)

    • Research & Analysis Wing (RAW)

    • National Investigation Agency (NIA)

    • Delhi Police

    And should we even be worried about the Central Monitoring System?

    Well, according to the brief material for the Honourable MOC and IT Press Briefing on 16th July 2013, we should not be worried about the Central Monitoring System. Over the last year, media reports have expressed fear that the Central Monitoring System will infringe upon citizen's right to privacy and other human rights. However, Indian authorities have argued that the Central Monitoring System will better protect the privacy of individuals and maintain their security due to the following reasons:

    1. The CMS will just automate the existing process of interception and monitoring, and all the existing safeguards will continue to exist

    2. The interception and monitoring of communications will continue to be in accordance with Section 5(2) of the Indian Telegraph Act, 1885, read with Rule 419A

    3. The CMS will enhance the privacy of citizens, because it will no longer be necessary to take authorisation from the nodal officer of the Telecom Service Providers (TSPs) – who comes to know whose and which phone is being intercepted

    4. The CMS authority will provision the interception requests from law enforcement agencies and hence, a complete check and balance will be ensured, since the provisioning entity and the requesting entity will be different and the CMS authority will not have access to content data

    5. A non-erasable command log of all provisioning activities will be maintained by the system, which can be examined anytime for misuse and which provides an additional safeguard

    While some of these arguments may potentially allow for better protections, I personally fundamentally disagree with the notion that a centralised monitoring system is something not to worry about. But let's start-off by having a look at the above arguments.

    The first argument appears to imply that the pre-existing process of interception and monitoring was privacy-friendly or at least “a good thing” and that existing safeguards are adequate. As such, it is emphasised that the process of interception and monitoring will “just” be automated, while posing no real threat. I fundamentally disagree with this argument due to several reasons. First of all, the pre-existing regime of interception and monitoring appears to be rather problematic because India lacks privacy legislation which could safeguard citizens from potential abuse. Secondly, the very interception which is enabled through various sections of the Information Technology (Amendment) Act, 2008, and the Indian Telegraph Act, 1885, potentially infringe upon individual's right to privacy and other human rights.

    May I remind you of Section 69 of the Information Technology (Amendment) Act, 2008, which allows for the interception of all information transmitted through a computer resource and which requires users to assist authorities with the decryption of their data, if they are asked to do so, or face a jail sentence of up to seven years. The debate on the constitutionality of the various sections of the law which allow for the interception of communications in India is still unsettled, which means that the pre-existing interception and monitoring of communications remains an ambiguous matter. And so, while the interception of communications in general is rather concerning due to dracodian sections of the law and due to the absence of privacy legislation, automating the process of interception does not appear reassuring at all. On the contrary, it seems like something in the lines of: “We have already been spying on you. Now we will just be doing it quicker and more efficiently.”

    The second argument appears inadequate too. Section 5(2) of the Indian Telegraph Act, 1885, states that the interception of communications can be carried out on the occurence of a “public emergency” or in the interest of “public safety” when it is deemed “necessary or expedient” to do so under certain conditions which were previously mentioned. However, this section of the law does not mandate the establishment of the Central Monitoring System, nor does it regulate how and under what conditions this surveillance system will function. On the contrary, Section 5(2) of the Indian Telegraph Act, 1885, clearly mandates targeted surveillance, while the Central Monitoring System could potentially undertake mass surveillance. Since the process of interception is automated and, under clause 41.16 of the Unified License (Access Services) Agreement, service providers are required to provision at least 3,000 calls for monitoring to nine law enforcement agencies, it is likely that the CMS undertakes mass surveillance. Thus, it is unclear if the very nature of the CMS falls under Section 5(2) of the Indian Telegraph Act, 1885, which mandates targeted surveillance, nor is it clear that such surveillance is being carried out on the occurence of a specific “public emergency” or in the interest of “public safety”. As such, the vagueness revolving around the question of whether the CMS undertakes targeted or mass surveillance means that its legality remains an equivocal matter.

    As for the third argument, it is not clear how bypassing the nodal officers of TSPs will enhance citizen's right to privacy. While it may potentially be a good thing that nodal officers will not always be aware of whose information is being intercepted, that does not guarantee that those who do have access to such data will not abuse it. After all, the CMS appears to be largely unregulated and India lacks privacy legislation and all other adequate legal safeguards. Moreover, by bypassing the nodal officers of TSPs, the opportunity for unauthorised requests to be rejected will seize to exist. It also implies an increased centralisation of intercepted data which can potentially create a centralised point for cyber attacks. Thus, the argument that the CMS authority will monopolise the control over intercepted data does not appear reassuring at all. After all, who will watch the watchmen?

    While the fourth argument makes a point about differentiating the provisioning and requesting entities with regards to interception requests, it does not necessarily ensure a complete check and balance, nor does it completely eliminate the potential for abuse. The CMS lacks adequate legal backing, as well as a framework which would ensure that unauthorised requests are not provisioned. Thus, the recommended chain of custody of issuing interception requests does not necessarily guarantee privacy protections, especially since a legal mechanism for ensuring checks and balances is not in place.

    Furthermore, this argument states that the CMS authority will not have access to content data, but does not specify if it will have access to metadata. What's concerning is that metadata can potentially be more useful for tracking individuals than content data, since it is ideally suited to automated analysis by a computer and, unlike content data which shows what an individuals says (which may or may not be true), metadata shows what an individual does. As such, metadata can potentially be more “harmful” than content data, since it can potentially provide concrete patterns of an individual's interests, behaviour and interactions. Thus, the fact that the CMS authority might potentially have access to metadata appears to tackle the argument that the provisioning and requesting entities will be seperate and therefore protect individual's privacy.

    The final argument appears to provide some promise, since the maintenance of a command log of all provisioning activities could potentially ensure some transparency. However, it remains unclear who will maintain such a log, who will have access to it, who will be responsible for ensuring that unlawful requests have not been provisioned and what penalties will be enforced in cases of breaches. Without an independent body to oversee the process and without laws which predefine strict penalties for instances of misuse, maintaining a command log does not necessarily safeguard anything at all. In short, the above arguments in favour of the CMS and which support the notion that it enhances individual's right to privacy appear to be inadequate, to say the least.

    In contemporary democracies, most people would agree that freedom is a fundamental human right. The right to privacy should be equally fundamental, since it protects individuals from abuse by those in power and is integral in ensuring individual liberty. India may literally be the largest democracy in the world, but it lacks privacy legislation which establishes the right to privacy, which guarantees data protection and which safeguards individuals from the potentially unlawful interception of their communications. And as if that is not enough, India is also carrying out a surveillance scheme which is largely unregulated. As such, it is highly recommended that India establishes a privacy law now.

    If we do the math, here is what we have: a country with extremely high levels of corruption, no privacy law and an unregulated surveillance scheme which lacks public and parliamentary debate prior to its implementation. All of this makes it almost impossible to believe that we are talking about a democracy, let alone the world's largest (by population) democracy! Therefore, if Indian authorities are interested in preserving the democratic regime they claim to be a part of, I think it would be highly necessary to halt the Central Monitoring System and to engage the public and the parliament in a debate about it.

    After all, along with our right to privacy, freedom of expression and other human rights...our right to freedom from suspicion appears to be at stake.

    How can we not be worried about the Central Monitoring System?

     

     

    The Centre for Internet and Society (CIS) is in possession of the documents which include the information on the Central Monitoring System (CMS) as analysed in this article, as well as of the draft Rule 419B under the Indian Telegraph Act, 1885.

    Bhutan's Google Apps

    by Prasad Krishna last modified Jan 30, 2014 12:17 PM

    PDF document icon Jan25 Vol5 Issue04_2014.pdf — PDF document, 4744 kB (4858629 bytes)

    New Document on India's Central Monitoring System (CMS) - 2

    by Maria Xynou last modified Jan 30, 2014 12:40 PM

    PDF document icon Brief material about CMS etc — PDF document, 1116 kB (1143370 bytes)

    Video Games: A Case Study of a Cross-cultural Video Collaboration

    by Larissa Hjorth and Nishant Shah — last modified Jan 31, 2014 12:02 PM
    A new book focusing on Palestinian artists’ video, edited by Bashir Makhoul and published by Palestinian Art Court- al Hoash, 2013, includes a chapter co-authored by Larissa Hjorth and Nishant Shah.

    This was published in a book on Palestinian art.


    The rise of mobile media is heralding new forms of networked visualities. These visualities see place, politics and images entangled in new ways: what can be called ‘emplaced visuality’. In the images disseminated globally of citizen uprising such as the Arab Spring, it was mobile phones that provided the frame and context for new forms of networked visual politics. In the growth in networked photo apps such as Instagram and Hipstamic, how, when and why we are representing a relationship between place and co-presence is changing. No longer the poorer cousin to professional cameras, camera phones have lead the rise of do-it-yourself (DIY) aesthetics flooding mainstream and subcultural media cultures. In networked visuality contexts such as YouTube and Flickr, the aesthetic of what Burgess has called ‘vernacular creativity’[1] has become all-pervasive—so much so that even mainstream media borrows the DIY style.

    Now, with locative media added into the equation, these visualities are not only networked but also emplaced—that is, entangled within the temporal and spatial movements of everyday life.[2]

    Emplaced visualities represent a new relationship between place (as a series of what Doreen Massey calls ‘stories so far’)[3] co-presence, subjectivity and visuality. This phenomenon is impacting upon video art. In this chapter we reflect upon how mobile media visualities are impacting upon a sense of place and displacement. With the added dimension of Big Data and location-based services (like Google Maps and Facebook Places) now becoming part of the everyday informational circuits, how a sense of place and privacy is experienced and represented is changing. This phenomenon is apparent in the Palestinian cross-cultural video project called Al Jaar Qabla al Daar (The Neighbour before the House) as we will discuss in detail later in this chapter.

    With its history of displacement and disapora, Palestinine’s role in contemporary art is increasingly becoming pivotal. This is especially the case with video art as a key medium for reflecting upon representations of place and movement. When we think of Palestinan video art the first artist we think of is Mona Hatoum. Hatoum was a poineer in so many ways. In particular, she gave voice to Arab women. Her work unsetttled the poetics of the everyday by evoking a sense of displacement and entanglement. While born in Beirut of Palestinan parents and then moving to London, she never identified as Lebanese. Despite never living in Palestinian, Hatoum was like a number of Palestinian refugees in Lebanon post 1948 who were never able to gain Lebanese identity cards. Unsurprisingly, Hatoum’s experiences of exile permulate her work. In particular, exile, politics and the body have played a key role. This is epitomised in her iconic Measures of Distance (1988) whereby Hatoum superimposes images of her mother having a shower with letters by her mum written in Arabian.

    However, Hatoum is not the only artist representing the oevre of Palestinian video art. Over the last two decades—with the rise of mobile media affording easy accessibility to new media tools and networked contexts like YouTube—a new breed of video artists has arisen. An example is Navigations: Palestinian Video Art, 1988 to 2011 (curated as part of the Palestine Film Festival) that explored artists working in Palestine and the diaspora over nearly a quarter of a century. Unsurprisingly, motifs of diaspora and displacement feature throughout the fifteen works by Hatoum, Taysir Batniji, Manar Zoabi, Larissa Sansour and Khaled Jarrar to name a few. In Navigations: Palestinian Video Art key themes include ‘mobility and fluidity: the virtual and the real, the past and the future, the spectacular and the quotidian, the near and the far’.

    Another example of an event promoting Palestinian video art is the /si:n/ Festival of Video Art & Performance. Consisting of performances, video installations, lectures, talks, and workshops in various venues all over the West Bank and includes artists from all over the world. The name /si:n/ is meant to linked the words ‘scene’ with ‘seen’ and has been seen as making an innovative context for video artists to share and collaborate in public venue. With themes such as ‘poetical revolution comes before political revolution’, the /si:n/ Festival provides a context that reflects upon exile and place in one of the most contested and politucal spaces, the West Bank. Beginning in 2009, the /si:n/ Festival became the first festival of video art in Palestine.

    Given this rich tapestry of video art emerging in Palestine, in this chapter we explore the relationship between emergent mobile visualities, diaspora and place through a specific project called The Neighbour before the House. A cross-cultural video collaboration between Indian artists Shaina Anand, Ashok Sukumaran and Nida Ghouse, with Palestinian and Israeli artists Mahmoud Jiddah, Shereen Brakat and Mahasen Nasser-Eldin, The Neighbour before the House is a video art project that explores quotidian practices of life in a ‘post-surveillance society’. The Neighbour before the House is set in the context of the much contested territories and the relentless re-occupation and re-appropriation of East Jerusalem. Working with cheap surveillance technologies which have become such a ubiquitous part of the landscape of East Jerusalem, the artists use a PTZ (pan-tilt-zoom) security camera to inquire into the affective dimensions of ‘mobile’ life in the time of turbulent politics. The images that they capture look at jest, memory, desire and doubt, as fragile conditions of trust and life shape the everyday experiences of the region. The camera is given to the residents of a neighbourhood torn asunder by political strife and conflicts, asking them to search for the nugget of truth or morsel of thickness in the otherwise familiar flatness of walls and closed doors, which have been completely depleted of all depth because of the increased distance in the social relations.

    The images eschew the tropes of traditional documentary making by and adopting the grainy, lo-res, digital non-frame, DIY aesthetics constantly in search of an image that might become the site of meaning making, but increasingly only capturing the mundane, the inane, the opaque and the evanescent. The image leads the commentary. The live camera operator’s interest and experience shape the image, rendering the familiar or the insignificant as hugely affective and evocative. The project further initiates a dialogue between the neighbours—both from across the contested zones, but also from across picket fences and walls of surveillance—by introducing the images to them, by inviting them to capture the images, and instil in them, the narratives of hope, despair, nostalgia, memory, loss, love, and longing.

    The Neighbour before the House reflects upon the relationship between between art, technologies of visual reproduction and political strife. Moving away from the documentary style that has been popular in capturing the ‘real’, The Neighbour before the House refigures the temporality and spatiality through new affective and metaphorical tropes, playing with the tension between the presence of surveillance technologies and the familiarity of these images that breeds new conditions of life and living, trust and belonging, safety and threat, for people in Palestine. In the process, it introduces key questions to the role of the artist, the function of art, the form of video art practice, and the new negotiations that digital video apparatus introduce to the art worlds, beyond the now main-stream ideas of morphing, digitization, remixing etc.

    Moreover, The Neighbour before the House reflects upon a shift away from the dominant network society paradigm and towards more contingent and ambivalent mirconarratives of camera phone practices. It toys with the DIY ‘banality’ aesthetics of camera phones in order to consider the ways in which place is overlaid with different types of information—electronic, geographic, psychological and metaphoric. On the one hand, The Neighbour before the House evokes network society metaphors. On the other hand, it suggests a move away from this paradigm and towards a politics of both ‘emplaced’ and displaced visuality. In order to discuss this transformation of the relationship between image, place and information from network society metaphors towards ‘emplaced’ visualities we firstly describe The Neighbour before the House before then reflecting upon a few key themes the project explores: that is, the movement of the networked society to emplaced visualities and the rise of the politics of the phoneur.

    The Neighbour before the House 2012): A case study

    As aforementioned, The Neighbor before the House is a collaborative video project between Indian, Israeli and Palestenian artists that appropriates, critically responds and insightfully rearranges the notion of art, politics and digital video technologies in its exploration of everyday practices of life in critical times in a networked post-surveillance society. The Neighbor before the House equips eight Palestenian families from East Jerusalem to be in control of PTZ (Pan Tilt Zoom) surveillance cameras mounted at strategic locations in the city, to observe the live feed on their TV sets, recording their reactions and live commentaries at what they see. Here the Big Brother, and its contemporary Big Data, is inverted through everyday citizens being given the omnipresent eye. It plays on the idea of the neighbour being both a friendly eye and when this watching shifts from being benevolent to malevolent.

    As the artists write, ‘this footage shot with a security camera, takes us beyond the instrumental aspects of surveillance imaging, introducing us to the architecture of a deliberate and accelerated occupation of a city.’ Here the city is rendered into a cartography of informational circuits. Exploiting the conditions of networked spectacle, the project attempts to remap the real and the everyday through ‘inquisitiveness, jest, memory, fear, desire and doubt’. They use the surveillance cameras—symbols of suspicion and fear—to catalyse stories from Palestenians in different neighbourhoods about what can be seen: ‘messianic archeological digs; Israeli settlement activities; takeovers of Palestenian properties; the Old City, the Wall and the West Bank,’ among other mundane and marvellous details of living life in those precarious conditions.

    Through the inversion of the politics of survellience from the Big Brother to the ubiquitous neighbour, The Neighbor before the House provides a rich, evocative and non-representational history of living in East Jerusalem. The networked media spectacles which have come to stand-in for the complex geo-political struggles of the region are displaced. As the low-res cameras reduce the deep geography into an alien flatness on the TV screens, as the camera captures glimpses of what could have been, records traces of blurred movements which require discussions and debates about their possible meaning, and engages the families to communicate their hopes, fears, desires and doubts, the art project also signals us to the new forms, functions and role of video art. Rather than the media event or spectacle,
    The Neighbor before the House provides the micronarrative gestures of the everyday. The ways in which the place is a tapestery of subjectivities and experiences, not just a media spectacle.

    As artist Shaina Anand mentions in an interview with Shah, this is a new kind of storytelling, where,

    … a lot of the practice actually removes the filmmaker, the director, the auteur, and also therefore the cameraman, and also the lens... and offers these possibilities and privileges— of this look and gaze and all—to the subjects themselves[4]

    And as the lens makes itself invisible, it also gives new importance to the apparatus of surveillance, seeing and its incorporation in our lives. As Florian Schenider mentions in the introduction to the project, the house upon which the camera is mounted, itself becomes a tripod made of stones. Instead of thinking of the video apparatus as out there, the private conditions of the home, the histories of the family, their relationships with neighbours and communities that they have lost, and strangers that they have inherited, all become the defining circumstances of this new crisis.

    Borrowing from a Quranic saying, Al Jaar Qabla Al Daar, which is close to the idea of ‘loving they neighbour’ it explores how the presence of new digital video technologies establishes difference, distance, alienation, proximity, curiosity and surveillance which is not merely a function of governmental structures but also a condition of gamification and everyday engagement for the families in East Jerusalem. For the artists, this also takes up another connotation of ‘checking out your neighbour before you buy the house’ suggesting establishing bounded similarities to seek comfort. The edited footage of the video shows how and when the users got in control of the keyboard and a joy-stick, panning, tilting and zooming the camera, watching the live feeds on their Television sets as they speak live over the footage. These commentaries are personal as they are affective. Sometimes the commentary leads the person to probe the image, deeper, trying to find a meaning that can no longer be supported by the hyper-pixelated image on their screen, but becoming a site through which memories and interpretations

    get generated. What begins as a playful probe soon takes up sinister shades, as some generate narratives of loss and death. Others take the opportunity to spy on the new settlers who have sometimes taken over their older houses, wondering what changes they are making to what was their own. There is a sense of rawness and urgency, as they look back, with fear, and anger, but also with resignation at the houses that they were evicted from, and the semblance of life that they can spot from their remote presence.

    The final 5 cuts that the artist produce, give us a deep and evocative insight into geography, temporality, and the ways in which we can re-appropriate the network spectacle to look at things that are often forgotten, dropped out of, or rendered invisible in the neat and clean lines of network models and diagrams. The ‘footage’ quality of the probes, the long dwellings on insignificant images, and the panoptican nature of video as witness, video as spy, and video as affective engagement with territories and times that are lost, all give a new idea of what the future of video art would be like. Instead of looking at a tired old Foucauldian critique of surveillance, The Neighbor before the House posits the question of ‘Who watches the watchman?’ in ways that are both startling and assuring.

    Visualising the Politics of the Network

    One of the key themes of The Neighbour before the House is the changing role of the network society—especially in an age of Big Data and locative-based services (LBS)—whereby privacy and surveillance come to the forefront. The network society has often been cited as one of the defining frameworks of our heavily mediated times. From theorists such as Barry Wellman and Manuel Castells, the network metaphor has burgeoned in parallel with the all-pervasive rise of Information and Communication Technologies (ICTs) globally. According to Lee Raine and Wellman in Networked, the ‘new social operating systems of networked individualism liberates us from the restrictions of tightly knit groups.’[5]

    Raine and Wellman argue that there has been a ‘triple revolution’: the rise of social networking, the capacity of the internet to empower individuals, and the always-on connectivity of mobile devices’.[6]

    The ability of networks to explain a range of human personal and social relationships has afforded it great explanatory power, where everything (and hence, by association, everybody) can be understood and explained by the indexicalities and visual cartographies that networks produce. The network is simultaneously, and without any sense of irony, committed to both, examining sketchiness and producing clarity of any phenomena or relationality. The network presumes an externality which can be rich, chaotic and complex and proposes tools and models through which that diverse and discrete reality can be rendered intelligible by producing visualisations.

    These visualisations are artefacts—in as much as all mapping exercises produce artefacts—and operate under the presumption of a benignity devoid of political interventions or intentions. The visualisations are non-representational, in terms that they do not seek to reproduce reality but actually understand it, and thereby shaping the lenses and tools to unravel the real nature of the Real. In this function, the network visualisations are akin to art, attaining symbolic value and attempting to decode a depth that the network itself defies and disowns, simulating conditions of knowing and exploring, emerging as surrogate structures that stand in for the real. Thus the rich set of actions, emotions, impulses, traces, inspirations, catalysts, memories, etc. get reified as transactions which can be sorted in indices, arranged in databases, and presented as an abstract, symbolic and hyper-visual reality which can now be consumed, accessed and archived within the network, thus obfuscating the reality that it was premised upon.

    This phenomenon is what Shah calls the spectacle imperative of the network. Especially with the proliferation of ubiquitous image and video recording digital devices, this ability to create subjective, multiple, fractured spectacles that feed into the network’s own understanding of itself (rather than an engagement with a reality outside) has become the dominant aesthetic that travels from Reality TV programming to user generated content production on video distribution channels on the internet. This networked spectacle, without a single auteur or a concentrated intention—so the videos from the Arab Spring on YouTube, for example, range from small babies in prams to women forming barricades against a marching army, and from people giving out free food and water to acts of vandalism and petty thefts—has become the new aesthetic of video interaction, consumption and circulation. It invites an engagement, divesting our energies and attentions from the physical and the political, to the aesthetic and the discursive. Which is to say that when we consume these spectacles (or indeed, produce them, not necessarily only through the images but also through texts), we produce a parallel universe that demands that we understand the world ‘out there’ through these cultural artefacts which require an immense amount of decoding and meaning making. The network, in its turn, offers us better and more exhaustive tools of mining and sifting through this information, sorting and arranging it, curating and managing it, so that we build more efficient networks without essentially contributing to the on-the-ground action.

    This peculiar self-sustaining selfish nature of the network, to become the only reality, under the guise of attempting to explain reality, is perhaps the most evident in times and geographies of crises. Where (and when) the conditions of politics, circumstances of everyday survival, and the algebra of quotidian life becomes too precarious, too wearisome, too unimaginable to cope with, the network spectacle appears as both the tool for governance as well as the site of protest. Hence, the same technologies are often used by people on different sides of the crises, to form negotiations and get a sense of control, on a reality that is quickly eluding their lived experiences. Surveillance cameras storing an incredible amount of visual data, forming banal narratives of the everyday, appear in critical times and geographies as symbols of control and containment, by authorities that seek to establish their sovereignty over unpredictable zones of public life and dwelling. The gaze of the authority is often criss-crossed by the cell-phone, the webcam, the tiny recording devices of everyday life that people on the streets and in their houses use, to record the nothingness of the crisis, the assurance of normalcy and the need to look over the shoulder and beyond the house, to know that whether or not god is in the heavens, all is well with the world.

    The Place of the Visual: Towards a theory of emplaced visuality

    However, with the rise of mobile media and its micronarrative capacity, the politics of network, and its relationship to a sense of place changes. Far from eroding a sense of place in the growing unboundness of home, mobile technologies reinforce the significance locality.[7] Mobile media also signal a move away from earlier depictions of the network society. Through the growth in camera phone practices overlaid with location-based services, we see new forms of visuality that reflect changing relations between place and information. With the rise of technologies in an increasingly mobile—physically and technologically—place has become progressively more contested. As Rowan Wilken and Gerard Goggin note in Mobile Technologies and Place, place is one of the most contested, ambiguous and complex terms today.[8] Viewing it as unbounded and relational, Wilken and Goggin observe, ‘place can be understood as all-pervasive in the way that it informs and shapes everyday lived experience—including how it is filtered and experienced via the use of mobile technologies’.[9] As social geographer Doreen Massey notes, maps provide little understanding into the complex elusiveness of place as a collection of ‘stories-so-far’:

    One way of seeing ‘places’ is as on the surface of maps… But to escape from an imagination of space as surface is to abandon also that view of place. If space is rather a simultaneity of stories-so-far, then places are collections of those stories, articulations within the wider power-geometries of space. Their character will be a product of these intersections within that wider setting, and of what is made of them… And, too, of the non-meetings-up, the disconnections and the relations not established, the exclusions. All this contributes to the specificity of place.[10]

    For anthropologist Sarah Pink, place is increasingly being mapped by practices of emplacement.[11] With location based media like Google Maps and geotagging becoming progressively part of everyday media practice, how place is imagined and experienced across geographic, psychological, online and offline spaces is changing. This impacts upon the role of ethnography and its relationship to geography and place. As Anne Beaulieu notes, ethnography has moved from co-location to co-presence.[12] In this shift, we see the role of ethnography to address the complex negotiations between online and offline spaces growing.

    In The Neighbour before the House, we are made to consider the changing role of visuality in how place is experienced and practiced. By deploying a surveillant and multivalent gaze, The Neighbour before the House asks us to reconsider privacy and surviellance in an age of locative media. The rise of the network society has witnessed numerous tensions and ambivalence, especially around the the relationship between agency, information and place. This is epitomised by the second generation camera phones practices whereby with the added layer of LBS—where and when images were taken—becomes automatic by default. Whereas first generation of camera phone practices noted gendered differences.[13] through LBS, these differences take on new dimensions—particularly in terms of its potential ‘stalker’ elements.[14] While notions of privacy differ subject to socio-cultural context, LBS do provide more details about users and thus allow them to be victims of stalking (Cincotta, Ashford, & Michael 2011).

    The shift towards second generation camera phone images sees a movement away from networked towards emplaced visualities (Pink & Hjorth 2012; Hjorth 2013; Hjorth & Arnold 2013). On the one hand, this overlaying of the geographic with the social highlights that place has always mattered to mobile media (Ito 2002; Hjorth 2005). Far from eroding place, mobile media amplify the complexities of place as something lived and imagined, geographic and yet psychological. LBS enable mobile media users to create and convey more complex details about a locality. On the other hand, LBS create new motivations for narrating a sense of place and the role of amateur and vernacular photography.

    Shifts in contemporary amateur photography highlight the changes in how place, co-presence and information is navigated, performed and represented. This issues are particularly prevalent in contested location like Palestine. Last century it was the Kodak camera that epitomized amateur photography and played an important role in normalizing notions of the family as well as ritualizing events such as holidays.[15]

    As Lisa Gye notes, personal photography is central to the process of identity formation and memorialization.[16] The shift towards camera phones not only changes how we capture, store, and disseminate images but also has ‘important repercussions for how we understand who we are and how we remember the past’.[17]

    Moreover, with the rise in possibilities for sharing via social media like microblogs and Twitter, camera phone photography not only magnifies UCC, but also provides filters and lenses to enhance the “professional” and “artistic” dimensions of the photographic experience.[18]

    For Daniel Palmer, smartphone photography is distinctive in various ways, with one key feature being the relationship between touch and the image in what he calls an ”embodied visual intimacy” (2012: 88). With the rise of high quality camera phones, along with the growth in distribution services via social and locative media, new forms of visuality are emerging (Pink & Hjorth 2012). The added dimensions of movement and touch becoming important features of the camera phone with the emphasis on networked is shifting to “emplaced” visuality. Images as emplaced in relation to what human geographer Tim Ingold has called a “meshwork” and entanglement of lines (2008). Images themselves are part of such lines as they are inextricable from the camera and person who took them. In this sense camera phone images are not simply about what they represent (although they are also about that) but are additionally about what is behind, above, below, and to either side.

    By using different smartphone photo apps, respondents tried to inscribe a sense of place with emotion. This practice is what anthropologist Sarah Pink identifies as the “multisensorality of images.” That is, they are located in “the production and consumption of images as happening in movement, and consider them as components of configurations of place” (Pink 2011: 4). Drawing on Tim Ingold’s conceptualization of place as “entanglement” (Ingold 2008), Pink notes, “Thus, the ‘event’ where photographs are produced and consumed becomes not a meeting point in a network of connections but an intensity of entangled lines in movement… a meshwork of moving things” (Pink 2011: 8).

    While the surveillant eye of Big Brother now takes the form of Big Data, the emplaced nature of camera phone images can help to contribute to a changing relationship between performativity, memory and place that is user-orientated. Rather than operating to memorialize place, camera phone practices, especially through LBS networks, are creating playful performances around the movement of co-presence, place and placing (Richardson & Wilken 2012). As noted elsewhere, Pink and Hjorth argue that camera phone practices are highlighting a move away from the network society towards emplaced visualities and socialities (2012). Emplaced visuality means understanding camera phone practices and the socialities that create and emerge through them in ways corresponding with non-representational (Thrift, 2008) or ‘more-than-representational’ approaches in geography which according to Hayden Lorimer encompass:

    … how life takes shape and gains expression in shared experiences, everyday routines, fleeting encounters, embodied movements, precognitive triggers, affective intensities, enduring urges, unexceptional interactions and sensuous dispositions (Lorimer, 2005: 84).

    Thus we see camera phone photography as a part of the flow of everyday life, an increasingly habitual way of being that is sensed and felt (emotionally and physically). Yet, because camera phone photography involves the production and sharing of images, it also compels us to engage with the relationship between the representational and the non-representational. Emplaced visualities see images as embedded with the movements of everyday life. Tim Cresswell has suggested that we consider ‘three aspects of mobility: the fact of physical movement—getting from one place to another; the representations of movement that give it shared meaning; and, finally, the experienced and embodied practice of movement’ (Cresswell, 2010: 19). These three aspects of mobility are deeply interwoven and entangled. In camera phone photography the experience and representation of camera phone photography is enacted in the ‘flow’ of everyday life at the interface where digital and material realities come together. These emplaced visualities are often abstracted through the mechanics of Big Data mega surveillance. But as The Neighbour before the House demonstrates, the perpetual movement of emplaced visualities is in sharp contrast with the unmoving, omipresent Big Data eye.

    This contrast between the moving and unmoving, micro and macro information overlaid onto place can also be reflected as part of the shift from the flâneur to the phoneur. The notion of mobility—as a technology, cultural practice, geography and metaphor—has impacted upon the ways in which twenty-first century cartographies of the urban play out. Through the trope of mobility, and immobility, rather than overcoming all difference and distance, the significance of local is reinforced. While nineteenth-century narrations of the urban were symbolised by the visual economics of the flâneur, the twenty-first century wanderer of the informational city has been rendered what Robert Luke calls the phoneur. [19] The conceptual distance, and yet continuum, between the flâneur and the phoneur is marked by the paradigmatic shift of the urban as once a geospatial image of, and for, the bourgeoisie, as opposed to the phoneur which sees the city transformed into informational circuit in which the person is just a mere node with little agency. Beyond dystopian narrations about the role of technology in maintaining a sense of intimacy, community and place, we can find various ways in which the tenacity of the local retains control. In particular, through the tension between mobile media and Big Data, we can see how the local and the urban can be re-imagined in new ways.

    The flâneur (or the wanderer of the modern city), best encapsulated by German philosopher Walter Benjamin’s discussion of Baudelaire’s painting, has been defined as an important symbol of Paris and modernity as it moved into nineteenth century urbanity. Thanks to the restructuring of one third of the small streets into boulevards by Baron Hausmann, Paris of the nineteenth century took a new sense of place and space.

    Luke’s phoneur, on the other hand, is the ‘user’ as as part of the informational network flows constituting contemporary urbanity. If the flâneur epitomised modernism and the rise of nineteenth-century urban, then for Luke, the phoneur is the twenty-first-century extension of this tradition as the icon of modernity. As Luke observes, in a networked city one is connected as part of circuit of information in which identity and privacy is at the mercy of system. The picture of the urban city today painted by Luke is one in which the individuals have minimal power in the rise of corporate surveillance.

    Neighbour before the House problematises Luke’s dystopian view of the phoneur. The picture painted by Neighbour before the House is much more ambivalent. However it does make the audience reflect upon the changing nature of surveillance in an age of Big Data.[20]

    These tensions around the dystopian phoneur and a more embodied and emplaced version can be found running as an undercurrent in the work of Neighbour before the House.

    Conclusion

    In this chapter we have explored the cross-cultural video collaboration, The Neighbour before the House, to consider the changing relationship between a sense of place, information and the politics of visuality. As we have suggested, with the rise of location-based camera phone practices and Big Data we are seeing new forms of visuality that are best described as emplaced rather than networked. The notion of emplaced reflects some of the tensions around contemporary representations of mobility and movement, particularly prevalent in the often displaced and diasporic experiences of Palestine.

    Filmed in Palestine,The Neighbour before the House explores the notion of place as entangled and embedded at the same time as displaced through the rise of ICTs. By providing some of the paradoxes and ambivalences surrounding contemporary media practices and its relationship between information and place, it allows for a space for reflection and contemplation about the surveillence and privacy.


    [1]. Jean Burgess, Vernacular creativity and new media (Doctoral dissertation), 2007. Retrieved from http://eprints.qut.edu.au/16378/

    [2]. Sarah Pink and Larissa Hjorth Emplaced Cartographies: Reconceptualising camera phone practices in an age of locative media’, Media International Australia, 145 (2012): 145-156.

    [3]. Doreen Massey

    [4]. Shaina Anand interviewed by Nishant Shah, December 2012.

    [5]. Raine, L. and B. Wellman 2012, Networked, Cambridge, Mass, MIT Press.

    [6]. Ibid.

    [7]. Mizuko Ito, ‘Mobiles and the Appropriation of Place’. Receiver 8, 2002, (consulted 5 December 2012) http://academic.evergreen.edu/curricular/evs/readings/itoShort.pdf ; Hjorth, L. (2005) ‘Locating Mobility: Practices of Co-Presence and the Persistence of the Postal Metaphor in SMS/MMS Mobile Phone Customization in Melbourne’, Fibreculture Journal, 6, (consulted 10 December 2006) http://journal.fibreculture.org/issue6/issue6_hjorth.html.

    [8]. Rowan Wilken and Gerard Goggin, ‘Mobilizing Place: Conceptual Currents and Controversies’, in R. Wilken and G. Goggin (Eds) Mobile Technology and Place, New York, Routledge, 2012, pp. 3-25 (5).

    [9].Ibid 6.

    [10]. Doreen Massey,For Space, London, Sage, 2005 (130).

    [11]. Sarah Pink, Doing Sensory Ethnography, London, Sage, 2009.

    [12]. Anne Beaulieu, ‘From Co-location to Co-presence: Shifts in the Use of Ethnography for the Study of Knowledge’. Social Studies of Science, 40 (3) 2010: June. 453-470.

    [13]. Dong-Hoo Lee, ‘Women’s creation of camera phone culture’. Fibreculture Journal 6, 2005, URL (consulted 3 February 2006) http://www.fibreculture.org/journal/issue6/issue6_donghoo_print.html; Larissa Hjorth, ‘Snapshots of almost contact’. Continuum, 21 (2) 2007: 227-238.

    [14]. Alison Gazzard, ‘Location, Location, Location: Collecting Space and Place in Mobile Media’. Convergence: The International Journal of Research into New Media Technologies, 17 (4) 2011: 405-417.

    [15]. Lisa Gye, ‘Picture this: the impact of mobile camera phones on personal photographic practices,’ Continuum: Journal of Media & Cultural Studies 21(2) 2007: 279–288.

    [16]. Ibid 279.

    [17]. Ibid 279.

    [18]. Søren Mørk Petersen,Common Banality: The Affective Character of Photo Sharing, Everyday Life and Produsage Cultures, PhD Thesis, ITU Copenhagen.

    [19]. Robert Luke, ‘The Phoneur: Mobile Commerce and the Digital Pedagogies of the Wireless  Web’, in P. Trifonas (ed.) Communities of Difference: Culture, Language, Technology, pp. 185-204, Palgrave, London, 2006.

    [20]. Sites such as www.pleaserobme.com, that seek to raise awareness about over-sharing of personal data, highlight not only the localised nature of privacy but also that privacy is something  we do rather than something we possess.

    MAG Notice

    by Prasad Krishna last modified Feb 03, 2014 10:06 AM

    PDF document icon MAG-meeting-notice.pdf — PDF document, 161 kB (165692 bytes)

    Data Privacy Day, Chennai

    by Prasad Krishna last modified Feb 04, 2014 07:05 AM

    PDF document icon TACTiCS Symposium - Data Privacy 2014 - Program.pdf — PDF document, 282 kB (288831 bytes)

    Document Actions