Blog

by kaeru — last modified Mar 25, 2013 11:14 AM

Interview with Mr. Billy Hawkes - Irish Data Protection Commissioner

by Maria Xynou last modified Jul 12, 2013 11:06 AM
Maria Xynou recently interviewed Mr. Billy Hawkes, the Irish Data Protection Commissioner, at the CIS´ 4th Privacy Round Table meeting. View this interview and gain an insight on recommendations for data protection in India!
Interview with Mr. Billy Hawkes - Irish Data Protection Commissioner

by Sean Nicholls on flickr


This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


The Irish Data Protection Commissioner was asked the following questions:

1. What powers does the Irish Data Commissioner´s office have? In your opinion, are these sufficient? Which powers have been most useful? If there is a lack, what would you feel is needed?

2. Does your office differ from other EU data protection commissioner offices?

3. What challenges has your office faced? What is the most common type of privacy violation that your office has faced?

4. Why should privacy legislation be enacted in India?

5. Does India need a Privacy Commissioner? Why? If India creates a Privacy Commissioner, what structure / framework would you suggest for the office?

6. How do you think data should be regulated in India? Do you support the idea of co-regulation or self-regulation?

7. How can India protect its citizens´ data when it is stored in foreign servers?

 

video

Interview with the Citizen Lab on Internet Filtering in India

by Maria Xynou last modified Jun 26, 2013 09:47 AM
Maria Xynou recently interviewed Masashi Crete-Nishihata and Jakub Dalek from the Citizen Lab on internet filtering in India. View this interview and gain an insight on Netsweeper and FinFisher!

A few days ago, Masashi Crete-Nishihata (research manager) and Jakub Dalek (systems administrator) from the Citizen Lab visited the Centre for Internet and Society (CIS) to share their research with us.

The Citizen Lab is an interdisciplinary laboratory based at the Munk School of Global Affairs at the University of Toronto, Canada. The OpenNet Initiative is one of the Citizen Lab's ongoing projects which aims to document patterns of Internet surveillance and censorship around the world. OpenNet.Asia is another ongoing project which focuses on censorship and surveillance in Asia.

The following video entails an interview of both Masashi Crete-Nishihata and Jakub Dalek on the following questions:

1. Why is it important to investigate Internet filtering around the world?

2. How high are the levels of Internet filtering in India, in comparison to the rest of the world?

3. "Censorship and surveillance of the Internet aim at tackling crime and terrorism and in increasing overall security." Please comment.

4. What is Netsweeper and how is it being used in India? What consequences does this have?

5. What is FinFisher and how could it be used in India?

Video


Report on the 4th Privacy Round Table meeting

by Maria Xynou last modified Jul 12, 2013 11:04 AM
This report entails an overview of the discussions and recommendations of the fourth Privacy Round Table in Mumbai, on 15th June 2013.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


In furtherance of Internet Governance multi-stakeholder Initiatives and Dialogue in 2013, the Centre for Internet and Society (CIS) in collaboration with the Federation of Indian Chambers of Commerce and Industry (FICCI), and the Data Security Council of India (DSCI), is holding a series of six multi-stakeholder round table meetings on “privacy” from April 2013 to August 2013. The CIS is undertaking this initiative as part of their work with Privacy International UK on the SAFEGUARD project.

In 2012, the CIS and DSCI were members of the Justice AP Shah Committee which created the “Report of Groups of Experts on Privacy”. The CIS has recently drafted a Privacy (Protection) Bill 2013, with the objective of contributing to privacy legislation in India. The CIS has also volunteered to champion the session/workshops on “privacy” in the meeting on Internet Governance proposed for October 2013.

At the roundtables the Report of the Group of Experts on Privacy, DSCI´s paper on “Strengthening Privacy Protection through Co-regulation” and the text of the Privacy (Protection) Bill 2013 will be discussed. The discussions and recommendations from the six round table meetings will be presented at the Internet Governance meeting in October 2013.

The dates of the six Privacy Round Table meetings are enlisted below:

  1. New Delhi Roundtable: 13 April 2013

  2. Bangalore Roundtable: 20 April 2013

  3. Chennai Roundtable: 18 May 2013

  4. Mumbai Roundtable: 15 June 2013

  5. Kolkata Roundtable: 13 July 2013

  6. New Delhi Final Roundtable and National Meeting: 17 August 2013

Following the first three Privacy Round Tables in Delhi, Bangalore and Chennai, this report entails an overview of the discussions and recommendations of the fourth Privacy Round Table meeting in Mumbai, on 15th June 2013.

Discussion of the Draft Privacy (Protection) Bill 2013

Discussion of definitions: Chapter 1

The fourth Privacy Round Table meeting began with a discussion of the definitions in Chapter 1 of the draft Privacy (Protection) Bill 2013. In particular, it was stated that in India, the courts argue that the right to privacy indirectly derives from the right to liberty, which is guaranteed in article 21 of the constitution. However, this provision is inadequate to safeguard citizens from potential abuse, as it does not protect their data adequately. Thus, all the participants in the meeting agreed with the initial notion that India needs privacy legislation which will explicitly regulate data protection, the interception of communications and surveillance within India. To this extent, the participants started a thorough discussion of the definitions used in the draft Privacy (Protection) Bill 2013.

It was specified in the beginning of the meeting that the definition of personal data in the Bill applies to natural persons and not to juristic persons. A participant argued that the Information Technology Act refers to personal data and that the draft Privacy (Protection) Bill 2013 should be harmonised with existing rules. This was countered by a participant who argued that the European Union considers the Information Technology Act inadequate in protecting personal data in India and that since India does not have data secure adequacy, the Bill and the IT Act should not be harmonised.

Other participants argued that all other relevant acts should be quoted in the discussion so that it does not overlap with existing provisions in other rules, such as the IT Act. Furthermore, this was supported by the notion that the Bill should not clash with existing legislation, but this was dismissed by the argument that this Bill – if enacted into law – would over right all other competing legislation. Special laws over right general laws in India, but this would be a special law for the specific purpose of data protection.

The definition of sensitive personal data includes biometric data, political affiliation and past criminal history, but does not include ethnicity, caste, religion, financial information and other such information. It was argued that one of the reasons why such categories are excluded from the definition of sensitive personal data is because the government requests such data on a daily basis and that it is not willing to take any additional expense to protect such data. It was stated that the Indian government has argued that such data collection is necessary for caste census and that financial information, such as credit data, should not be included in the definition for sensitive personal data, because a credit Act in India specifically deals with how credit data should be used, shared and stored.

Such arguments were backlashed by participants arguing that definitions are crucial because they are the “building blocks” of the entire Bill and that ethnicity, caste, religion and financial information should not be excluded from the Bill, as they include information which is sensitive within the Indian context. In particular, some participants argued that the Bill would be highly questioned by countries with strong privacy legislation, as certain categories of information, such as ethnicity and caste, are definitely considered to be sensitive personal information within India. The argument that it is too much of a bureaucratic and financial burden for the Indian government to protect such personal data was countered by participants who argued that in that case, the government should not be collecting that information to begin with – if it cannot provide adequate safeguards.

The debate on whether ethnicity, religion, caste and financial information should be included in the definition for sensitive personal data continued with a participant arguing that no cases of discrimination based on such data have been reported and that thus, it is not essential for such information to be included in the definition. This argument was strongly countered by participants who argued that the mere fact that the government is interested in this type of information implies that it is sensitive and that the reasons behind the governments´ interest in this information should be investigated. Furthermore, some participants argued that a new provision for data on ethnicity, religion, caste and financial information should be included, as well as that there is a difference between voluntarily handing over such information and being forced to hand it over.

The inclusion of passwords and encryption keys in the definition of sensitive personal data was highly emphasized by several participants, especially since their disclosure can potentially lead to unauthorised access to volumes of personal data. It was argued that private keys in encryption are extremely sensitive personal data and should definitely be included within the Bill.

In light of the NSA leaks on PRISM, several participants raised the issue of Indian authorities protecting data stored in foreign servers. In particular, some participants argued that the Bill should include provisions for data stored in foreign servers in order to avoid breaches for international third parties. However, a participant argued that although Indian companies are subject to the law, foreign data processors cannot be subject to Indian law, which is why they should instead provide guarantees through contracts.

Several participants strongly argued that the IT industry should not be subject to some of the privacy principles included in the Report of the Group of Experts on Privacy, such as the principle of notice. In particular, they argued that customers choose to use specific services and that by doing so, they trust companies with their data; thus the IT industry should not have to comply with the principle of notice and should not have to inform individuals of how they handle their data.

On the issue of voluntary disclosure of personal data, a participant argued that, apart from the NPR and UID, Android and Google are conducting the largest data collection within India and that citizens should have the jurisdiction to go to court and to seek that data. The issue of data collection was further discussed over the next sessions.

Right to Privacy: Chapter 2

The discussion of the right to privacy, as entailed in chapter 2 of the draft Privacy (Protection) Bill 2013, started with a participant stating that governments own the data citizens hand over to them and that this issue, along with freedom from surveillance and illegal interception, should be included in the Bill.

Following the distinction between exemptions and exceptions to the right to privacy, a participant argued that although it is clear that the right to privacy applies to all natural persons in India, it is unclear if it also applies to organizations. This argument was clarified by a participant who argued that chapter 2 clearly protects natural persons, while preventing organisations from intervening to this right. Other participants argued that the language used in the Bill should be more gender neutral and that the term “residential property” should be broadened within the exemptions to the right to privacy, to also include other physical spaces, such as shops. On this note, a participant argued that the word “family” within the exemptions should be more specifically defined, especially since in many cases husbands have controlled their wives when they have had access to their personal accounts.

The definition of “natural person” was discussed, while a participant raised the question of whether data protection applies to persons who have undergone surgery and who have changed their sexual orientation; it was recommended that such provisions are included within the Bill. The above questions were answered by a participant who argued that the generic European definitions for “natural persons” and “family” could be adopted, as well as that CCTV cameras used in public places, such as shops, should be subject to the law, because they are used to monitor third parties.

Other participants suggested that commercial violations are not excluded from the Bill, as the broadcasting of people, for example, can potentially lead to a violation of the right to privacy. In particular, it was argued that commercial establishments should not be included in the exemptions section of the right to privacy, in contrast to other arguments that were in favour of it. Furthermore, participants argued that the interaction between transparency and freedom of information should be carefully examined and that the exemptions to the right to privacy should be drafted accordingly.

Protection of Personal Data: Chapter 3

Some of the most important discussions in the fourth Privacy Round Table meeting revolved around the protection of personal data.

Collection of personal data

The discussion on the collection of personal data started with a statement that the issue of individual consent prior to data collection is essential and that in every case, the data subject should be informed of its data collection, data processing, data sharing and data retention.

It was pointed out that, unlike most privacy laws around the world, this Bill is affirmative because it states that data can only be collected once the data subject has provided prior consent. It was argued that if this Bill was enacted into law, it would probably be one of the strictest laws in the world in terms of data collection, because data can only be collected with individual consent and a legitimate purpose. Data collection in the EU is not as strict, as there are some exemptions to individual consent; for example, if someone in the EU has a heart attack, other individuals can disclose his or her information. It was emphasized that as this Bill limits data collection to individual consent, it does not serve other cases when data collection may be necessary but individual consent is not possible. A participant pointed out that, although the Justice AP Shah Report of the Group of Experts on Privacy states that “consent may not be acquired in some cases”, such cases are not specified within the Bill.

Other issues that were raised are that the Bill does not specify how individual consent would be obtained as a prerequisite to data collection. In particular, it remains unclear whether such consent would be acquired through documentation, a witness or any other way. Thus it was emphasized that the method for acquiring individual consent should be clearly specified within the Bill, especially since it is practically hard to obtain consent for large portions of the Indian population that live below the line of poverty.

A participant argued that data collection on private detectives, from reality TV shows and on physical movement and location should also be addressed in the Bill. Furthermore, other participants argued that specific explanations to exempt medical cases and state collection of data which is directly related to the provision of welfare should be included in the Bill. Participants recommended that individuals should have the right to opt out from data collection for the purpose of providing welfare programmes and other state-run programmes.

The need to define the term “legitimate purpose” was pointed out to ensure that data is not breached when it is being collected. A participant recommended the introduction of a provision in the Bill for anonymising data in medical case studies and it was pointed out that it is very important to define what type of data can be collected. In particular, it was argued that a large range of personal data is being collected in the name of “public health” and “public security” and that, in many cases, patients may provide misinformed consent, because they may think that the revelation of their personal data is necessary, when actually it might not be. It was recommended that this issue is addressed and that necessary provisions are included in the Bill.

In the cases where data is collected for statistics, individuals may not be informed of their data being collected and may not provide consent. It was also recommended that this issue is addressed and included in the Bill. However, it was also pointed out that in many cases, individuals may choose to use a service, but they may not be able to consent to their data collection and Android is an example of this. Thus it was argued that companies should be transparent about how they handle users´ data and that they should require individuals´ consent prior to data collection.

It was emphasized that governments have a duty of transparency towards their citizens and that the fact that, in many cases, citizens are obliged to hand over their data without giving prior consent to how their data is being used should be taken into consideration. In particular, it was argued that many citizens need to use specific services or welfare programmes and that they are obliged to hand over their personal information. It was recommended that the Bill incorporates provisions which would oblige all services to acquire individual consent prior to data collection. However, the issue that was raised is that often companies provide long and complicated contracts and policy guides which discourage individuals from reading them and thus from providing informed consent; it was recommended that this issue is addressed as well.

Storage and destruction of personal data

The discussion on the storage and destruction of personal data started with a statement that different sectors should have different data retention frameworks. The proposal that a ubiquitous data retention framework should not apply to all sectors was challenged by a participant who stated that the same data retention period should apply to all ISPs and telecoms. Furthermore, it was added that regulators should specify the data retention period based on specific conditions and circumstances. This argument was countered by participants who argued that each sector should define its data retention framework depending on many variables and factors which affect the collection and use of data.

In European laws, no specific data retention periods are established. In particular, European laws generally state that data should only be retained for a period related to the purpose of its collection. Hence it was pointed out that data retention frameworks should vary from sector to sector, as data, for example, may need to be retained longer for medical cases than for other cases. This argument, however, was countered by participants who argued that leaving the prescription of a data retention period to various sectors may not be effective in India.

Questions of how data retention periods are defined were raised, as well as which parties should be authorised to define the various purposes for data retention. One participant recommended that a common central authority is established, which can help define the purpose for data retention and the data retention period for each sector, as well as to ensure that data is destroyed once the data retention period is over. Another participant recommended that a three year data retention period should be applied to all sectors by default and that such periods could be subject to change depending on specific cases.

Security of personal data and duty of confidentiality

Participants recommended that the definition of “data integrity” should be included in Chapter 1 of the draft Privacy (Protection) Bill 2013. Other participants raised the need to define the term “adequacy” in the Bill, as well as to state some parameters for it. It was also suggested that the term “adequacy” could be replaced by the term “reasonable”.

One of the participants raised the issue of storing data in a particular format, then having to transfer that data to another format which could result in the modification of that data. It was pointed out that the form and manner of securing personal data should be specifically defined within the Bill. However, it was argued that the main problem in India is the implementation of the law, and that it would be very difficult to practically implement the draft Privacy (Protection) Bill in India.

Disclosure of personal data

The discussion on the disclosure of personal data started with a participant arguing that the level of detail disclosed within data should be specified within the Bill. Another participant argued that the privacy policies of most Internet services are very generic and that the Bill should prevent such services from publicly disclosing individuals´ data. On this note, a participant recommended that a contract and a subcontract on the disclosure of personal data should be leased in order to ensure that individuals are aware of what they are providing their consent to.

It was recommended that the Bill should explicitly state that data should not be disclosed for any other purpose other than the one for which an individual has provided consent. Data should only be used for its original purpose and if the purpose for accessing data changes within the process, consent from the individual should be acquired prior to the sharing and disclosure of that data. A participant argued that banks are involved with consulting and other advisory services which may also lead to the disclosure of data; all such cases when information is shared and disclosed to (unauthorised) third parties should be addressed in the Bill.

Several participants argued that companies should be responsible for the data they collect and that should not share it or disclose it to unauthorised third parties without individuals´ knowledge or consent. On this note, other participants argued that companies should be legally allowed to share data within a group of companies, as long as that data is not publicly disclosed. An issue that was raised by one of the participants is that online companies, such as Gmail, usually acquire consent from customers through one “click” to a huge document which not only is usually not read by customers, but which vaguely entails all the cases for which individuals would be providing consent for. This creates the potential for abuse, as many specific cases which would require separate, explicit consent, are not included within this consent mechanism.

This argument was countered by a participant who stated that the focus should be on code operations for which individuals sign and provide consent, rather than on the law, because that would have negative implications on business. It was highlighted that individuals choose to use specific services and that by doing so they trust companies with their data. Furthermore, it was argued that the various security assurances and privacy policies provided by companies should suffice and that the legal regulation of data disclosure should be avoided.

Consent-based sharing of data should be taken into consideration, according to certain participants. The factor of “opt in” should also be included when a customer is asked to give informed consent. Participants also recommended that individuals should have the power to “opt out”, which is currently not regulated but deemed to be extremely important. Generally it was argued that the power to “opt in” is a prerequisite to “opt out”, but both are necessary and should be regulated in the Bill.

A participant emphasized the need to regulate phishing in the Bill and to ensure that provisions are in place which could protect individuals´ data from phishing attacks. On the issue of consent when disclosing personal data, participants argued that consent should be required even for a second flow of data and for all other flows of data to follow. In other words, it was recommended that individual consent is acquired every time data is shared and disclosed. Moreover, it was argued that if companies decide to share data, to store it somewhere else or to disclose it to third parties years after its initial collection, the individual should have the right to be informed.

However, such arguments were countered by participants who argued that systems, such as banks, are very complex and that they don´t always have a clear idea of where data flows. Thus, it was argued that in many cases, companies are not in a position to control the flow of data due to a lack of its lack of traceability and hence to inform individuals every time their data is being shared or disclosed.

Participants argued that the phrase “threat to national security” in section 10 of the Bill should be explicitly defined, because national security is a very broad term and its loose interpretation could potentially lead to data breaches. Furthermore, participants argued that it is highly essential to specify which authorities would determine if something is a threat to national security.

The discussion on the disclosure of personal data concluded with a participant arguing that section 10 of the Bill on the non-disclosure of information clashes with the Right to Information Act (RTI Act), which mandates the opposite. It was recommended that the Bill addresses the inevitable clash between the non-disclosure of information and the right to information and that necessary provisions are incorporated in the Bill.

Presentation by Mr. Billy Hawkes – Irish Data Protection Commissioner

The Irish Data Protection Commissioner, Mr. Billy Hawkes, attended the fourth Privacy Round Table meeting in Mumbai and discussed the draft Privacy (Protection) Bill 2013.

In particular, Mr. Hawkes stated that data protection law in Ireland was originally introduced for commercial purposes and that since 2009 privacy has been a fundamental right in the European Union which spells out the basic principles for data protection. Mr. Hawkes argued that India has successful outsourcing businesses, but that there is a concern that data is not properly protected. India has not been given data protection adequacy by the European Union, mainly because the country lacks privacy legislation.

There is a civic society desire for better respect for human rights and there is the industrial desire to be considered adequate by the European Union and to attract more international customers. However, privacy and data protection are not covered adequately in the Information Technology Act, which is why Mr. Hawkes argued that the draft Privacy (Protection) Bill 2013 should be enacted in compliance with the principles from the Justice AP Shah Report on the Group of Experts on Privacy. Enacting privacy legislation in India would, according to Mr. Hawkes, be a prerequisite so that India can potentially be adequate in data protection in the future.

The Irish Data Protection Commissioner referred to the current negotiations taking place in the European Union for the strengthening of the 1995 Directive on Data Protection, which is currently being revisited and which will be implemented across the European Union. Mr. Hawkes emphasized that it is important to have strong enforcement powers and to ask companies to protect data. In particular, he argued that data protection is good customer service and that companies should acknowledge this, especially since data protection reflects respect towards customers.

Mr. Hawkes highlighted that other common law countries, such as Canada and New Zealand, have achieved data secure adequacy and that India can potentially be adequate too. More and more countries in the world are seeking European adequacy. Privacy law in India would not only safeguard human rights, but it´s also good business and would attract more international customers, which is why European adequacy is important. In every outsourcing there needs to be a contract which states that the requirements of the data controller have been met. Mr. Hawkes emphasized that it is a competitive disadvantage in the market to not be data adequate, because most countries will not want their data outsourced to countries which are inadequate in data security.

As a comment to previous arguments stated in the meeting, it was pointed out that in Ireland, if companies and banks are not able to track the flow of data, then they are considered to be behaving irresponsibly. Furthermore, Mr. Hawkes states that data adequacy is a major reputational issue and that inadequacy in data security is bad business. It is necessary to know where the responsibility for data lies, which party initially outsourced the data and how it is currently being used. Data protection is a fundamental right in the European Union and when data flows outside the European Union, the same level of protection should apply. Thus other non-EU countries should comply with regulations for data protection, not only because it is a fundamental human right, but also because it is bad business not to do so.

The Irish Data Protection Commissioner also referred to the “Right to be Forgotten”, which is the right to be told how long data will be retained for and when it will be destroyed. This provides individuals some control over their data and the right to demand this control.

On the funding of data protection authorities, Mr. Hawkes stated that funding varies and that in most cases, the state funds the data protection authority – including Ireland. Data protection authorities are substantially funded by their states across the European Union and they are allocated a budget every year which is supposed to cover all their costs. The Spanish data protection authorities, however, are an exception because a large amount of their activities are funded by fines.The data protection authorities in the UK (ICO) are funded through registration fees paid by companies and other organizations.

When asked about how many employees are working in the Irish data protection commissioner´s office, Mr. Hawkes replied that only thirty individuals are employed. Employees working in the commissioner´s office are responsible for overseeing the protection of the data of Facebook users, for example. Facebook-Ireland is responsible for handling users´ data outside of North America and the commissioner´s office conducted a detailed analysis to ensure that data is protected and that the company meets certain standards. Facebook´s responsibility is limited as a data controller as individuals using the service are normally covered by the so-called "household exemption" which puts them outside the scope of data protection law. The data protection commissioner conducts checks and balances, writes reports and informs companies that if they comply with privacy and data protection, then they will be supported.

Data protection in Ireland covers all the organizations, without exception. Mr. Hawkes stated that EU data protection commissioners meeting in the "Article 29" Working Party spend a significant amount of their time dealing with companies like Google and Facebook and with whether they protect their customers´ data.

The Irish Data Protection Commissioner recommended that India establishes a data protection commission based on the principles included in the Justice AP Shah Report of the Group of Experts on Privacy. In particular, an Indian data protection commission would have to deal with a mix of audit inspections, complaints, greater involvement with sectors, transparency, accountability and liability to the law. Mr. Hawkes emphasized that codes of practice should be implemented and that the focus should not be on bureaucracy, but on accountability. It was recommended that India should adopt an accountability approach, where punishment will be in place when data is breached.

On the recent leaks on the NSA´s surveillance programme, PRISM, Mr. Hawkes commented that he was not surprised. U.S. companies are required to give access to U.S. law enforcement agencies and such access is potentially much looser in the European Union than in the U.S., because in the U.S. a court order is normally required to access data, whereas in the European Union that is not always the case. Mr. Hawkes stated that there needs to be a constant questioning of the proportionality, necessity and utility of surveillance schemes and projects in order to ensure that the right to privacy and other human rights are not violated.

Mr. Hawkes stated that the same privacy law should apply to all organizations and that India should ensure its data adequacy over the next years. The Irish Data Protection Commissioner is responsible for Facebook Ireland and European law is about protecting the rights of any organisation that comes under European jurisdiction, whether it is a bank or a company. Mr. Billy Hawkes emphasized that the focus in India should be on adequacy in data security and in protecting citizens´ rights.

Meeting conclusion

The fourth Privacy Round Table meeting entailed a discussion of the draft Privacy (Protection) Bill 2013 and Mr. Billy Hawkes, the Irish Data Protection Commissioner, gave a presentation on adequacy in data security and on his thoughts on data protection in India. The discussion on the draft Privacy (Protection) Bill 2013 led to a debate and analysis of the definitions used in the Bill, of chapter 2 on the right to privacy, and on data collection, data retention, data sharing and data disclosure. The participants provided a wide range of recommendations for the improvement of the draft Privacy (Protection) Bill and all will be incorporated in the final draft. The Irish Data Protection Commissioner, Mr. Billy Hawkes, stated that the European Union has not given data adequacy to India because it lacks privacy legislation and that data inadequacy is not only a competitive disadvantage in the market, but it also shows a lack of respect towards customers. Mr. Hawkes strongly recommended that privacy legislation in compliance with the Justice AP Shah report is enacted, to ensure that India is potentially adequate in data security in the future and that citizens´ right to privacy and other human rights are guaranteed.

Open Letter to Prevent the Installation of RFID tags in Vehicles

by Maria Xynou last modified Jul 12, 2013 10:59 AM
The Centre for Internet and Society (CIS) has sent this open letter to the Society of Indian Automobile Manufacturers (SIAM) to urge them not to intall RFID tags in vehicles in India.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


This letter is with regards to the installation of Radio Frequency Identification Tags (RFID) in vehicles in India.

On behalf of the Centre for Internet and Society, we urge you to prevent the installation of RFID tags in vehicles in India, as the legality, necessity and utility of RFID tags have not been adequately proven. Such technologies raise major ethical concerns, since India lacks privacy legislation which could safeguard individuals' data.

The proposed rule 138A of the Central Motor Vehicle Rules, 1989, mandates that RFID tags are installed in all light motor vehicles in India. However, section 110 of the Motor Vehicles Act (MV Act), 1988, does not bestow on the Central Government a specific empowerment to create rules in respect to RFID tags. Thus, the legality of the proposed rule 138A is questioned, and we urge you to not proceed with an illegal installation of RFID tags in vehicles until the Supreme Court has clarified this issue.

The installation of RFID tags in vehicles is not only currently illegal, but it also raises majors privacy concerns. RFID tags yield locational information, and thus reveal information as to an individual’s whereabouts. This could lead to a serious invasion of the right to privacy, which is at the core of personal liberty, and constitutionally protected in India. Moreover, the installation of RFID tags in vehicles is not in compliance with the privacy principles of the Report of the Group of Experts on Privacy, as, among other things, the architecture of RFID tags does not allow for consent to be taken from individuals for the collection, use, disclosure, and storage of information generated by the technology.[1]

The Centre for Internet and Society recently drafted the Privacy (Protection) Bill 2013 – a citizen's version of a possible privacy legislation for India.[2] The Bill defines and establishes the right to privacy and regulates the interception of communications and surveillance, and would include the regulation of technologies like RFID tags. As this Bill has not been enacted into law and India lacks a privacy legislation which could safeguard individuals' data, we strongly urge you to not require the mandatory installation of RFID tags in vehicles, as this could potentially violate individuals' right to privacy and other human rights.

As the proposed rule 138A, which mandates the installation of RFID tags in vehicles, is currently illegal and India lacks privacy legislation which would regulate the collection, use, sharing of, disclosure and retention of data, we strongly urge you to ensure that RFID tags are not installed in vehicles in India and to play a decisive role in protecting individuals' right to privacy and other human rights.

Thank you for your time and for considering our request.

Sincerely,

Centre for Internet and Society (CIS)

 

 

[1]. Report of the Group of Experts on Privacy: http://planningcommission.nic.in/reports/genrep/rep_privacy.pdf

[2].Draft Privacy (Protection) Bill 2013: http://cis-india.org/internet-governance/blog/privacy-protection-bill-2013.pdf

The State is Snooping: Can You Escape?

by Snehashish Ghosh last modified Apr 29, 2019 03:09 PM
Blanket surveillance of the kind envisaged by India's Centralized Monitoring System achieves little, but blatantly violates the citizen's right to privacy; Snehashish Ghosh explores why it may be dangerous and looks at potential safeguards against such intrusion.

The Snowden Leaks have made it amply clear that the covert surveillance conducted by governments is no longer covert. Information by its very nature is prone to leaks. The discretion lies completely in the hands of the personnel handling your data or information. Whether it is through knowledge obtained by an intelligence analyst about the US Government conducting indiscriminate surveillance, or hackers infiltrating a secure system and leaking personal information, stored information has a tendency to come out in the open sooner or later.

This raises the question whether, with the advancement of technologies, we should trust our personal information and data with computers. Should we have more stringent laws and procedural safeguards to protect our personal information? Of course, the broader question that remains is whether we have a ‘Right to be Forgotten’.

Similar to PRISM in the US, India is also implementing a Centralized Monitoring System (CMS) which would have the capabilities to conduct multiple privacy-intrusive activities, ranging from call data record analysis to location based monitoring. Given the circumstances and the current revelations by a whistleblower in the US, it is more than imperative to take a closer look at the surveillance technologies which are being deployed by India and question what implications it might have in the future.

Technological shift and procedural safeguards
The need for procedural safeguards was brought to light in the Supreme Court case, when news reports surfaced about the tapping of politicians' phones by the CBI. The Court while deciding on the issue of phone tapping in the case of People’s Union of Civil Liberties v. Union of India (1996), observed that the Indian Telegraph Act, 1885 is an ancient legislation and does not address the issue of telephone tapping. Thereafter, the court issued guidelines, which were implemented by the Government by amending and inserting Rule 419A of the Indian Telegraph Rules, 1951. These procedural safeguards ensure that due process will be followed by any law enforcement agency, while conducting surveillance.

Section 5(2) of the Indian Telegraph Act, 1885 grants the power to the Government to conduct surveillance provided that there is an occurrence of any public emergency or public safety. If and only if the conditions of public safety and public emergency are compromised, and if the concerned authority is convinced that it is expedient to issue such an order for interception in the interest of “the sovereignty and integrity of India, the security of the State, friendly relations with foreign States or public order or for preventing incitement to the commission of an offence” is surveillance legitimized. The same was reaffirmed by the Supreme Court in the 1996 judgment on wire tapping.

Now, as the Government of India is planning to launch a new technology, the Centralized Monitoring System (CMS) which would snoop, track and monitor communication data flowing through telecom and data networks, the question arises: can we have procedural safeguards which would protect our right to privacy against technologies such as the CMS?

The key component of a procedural safeguard is human discretion; either a court authorization or an order from a high ranking government official is necessary to conduct targeted surveillance and the reasons for conducting surveillance have to be recorded in writing. This is the procedure which is ordinarily followed by law enforcement agencies before conducting any form of surveillance. However, with the computational turn, governments have resorted to practices which would do away with the human discretion. Dragnet surveillance allows for blanket surveillance. Before getting to the problems in evolving a due process for systems like CMS, it is imperative to examine the capabilities of the system.

Centralized Monitoring System and death of due process
Setting up of a CMS was conceptualized in India after the 2008 Mumbai attacks. It was further consolidated and found a place in the Report of the Telecom Working Group on the Telecom Sector for the Twelfth Five Year Plan (2012-2017). The Report was published in August, 2011 and goes into the details of the CMS.

When machines and robots are deployed to conduct blanket surveillance and impinge on the most fundamental right to life and liberty, and also violate the basic tenets of due process, then much cannot be done by way of procedures. What then do we resort to, is the primary question. Can there be a compromise between the right to privacy and security?

The Report indicates that the technology will cater to “the requirements of security management for law enforcement agencies for interception, monitoring, data analysis/mining, anti‐social‐networking using the country’s telecom infrastructure for unlawful activities.”

The CMS will also be capable of running algorithms for interception of connection oriented networks, algorithms for interception of voice over internet protocol (VoIP), video over IP and GPS based monitoring systems. These algorithms would be able to intercept any communication without any intervention from the telecom or internet service provider. It would also have the capability to intercept and analyze data on any communication network as well as to conduct location based monitoring by tracking GPS locations. Given such capabilities, it is clear that a computer system will be sifting through the internet/communication data and will conduct surveillance as instructed through algorithms. This would include identifying patterns, profiling and also storing data for posterity. Moreover, the CMS will have direct access to the telecommunication infrastructure and would be monitoring all forms of communication.

With the introduction of CMS, state surveillance will shift to blanket surveillance from the current practice of targeted surveillance which can be carried out under specific circumstances that are well defined in the law and in judgments. Moreover, when it comes to current means of surveillance, there are well-defined procedures under the law which have the ability to prevent misuse of the surveillance systems. This is not to say that the current procedural safeguards under the laws are not prone to abuse, but if implemented properly, there is less chance of them being misused. Furthermore, with strong privacy and data protection laws, unlawful and illegal surveillance can be minimized.

In the current legal framework, with respect to surveillance, if CMS is implemented then it will be in violation of the fundamental right to privacy and freedom of speech as guaranteed under our Constitution. It will be also in contravention of the procedural safeguards laid down in the Supreme Court judgement and the Rule 419A of Indian Telegraph Rules, thereof. Strong privacy laws and data protection laws may be put in place, which are completely absent now. But at the end of the day, a machine will be spying on every citizen of India or anyone using any communication services, without any specific targets or suspects.

In the People’s Union of Civil Liberties v. Union of India (1996), the Supreme Court laid down that “the substantive law as laid down in Section 5(2) of the [Indian Telegraph Act, 1885] must have procedural backing so that the exercise of power is fair and reasonable.” But with technologies such as CMS, it will be very difficult to have any form of procedural backing because the system would do away with human discretion which happens to be a key ingredient of any legal procedure.

The argument which can be made in favour of CMS, if any, is that a machine will be going through personal data and it will not be available to any personnel or law enforcement agency without authorization and therefore, it will adhere to the due process. However, such a system will be keeping track of all personal information. Right to privacy is the right to be left alone and any incursion on this fundamental right can only be allowed in special cases, in cases of public emergency or threat of public safety. So, electronic blanket surveillance without human intervention also amounts to violation of the substantive law, which specifically allows surveillance only to be conducted under certain conditions, and not through a system such as CMS that is designed to keep a constant watch on everyone, irrespective of the fact whether there is a need to do so.

Additionally, there exists a strong, pre-established notion that whatever comes out of a computer is bound to be true and authentic and there cannot be any mistakes. We have witnessed this in the past where an IT professional from Bangalore was arrested and detained by the Maharashtra Police for posting derogatory content on Orkut about Shivaji. Later, it was found that the records acquired from the Internet Service Provider were incorrect and the individual had been arrested and detained illegally.

Telephone bills, credit card bills coming out from a computer system are often held to be authentic and error-free. With UID, our identity has been reduced to a number and biometrics stored in a database corresponding to that number. It is this trust in anything which comes out of a computer or a machine that can lead to massive abuse of the system in the absence of any form of checks and balance in place. Artificial things taking control over human lives and our almost unflinching trust in technology will not only cause gross violations of privacy but will also be the death of due process and basic human rights as we know it.

In this regard, due emphasis should be given to the landmark Supreme Court judgment in the case of Maneka Gandhi v. Union of India (1978) which deals with issues related to due process and privacy. It states that "procedure which deals with the modalities of regulating, restricting or even rejecting a fundamental right falling within Article 21 has to be fair, not foolish, carefully designed to effectuate, not to subvert, the substantive right itself. Thus, understood, ‘procedure’ must rule out anything arbitrary, freakish or bizarre. A valuable constitutional right can be canalised only by canalised processes".

When machines and robots are deployed to conduct blanket surveillance and impinge on the most fundamental right to life and liberty and also violate the basic tenets of due process, then much cannot be done by way of procedures. What then do we resort to, is the primary question. Can there be a compromise between the right to privacy and security?

A no-win situation
In reality, dragnet surveillance or blanket surveillance is not very useful for gathering valuable intelligence to prevent instances of threat to national security, public safety and public emergency. For example, if the CMS is used to mine data, analyse content related to anti-social activities and even if the system is 99 per cent accurate, the remaining 1 per cent which is a false positive happens to be a large set. So, 1 out of every 100 individuals identified as an anti-social element by CMS may actually be an innocent citizen. Given the possibility of false positives and which may be more than 1 per cent, the number of innocent citizens caught in the terrorist net would be much higher.

Even though blanket surveillance or dragnet surveillance can keep a tab on everyone, it is nearly impossible for an algorithm to separate the terrorists from the rest. Moreover, the data set collected by the machine is too big for any human analyst, to actually analyze and identify the terrorist in the midst of a deluge of information. Therefore, the argument that a system like CMS will ensure security in lieu of minor intrusions of privacy is a flawed one. Implementation of CMS will not really ensure security but will be a case of blatant violation of individual’s right to privacy anyway.

What is perhaps more shocking is that not only will CMS be futile in preventing security breaches or neutralizing security threats, it will on the contrary expose individual Indian citizens to breach of personal security. If personal data and information are stored for future reference through a centralized mechanism, which is also the case with UID, it will be highly susceptible to attacks and security threats. It will be a Pandora’s Box with a potential to create havoc the moment someone is able to gain access to the information with intention to misuse that. Leaking of personal information and data on a large scale can be detrimental to society and give rise to instances of public emergency.

The ‘Right to be Forgotten’

Currently, the European Union is engulfed in the debate on the “Right to be Forgotten” laws. The Right to be Forgotten finds its origins in the French Law le droit à l’oubli or the right of oblivion, where a convict who has served his sentence can object to the publication of facts of his conviction and imprisonment or penalty. This law has a new found meaning in the context of social media and the internet, where we have the right to delete all our personal information permanently. This is an important issue which India should debate and discuss, as we live in an era where privacy comes at a cost.

On the one hand, technology has made it easier to track, trace, monitor and snoop, on the other it has also seen innovation in the field of encryption and anonymity tools. Encryption tools such as Open PGP exist online, which can secure information from third party access. Tor Browser, allows an user to surf the web anonymously. The use of such technologies should be encouraged as there is no law which prohibits their use. If systems are being built to spy on us, it will be better if we use technologies which protect our personal information from such surveillance technologies.

SEBI and Communication Surveillance: New Rules, New Responsibilities?

by Kovey Coles last modified Jul 12, 2013 10:51 AM
In this blog post, Kovey Coles writes about the activities of the Securities Exchange Board of India (SEBI), discusses the importance of call data records (CDRs), and throws light on the significant transition in governmental leniency towards access to private records.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


Introduction

The Securities Exchange Board of India (SEBI) is the country’s securities and market regulator, an investigation agency which seeks to combat market offenses such as insider trading. SEBI has received much media attention this month regarding its recent expansion of authority; the agency is reportedly on track to be granted powers to access telecom companies’ CDRs. These CDRs are kept by telecommunication companies for billing purposes, and contain information on who sent a call, who received a call, and how long the call lasted, but does not disclose information about call content. Although SEBI has emphatically sought several new investigative powers since 2009 (including access to CDRs, surveillance of email, and monitoring of social media), India’s Ministry of Finance only recently endorsed SEBI’s plea for direct access to service providers’ CDRs. In SEBI’s founding legislation, this capability is not mentioned. Very recently, however, the Ministry of Finance has decided to support expansion of current legislation in regards to CDR access for SEBI, the Reserve Bank of India (RBI), and potentially other agencies, when it comes to prevention of money laundering and other economic offenses.

SEBI’s Authority (Until Now)

Established in 1992 under the Securities and Exchange Board of India Act, SEBI was created with the power of "registering and regulating the working of… [individuals] and intermediaries who may be associated with securities markets in any manner."[1] Its powers have included "calling for information from, undertaking inspection, conducting inquires and audits of the intermediaries and self-regulatory organisations in the securities market."[2] Although the agency has held the responsibility to investigate records on market activity, they have never explicitly enjoyed a right to CDRs or other communications data. Now, with the intention of “meeting new challenges thrown forward by the technological and market advances,”[3] SEBI and the Ministry of Finance want to extend their record keeping scope and investigative powers to include CDR access, a form of communications surveillance.

But the ultimate question is whether agencies like SEBI need this type of easy access to records of communication.

What is the Importance of CDR Access?

Reports on SEBI’s recent expansion are quick to ensure that the agency is not looking for phone-tapping rights, which intercepts messages within telephonic calls, but instead only seeks call records. CDRs, in effect, are “metadata,” a sort of information about information. In this case, it is data about communications, but it is not the communications themselves. Currently, there a total of nine agencies which are able to make actual phone-tapping requests in India. But when it comes to access of CDRs, the government seems much more generous in expanding powers of existing agencies. SEBI, as well as RBI and others, are all looking to be upgraded in their authority over CDRs. Experts argue, however, that "metadata and other forms of non-content data may reveal even more about an individual than the content itself, and thus deserves equivalent protection."[4] Therefore, a second crucial question is whether this sensitive CDR data will feature the same detail of protection and safeguards which exist for communication interception.

One reason for the recent move in CDR access is that SEBI and RBI have found the process of obtaining CDRs too arduous and ill-defined.[5] Currently, under section 92 of the CrPc, Magistrates and Commissioners of Police can request a CDR only with an official corresponding first information report (FIR), while there exists no explicit guideline for SEBI’s role in the process of CDR acquisition.[6] Although the government may seek to relax this procedure, SEBI’s founding legislation prohibits investigation without the pretense of “reasonable grounds," as stipulated in section 11C of the SEBI Act.[7] It has always stood that only under these reasonable grounds could SEBI begin inspection of an intermediary’s "books, registers, and other documents."[7] With the government creating a way for SEBI and similar agencies to circumvent the traditional procedures for access to CDRs, these new standards should incorporate safeguards to ensure the protection of individual privacy. Banking companies, financial institutions, and intermediaries have already been obliged to maintain extensive record keeping of transactions, clients, and other financial data under section 12 of the Prevention of Money-Laundering Act of 2002.[8] But books and records containing financial data differ greatly from communication data, which can include much more personal information and therefore may compromise individuals’ freedom of speech and expression, as well as the right to privacy.

Significance and Responsibility in this Decision

Judging from SEBI’s prior capabilities of inspection and inquiry, this change may initially seem only a minor expansion of power for the agency, but it actually represents a significant transition in governmental leniency toward access to private records. As mentioned, the recent goal of the Ministry of Finance to extend rights to CDRs is resulting in amended powers for more agencies than only SEBI. Moreover, this power expansion comes on the heels of controversy surrounding America’s National Security Agency (NSA) amassing millions of CDRs and other datasets both domestically and internationally. There is obvious room for concern over Indian citizen’s call records being made more easily accessible, with fewer checks and balances in place. The benefits of the new policy include easier access to evidence which could incriminate those involved in financial crimes. But is that benefit actually worth giving SEBI the right to request citizen’s call records? In the cases against economic offenses, CDR access often amounts only to circumstantial evidence. With its ongoing battle against insider trading and other financial malpractice, crimes which are inherently difficult to prove, SEBI could have aspirations to grow progressively more omnipresent. But as the agency’s breadth expands, citizen’s rights to privacy are simultaneously being curtailed. Ultimately, the value of preventing economic offense must be balanced with the value of the people’s rights to privacy.


[1]. 1992 Securities and Exchange Board of India Act, section 11, part 2(b).

[2]. 1992 Securities and Exchange Board of India Act, section 11, part 2(i).

[3]. “Sebi Finalising new Anti-money laundering guidelines,” The Times of India, June 16, 2013

http://timesofindia.indiatimes.com/business/india-business/Sebi-finalizing-new-anti-money-laundering-guidelines/articleshow/20615014.cms

[4]. International Principles on the Application of Human Rights to Communications Surveillance -http://www.necessaryandproportionate.net/#_edn1

[5]. “Sebi to soon to get Powers to Access Call Records,” Business Today, June 13, 2013

http://businesstoday.intoday.in/story/sebi-call-record-access/1/195815.html

[6]. 1973 Criminal Procedure Code, Section 92 http://trivandrum.gov.in/~trivandrum/pdf/act/CODE_OF_CRIMINAL_PROCEDURE.pdf

“Govt gives Sebi, RBI Access to Call Data Records,” The Times of India, June 14, 2013

http://articles.timesofindia.indiatimes.com/2013-06-14/india/39975284_1_home-ministry-access-call-data-records-home-secretary

[7]. 1992 Securities and Exchange Board of India Act, section 11C, part 8

[8]. 2002 Prevention of Money-Laundering Act, section 12

Privacy Round Table Kolkata

by Prasad Krishna last modified Jul 10, 2013 06:08 AM

PDF document icon Invite-Kolkata.pdf — PDF document, 1090 kB (1116261 bytes)

Way to watch

by Chinmayi Arun last modified Jul 01, 2013 10:17 AM
The domestic surveillance regime in India lacks adequate safeguards.

Chinmayi Arun's column was published in the Indian Express on June 26, 2013.


A petition has just been filed in the Indian Supreme Court, seeking safeguards for our right to privacy against US surveillance, in view of the PRISM controversy. However, we should also look closer home, at the Indian government's Central Monitoring System (CMS) and other related programmes. The CMS facilitates direct government interception of phone calls and data, doing away with the need to justify interception requests to a third party private operator. The Indian government, like the US government, has offered the national security argument to defend its increasing intrusion into citizens' privacy. While this argument serves the limited purpose of explaining why surveillance cannot be eliminated altogether, it does not explain the absence of any reasonably effective safeguards.

Instead of protecting our privacy rights from the domestic and international intrusions made possible by technological development, our government is working on leveraging technology to violate privacy with greater efficiency. The CMS infrastructure facilitates large-scale state surveillance of private communication, with very little accountability. The dangers of this have been illustrated throughout history. Although we do have a constitutional right to privacy in India, the procedural safeguards created by our lawmakers thus far offer us very little effective protection of this right.

We owe the few safeguards that we have to the intervention of the Supreme Court of India, in PUCL vs Union of India and Another. In the context of phone tapping under the Telegraph Act, the court made it clear that the right to privacy is protected under the right to life and personal liberty under Article 21 of the Constitution of India, and that telephone tapping would also intrude on the right to freedom of speech and expression under Article 19. The court therefore ruled that there must be appropriate procedural safeguards to ensure that the interception of messages and conversation is fair, just and reasonable. Since lawmakers had failed to create appropriate safeguards, the Supreme Court suggested detailed safeguards in the interim. We must bear in mind that these were suggested in the absence of any existing safeguards, and that they were framed in 1996, after which both communication technology and good governance principles have evolved considerably.

The safeguards suggested by the Supreme Court focus on internal executive oversight and proper record-keeping as the means to achieving some accountability. For example, interception orders are to be issued by the home secretary, and to later be reviewed by a committee consisting of the cabinet secretary, the law secretary and the secretary of telecommunications (at the Central or state level, as the case may be). Records are to be kept of details such as the communications intercepted and all the persons to whom the material has been disclosed. Both the Telegraph Act and the more recent Information Technology Act have largely adopted this framework to safeguard privacy. It is, however, far from adequate in contemporary times. It disempowers citizens by relying heavily on the executive to safeguard individuals' constitutional rights. Additionally, it burdens senior civil servants with the responsibility of evaluating thousands of interception requests without considering whether they will be left with sufficient time to properly consider each interception order.

The extreme inadequacy of this framework becomes apparent when it is measured against the safeguards recommended in the recent report on the surveillance of communication by Frank La Rue, the United Nations special rapporteur on the promotion and protection of the right to freedom of speech and expression. These safeguards include the following: individuals should have the legal right to be notified that they have been subjected to surveillance or that their data has been accessed by the state; states should be transparent about the use and scope of communication surveillance powers, and should release figures about the aggregate surveillance requests, including a break-up by service provider, investigation and purpose; the collection of communications data by the state, must be monitored by an independent authority.

The safeguards recommended by the special rapporteur would not undermine any legitimate surveillance by the state in the interests of national security. They would, however, offer far better means to ensure that the right to privacy is not unreasonably violated. The emphasis placed by the special rapporteur on transparency, accountability and independent oversight is important, because our state has failed to recognise that in a democracy, citizens must be empowered as far as possible to demand and enforce their rights. Their rights cannot rest completely in the hands of civil servants, however senior. There is no excuse for refusing to put these safeguards in place, and making our domestic surveillance regime transparent and accountable, in compliance with our constitutional and international obligations.

World Wide Rule

by Nishant Shah last modified Jul 01, 2013 10:26 AM
Nishant Shah's review of Schmidt and Cohen's book was published in the Indian Express on June 14, 2013.

Click to read the original published in the Indian Express here


Book: The New Digital Age
Author: Eric Schmidt & Jared Cohen
Publisher: Hachette
Price: Rs 650
Pages: 315


When I first heard that Eric Schmidt the chairman of Google and Jared Cohen, the director of the techno-political think-tank Google Ideas, are co-authoring a book about our future and how it is going to be re-shaped with the emergence of digital technologies, I must confess I was sceptical. When people who do things that you like start writing about those things, it is not always a pretty picture. Or an easy read. However, like all sceptics, I am only a romantic waiting to be validated. So, when I picked up The New Digital Age I was hoping to be entertained, informed and shaken out of my socks as the gurus of the interwebz spin science fiction futures for our times. Sadly, I have been taught my lesson and have slid back into hardened scepticism.

Here is the short version of the book: Technology is good. Technology is going to be exciting. There are loads of people who haven't had it yet. There are not enough people who have figured out how things work. Everybody needs to go online because no matter what, technologies are here to stay and they are going to be the biggest corpus of power. They write, "There is a canyon dividing people who understand technology and people charged with addressing the world's toughest geopolitical issues, and no one has built a bridge…As global connectivity continues its unprecedented advance, many old institutions and hierarchies will have to adapt or risk becoming obsolete, irrelevant to modern society." So the handful who hold the reigns of the digital (states, corporates, artificial intelligence clusters) are either going to rule the world, or, well, write books about it.

The long version is slightly more nuanced, even though it fails to give us what we have grown to expect of all things Google — the bleeding edge of back and beyond. For a lay person, observations that Schmidt and Cohen make about the future of the digital age might be mildly interesting in the way title credits to your favourite movie can be. Once they have convinced us, many, many times, that the internet is fast and fluid and that it makes things fast and fluid and hence the future we imagine is going to be fast and fluid, the authors tell us that the internet is spawning a new "caste system" of haves, have-nots, and wants-but-does-not-haves.

Citing the internet as "the largest experiment involving anarchy in history" they look at the new negotiations of power around the digital. Virulent viruses from the "Middle East" make their appearance. Predictably wars of censorship and free information in China get due attention. Telcos get a big hand for building the infrastructure which can sell Google phones to people in Somalia. The book offers a straightforward (read military) reading of drones and less-than-expected biased views on cyberterrorism, which at least escapes the jingoism that the USA has been passing off in the service of a surveillance state. And more than anything else, the book shows politicos and governments around the world, that the future is messy, anarchy is at hand, but as long as they put their trust in Big Internet Brothers, the world will be a manageable place.

So while you can clearly see where my review for the book is heading, I must give it its due credit.

There are three things about this book that make it interesting. The first is how Schmidt and Cohen seem to be in a seesaw dialogue with themselves. They realise that five billion people are going to get connected online. They gush a little about what this net-universality is going to mean. And then immediately, they also realise that we have to prepare ourselves for a "Brave New World," which is going to be infinitely more messy and scary. They recognise that the days of anonymity on the Web are gone, with real life identities becoming our primary digital avatars. However, they also hint at a potential future of pseudonymity that propels free speech in countries with authoritarian regimes. This oscillation between the good, the bad, the plain and the incredible, keeps their writing grounded without erring too much either on the side of techno-euphoria or dystopic visions of the future.

Second, and perhaps justly so, the book doles out a lot of useful information not just for the techno-neophytes but also the amateur savant. There are stories about "Currygate" in Singapore, or of what Vodaphone did in Egypt after the Arab Spring, or of the "Human Flesh Search Engine" in China, which offer a comprehensive, if not critical, view of the way things are. Schmidt and Cohen have been everywhere on the ether and they have cyberjockeyed for decades to tell us stories that might be familiar but are still worth the effort of writing.

Third, it is a readable book. It doesn't require you to Telnet your way into obscure meaning sets in the history of computing. It is written for people who are still mystified not only about the past of the Net but also its future, and treads a surprisingly balanced ground in both directions. It is a book you can give to your grandmother, and she might be inspired to get herself a Facebook (or maybe a Google +) account.

But all said and done, I expected more. It is almost as if Schmidt and Cohen are sitting on a minefield of ideas which they want to hint at but don't yet want to share because they might be able to turn it into a new app for the Nexus instead. It is a book that could have been. It wasn't. It is ironic how silent the book is about the role that big corporations play in shaping our techno-futures, and the fact that it is printed on dead-tree books with closed licensing so I couldn't get a free copy online. For people claiming to build new and political futures, the fact that this wisdom could not come out in more accessible forms and formats, speaks a lot about how seriously we can take their views of the future.

A Technological Solution to the Challenges of Online Defamation

by Eduardo Bertoni — last modified Jul 02, 2013 02:47 PM
When people are insulted or humiliated on the Internet and decide to take legal action, their cases often follow a similar trajectory.

This blog post written by Eduardo Bertoni was published in GlobalVoices on May 28, 2013. CIS has cross-posted this under the Creative Commons Licence.


Consider this scenario:

A public figure, let’s call her Senator X, enters her name into a search engine. The results surprise her — some of them make her angry because they come from Internet sites that she finds offensive. She believes that her reputation has been damaged by certain content within the search results and, consequently, that someone should pay for the personal damages inflicted.

Her lawyer recommends appealing to the search engine – the lawyer believes that the search engine should be held liable for the personal injury caused by the offensive content, even though the search engine did not create the content. The Senator is somewhat doubtful about this approach, as the search engine will also likely serve as a useful tool for her own self-promotion. After all, not all sites that appear in the search results are bothersome or offensive. Her lawyer explains that while results including her name will likely be difficult to find, the author of the offensive content should also be held liable. At that point, one option is to request that the search engine block any offensive sites related to the individual’s name from its searches. Yet the lawyer knows that this cannot be done without an official petition, which will require a judge’s intervention.

“We must go against everyone – authors, search engines – everyone!” the Senator will likely say. “Come on!” says the lawyer, “let's move forward.” However, it does not occur to either the Senator or the lawyer that there may be an alternative approach to that of classic courtroom litigation. The proposal I make here suggests a change to the standard approach – a change that requires technology to play an active role in the solution.

Who is liable?

The “going against everyone” approach poses a critical question: Who is legally liable for content that is available online? Authors of offensive content are typically seen as primarily liable. But should intermediaries such as search engines also be held liable for content created by others?

This last question raises a very specific, procedural question: Which intermediaries will be the subjects of scrutiny and viewed as liable in these types of situations? To answer this question, we must distinguish between intermediaries that provide Internet access (e.g. Internet service providers) and intermediaries that host content or offer content search functions. But what exactly is an ‘intermediary’? And how do we evaluate where an intermediary’s responsibility lies? It is also important to distinguish those intermediaries which simply connect individuals to the Internet from those that offer different services.

What kind of liability might an intermediary carry?


This brings us to the second step in the legal analysis of these situations: How do we determine which model we use in defining the responsibility of an intermediary? Various models have been debated in the past. Leading concepts include:

  • strict liability, under which the intermediary must legally respond to all offensive content
  • subjective liability, under which the intermediary’s response depends on what it has done and what it was or is aware of
  • conditional liability – a variation on subjective liability – under which, if an intermediary was notified or advised that it was promoting or directing users to illegal content and did nothing in response, it is legally required to respond to the offensive content.

These three options for determining liability and responses to offensive online content have been included in certain legislation and have been used in judicial decisions by judges around the world. But not one of these three alternatives provides a perfect standard. As a result, experts continue to search for a definition of liability that will satisfy those who have a legitimate interest in preventing damages that result from offensive content online.

How are victims compensated?

Now let’s return to the example presented earlier. Consider the concept of Senator X’s “satisfaction.” In these types of situations, “satisfaction” is typically economic — the victim will sue for a certain amount of money in “damages”, and she can target anyone involved, including the intermediary.

Interestingly, in the offline world, alternatives have been found for victims of defamation: For example, the “right to reply” aims to aid anyone who feels that his or her reputation or honor has been damaged and allows individuals to explain their point of view.

We must also ask if the right to reply is or is not contradictory to freedom of expression. It is critical to recognize that freedom of expression is a human right recognized by international treaties; technology should be able to achieve a similar solution to issues of online defamation without putting freedom of expression at risk.

Solving the problem with technology

In an increasingly online world, we have unsuccessfully attempted to apply traditional judicial solutions to the problems faced by victims like Senator X. There have been many attempts to apply traditional standards because lawyers are accustomed to using in them in other situations. But why not change the approach and use technology to help “satisfy” the problem?

The idea of including technology as part of the solution, when it is also part of the problem, is not new. If we combine the possibilities that technology offers us today with the older idea of the right to reply, we could change the broader focus of the discussion.

My proposal is simple: some intermediaries (like search engines) should create a tool that allows anyone who feels that he or she is the victim of defamation and offensive online content to denounce and criticize the material on the sites where it appears. I believe that for victims, the ability to say something and to have their voices heard on the sites where others will come across the information in question will be much more satisfactory than a trial against the intermediaries, where the outcome is unknown.

This proposal would also help to limit regulations that impose liability on intermediaries such as search engines. This is important because many of the regulations that have been proposed are technologically impractical. Even when they can be implemented, they often result in censorship; requirements that force intermediaries to filter content regularly infringe on rights such as freedom of expression or access to information.

This proposal may not be easy to implement from a technical standpoint. But I hope it will encourage discussion about the issue, given that a tool like the one I have proposed, although with different characteristics, was once part of Google’s search engine (the tool, “Google Sidewiki” is now discontinued). It should be possible  improve upon this tool, adapt it, or do something completely new with the technology it was based on in order to help victims of defamation clarify their opinions and speak their minds about these issues, instead of relying on courts to impose censorship requirements on search engines. This tool could provide much greater satisfaction for victims and could help prevent the violation of the rights of others online as well.

Critics may argue that people will not read the disclaimers or statements written by “defamed” individuals and that the impact and spread of the offensive content will continue unfettered. But this is a cultural problem that will not be fixed by placing liability on intermediaries. As I explained before, the consequences of doing so can be unpredictable.

If we continue to rely on traditional regulatory means to solve these problems, we’ll continue to struggle with the undesirable results they can produce, chiefly increased controls on information and expression online. We should instead look to a technological solution as a viable alternative that cannot and should not be ignored.


Eduardo Bertoni is the Director of the Center for Studies on Freedom of Expression and Access to Information at Palermo University School of Law in Buenos Aires. He served as the Special Rapporteur for Freedom of Expression to the Organization of American States from 2002-2005.

Indian surveillance laws & practices far worse than US

by Pranesh Prakash last modified Jul 12, 2013 11:09 AM
Explosive would be just the word to describe the revelations by National Security Agency (NSA) whistleblower Edward Snowden.

Pranesh Prakash's column was published in the Economic Times on June 13, 2013. This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC.


Now, with the American Civil Liberties Union suing the Obama administration over the NSA surveillance programme, more fireworks could be in store. Snowden's expose provides proof of what many working in the field of privacy have long known. The leaks show the NSA (through the FBI) has got a secret court order requiring telecom provider Verizon to hand over "metadata", i.e., non-content data like phone numbers and call durations, relating to millions of US customers (known as dragnet or mass surveillance); that the NSA has a tool called Prism through which it queries at least nine American companies (including Google and Facebook); and that it also has a tool called Boundless Informant (a screenshot of which revealed that, in February 2013, the NSA collected 12.61 billion pieces of metadata from India).

Nothing Quite Private

The outrage in the US has to do with the fact that much of the data the NSA has been granted access to by the court relates to communications between US citizens, something the NSA is not authorised to gain access to. What should be of concern to Indians is that the US government refuses to acknowledge non-Americans as people who also have a fundamental right to privacy, if not under US law, then at least under international laws like the Universal Declaration of Human Rights and the ICCPR.

US companies such as Facebook and Google have had a deleterious effect on privacy. In 2004, there was a public outcry when Gmail announced it was using an algorithm to read through your emails to serve you advertisements. Facebook and Google collect massive amounts of data about you and websites you visit, and by doing so, they make themselves targets for governments wishing to snoop on you, legally or not.

Worse, Indian-Style

That said, Google and Twitter have at least challenged a few of the secretive National Security Letters requiring them to hand over data to the FBI, and have won. Yahoo India has challenged the authority of the Controller of Certifying Authorities, a technical functionary under the IT Act, to ask for user data, and the case is still going on.

To the best of my knowledge, no Indian web company has ever challenged the government in court over a privacy-related matter. Actually, Indian law is far worse than American law on these matters. In the US, the NSA needed a court order to get the Verizon data. In India, the licences under which telecom companies operate require them to provide this. No need for messy court processes.

The law we currently have — sections 69 and 69B of the Information Technology Act — is far worse than the surveillance law the British imposed on us. Even that lax law has not been followed by our intelligence agencies.

Keeping it Safe

Recent reports reveal India's secretive National Technical Research Organisation (NTRO) — created under an executive order and not accountable to Parliament — often goes beyond its mandate and, in 2006-07, tried to crack into Google and Skype servers, but failed. It succeeded in cracking Rediffmail and Sify servers, and more recently was accused by the Department of Electronics and IT in a report on unauthorised access to government officials' mails.

While the government argues systems like the Telephone Call Interception System (TCIS), the Central Monitoring System (CMS) and the National Intelligence Grid (Natgrid) will introduce restrictions on misuse of surveillance data, it is a flawed claim. Mass surveillance only increases the size of the haystack, which doesn't help in finding the needle. Targeted surveillance, when necessary and proportional, is required. And no such systems should be introduced without public debate and a legal regime in place for public and parliamentary accountability.

The government should also encourage the usage of end-to-end encryption, ensuring Indian citizens' data remains safe even if stored on foreign servers. Merely requiring those servers to be located in India will not help, since that information is still accessible to American agencies if it is not encrypted. Also, the currently lax Indian laws will also apply, degrading users' privacy even more.

Indians need to be aware they have virtually no privacy when communicating online unless they take proactive measures. Free or open-source software and technologies like Open-PGP can make emails secure, Off-The-Record can secure instant messages, TextSecure for SMSes, and Tor can anonymise internet traffic.

Privacy (Protection) Bill, 2013

by Prasad Krishna last modified Jul 03, 2013 09:39 AM

PDF document icon The Privacy (Protection) Bill, 2013 - 1 June 2013 (for Bombay).pdf — PDF document, 196 kB (200944 bytes)

Privacy Protection Bill, 2013 (With Amendments based on Public Feedback)

by Elonnai Hickok last modified Jul 12, 2013 10:50 AM
In 2013 CIS drafted the Privacy Protection Bill as a citizens' version of a privacy legislation for India. Since April 2013, CIS has been holding Privacy Roundtables in collaboration with FICCI and DSCI, with the objective of gaining public feedback to the Privacy Protection Bill and other possible frameworks for privacy in India.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


As a part of this process, CIS has been amending the Privacy Protection Bill based on public feedback. Below is the text of the Bill as amended according to feedback gained from the New Delhi, Bangalore, and Chennai Roundtables.

Click to download the Privacy Protection Bill, 2013 with latest amendments (PDF, 196 Kb).

The Difficult Balance of Transparent Surveillance

by Kovey Coles last modified Jul 15, 2013 04:23 AM
Is it too much to ask for transparency in data surveillance? On occasion, companies like Microsoft, Facebook, and the other silicon valley giants would say no. When customers join these services, each company provides their own privacy statement which assures customers of the safety and transparency that accompanies their personal data.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


Google even publishes annual “Transparency Reports” which detail the data movement behind the scenes. Governments, too, are somewhat open about surveillance methods, for example with the public knowledge of the existence and role of institutions like America’s NSA and India’s CMS. These façades of assurance, however, never satisfy the public enough to protect them from feeling cheated and deceived when information leaks about surveillance practices. And in the face of controversy around surveillance, both service providers and governments scramble to provide explanations for discrepancies between their promises and their practices.

So it seems that transparency might not be too much to ask, but instead is perhaps more complicated of a request than imagined. For some citizens, nothing would be more satisfying than complete transparency on all data collection. For those who recognize surveillance as crucial for national security, however, complete transparency would mean undermining the very efficacy of surveillance practices. And data companies often find themselves caught between these two ends, simultaneously seeking profits by catering to the public, while also trying to abide by political and legal frameworks. Therefore, in the process of modern data surveillance, each attempt at resolution of the transparency issue will become a delicate balance between three actors: the government, the big data companies, and the people. As rightly stated on the Digital Due Process website, rules for surveillance must carefully consider “the individual’s constitutional right to privacy, the government’s need for tools to conduct investigations, and the interest of service providers in clarity and customer trust.”[1]

So we must unpack the idea of transparency.

First, there should be a distinction made between proactive transparency and reactive transparency, or, the announcement of surveillance practices versus the later access to surveillance records. The former is more risky and therefore more difficult to entertain, while the latter may lack any real substance beyond satisfying inquiries. Also consider the discrepancy in motivation for transparency between the actors. For the citizen, is transparency really an end goal, or is it only a stepping stone in the argument for eradication of surveillance practices in the name of rights to privacy? Here, we ascertain the true value of total transparency; will it ever please citizens to learn of a government’s most recent undermining of the private sphere?

Reactive transparency has been achieved only in recent years in India, during a number of well publicized legal cases. In one of the earliest cases of reactive transparency, Reliance Communications made an affidavit in the Supreme Court over the exact number of surveillance directives given by the government. It was released that 151,000 Reliance accounts were monitored for a project between 2006 and 2010, with 3,588 tapped phones just from the Delhi region alone in 2005.[2]

But also there has been controversy over the extent of reactive transparency, because it has been especially problematic to discern the point where transparency once again encroaches on privacy, both for government and the people’s sake. After gathering the data, its release could further jeopardize the citizens and the government. It is important to carefully consider the productive extent of reactive transparency: What will become of the information? Will one publicly reveal how many people were spied on? Who was spied on? What was found when through spying? Citizens must take all of this into consideration when requesting transparency.

Meanwhile, service providers embrace transparency when it can benefit their corporation, or as a recent Facebook statement explained, “we’ve been in discussions with U.S. national security authorities urging them to allow more transparency, so that our users around the world can understand how infrequently we are asked to provide user data on national security grounds.” [a] Many of the service providers mentioned in the recently leaked PRISM report have made well-publicized requests to the U.S. government for more transparency.[3]

Not only have they allegedly written requests to the government to allow them to disclose information, but the companies (including Facebook [a], Apple [b], Microsoft[c], and Google [d]) have all released explanatory statements in the wake of the June 2013 PRISM scandal. Although service providers claim that the request to release data about their cooperation is in the ‘interest of transparency,’ it instead seems that the motivation for this transparency is to ease consumers’ concerns and help the companies save face. The companies (and the government) will admit their participation in surveillance once it has become impossible to deny their association with the programs. This shrewd aspect of transparency can be seen most clearly in statements like those from Microsoft, who included in their statement on June 14th, “We have not received any national security orders of the type that Verizon was reported to have received.” [c] Spontaneous allusions like this are meant to contrast guilt-conscious service providers favorably to telecom service providers such as AT&T and Verizon, who allegedly yielded the most communications data and who as of now have yet to release defensive public statements.

Currently, we find ourselves in a situation where entities admit to their collusion in snooping only once information has leaked, indignation has ignited, and scandal has erupted. A half-hearted proactive transparency leads to an outrage demanding reactive semi-transparency. These weak forms of transparency neither satisfy the public, nor allow governments and service providers to maintain dignity.

But now is also a crucial moment for possible reevaluation and reformation of this system, especially in India. Not only is India enacting its own national security surveillance system, the CMS[4] but the recent NSA and PRISM revelations are still sending shockwaves throughout the world of cyber security and surveillance. Last week, a Public Interest Litigation (PIL) was sent to the Indian Supreme Court, arguing that nine foreign service providers (Facebook, Hotmail, Yahoo!, Google, Apple, Skype, Paltalk, AOL, YouTube) violated the trust and privacy of their Indian customers through their collusion with the US government’s surveillance programs.[5]

Among other things, the PIL emphatically sought prosecution of the mentioned corporations, demands for the service providers to establish servers in India, and also sought stricter rules to prevent Indian officials from using these foreign services for work involving national security. Ultimately, the PIL was rejected by the Supreme Court; although the PIL stated the grounds of Rule 6 of the Information Technology Rules 2011 for the guidelines in protecting sensitive Indian citizen information, the SC saw the PIL as addressing problems outside of SC jurisdiction, and was quoted as saying “we cannot entertain the petition as an Indian agency is not involved.”[5][6]

The SC considered the PIL only partially, however, as certain significant parts of the petition were indeed within Indian domestic agency, for example the urge to prohibit federal officials from using the private email services such as Gmail, Hotmail, and Yahoo. And although the SC is not the correct place to push for new safeguard legislation, the ideas of the PIL are not invalid, as Indian leaders have long searched for ways of ensuring basic Indian privacy laws in the context of international service providers. This is also not a problem distinctive to India. International service providers have entered into agreements regarding the same problems of incorporating international customers’ rights, formal agreements which India could emulate if it wanted to demand greater privacy or transparency.

For example, there is the Safe Harbor Framework, an institution in place to protect and mediate European Union citizens’ privacy rights within the servers of foreign (i.e. American) Internet companies. These regulations were established in 2000, and serve the purpose of adjusting foreign companies’ standards to incorporate E.U. privacy laws. In accordance with the agreement, E.U. data is only allowed to be sent to outside providers who maintain the seven Safe Harbor principles, several of which focus on transparency of data usage.[7] India could enact a system similar to this, and it would likely alleviate some of the concerns raised in the most recent PIL. These frameworks, however, have not proven completely reliable safeguards either, especially when the service providers’ own government uses national security as a means to override the agreement. Although the U.S. government has yet to fully confirm or deny many of the NSA and PRISM allegations in regards to Europe, there is currently strong room to believe that the surveillance practices may have violated the Safe Harbor agreements by delivering sensitive E.U. citizen data to the U.S. government.[8] It is uncertain how these revelations will impact the agreements made between the big Silicon-Valley companies and their E.U. customers.

The recent PIL also strongly suggested establishing domestic data servers to keep Indian citizens’ information within the country and under the direct supervision of Indian entities. It strongly pushes for self-reliance as the best way to ensure both citizen and national security. The PIL assumes that domestic servers will not only offer better information protection, but also create much needed jobs and raise national tax revenue.[5] If allegations about PRISM and the E.U. prove true, then the E.U. may also decide to support establishment of European servers as well.

Several of the ideas outlined in the PIL have merit, but may not be as productive as the requesters assume. It is true that establishing servers and domestic regulators in India may temporarily protect from unwanted foreign, i.e. American, surveillance. But at the same time, this also increases likelihood of India’s own central government taking a stronger surveillance stance, more stringently monitoring their own servers and databases. It has not yet been described how the CMS will be operate its surveillance methods, but moving data to domestic servers may just result in shifting power from NSA to CMS. Rather than more privacy or transparency, the situation could easily become a matter of who citizens prefer spying over them.

Even if one government establishes rules which enforce transparency, this may clash with the laws of the service providers’ domestic government, i.e. confidentiality in surveillance. Considering all of this, rejection of foreign service providers and promotion of domestic self reliance may ultimately prove the most effective alternative for nations which are growing rapidly in both internet presence and internet consciousness. But that does not make this option the easiest. Facing the revelations and disillusionment of domestic (CMS) and international (PRISM) surveillance methods, countries like India are reaching an impeding critical juncture. Now is the most important time to establish new norms, while public sentiment is at its highest and transition is most possible, not only creating new laws which can safeguard privacy, but also strongly considering alternatives to foreign service providers like those outlined in June’s PIL. Privacy International’s guiding principles of communications surveillance also offer useful advice, urging for the establishment of oversight institutions which can access surveillance records and periodically publish aggregate data on surveillance methods.[9] Although the balance between security on the national level and security on the personal level will continue to be problematic for nations in the upcoming years, and even though service providers’ positions on surveillance usually seem contrived, Microsoft Vice President John Frank made a statement which deserves appreciation, rightly saying, “Transparency alone may not be enough to restore public confidence, but it’s a great place to start.”[c]


[1]. http://digitaldueprocess.org/

[2]. http://bit.ly/151Ue1H

[3]. http://bit.ly/12XDb1Z

[4]. http://ti.me/11Xh08V

[5]. Copy of 2013 PIL to Supreme Court, Prof. S.N. Singh [attached]

[6]. http://bit.ly/1aXWdbU

[7]. http://1.usa.gov/qafcXe

[8]. http://bit.ly/114hcCX

[9]. http://bit.ly/156wspI


[a]. Facebook Statement: http://bit.ly/ZQDcn6

[b]. Apple Statement: http://bit.ly/1akaBuN

[c]. Microsoft Statement:http://bit.ly/1bFIt31

[d]. Google Statement: http://bit.ly/16QlaqB

CIS Cybersecurity Series (Part 4) - Marietje Schaake

by Purba Sarkar last modified Jul 12, 2013 10:24 AM
CIS interviews Marietje Schaake, member of the European parliament, as part of the Cybersecurity Series
"It is important that we don't confine solutions in military head quarters or in government meeting rooms but that consumers, internet users, NGOs, as well as businesses, together take responsibility to build a resilient society where we also don't forget what it is we are defending, and that is our freedoms... and we have learned hopefully from the war on terror, that there is a great risk to compromise freedom for alleged security and that is a mistake we should not make again." - Marietje Schaake, member of European parliament.

Centre for Internet and Society presents its fourth installment of the CIS Cybersecurity Series.
 
The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.
 
In this installment, CIS interviews Marietje Schaake, member of the European Parliament for the Dutch Democratic Party (D66) with the Alliance of Liberals and Democrats for Europe (ALDE) political group. She serves on the Committee on Foreign Affairs, where she focuses on neighbourhood policy, Turkey in particular; human rights, with a specific focus on freedom of expression, Internet freedom, press freedom; and Iran. In the Committee on Culture, Media, Education, Youth and Sports, Marietje works on Europe’s Digital Agenda and the role of culture and new media in the EU´s external actions. In the Committee on International Trade, she focuses on intellectual property rights, the free flow of information and the relation between trade and foreign affairs.
 
Marietje's website is: http://www.marietjeschaake.eu/
 

 

This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.


Response from Ministry of Home Affairs

by Prasad Krishna last modified Jul 15, 2013 04:34 AM
Rakesh Mittal's reply received by the Centre for Internet and Society.

PDF document icon Rakesh Mittal's reply.pdf — PDF document, 264 kB (271205 bytes)

Redirected to DEITY for Response to RTI

by Prasad Krishna last modified Jul 15, 2013 05:04 AM
Ministry of Home Affairs redirected to the Department of Electronics and Communication Information to respond to the RTI filed by CIS regarding information on the officials and agencies authorized to intercept telephone messages in India.

PDF document icon Redirected.pdf — PDF document, 325 kB (332929 bytes)

Moving Towards a Surveillance State

by Srinivas Atreya last modified Jul 15, 2013 05:57 AM
The cyberspace is a modern construct of communication and today, a large part of human activity takes place in cyberspace. It has become the universal platform where business is executed, discourse is conducted and personal information is exchanged. However, the underbelly of the internet is also seen to host activities and persons who are motivated by nefarious intent.

Note: The original tender document of the Assam Police dated 28.02.2013 along with other several other tender documents for procurement of Internet and Voice Monitoring Systems is attached as a zip folder.


As highlighted in the International Principles on the Application of Human Rights to Communications Surveillance, logistical barriers to surveillance have decreased in recent decades and the application of legal principles in new technological contexts has become unclear. It is often feared that in light of the explosion of digital communications content and information about communications, or "communications metadata," coupled with the decreasing costs of storing and mining large sets of data and the provision of personal content through third party service providers make State surveillance possible at an unprecedented scale. Communications surveillance in the modern environment encompasses the monitoring, interception, collection, preservation and retention of, interference with, or access to information that includes, reflects, arises from or is about a person's communications in the past, present or future.[*] These fears are now turning into a reality with the introduction of mass surveillance systems which penetrate into the lives of every person who uses any form of communications. There is ample evidence in the form of tenders for Internet Monitoring Systems (IMS) and Telecom Interception Systems (TCIS) put out by the Central government and various state governments that the Indian state is steadily turning into an extensive surveillance state.

While surveillance and intelligence gathering is essential for the maintenance of national security, the creation and working of a mass surveillance system as it is envisioned today may not necessarily be in absolute conformity with the existing law. A mass surveillance system like the Central Monitoring System (CMS) not only threatens to completely eradicate any vestige of the right to privacy but in the absence of a concrete set of procedural guidelines creates a tremendous risk of abuse.

Although information regarding the Central Monitoring System is quite limited on the public forum at the moment it can be gathered that a centralized system for monitoring of all communication was first proposed by the Government of India in 2009 as indicated by the press release of the Ministry of Communications & Information. Implementation of the system started subsequently as indicated by another government press release and the Center for Development of Telematics (C-DOT) was entrusted with the responsibility of implementing the system. As per the C-DOT annual report 2011-12, research, development, trials and progressive scaling up of a Central Monitoring System were conducted by the organization in the past 4 years and the requisite hardware and CMS solutions which support voice and data interception have been installed and commissioned at various Telecom Service Providers (TSP) in Delhi and Haryana as part of the pilot project. Media reports indicate that the project will be fully functional by 2014. While an extensive surveillance system is being stealthily introduced by the state, several concerns with regard to its extent of use, functioning, and real world impact have been raised owing to ambiguities and wide gaps in procedure and law. Moreover, the lack of a concrete privacy legislation coupled with the absence of public discourse indicates the lack of interest of the state over the rights of an ordinary citizen. It is under these circumstances that awareness must first be brought regarding the risks of the mass surveillance on civil liberties which in the absence of established procedures protecting the rights of the citizens of the state can result in the abuse of powers by the state or its agencies and lead to the demise of civil freedoms even in democratic states.

The architecture and working of a proposed Internet Monitoring System must be examined in an attempt to better understand the functioning, capabilities and possible impact of a Central Monitoring System on our society and lives. This can perhaps allow more open discourse and a committed effort to preserve the rights of the citizens especially the right to privacy can be made while allowing for the creation of strong procedural guidelines which will help maintain legitimate intelligence gathering and surveillance.

Internet Monitoring System: Setup and Working
Very broadly, The Internet Monitoring System enables an agency of the state to intercept and monitor all content which passes through the Internet Service Provider’s (ISP) server which includes all electronic correspondence (emails, chats or IM’s, transcribed call logs), web forms, video and audio files, and other forms of internet content. The electronic data is stored and also subject to various types of analysis. While Internet Monitoring Systems are installed locally and their function is limited to specific geographic region, the Central Monitoring System will consolidate the data acquired from the different voice and data interception systems located across the country and create a centralized architecture for interception, monitoring and analysis of communications. Although the exact specifications and functions of the central monitoring system still remain unclear and ambiguous, some parallels regarding the functioning of the CMS can be drawn from the the specifications revealed in the Assam Police tender document for the procurement of an Internet Monitoring System.

Setup
The deployment architecture of an Internet Monitoring System (IMS) contains probe servers which are installed at the Internet Service Provider’s (ISP) premises and the probes are installed at various tapping points within the entire ISP network.  A collection server is also installed and hosted at the site of the ISP. The collection server is used to either collect, analyze, filter or simple aggregate the data from the ISP servers and the data is transferred to a master aggregation server located a central data center. The central data center may also contain more servers specifically for analysis and storage. This type of architecture is being referred to as a ‘high availability clustered setup’ which is supposed to provide security in case of a failure or outage.

The Assam Police Internet Monitoring System tender document specifically indicates that the deployment in the state of Assam shall require 8 taps or probes to be installed at different ISPs, out of which 6 taps/probes shall be of 10 GBPS and 2 taps are of 1 GBPS. The document however mentions that the specifications are preliminary and subject to change.

Types of data
The proposed internet monitoring system of the Assam state can provide network traffic interception and a variety of internet protocols including Hypertext Transfer Protocol (HTTP), File Transfer Protocol (FTP), Simple Mail Transfer Protocol (SMTP), Internet Message Access Protocol (IMAP) and Session Initiation Protocol (SIP), Voice over Internet Protocol (VoIP) can be intercepted and monitored. The system can also support monitoring of Internet Relay Chat and various other messaging applications (such as Google Talk, Yahoo Chat, MSN Messenger, ICQ, etc.).  The system can be equipped to capture and display multiple file types like text (.doc, .pdf), zipped (.zip) and executable applications (.exe). Further, information regarding login details, login pattern, login location, DNS address, routing address can be acquired along with the IP address and other details of the user.

Web crawling capabilities can be installed on the system which can provide data from various data sources like social networking sites, web based communities, wikis, blogs and other forms of web content. Social media websites (such as Twitter, Facebook, Orkut, MySpace etc.), web pages and data on hosted applications can also be intercepted, monitored and analyzed.  The system also allows capture of additional pages if updated; log periodical updates and other changes. This allows the monitoring agencies the capability of gathering internet traffic based on several parameters like Protocols, Keywords, Filters and Watch lists. Keyword matching is achieved by including phonetically similar words in various languages including local languages.

More specific functions of the IMS can include complete email extraction which will disclose the address book, inbox, sent mail folder, drafts folder, personal folders, delete folders, custom folders etc. and can also provide identification of dead drop mails. The system can also be equipped to allow country wise tracking of instant messages, chats and mails.

Regarding retention and storage of data, the tender document specifies that the system shall be technically capable of retaining the metadata of Internet traffic for at least one year and the defined traffic/payload/content is to be retained in the storage server at least for a week.  However, the data may be retained for a longer period if required. The metadata and qualified data after analysis are integrated to a designated main intelligence repository for storage.

Types of Analysis
The Internet Monitoring System apart from intercepting all the data generated through the Internet Service Providers is essentially equipped for various types of data analysis. The solutions that are installed in the internet monitoring system provide the capability for real time as well as historical analysis of network traffic, network perimeter devices and internal sniffers.  The kinds of analysis based on ‘slicing and dicing of data’ range from text mining, sentiment analysis, link analysis, geo-spatial analysis, statistical analysis, social network analysis, transaction analysis, locational analysis and fusion based analysis, CDR analysis, timeline analysis and histogram based analysis from various sources.

The solutions installed in the IMS can enable monitoring of specific words or phrases (in various languages) in blogs, websites, forums, media reports, social media websites, media reports, chat rooms and messaging applications, collaboration applications and deep web applications. Phone numbers, addresses, names, locations, age, gender and other such information from content including comments and such can also be monitored. Specifically with regard to social media, the user’s profile and information related to it can be extracted and a detailed ontology of all the social media profiles of the user can be created.

Based on the information, the analysis supposed to provide the capability to identify suspicious behavior based on existing and new patterns as they emerge and are continuously applied to combine incoming and existing information on people, profiles, transactions, social network, type of websites visited, time spent on websites, type of content download or view and any other type of gatherable information. The solutions on the system are also supposed to create single or multiple or parallel scenario build-ups that may occur in blogs, social media forums, chat rooms, specific web hosting server locations or URL, packet route that may be defined from time to time and such scenario build-ups can be based on parameters like sentiments, language or expressions purporting hatred or anti-national expressions, and even emotions like expression of joy, compassion and anger, which as may be defined by the agency depending on operational and intelligence requirement. Based on these parameters, automated alerts can be generated relating to structured or unstructured data (including metadata of contents), events, pattern discovery, phonetically similar words or phrases or actions from users.

Based on the data analysis, reports or dossiers can be generated and visual analysis allowing a wide variety of views can be created.  Further, real time visualization showing results from real-time data can be generated which allows alerts, alert categories or discoveries to be ranked (high, medium, and low priority, high value asset, low value asset, moderate value asset, verified information, unverified information, primary evidence, secondary evidence, circumstantial evidence, etc.) based on criteria developed by the agency. The IMS solutions can also be capable of offering web-intelligence and open source intelligence and allow capabilities like simultaneous search capabilities which can be automated providing a powerful tool for exploration of the intercepted data.

Another important requirement mentioned in the tender document is the systems capability to integrate with other interception and monitoring systems for 2G, 3G/UMTS and other evolving mobile carrier technologies including fixed line and Blackberry services and encrypted IP services like Skype services.

Conclusion
It is clear that a system like IMS with its extensive interception and analysis capabilities gives complete access to an agency or authority of all information that is accessed or transmitted by a person on the internet including information which is private and confidential such as email and instant messages. Although the state has the power to issue directions for interception or monitoring of information under the Information Technology Act, 2000 and certain rules are prescribed under section 69B, they are wholly inadequate compared to the scope and extent of the Internet Monitoring System and its scale of operations. The interception and monitoring systems that are either proposed or already in place effectively bypass the existing procedures prescribed under the Information Technology Act.

The issues, concerns and risks are only compounded when it comes to the Central Monitoring System. The solutions installed in present day interception and monitoring systems give the state unprecedented powers to intercept, monitor and analyze all the data of any person who access the internet. Tools like deep packet inspection and extensive data mining solutions in the absence of concrete safeguards and when deployed through a centralized system can be misused to censor any content including legitimate discourse. Also, the perception that access to a larger amount of data or all data can help improve intelligence can also be sometimes misleading and it must be asked whether the fundamental rights of the citizens of the state can be traded away under the pretext of national security. Furthermore, it is essential for the state to weigh the costs of such a project both economically and morally and balance it with sufficient internal measures as well as adequate laws so that the democratic values are persevered and not endangered by any act of reckless force.

Reiterating what has been said earlier, while it is important for the state to improve its intelligence gathering tools and mechanisms, it must not be done at the cost of a citizen’s fundamental right. It is the duty of the democratic state to endure and maintain a fine balance between national interest and fundamental rights through timely creation of equitable laws.


[*]. http://necessaryandproportionate.net/#_edn2

Tenders, EOI and Press Release

by Prasad Krishna last modified Jul 15, 2013 05:56 AM

ZIP archive icon Surveillance Systems - Govt Tenders, EOI and Press Release.zip — ZIP archive, 5976 kB (6119473 bytes)

How Surveillance Works in India

by Pranesh Prakash last modified Jul 15, 2013 10:20 AM
When the Indian government announced it would start a Centralized Monitoring System in 2009 to monitor telecommunications in the country, the public seemed unconcerned. When the government announced that the system, also known as C.M.S., commenced in April, the news didn’t receive much attention.
How Surveillance Works in India

Demonstrators showing support for National Security Agency whistleblower Edward Snowden at India Gate in New Delhi on Sunday.


This article by Pranesh Prakash was published in the New York Times on July 10, 2013.


After a colleague at the Centre for Internet and Society wrote about the program and it was lambasted by Human Rights Watch, more reporters started covering it as a privacy issue. But it was ultimately the revelations by Edward J. Snowden about American surveillance that prompted Indians to ask questions about its own government’s surveillance programs.

In India, we have a strange mix of great amounts of transparency and very little accountability when it comes to surveillance and intelligence agencies. Many senior officials are happy to anonymously brief reporters about the state of surveillance, but there is very little that is officially made public, and still less is debated in the national press and in Parliament.

This lack of accountability is seen both in the way the Big-Brother acronyms (C.M.S., Natgrid, T.C.I.S., C.C.T.N.S., etc.) have been rolled out, as well as the murky status of the intelligence agencies. No intelligence agency in India has been created under an act of Parliament with clearly established roles and limitations on powers, and hence there is no public accountability whatsoever.

The absence of accountability has meant that the government has since 2006 been working on the C.M.S., which will integrate with the Telephone Call Interception System that is also being rolled out. The cost: around 8 billion rupees ($132 million) — more than four times the initial estimate of 1.7 billion — and even more important, our privacy and personal liberty. Under their licensing terms, all Internet service providers and telecom providers are required to provide the government direct access to all communications passing through them. However, this currently happens in a decentralized fashion, and the government in most cases has to ask the telecoms for metadata, like call detail records, visited Web sites, IP address assignments, or to carry out the interception and provide the recordings to the government. Apart from this, the government uses equipment to gain access to vast quantities of raw data traversing the Internet across multiple cities, including the data going through the undersea cables that land in Mumbai.

With the C.M.S., the government will get centralized access to all communications metadata and content traversing through all telecom networks in India. This means that the government can listen to all your calls, track a mobile phone and its user’s location, read all your text messages, personal e-mails and chat conversations. It can also see all your Google searches, Web site visits, usernames and passwords if your communications aren’t encrypted.

Internet Surfing

A man surfing a Facebook page at an internet cafe in Guwahati, Assam, on Dec. 6, 2011.
Image Credit:
Anupam Nath/Associated Press

You might ask: Why is this a problem when the government already had the same access, albeit in a decentralized fashion? To answer that question, one has to first examine the law.

There are no laws that allow for mass surveillance in India. The two laws covering interception are the Indian Telegraph Act of 1885 and the Information Technology Act of 2000, as amended in 2008, and they restrict lawful interception to time-limited and targeted interception.The targeted interception both these laws allow ordinarily requires case-by-case authorization by either the home secretary or the secretary of the department of information technology.

Interestingly, the colonial government framed better privacy safeguards into communications interception than did the post-independence democratic Indian state. The Telegraph Act mandates that interception of communications can only be done on account of a public emergency or for public safety.  If either of those two preconditions is satisfied, then the government may cite any of the following five reasons: “the sovereignty and integrity of India, the security of the state, friendly relations with foreign states, or public order, or for preventing incitement to the commission of an offense.” In 2008, the Information Technology Act copied much of the interception provision of the Telegraph Act but removed the preconditions of public emergency or public safety, and expands the power of the government to order interception for “investigation of any offense.” The IT Act thus very substantially lowers the bar for wiretapping.

Apart from these two provisions, which apply to interception, there are many laws that cover recorded metadata, all of which have far lower standards. Under the Code of Criminal Procedure, no court order is required unless the entity is seen to be a “postal or telegraph authority” — and generally e-mail providers and social networking sites are not seen as such.

Unauthorized access to communications data is not punishable per se, which is why a private detective who gained access to the cellphone records of Arun Jaitley, a Bharatiya Janata Party leader, has been charged under the weak provision on fraud, rather than invasion of privacy. While there is a provision in the Telegraph Act to punish unlawful interception, it carries a far lesser penalty (up to three years of imprisonment) than for a citizen’s failure to assist an agency that wishes to intercept or monitor or decrypt (up to seven years of imprisonment).

To put the ridiculousness of the penalty in Sections 69 and 69B of the IT Act provision in perspective, an Intelligence Bureau officer who spills national secrets may be imprisoned up to three years. And under the Indian Penal Code, failing to provide a document one is legally bound to provide to a public servant, the punishment can be up to one month’s imprisonment. Further, a citizen who refuses to assist an authority in decryption, as one is required to under Section 69, may simply be exercising her constitutional right against self-incrimination. For these reasons and more, these provisions of the IT Act are arguably unconstitutional.

As bad as the IT Act is, legally the government has done far worse. In the licenses that the Department of Telecommunications grants Internet service providers, cellular providers and telecoms, there are provisions that require them to provide direct access to all communications data and content even without a warrant, which is not permitted by the existing laws on interception. The licenses also force cellular providers to have ‘bulk encryption’ of less than 40 bits. (Since G.S.M. network encryption systems like A5/1, A5/2, and A5/3 have a fixed encryption bit length of 64 bits, providers in India have been known use A5/0, that is, no encryption, thus meaning any person — not just the government — can use off-the-air interception techniques to listen to your calls.)

Cybercafes (but not public phone operators) are required to maintain detailed records of clients’ identity proofs, photographs and the Web sites they have visited, for a minimum period of one year. Under the rules designed as India’s data protection law (oh, the irony!), sensitive personal data has to be shared with government agencies, if required for “purpose of verification of identity, or for prevention, detection, investigation including cyber incidents, prosecution, and punishment of offenses.”

Along similar lines, in the rules meant to say when an Internet intermediary may be held liable for a user’s actions, there is a provision requiring the Internet company to “provide information or any such assistance to government agencies legally authorized for investigative, protective, cybersecurity activity.” (Incoherent, vague and grammatically incorrect sentences are a consistent feature of laws drafted by the Ministry of Communications and IT; one of the telecom licenses states: “The licensee should make arrangement for monitoring simultaneous calls by government security agencies,” when clearly they meant “for simultaneous monitoring of calls.”)

In a landmark 1996 judgment, the Indian Supreme Court  held that telephone tapping is a serious invasion of an individual’s privacy and that the citizens’ right to privacy has to be protected from abuse by the authorities. Given this, undoubtedly governments must have explicit permission from their legislatures to engage in any kind of broadening of electronic surveillance powers. Yet, without introducing any new laws, the government has surreptitiously granted itself powers — powers that Parliament hasn’t authorized it to exercise — by sneaking such powers into provisions in contracts and in subordinate legislation.

Can India Trust Its Government on Privacy?

by Pranesh Prakash last modified Jul 15, 2013 10:35 AM
In response to criticisms of the Centralized Monitoring System, India’s new surveillance program, the government could contend that merely having the capability to engage in mass surveillance won’t mean that it will. Officials will argue that they will still abide by the law and will ensure that each instance of interception will be authorized.
Can India Trust Its Government on Privacy?

A man checking his cell phone in New Delhi on June 18. Picture by Anindito Mukherjee/Reuters.


Pranesh Prakash's article was published in the New York Times on July 11, 2013.


In fact, they will argue that the program, known as C.M.S., will better safeguard citizens’ privacy: it will cut out the telecommunications companies, which can be sources of privacy leaks; it will ensure that each interception request is tracked and the recorded content duly destroyed within six months as is required under the law; and it will enable quicker interception, which will save more lives. But there are a host of reasons why the citizens of India should be skeptical of those official claims.

Cutting out telecoms will not help protect citizens from electronic snooping since these companies still have the requisite infrastructure to conduct surveillance. As long as the infrastructure exists, telecom employees will misuse it. In a 2010 report, the journalist M.A. Arun noted that “alarmingly, this correspondent also came across several instances of service providers’ employees accessing personal communication of subscribers without authorization.” Some years back, K.K. Paul, a top Delhi Police officer and now the Governor of Meghalaya, drafted a memo in which he noted mobile operators’ complaints that private individuals were misusing police contacts to tap phone calls of “opponents in trade or estranged spouses.”

India does not need to have centralized interception facilities to have centralized tracking of interception requests. To prevent unauthorized access to communications content that has been intercepted, at all points of time, the files should be encrypted using public key infrastructure. Mechanisms also exist to securely allow a chain of custody to be tracked, and to ensure the timely destruction of intercepted material after six months, as required by the law. Such technological means need to be made mandatory to prevent unauthorized access, rather than centralizing all interception capabilities.

At the moment, interception orders are given by the federal Home Secretary of India and by state home secretaries without adequate consideration. Every month at the federal level 7,000 to 9,000 phone taps are authorized or re-authorized. Even if it took just three minutes to evaluate each case, it would take 15 hours each day (without any weekends or holidays) to go through 9,000 requests. The numbers in Indian states could be worse, but one can’t be certain as statistics on surveillance across India are not available. It indicates bureaucratic callousness and indifference toward following the procedure laid down in the Telegraph Act.

In a 1975 case, the Supreme Court held that an “economic emergency” may not amount to a “public emergency.” Yet we find that of the nine central government agencies empowered to conduct interception in India, according to press reports — Central Board of Direct Taxes, Intelligence Bureau, Central Bureau of Investigation, Narcotics Control Bureau, Directorate of Revenue Intelligence, Enforcement Directorate, Research & Analysis Wing, National Investigation Agency and the Defense Intelligence Agency — three are exclusively dedicated to economic offenses.

Suspicion of tax evasion cannot legally justify a wiretap, which is why the government said it had believed that Nira Radia, a corporate lobbyist, was a spy when it defended putting a wiretap on her phone in 2008 and 2009. A 2011 report by the cabinet secretary pointed out that economic offenses might not be counted as “public emergencies,” and that the Central Board of Direct Taxes should not be empowered to intercept communications. Yet the tax department continues to be on the list of agencies empowered to conduct interceptions.

India has arrived at a scary juncture, where the multiple departments of the Indian government don’t even trust each other. India’s Department of Information Technology recently complained to the National Security Advisor that the National Technical Research Organization had hacked into National Informatics Center infrastructure and extracted sensitive data connected to various ministries. The National Technical Research Organization denied it had hacked into the servers but said hundreds of e-mail accounts of top government officials were compromised in 2012, including those of “the home secretary, the naval attaché to Tehran, several Indian missions abroad, top investigators of the Central Bureau of Investigation and the armed forces,” The Mint newspaper reported. Such incidents aggravate the fear that the Indian government might not be willing and able to protect the enormous amounts of information it is about to collect through the C.M.S.

Simply put, government entities have engaged in unofficial and illegal surveillance, and the C.M.S. is not likely to change this. In a 2010 article in Outlook, the journalist Saikat Datta described how various central and state intelligence organizations across India are illegally using off-the-air interception devices. “These systems are frequently deployed in Muslim-dominated areas of cities like Delhi, Lucknow and Hyderabad,” Mr. Datta wrote. “The systems, mounted inside cars, are sent on ‘fishing expeditions,’ randomly tuning into conversations of citizens in a bid to track down terrorists.”

The National Technical Research Organization, which is not even on the list of entities authorized to conduct interception, is one of the largest surveillance organizations in India. The Mint reported last year that the organization’s surveillance devices, “contrary to norms, were deployed more often in the national capital than in border areas” and that under new standard operating procedures issued in early 2012, the organization can only intercept signals at the international borders. The organization runs multiple facilities in Mumbai, Bangalore, Delhi, Hyderabad, Lucknow and Kolkata, in which monumental amounts of Internet traffic are captured. In Mumbai, all the traffic passing through the undersea cables there is captured, Mr. Datta found.

In the western state of Gujarat, a recent investigation by Amitabh Pathak, the director general of police, revealed that in a period of less than six months, more than 90,000 requests were made for call detail records, including for the phones of senior police and civil service officers. This high a number could not possibly have been generated from criminal investigations alone. Again, these do not seem to have led to any criminal charges against any of the people whose records were obtained. The information seems to have been collected for purposes other than national security.

India is struggling to keep track of the location of its proliferating interception devices. More than 73,000 devices to intercept mobile phone calls have been imported into India since 2005. In 2011, the federal government asked various state governments, private corporations, the army and intelligence agencies to surrender these to the government, noting that usage of any such equipment for surveillance was illegal. We don’t know how many devices were actually turned in.

These kinds of violations of privacy can have very dangerous consequences. According to the former Intelligence Bureau head in the western state of Gujarat, R.B. Sreekumar, the call records of a mobile number used by Haren Pandya, the former Gujarat home minister, were used to confirm that it was he who had provided secret testimony to the Citizens’ Tribunal, which was conducting an independent investigation of the 2002 sectarian riots in the state. Mr. Pandya was murdered in 2003.

The limited efforts to make India’s intelligence agencies more accountable have gone nowhere. In 2012, the Planning Commission of India formed a group of experts under Justice A.P. Shah, a retired Chief Justice of the Delhi High Court, to look into existing projects of the government and to suggest principles to guide a privacy law in light of international experience. (Centre for Internet and Society, where I work was part of the group). However, the government has yet to introduce a bill to protect citizens’ privacy, even though the governmental and private sector violations of Indian citizens’ privacy is growing at an alarming rate.

In February, after frequent calls by privacy activists and lawyers for greater accountability and parliamentary oversight of intelligence agencies, the Centre for Public Interest Litigation filed a case in the Supreme Court. This would, one hopes, lead to reform.

Citizens must also demand that a strong Privacy Act be enacted. In 1991, the leak of a Central Bureau of Investigation report titled “Tapping of Politicians’ Phones” prompted the rights groups, People’s Union of Civil Liberties to file a writ petition, which eventually led to a Supreme Court of India ruling that recognized the right to privacy of communications for all citizens as part of the fundamental rights of freedom of speech and of life and personal liberty. However, through the 2008 amendments to the Information Technology Act, the IT Rules framed in 2011 and the telecom licenses, the government has greatly weakened the right to privacy as recognized by the Supreme Court. The damage must be undone through a strong privacy law that safeguards the privacy of Indian citizens against both the state and corporations. The law should not only provide legal procedures, but also ensure that the government should not employ technologies that erode legal procedures.

A strong privacy law should provide strong grounds on which to hold the National Security Advisor’s mass surveillance of Indians (over 12.1 billion pieces of intelligence in one month) as unlawful. The law should ensure that Parliament, and Indian citizens, are regularly provided information on the scale of surveillance across India, and the convictions resulting from that surveillance. Individuals whose communications metadata or content is monitored or intercepted should be told about it after the passage of a reasonable amount of time. After all, the data should only be gathered if it is to charge a person of committing a crime. If such charges are not being brought, the person should be told of the incursion into his or her privacy.

The privacy law should ensure that all surveillance follows the following principles: legitimacy (is the surveillance for a legitimate, democratic purpose?), necessity (is this necessary to further that purpose? does a less invasive means exist?), proportionality and harm minimization (is this the minimum level of intrusion into privacy?), specificity (is this surveillance order limited to a specific case?) transparency (is this intrusion into privacy recorded and also eventually revealed to the data subject?), purpose limitation (is the data collected only used for the stated purpose?), and independent oversight (is the surveillance reported to a legislative committee or a privacy commissioner, and are statistics kept on surveillance conducted and criminal prosecution filings?). Constitutional courts such as the Supreme Court of India or the High Courts in the Indian states should make such determinations. Citizens should have a right to civil and criminal remedies for violations of surveillance laws.

Indian citizens should also take greater care of their own privacy and safeguard the security of their communications. The solution is to minimize usage of mobile phones and to use anonymizing technologies and end-to-end encryption while communicating on the Internet. Free and open-source software like OpenPGP can make e-mails secure. Technologies like off-the-record messaging used in apps like ChatSecure and Pidgin chat conversations, TextSecure for text messages, HTTPS Everywhere and Virtual Private Networks can prevent Internet service providers from being able to snoop, and make Internet communications anonymous.

Indian government, and especially our intelligence agencies, violate Indian citizens’ privacy without legal authority on a routine basis. It is time India stops itself from sleepwalking into a surveillance state.

CIS Cybersecurity Series (Part 7) - Jochem de Groot

by Purba Sarkar last modified Jul 30, 2013 09:26 AM
CIS interviews Jochem de Groot, former policy advisor to the Netherlands government, as part of the Cybersecurity Series

"The basic principle that I think we must continue to embrace is that rights online are the same as rights offline... The amount of information that is available online is so enormous that it would be easy for governments to abuse that information for all kinds of purposes... And we are at a stage right now where we are really experimenting with how much information the govt or law enforcement can take to ensure the rule of law." - Jochem de Groot

Centre for Internet and Society presents its seventh installment of the CIS Cybersecurity Series. 

The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

In this installment, CIS interviews Jochem de Groot. Jochem has worked on the Netherlands government’s agenda to promote Internet freedom globally since 2009. He initiated and coordinated the founding conference of the Freedom Online Coalition in The Hague in December 2011, and advised the Kenyan government on the second Freedom Online event in Nairobi in 2012. Jochem represents the Dutch government in the EU, UN, OSCE and other multilateral fora, and oversees a project portfolio for promoting internet freedom globally.  

 
This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

DSCI Best Practices Meet 2013

by Kovey Coles last modified Jul 26, 2013 08:18 AM
The DSCI Best Practices Meet 2013 was organized on July 12, 2013 at Hyatt Regency, Anna Salai in Chennai. Kovey Coles attended the meet and shares a summary of the happenings in this blog post.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


Last year’s annual Best Practices Meet, sponsored by the Data Security Council of India (DSCI), was held in here in Bangalore, and featured CIS associates as panelists for an agenda focused mostly around mobility in technology. This year, the event was continued in nearby Chennai, where many of India’s top stakeholders in Cyber Security came together at the Hyatt hotel to discuss the modern cyber security landscape. Several of the key points of the day emphasized how the industry realm needed to be especially keen on Cyber Security today. Early speakers explained how many Cyber-Attacks occur as opportunistic attacks on financial institutions, and that these breaches often take months to be discovered, with the discovery usually being made by a third-party. For those reasons, it was repeatedly mentioned throughout the day that modern entities must anticipate attacks as inevitable, and prepare themselves to be able to respond and successfully bounce-back.

Several panelists of the event expanded upon the evolving challenges facing industries, and explained why service based industry continually grows more susceptible to Cyber-Attack. There were representatives from Microsoft, Flextronics, MyEasyDoc, and others, who explained how technological demands of modern consumers resulted inadvertently in weaker security. For example, with customers expecting real-time access to data rather than periodic data reports, i.e financial data reports, industries must now keep their data open, which weakens database security. Overall, the primary challenge faced by the industry was effectively summarized by Microsoft India CSO Ganapathi Subramaniam, stating that within web services, “Security and usability are inversely proportional.” Essentially, the more convenient a product, the less secure its infrastructure.

Despite discussion of the difficulties facing modern producers and consumers, there were undoubtedly highlights of optimism at the conference. A presentation by event sponsor Juniper Networks shed light on practices which combat Cyber-Attackers, including rerouting perceived Distributed Denial of Service (DDoS) attacks and finger-printing suspected hackers through a series of characteristics rather than just IP addresses (these characteristics include browser version, fonts, Add-Ons, time zone, and more). Notably, there was a call for cooperation on all fronts in combatting Cyber-crime, for public-private partnerships (PPP), and many citizens stood and spoke on the behalf of civil society’s incorporation in the process as well. One speaker, Retired Brig. Abhimanyu Ghosh admirably tore down sector divisions in the face of Cyber-Security threats, saying “We all want to secure ourselves. It is not a question of industry versus government, government versus industry. Government needs industry, and industry needs government.”

Finally, a few speakers used their opportunity at the conference to highlight issues related to rights and responsibilities of both citizens and government in internet. Nikhil Moro, a scholar at the Hindu Center for Politics and Public Policy, spoke at length about the urgent condition of laws which undermine freedom of speech and freedom of expression in India, especially within while online. His talk, which occurred near the end of the event, stirred the crowd to discussion, and helped remind the attendees of the comprehensiveness of issues which demand attention in the realm of a growing internet presence.

Interview with Mr. Reijo Aarnio - Finnish Data Protection Ombudsman

by Maria Xynou last modified Jul 19, 2013 01:02 PM
Maria Xynou recently interviewed Mr. Reijo Aarnio, the Finnish Data Protection Ombudsman, at the CIS' 5th Privacy Round Table. View this interview and gain an insight on recommendations for better data protection in India!

Mr. Reijo Aarnio - the Finnish Data Protection Ombudsman - was interviewed on the following questions:

1. What activities and functions does the Finnish data commissioner's office undertake?

2. What powers does the Finnish Data commissioner's office have? In your opinion, are these sufficient? Which powers have been most useful? If there is a lack, what would you feel is needed?

3. How is the office of the Finnish data protection commissioner funded?

4. What is the organizational structure at the Office of the Finnish Data Protection Commissioner and the responsibilities of the key executives?

5. If India creates a Privacy Commissioner, what structure/framework would you suggest for the office?

6. What challenges has your office faced?

7. What is the most common type of privacy violation that your office is faced with?

8. Does your office differ from other EU data protection commissioner offices?

9. How do you think data should be regulated in India?

10. Do you support the idea of co-regulation or self-regulation?

11. How can India protect its citizens' data when it is stored in foreign servers?

CII Conference on "ACT": Achieve Cyber Security Together"

by Kovey Coles last modified Jul 26, 2013 08:17 AM
The Confederation of Indian Industries (CII) organized a conference on facing cyber threats and challenges at Hotel Hilton in Chennai on July 13, 2013. Kovey Coles attended this conference and shares a summary of the event in this blog post.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


The conference hosted by CII in the Hotel Hilton, was well attended, and featured a range of industry experts, researches and developers, and members of the Indian armed forces.

Participants focused on the importance of Indian entities reaching new, adequate levels of cyber security. It was stated early in the event that India is one of the world's most targeted areas for cyber-attacks, and its number of domestic internet users is known to be rapidly increasing in an age which many view as a new era of international information warfare. Despite this, the speakers considered India to be too far behind other countries in its understanding of cyber security. In the opening remarks, CII Chairman Santhanam implored "We need hard core techies in this field… we are not producing them." Another speaker, Savitha Kesav Jagadeesan, a practicing lawyer in Chennai, asked if India would wait until the "9/11 of cyberspace" occurrence before we establish the same level of precautionary measures online as it exists now in transportation security.

With the presence of both the government’s executive forces and the private industries, the aura circulating the conference room was that of a collective Indian defense, a secure nation only achieved through both secure governmental and industrial aspects. Similar to the previous day’s DSCI cyber security conference, many speakers discussed security issues pertinent to the financial and banking industries, and other cyber crimes which had pecuniary goals. For people seeking to avoid the array of scams and frauds online, some talks shared some of the most basic advice, like safe password practices. "Passwords are like toothbrushes," said A.S. Murthy of the CDAC, "use them often, never share them with anyone, change them often." Other talks went into the intricacies of various hacking schemes, including tab-nabbing and Designated Denial of Service (DDoS) attacks, describing their tactics and how to moderate them.

In the end, the conference had certainly informed the attendees of the goals, and the challenges, that India will face in the coming months and years. The speakers (all of them) showed how the world of cyber security was quickly evolving, and demonstrated the imperative in government and industry entities evolving their own practices and defenses in stride. The ambitions of several presentations matched the well-publicized "5 lakh cyber professionals in 5 years" plan, placing a strong emphasis in the current and future training of young students in cyber security. Ultimately, I think, the conference helped convince that cyber security is neither a futile, nor completely infallible concept. As CISCO Vice President Col. K.P.M. Das said towards the end of the evening, the most ideal form of cyber security is truly "all about trust, the ability to recover, and transparency/visibility."

Parsing the Cyber Security Policy

by Chinmayi Arun last modified Jul 22, 2013 06:37 AM
An effective cyber-security policy must keep up with the rapid evolution of technology, and must never become obsolete. The standard-setting and review bodies will therefore need to be very nimble, says Chinmayi Arun.
Parsing the Cyber Security Policy

Image: siliconindia.com


Chinmayi Arun's article was published in the Hoot on July 13, 2013 and later cross-posted in the Free Speech Initiative the same day.


We often forget how vulnerable the World Wide Web leaves us. If walls of code prevent us from entering each other’s systems and networks, there are those who can easily pick their way past them or disable essential digital platforms. We are reminded of this by the doings of Anonymous, which carried out a series of attacks, including the website run by Computer Emergency Response Team India (CERT-In) which is the government agency in charge of cyber-security. Even more serious, are cyber-attacks (arguably cyber warfare) carried out by other states, using digital weapons such as Stuxnet, the digital worm. More proximate and personal are perhaps the phishing attacks, which are on the rise.

We therefore run a great risk if we leave air-traffic control, defense resources or databases containing several citizens’ personal data vulnerable. Sure, there is no doubt that efforts towards better cyber-security are needed. A cyber-security policy is meant to address this need, and to help manage threats to individuals, businesses and government agencies. We need to carefully examine the government’s efforts to handle cyber-security, how effective it is and whether its actions do not have too many negative spillovers.

The National Cyber-Security Policy, unveiled last week, is merely a statement of intention in broad terms. Much of  its real impact will be ascertainable only after the language to be used in the law is available. Nevertheless, the scope of the policy remains ambiguous so far, leading to much speculation about the different ways in which it might be intrusive.


One Size Fits All?
The policy covers very different kinds of entities: government agencies, private companies or businesses, non-governmental entities and individual users. These entities may need to be handled differently depending on their nature. Therefore, while direct state action may be most appropriate to secure government agencies’ networks, it may be less appropriate in the context of purely private business.

For example, securing police records would involve the government directly purchasing or developing sufficiently secure technology. However, different private businesses and non-governmental entities may be left to manage their own security. Depending on the size of each entity, each may be differently placed to acquire sophisticated security systems. A good policy would encourage innovation by those with the capacity to do this, while ensuring that others have access to reasonably sound technology, and that they use it. Grey-areas might emerge in contexts where a private party is manages critical infrastructure.

It will also be important to distinguish between smaller and larger organisations whilst creating obligations. Unless this distinction is made at the implementation stage, start-up businesses and civil society organisations may find requirements such as earmarking a budget for cyber security implementation or appointing a Chief Information Security Officer onerous. Additionally, the policy will need to translate into a regulatory solution that provides under-resourced entities with ready solutions to enable them to make their information systems secure, while encouraging larger entities with greater purchasing power to invest in procuring the best possible solutions.

Race to the Top
Security on the Internet works only if it stays one step ahead the people trying to break in. An effective cyber-security policy must keep up with the rapid evolution of technology, and must never become obsolete. The standard-setting and review bodies will therefore need to be very nimble.

The policy contemplates working with industry and supporting academic research and development to achieve this. However the actual manner in which resources are distributed and progress is monitored may make the crucial difference between a waste of public funds and acquisition of capacity to achieve a reasonable degree of cyber security.

Additionally the flow of public funds under this policy, particularly to purchase technology, should be examined very carefully to see whether it is justified. For example, if the government chooses to fund (even by way of subsidy) a private company’s cyber-security research and development rather than an equivalent public university’s endeavour, this decision should be scrutinized to see whether it was necessary. Similarly, if extensive public funds are spent training young people as a capacity-building exercise, we should watch to see how many of these people stay in India and how many leave such that other countries end up benefiting from the Indian government’s investment in them!

Investigation of Security Threats
Although much of the policy focuses on defensive measures that can be taken against security breaches, it is intended not only to cover investigation subsequent to an attack but also to pinpoint ‘potential cyber threats’ so that proactive measures may be taken.

The policy has outlined the need for a ‘Cyber Crisis Management Plan’ to handle incidents that impact ‘critical national processes or endanger public safety and security of the nation’. This portion of the policy will need to be watched closely to ensure that the language used is very narrow and allows absolutely no scope for misinterpretation or misuse that would affect citizens’ rights in any manner.

This caution will be necessary both in view of the manner in which restraints on freedom of speech permitted in the interests of public safety have been flagrantly abused, and because of the kind of paternalistic state intrusion that might be conceived to give effect to this.

Additionally, since the policy also mentions information sharing with internal and international security, defence, law enforcement and other such agencies, it will also be important to find out the exact nature of information to be shared. Of course, how the policy will be put into place will only become clear as the terms governing its various parts emerge. But one hopes the necessary internal direct action to ensure the government agencies’ information networks are secure is already well underway.

It is also to be hoped that the government chooses to take implementation of privacy rights at least as seriously as cyber-security. If some parts of cyber security involve ensuring that user data is protected, the decision about what data needs protection will be important to this exercise.

Additionally, although the policy discusses various enabling and standard-setting measures, it does not discuss the punitive consequences of failure to take reasonable steps to safeguard individuals’ personal data online. These consequences will also presumably form a part of the privacy policy, and should be put in place as early as possible.

You Have the Right to Remain Silent

by Nishant Shah last modified Jul 22, 2013 06:59 AM
Reflecting upon the state of freedom of speech and expression in India, in the wake of the shut-down of the political satire website narendramodiplans.com.

Nishant Shah's column was published in Down to Earth on July 17, 2013.


It took less than a day for narendramodiplans.com, a political satire website that had more than 60,000 hits in the 20 hours of its existence, to be taken down. A simple webpage that showed a smiling picture of Narendra Modi, the touted candidate for India’s next Prime Ministerial campaign, flashing his now trademark ‘V’ for Vengeance Victory sign. At the first glimpse it looked like another smart media campaign by the net-savvy minister who has already made use of the social web quite effectively, to connect with his constituencies and influence the younger voting population in the country. Below the image of Mr. Modi was a text that said, "For a detailed explanation of how Mr. Narendra Modi plans to run the nation if elected to the house as a Prime Minister and also for his view/perspective on 2002 riots please click the link below." The button, reminiscent of 'sale' signs on shops that offer permanent discounts, promised to reveal, for once and for all, the puppy plight of Mr. Modi's politics and his plans for the country that he seeks to lead.

However, when one tried to click on the button, hoping, at least for a manifesto that combined the powers of Machiavelli with the sinister beauty of Kafka, it proved to be an impossible task. The button wiggled, and jiggled, and slithered all over the page, running away from the mouse following it. Referencing the layers of evasive answers, the engineered Public Relations campaigns that try to obfuscate the history to some of the most pointed questions that have been posited to the Modi government through judicial and public forums, the button never stayed still enough to actually reveal the promised answers. For people who are familiar with the history of such political satire and protest online would immediately recognise that this wasn’t the most original of ideas. In fact, it was borrowed from another website - http://www.thepmlnvision.com/ that levelled similar accusations of lack of transparency and accountability on the part of Nawaz Sharif of Pakistan. Another instance, which is now also shut down, had a similar deployment where the webpage claimed to give a comprehensive view into Rahul Gandhi’s achievements, to question his proclaimed intentions of being the next prime-minister. In short, this is an internet meme, where a simple web page and a java script allowed for a critical commentary on the future of the next elections and the strengthening battle between #feku and #pappu that has already taken epic proportions on Twitter.

The early demise of these two websites (please do note, when you click on the links that the Nawaz Sharif website is still working) warns us of the tightening noose around freedom of speech and expression that politicos are responsible for in India. It has been a dreary last couple of years already, with the passing of the Intermediaries Liabilities Rules as an amendment to the IT Act of India, Dr. Sibal proposing to pre-censor the social web in a quest to save the face of erring political figures, teenagers being arrested for voicing political dissent, and artists being prosecuted for exercising their rights to question the state of governance in our country. Despite battles to keep the web an open space that embodies the democratic potentials and the constitutional rights of freedom of speech and expression in the country, it has been a losing fight to keep up with the ad hoc and dictatorial mandates that seem to govern the web.

We have no indication of why this latest piece of satirical expression, which should be granted immunity as a work of art, if not as an individual’s right to free speech, was suddenly taken down. The website now has a message that says, “I quit. In a country with freedom of speech, I assumed that I was allowed to make decent satire on any politician more particularly if it is constructive. Clearly, I was wrong.” The web is already abuzz with conspiracy theories, each sounding scarier than the other because they seem so plausible and possible in a country that has easily sacrificed our right to free speech and expression at the altar of political egos. And whether you subscribe to any of the theories or not, whether your sympathies lie with the BJP or with the UPA, whether or not you approve of the political directions that the country seems to be headed in, there is no doubt that you should be as agitated as I am, about the fact that we are in a fast-car to blanket censorship, and we are going there in style.

What happens online is not just about this one website or the one person or the one political party – it is a reflection on the rising surveillance and bully state that presumes that making voices (and sometimes people) invisible, is enough to resolve the problems that they create. And what happens on the web is soon going to also affect the ways in which we live our everyday lives. So the next time, you call some friends over for dinner, and then sit arguing about the state of politics in the country, make sure your windows are all shut, you are wearing tin-foil hats and if possible, direct all conversations to the task of finally finding Mamta Kulkarni. Because anything else that you say might either be censored or land you in a soup, and the only recourse you might have would be a website that shows the glorious political figures of the country, with a sign that says “To defend your right to free speech and expression, please click here”. And you know that you are never going to be able to click on that sign. Ever.

CIS Cybersecurity Series (Part 8) - Jeff Moss

by Purba Sarkar last modified Jul 30, 2013 09:25 AM
CIS interviews Jeff Moss, Chief Security Officer for ICANN, as part of the Cybersecurity Series.

"Most consumers don't understand the privacy trade offs when they browse the web... the data that is being collected about them, the analytics that is being run against their buying behaviour, it is invisible... it is behind the scenes... and so it is very difficult for the consumer to make an informed decision." - Jeff Moss, Chief Security Officer, ICANN.

Centre for Internet and Society presents its eighth installment of the CIS Cybersecurity Series. 

The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

In this installment, CIS interviews Jeff Moss. Jeff is the chief security officer for ICANN. He founded Black Hat Briefings and DEF CON, two of the most influential information security conferences in the world. In 2009, Jeff was sworn in as a member of the U.S. Department of Homeland Security Advisory Council (DHS HSAC), providing advice and recommendations to the Secretary of the Department of Homeland Security on matters related to domestic security.   

 
This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

Report on the 5th Privacy Round Table meeting

by Maria Xynou last modified Jul 26, 2013 08:24 AM
This report entails an overview of the discussions and recommendations of the fifth Privacy Round Table in Calcutta, on 13th July 2013.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC.


In 2013, the Centre for Internet and Society (CIS) in collaboration with the Federation of Indian Chambers of Commerce and Industry (FICCI), and the Data Security Council of India (DSCI), is holding a series of seven multi-stakeholder round table meetings on “privacy” from April 2013 to October 2013. The CIS is undertaking this initiative as part of their work with Privacy International UK on the SAFEGUARD project.

In 2012, the CIS and DSCI were members of the Justice AP Shah Committee which created the “Report of Groups of Experts on Privacy”. The CIS has recently drafted a Privacy (Protection) Bill 2013, with the objective of contributing to privacy legislation in India. The CIS has also volunteered to champion the session/workshops on “privacy” in the meeting on Internet Governance proposed for October 2013.

At the roundtables the Report of the Group of Experts on Privacy, DSCI´s paper on “Strengthening Privacy Protection through Co-regulation” and the text of the Privacy (Protection) Bill 2013 will be discussed. The discussions and recommendations from the round table meetings will be presented at the Internet Governance meeting in October 2013.

The dates of the seven Privacy Round Table meetings are enlisted below:

  1. New Delhi Roundtable: 13 April 2013

  2. Bangalore Roundtable: 20 April 2013

  3. Chennai Roundtable: 18 May 2013

  4. Mumbai Roundtable: 15 June 2013

  5. Kolkata Roundtable: 13 July 2013

  6. New Delhi Roundtable: 24 August 2013

  7. New Delhi Final Roundtable and National Meeting: 19 October 2013

Following the first four Privacy Round Tables in Delhi, Bangalore, Chennai and Mumbai, this report entails an overview of the discussions and recommendations of the fifth Privacy Round Table meeting in Kolkata, on 13th July 2013.

Presentation by Mr. Reijo Aarnio – Finnish Data Protection Ombudsman

The fifth Privacy Round Table meeting began with a presentation by Mr. Reijo Aarnio, the Finnish Data Protection Ombudsman. In particular, Mr. Aarnio initiated his presentation by distinguishing privacy and data protection and by emphasizing the need to protect both equally within a legal framework. Mr. Aarnio proceeded by highlighting that 96 percent of the Finnish community believes that data protection is necessary, especially since it is considered to play an essential role in the enhancement of the self-determination of the individual. Fuerthermore, Mr. Aarnio pointed out that the right to privacy in Finland in guaranteed under section 10 of the Finnish constitution.

The Finnish Data Protection Ombudsman argued that in order for India to gain European data protection adequacy, the implementation of a regulation for data protection in the country is a necessary prerequisite. Mr. Aarnio argued that although the draft Privacy (Protection) Bill 2013 provides a decisive step in regulating the use of data, the interception of communications and surveillance in India, it lacks in defining the data controller and the data subject, both of which should be legally specified.

In order to support his argument that India needs privacy legislation, the Ombudsman clarified the term “data protection” by stating that it relates to the following:

  • individual autonomy

  • the right to know

  • the right to live without undue interference

  • the right to be evaluated on the basis of correct and relevant information

  • the right to know the criteria automatic decision-making systems are based on

  • the right to trust data security

  • the right to receive assistance from independent authorities

  • the right to be treated in accordance with all other basic rights in a democracy

  • the right to have access to public documents

  • the freedom of speech

In addition to the above, Mr. Aarnio argued that the reason why data protection is important is because it ensures the respect for human dignity, individual autonomy and honor.

The Finnish Data Protection Ombudsman gave a brief overview of the development and history of data protection, by citing the oathe of Hippokrates, the Great Revolutions and World War II, all throughout which data protection has gained increased significance. Mr. Aarnio pointed out that as a result of the development and proliferation of technology, societies have evolved and that data protection is a major component of the contemporary Information Society. The Ombudsman stated that in the Information Society, information is money and open data and big data are products which are being commercialised and commodified. Hence, in order to ensure that human rights are not commericalised and commodified in the process, it is necessary to establish legal safeguards which can prevent potential abuse.

Article 8 of the European Charter of Fundamental Rights guarantees the protection of personal data. Mr. Aarnio argued that the Parliament is the most important data protection authority in Europe and that privacy is legally guaranteed on three levels:

  • Protection of personal life: The Criminal Code (chapter 24) addresses and protects freedom of speech and secrecy regulations

  • Communication: Protection of content and traffic data

  • Data Protection: The Personal Data Act creates Right to Know and to affect/impact, the right to organise one's personal life, automatic processing of personal data and maintenance of register

The Ombudsman also referred to the Directive 95/46/EC of the European Parliament of 24 October 1995 on the protection of individuals with regard to the processing of personal data and the free movement of such data.

Mr. Aarnio argued that in the contemporary ecosystem of the Information Society, countries need “Privacy by Design”, which entails the description of the processing of personal data and the evaluation of its lawfulness. In particular, the purpose for the collection and processing of data should be legally defined, as well as whether such data will be shared with third parties, disclosed and/or retained. The Ombudsman argued that India needs to define its data controllers and to legally specify their roles, in order to ensure that the management of data does not result in the infringement upon the right to privacy and other human rights.

The Finnish Data Protection Ombudsman concluded his presentation by stating that data security is not only a technological matter, but also – and in some cases, mostly – a legal issue, which is why India should enact the draft Privacy (Protection) Bill 2013.

Discussion of the draft Privacy (Protection) Bill 2013

Chapter I: Definitions

The discussion of the draft Privacy (Protection) Bill 2013 commenced with a debate on whether such a Bill is necessary at all, given that section 43 of the IT Act is considered (by participants at the round table) to regulate the protection of data. It was pointed out that although section 43 of the Information Technology Act provides some rules for data protection, the Committee has stated that these rules are inadequate. In particular, India currently lacks statutory provisions dealing with data protection and rules are inadequate because they are subject to parliamentary debate, and the Parliament does not have the right to vote on rules. The Parliament does not have the right to amend rules, which means that it does not have the right to amend the rules on data protection under the IT Act. Since the rules under section 43 of the IT Act are not subject to parliamentary review, India needs a seperate privacy statutue. Hence, the round table reached a consensus on the discussion of the draft Privacy (Protection) Bill 2013.

Personal data is defined in the draft Privacy (Protection) Bill 2013 as any data which relates to a natural person, while sensitive personal data is defined as a subset of personal data, such as biometric data, medical history, sexual preference, political affiliation and criminal history. It was pointed out that race, religion and caste are not included in the Bill's definition for sensitive personal data because the Government of India refuses to acknowledge these types of information as personal data. According to the Government, the collection of such data is routine and there have been no cases when such data has been breached, which is why race, religion and caste should not be included in the definition for sensitive personal information. However, the last caste sensus took place in 1931 and since then there has been no caste sensus, because it is considered to be a sensitive issue. This contradictory fact to the government's position was pointed out during the round table meeting.

A participant argued that financial information should be included within the definition for sensitive personal data. This was countered by a participant who argued that India has the Credit Information Companies Act which covers credit information and sets out specific information for the protection of credit data by banks and relevant companies. Yet the question of whether general financial information should be included in the definition for sensitive personal data was further discussed, and many participants supported its inclusion in the definition.

The question of whether IP addresses should be included in the definition for personal data was raised. The response to this question was that IP addresses should be included in the definition since they relate to the identification of a natural person. However, the question of whether a specific IP address is considered personal data, as many individuals use the Web through the same IP address, remained unclear. Other participants raised the question of whether unborn humans and deceased persons should have privacy rights. The response to this was that in India, only the court can decide if a deceased person can have the right to privacy.

The controversy between the UID project and the protection of biometric data under the definition for sensitive personal information was discussed in the round table. In particular, it was pointed out that because the UID scheme requires the mass biometric collection in India is contradictory to the protection of such data under the Bill. As the UID scheme remains unregulated, it is unclear who will have access to the biometric data, who it will be shared with, whether it will be disclosed and retained and if so, for how long. All the questions which revolve around the implementation of the UID scheme and the use of the biometric data collected raise concerns in regards to what extent such data can realistically be protected under privacy legislation.

On this note, a participant mentioned that under EU regulation, an ID number is included in the definition for sensitive personal information and it was recommended that the same is added in India's draft Privacy (Protection) Bill 2013. Furthermore, a participant recommended that fingerprints are also included in the definition for sensitive personal data, especially in light of the NPR and UID scheme.

A participant argued that passwords should also be included in the definition for sensitive personal data, as well as private keys which are used for encryption and decryption. It was pointed out that section 69 of the IT Act requires the disclosure of encryption keys upon the request from authorities, which potentially can lead to the violation of privacy and other human rights. Hence the significance of protecting passwords and encryption keys which can safeguard data was highly emphasized and it was argued that they should definitely be included in the definition for sensitive personal data. This position was countered by a participant who argued that the Government of India should have access to private encyrption keys for national security purposes.

On the definition of sensitive personal data, it was emphasized that this term should relate to all data which can be used for discrimination, which is why it needs to be protected. It was further emphasized that it took Europe twelve years to reach a definition for personal data, which is why India still needs to look at the issue in depth and encounter all the possible violations which may potentially occur from the non-regulation of various types of data. Most participants agreed that financial information, passwords and private encryption keys should be added in the definition for sensitive personal data.

The fifth round table entailed a debate on whether political affiliation should be included in the definition for sensitive personal data. In particular, one participant argued that political parties disclose the names of their members and that in many cases they are required to do in order to show their source of income. Hence, it was argued that political affiliation should not be included in the definition for sensitive personal data, since it is not realistic to expect political parties to protect their members' privacy. This was countered by other participants who argued that anonymity in political communications is important, especially when an individual is in a minority position, which is why the term political affiliation should be included in the definition for sensitive personal data.

The discussion on the definitions in the draft Privacy (Protection) Bill 2013 concluded with comments that the definiton for surveillance is very exclusive of many types of surveillance. In particular, it was argued that the definition for surveillance does not appear to cover artificial intelligence, screen shots and various other forms of surveillance, all of which should be regulated.

Chapter II: Right to Privacy

Section 4 of the draft Privacy (Protection) Bill 2013 states that all natural persons have a right to privacy. Section 5 of the Bill includes exemptions to the right to privacy. On this note, it was pointed out that during the round table that there is no universal definition of privacy and thus it is challenging to define the term and to regulate it. Furthermore, the rapid pace at which technology is proliferating was emphasized, along with its impact on the right to privacy. For example, it was mentioned that emails were not covered by privacy legislation in the past, but this needs to be amended accordingly. The European Data Protection Directive was established in 1995 and does not regulate many privacy issues which arise through the Internet, which is why it is currently being reviewed. Similarily, it was argued that privacy legislation in India should encompass provisions for potential data breaches which may occur through the Internet and various forms of technology.

A participant argued that the draft Privacy (Protection) Bill 2013 should include provisions for data subjects, which enable them to address their rights. In particular, it was argued that data subjects should have the right to access information collected and retained about them and that they should have the right to make corrections. The reponse to this comment was that the Bill may be split into two seperate Bills, where the one would regulate data protection and the other would regulate the interception of communications and surveillance, while the data subject would be addressed extensively. Furthermore, participants raised questions of how to define the data controller and the data subjects within the Indian context.

Other questions which were raised during the round table included whether spam should be addressed by the Bill. Several participants argued that spam should not be regulated, as it is not necessarily harmful to data subjects. Other participants argued that the isse of access to data should be addressed prior to the definition of privacy. Another argument was that commerical surveillance should not be conducted within restrictions, which is why it should not be inlcuded in the exemptions to the right to privacy. It was also pointed out that residential surveillance should be allowed, as long as the cameras are pointed inwards and do not capture footage of third parties outside of a residence. On this note, it was argued that surveillance in the work place should also be exempted from the right to privacy, as that too can be considered the private property of the owner. Moreover, it was emphasized that the surveillance of specific categories of people should also be excluded from the exemptions to the right to privacy.

A participant argued that in some cases, NGOs may be collecting information for some “beneficial purpose” and that such cases should be excluded from the exemptions to the right to privacy. Other participants argued that in many cases, data needs to be collected for market research and that the Bill should regulate what applies in such cases. All such arguments were countered by a participant, who argued that Section 5 of the Bill on the exemptions to the right to privacy should be deleted, as it creates to many complications. This recommendation was backed up by the example of a husband capturing a photograph of his wife and then publishing the image without her consent.

During this discussion, a participant raised the question of to what extent the right to privacy applies to minors. This question was supported by the example of Facebook, where many minors have profiles but the extent to which this data is protected remains ambiguous. Furthermore, it was pointed out that it remains unclear whether privacy legislation can practically safeguard minors who choose to share their data online. A participant responded to these concerns by stating that Facebook is a data controller and has to comply with privacy law to protect its customers' data. It was pointed out that it does not matter if the data controller is a company or an NGO; in every case, the data controller is obliged to comply with data protection law and regulations.

Furthermore, it was pointed out that Facebook allows for minors aged 13 to create a profile, while it remains unclear how minors can enforce their privacy rights. In particular, it remains unclear how the mediated collection of minors' data can be regulated and it was recommended that this is addressed by the Bill. A participant replied to this by stating that Indian laws rule in favour of minors, but that this simultaneously remains a grey area. In particular, it was pointed out that rules under section 43 of the Information Technology (IT) Act cover Internet access by minors, but this still remains an unclear area which needs further debate and analysis.

The question which prevailed at the end of the discussion of Chapter 2 of the Bill was on the social media and minors, and on how minors' data can be protected when it is being published immediately through the social media, such as Facebook. Furthermore, it was recommended that the Bill addresses the practical operationalisation of the right to privacy within the Indian context.

Chapter III: Protection of Personal Data

The discussion of Chapter 3 of the draft Privacy (Protection) Bill 2013 on the protection of personal data commenced with a reference to the nine privacy principles of the Justice AP Shah Justice Committee. The significance of the principles of notice and consent were outlined, as it was argued that individuals should have the right to be informed about the data collected about them, as well as to have the rigt to access such data and make possible corrections.

Collection of Personal Data

The discussion on the collection of personal data (as outlined in Section 6 of Chapter 3 of the Bill) commenced with a participant arguing that a company seeking to collect personal data should always have a stated function. In particular, a company selling technological products or services should not collect biometric data, for example, unless it serves a specified function. It was pointed out that data collection should be restricted to the specified purposes. For example, a hospital should be able to collect medical data because it relates to its stated function, but an online company which provides services should not be eligible to collect such data, as it deviates from its stated function.

During the discussion, it was emphasized that individuals should have the right to be informed when their data is being collected, which data is being collected, the conditions for the disclosure of such data and everything else that revolves around the use of their data once it has been collected. However, a participant questioned whether it is practically feasible for individuals to provide consent to the collection of their data every time it is being collected, especially since the privacy policies of companies keep changing. Moreover, it was questioned whether companies can or should resume the consent of their customers once their privacy policy has changed. On this note, a participant argued that companies should be obliged to notify their customers every time their privacy policy changes and every time the purpose behind their data collection changes.

On the issue of consent for data collection, a participant argued that individuals should have the right to withdraw their consent, even after their data has been collected and in such cases, such data should be destroyed. This was countered by another participant who argued that it is not realistic to expect companies to acquire individual consent every time the purpose behind data collection changes, nor is it feasible to allow for the withdrawal of consent without probable cause.

The issue of indirect consent to the collection of personal data was raised and, in particular, several participants argued that the Bill should have provisions which would regulate circumstances where indirect consent can be obtained for the collection of personal data. Furthermore, it was emphasized that the Bill should also include a notice for all potential purposes of data collection which may arise in the future; if the purpose for data collection changes based on conditions specified, then companies should not be mandated to notify individuals. Moreover, a participant argued that the Bill should include provisions which would enable individuals to opt-in and/or opt-out from data collection.

On the issue of consent, it was further outlined that consent provides a legitimate purpose to process data and that the data subject should have the right to be informed prior to the collection of his or her data. However, it was emphasized that the draft Privacy (Protection) Bill 2013 is a very strict regulation, as consent cannot always be acquired prior to data collection, because there are many cases where this is not practically feasible. It was pointed out that in the European Data Protection Directive, it is clear that consent cannot always be acquired prior to data collection. The example of medical cases was mentioned, as patients may not always be capable to provide consent to data collection which may be necessary.

In particular, it was highlighted that the European Data Protection Directive includes provisions for the processing of personal data, as well as exceptions for when consent is not required prior to data collection. The Directive guarantees the legitimate interest of the data controller and data processing is based upon the provisions of privacy legislation. The outsourcing of data is regulated in the European Union, and it was recommended that India regulates it too. Following this comment, it was stated that the recent leaks on the NSA's surveillance raise the issue of non-consentual state collection of data and non-consentual private disclosure of data and a brief debate revolved around these issues in the round table.

On the issue of mediated data collection, the situations in which collected data is mediated by third parties was analysed. It was recommended that the law is flexible to address the various types of cases when collected data is mediated, such as when a guardian needs to handle and take decisions for data of a mentally disabled person being collected. However, it was pointed out that mediated data collection should be addressed sectorally, as a doctor, for example, would address mediated data in a different manner than a company. It was emphasized that specific cases – such a parent taking a mediated decision on the data collection of his or her child – should be enabled, whereas all other cases should be prohibited. Thus it was recommended that language to address the mediated collection of data should be included in the Bill.

A participant raised the question of whether there should be seperate laws for the private collection of data and state collection of data. It was mentioned that this is the case in Canada. Another question which was raised was what happens when state collectors hire private contractors. The UID was brought as an example of state collection of data, while private contractors have been hired and are involved in the process of data collection. This could potentially enable the collection and access of data by unauthorised third parties, to which individuals may have not given their consent to. Thus it was strongly recommended that the Bill addresses such cases and prevents unauthorised collection and access of data.

The discussion on the collection of personal data ended with an interesting test case study for privacy: should the media have the right to disclose individuals' personal data? A debate revolved around this question and participants recommended that the Bill regulates the collection, processing, sharing, disclosure and retention of personal data by the media.

Retention of Personal Data

The discussion on the retention of personal data commenced with the statement that there are various exceptions to the retention of data in India, which are outlined in various court cases. It was pointed out that data should be retained in compliance with the law, but this is problematic as, in various occasions, a verbal order by a policeman can be considered adequate, but this can potentially increase the probability for abuse. A question which was raised was whether an Act of Parliament should allow for the long term storage of data, especially when there is inadequate data to support its long-term retention. It was pointed out that in some cases there are laws which allow for the storage of data for up to ten years, without the knowledge – let alone the consent – of the individual. Thus, the issue of data retention in India remains vague and should be addressed by the draft Privacy (Protection) Bill 2013.

Questions were raised on the duration of data retention periods and on whether there should be one general data retention law or several sectoral data retention laws. The participants disagreed on whether an Act of Parliament should regulate data retention or whether data retention should be regulated by sectoral authorities. A participant recommended “privacy by design” and stated that the question of data retention should be addressed by data controllers. Other participants raised the question of purpose limitation, especially for cases when data is being re-retained after the end of its retention period. A participant recommended that requirements for the anonymisation of data once it has exceeed its retention period should be established. However, this proposal was countered by participants who argued that the pracitcal enforcement of the anonymisation of retained data is not feasible within India.

Destruction of Personal Data

The retention of personal data can be prevented once data has been destroyed. However, participants argued that various types of data are being collected through surveillance products which are controlled by private parties. In such cases, it was argued that it remains unclear how it will be verified that data has indeed being destroyed.

A participant argued that the main problem with data destruction is that even if data has been deleted, it can be retrieved up to seven times; thus the question which arises is how can individuals know if their data has been permanently destroyed, or if it is being secretly retrieved. Questions were raised on how the permanent retention of data can be prevented, especially when even deleted data can be retrieved. Hence it was recommended that information security experts cooperate with data controllers and the Privacy Commissioner, to ensure that data is permanently destroyed and/or that data is not being accessed after the end of its retention period. Such experts would ensure that data is actually being destroyed.

Another participant pointed out the difference between the wiping of data and the deletion of data. In particular, the participant argued that data is being deleted when it is being overwritten by other data, and can potentially be recovered. Wiping of data, on the other hand, involves the wiping out of data which can never be recovered. The participant recommended that the Bill explicitly states that data is wiped out in order to ensure that data is not being indirectly retained.

Processing of Personal Data

The dicsussion on the processing of personal data began with the question of national archives. In particular, participants argued that if the processing of data is strictly regulated, that would restrict access to national archives and the draft Privacy (Protection) Bill 2013 should address this issue.

Questions were raised on the non-consentual processing of personal data and on how individual consent should be acquired prior to the processing of personal data. It was pointed out that the Article 29 Working Party has published an Opinion on purpose limitation with regards to data processing and it was recommended that a similar approach is adopted in India.

Furthermore, it was stated that IT companies are processing data from the EU and the U.S., but it remains unclear how individual consent can be obtained in such cases. A debate evolved on how to bind foreign data processors to meet the data requirements of India, as a minimum prerequisite to ensure that outsourced data is not breached. In light of the Edward Snowden leaks of NSA surveillance, many questions were raised on how Indian data outsourced and stored abroad can be protected.

It was highlighted during the round table that all data processing in India requires certification, but since the enforceability of the contracts relies on individuals, this raises issues of data security. Moreover, questions were raised on how Indian companies can protect the data of their foreign data subjects. Thus, it was recommended that the processing of data is strictly regulated through the draft Privacy (Protection) Bill 2013 to ensure that outsourced data and data processed in the country is not breached.

Security of Personal Data

On the issue of data security, the participants argued that the data subject should always be informed in cases when the confidentiality of their personal data is violated. Confidentiality is usually contractually limited, whereas secrecy is not, which is why both terms are included in the draft Privacy (Protection) Bill 2013. In particular, secrecy is usually used for public information, whereas confidentiality is not.

Participants argued that the Bill should include restrictions on the media, in order to ensure that the confidentiality and integrity of their sources' data is preserved. Several participants stated that the Bill should also include provisions for whistleblowers which would provide security and confidentiality for their data. The participants of the round table engaged in a debate on whether the media should be strictly regulated in order to ensure the confidentiality of their sources' data. On the one hand, it was argued that numerous data breaches have occured as a result of the media mishandling their sources' data. On the other hand, it was stated that all duties of secrecy are subject to the public interest, which is why the media reports on them and which is why the media should not be restricted.

Disclosure of Personal Data

The discussion on the disclosure of personal data commenced with participants pointing out that the draft Privacy (Protection) Bill 2013 does not include requirements for consent prior to the disclosure of personal data, which may potentially lead to abuse. Questions were raised on the outsourcing of Indian data abroad and on the consequences of its foreign disclosure. Once data is outsourced, it remains unclear how the lawful disclosure or non-disclosure of data can be preserved, which is why it was recommended that the Bill addresses such issues.

A participant argued that there is a binding relationship between the data controller and the data subject and that disclosure should be regulated on a contractual level. Another participant raised the question of enforcement: How can regulations on the disclosure of personal data be enforced? The response to this question was that the law should focus on the data controller and that when Indian data is being outsourced abroad, the Indian data controller should ensure that the data subjects' data is not breached. However, other participants raised the question of how data can be protected when it is outsourced to countries where the rule of law is not strong and when the country is considered inadequate in terms of data protection.

With an increased transnational flow of information, questions arise on how individuals can protect their information. A participant recommended that it should be mandatory for companies to state in their contracts who they are outsourcing data to and whether such data will be disclosed to third parties. However, this proposal as countered by a participant who argued that even if this was inforced, it is still not possible to enforce the rights of an Indian data subject in a country which does not have a strong rule of law or which generally has weak legislation. A specific example was mentioned, where E.G. Infosys and Wipro Singapore have a contractual agreement and Indian data is outsourced. It was pointed out that if such data is breached, it remains unclear if the individual should address this issue to Wipro India, as well as which law should apply in this case and whether companies should be liable.

A participant suggested that the data controller discloses data without having acquired prior consent, if the Government of India requests it. However, this was countered by a participant who argued that even in such a case, the question of regulating access to data still remains. Other participants argued that the Right to Information Act has been misused and that too much information is currently being disclosed. It was recommended that the Right to Information Act is amended and that the Bill includes strict regulations for the disclosure of personal data.

Meeting Conclusion

The fifth Privacy Round Table meeting commenced with a presentation on privacy and data protection by Mr. Reijo Aarnio, the Finnish Data Protection Ombudsman, and proceeded with a discussion of the draft Privacy (Protection) Bill 2013. The participants engaged in a heated debate and provided recommendations for the definitions used in the Bill, as well as for the regulation of data protection. The recommendations for the improvement of the draft Privacy (Protection) Bill 2013 will be considered and incorporated in the final draft.

Privacy Round Table, Delhi

by Prasad Krishna last modified Aug 12, 2013 10:42 AM

PDF document icon Invite-Delhi.pdf — PDF document, 1157 kB (1185675 bytes)

More than a Hundred Global Groups Make a Principled Stand against Surveillance

by Elonnai Hickok last modified Jul 31, 2013 02:26 PM
For some time now there has been a need to update understandings of existing human rights law to reflect modern surveillance technologies and techniques.

Nothing could demonstrate the urgency of this situation more than the recent revelations confirming the mass surveillance of innocent individuals around the world.

To move toward that goal, today we’re pleased to announce the formal launch of the International Principles on the Application of Human Rights to Communications Surveillance. The principles articulate what international human rights law – which binds every country across the globe – require of governments in the digital age. They speak to a growing global consensus that modern surveillance has gone too far and needs to be restrained. They also give benchmarks that people around the world can use to evaluate and push for changes in their own legal systems.

The product of over a year of consultation among civil society, privacy and technology experts, including the Centre for Internet and Society (read here, here, here and here), the principles have already been co-signed by over hundred organisations from around the world. The process was led by Privacy International, Access, and the Electronic Frontier Foundation. The process was led by Privacy International, Access, and the Electronic Frontier Foundation.

The release of the principles comes on the heels of a landmark report from the United Nations Special Rapporteur on the right to Freedom of Opinion and Expression, which details the widespread use of state surveillance of communications, stating that such surveillance severely undermines citizens’ ability to enjoy a private life, freely express themselves and enjoy their other fundamental human rights. And recently, the UN High Commissioner for Human Rights, Nivay Pillay, emphasised the importance of applying human right standards and democratic safeguards to surveillance and law enforcement activities.

"While concerns about national security and criminal activity may justify the exceptional and narrowly-tailored use of surveillance programmes, surveillance without adequate safeguards to protect the right to privacy actually risk impacting negatively on the enjoyment of human rights and fundamental freedoms," Pillay said.

The principles, summarised below, can be found in full at necessaryandproportionate.org. Over the next year and beyond, groups around the world will be using them to advocate for changes in how present laws are interpreted and how new laws are crafted.

We encourage privacy advocates, rights organisations, scholars from legal and academic communities, and other members of civil society to support the principles by adding their signature.

To sign, please send an email to [email protected], or visit https://www.necessaryandproportionate.org/about

Summary of the 13 principles

  • Legality: Any limitation on the right to privacy must be prescribed by law.
  • Legitimate Aim: Laws should only permit communications surveillance by specified State authorities to achieve a legitimate aim that corresponds to a predominantly important legal interest that is necessary in a democratic society.
  • Necessity: Laws permitting communications surveillance by the State must limit surveillance to that which is strictly and demonstrably necessary to achieve a legitimate aim.
  • Adequacy: Any instance of communications surveillance authorised by law must be appropriate to fulfill the specific legitimate aim identified.
  • Proportionality: Decisions about communications surveillance must be made by weighing the benefit sought to be achieved against the harm that would be caused to users’ rights and to other competing interests.
  • Competent judicial authority: Determinations related to communications surveillance must be made by a competent judicial authority that is impartial and independent.
  • Due process: States must respect and guarantee individuals' human rights by ensuring that lawful procedures that govern any interference with human rights are properly enumerated in law, consistently practiced, and available to the general public.
  • User notification: Individuals should be notified of a decision authorising communications surveillance with enough time and information to enable them to appeal the decision, and should have access to the materials presented in support of the application for authorisation.
  • Transparency: States should be transparent about the use and scope of communications surveillance techniques and powers.
  • Public oversight: States should establish independent oversight mechanisms to ensure transparency and accountability of communications surveillance.
  • Integrity of communications and systems: States should not compel service providers, or hardware or software vendors to build surveillance or monitoring capabilities into their systems, or to collect or retain information.
  • Safeguards for international cooperation: Mutual Legal Assistance Treaties (MLATs) entered into by States should ensure that, where the laws of more than one State could apply to communications surveillance, the available standard with the higher level of protection for users should apply.
  • Safeguards against illegitimate access: States should enact legislation criminalising illegal communications surveillance by public and private actors.

The Audacious ‘Right to Be Forgotten’

by Kovey Coles last modified Jul 31, 2013 10:08 AM
There has long been speculation over the permanency of our online presence. Posting about excessively-personal details, commenting in a way which is later embarrassing, being caught in unflattering public photos; to our chagrin, all of these unfortunate situations often persist on the web, and can continue to haunt us in future years.

Perhaps less dire, what if someone decides that she no longer wants the history of her internet action stored in online systems?

So far, there has been confusion over what should be done, and what realistically can be done about this type of permanent presence on a platform as complex and international in scope as the internet. But now, the idea of a right to be forgotten may be able to define the rights and responsibilities in dealing with unwanted data.

The right to be forgotten is an interesting and highly contentious concept currently being debated in the new European Union Data Protection Regulations.[1]

The Data Protection Regulation Bill was proposed in 2012 by EU Commissioner Viviane Reding and stands to replace the EU’s previous Data Protection law, which was enacted in 1995. Referred to as the “right to be forgotten” (RTBF), article 17 of the proposal would essentially allow an EU citizen to demand service providers to “take all reasonable steps” to remove his or her personal data from the internet, as long as there is no “legitimate” reason for the provider to retain it.[1] Despite the evident emphasis on personal privacy, the proposition is surrounded by controversy and facing resistance from many parties. Apparently, there are a range of concerns over the ramifications RTBF could bring.

Not only are major IT companies staunchly opposed to the daunting task of being responsible for the erasure of data floating around the web, but governments like the United States and even Great Britain are objecting the proposal as well.[2],[3]

From a commercial aspect, IT companies and US lobbying forces view the concept of RTBF as a burden and a waste of resources for service providers to implement. Largely due to the RTBF clause, the new EU Data Protection proposal as a whole has witnessed intense, “unprecedented” lobbying by the largest US tech companies and US lobby groups[4],[5]. From a different angle, there are those like Great Britain, whose grievances with the RTBF are in its overzealous aim and insatiable demands.[2] There are doubts as to whether a company will even be able to track down and erase all forms of  the data in question. The British Ministry of Justice stated, "The UK does not support the right to be forgotten as proposed by the European commission. The title raises unrealistic and unfair expectations of the proposals."[2] Many experts share these feasibility concerns. The Council of European Professional Informatics Societies (CEPIS) wrote a short report on the ramifications of cloud computing practices in 2011, in which it conformed, “It is impossible to guarantee complete deletion of all copies of data. Therefore it is difficult to enforce mandatory deletion of data. Mandatory deletion of data should be included into any forthcoming regulation of Cloud Computing services, but still it should not be relied on too much: the age of a ‘Guaranteed complete deletion of data’, if it ever existed has passed."[6]

Feasibility aside, the most compelling issue in the debate over RTBF is the demanding challenge of balancing and prioritizing parallel rights. When it comes to forced data erasure, conflicts of right to be forgotten versus freedom of speech and expression easily arises. Which right takes precedence over the other?

Some RTBF opponents fear that RTBF will hinder freedom of speech. They have a valid point. What is the extent of personal data erasure? Abuse of RTBF could result in some strange, Orwellian cyberspace where the mistakes or blemishes of society are all erased or constantly amended, and only positivity fills the internet. There are reasonable fears that a chilling effect may come into play once providers face the hefty noncompliance fines of the Data Protection law, and begin to automatically opt for customer privacy over considerations for freedom of expression. Moreover, what safeguards may be in place to prevent politicians or other public figures from removing bits of unwanted coverage?

Although these examples are extreme, considerations like these need to be made in the development of this law. With the amount of backlash from various entities, it is clear that a concept like the right to be forgotten could not exist as a simple, generalized law. It needs refinement.

Still, the concept of a RTBF is not without its supporters. Viktor Mayer-Schönberger, professor of Internet Governance at Oxford Internet Institute, considers RTBF implementation feasible and necessary, saying that even if it is difficult to remove all traces of an item, "it might be in Google's back-up, but if 99% of the population don't have access to it you have effectively been deleted."[7] Additionally, he claims that the undermining of freedom of speech and expression is "a ridiculous misstatement."[7] To him, the right to be forgotten is tied intricately to the important and natural process of forgetting things of the past.

Moreover, the Data Protection Regulation does mention certain exceptions for the RTBF, including protection for "journalistic purposes or the purpose of artistic or literary expression." [1] The problem, however, is the seeming contradiction between the RTBF and its own exceptions. In practice, it will be difficult to reconcile the powers granted by the RTBF with the limitations claimed in other sections of the Data Protection Regulation.

Currently, the are a few clean and straight forward implementations of RTBF. One would be the removal of mined user data which has been accumulated by service providers. Here, invoking the right would be possible once a person has deleted accounts or canceled contracts with a service (thereby fulfilling the notion that the service no longer has "legitimate" reason to retain the data). Another may be in the case of personal data given by minors who later want their data removed, which is an important example mentioned in Reding’s original proposal.[4] These narrow cases are some of the only instances where RTBF may be used without fear of interference with other social rights. Broader implementations of the RTBF concept, under the current unrefined form, may cause too many conflicting areas with other freedoms, and especially freedom of expression.

Overall, the Right to Be Forgotten is a noble concept, born out of concern for the citizen being overpowered by the internet. As an early EU publication states, "The [RTBF] rules are about empowering people, not about erasing past events or restricting the freedom of the press."[8] But at this point, too many clear details seem to be lacking from the draft design of the RTBF. There is concern that without proper deliberation, the concept could lead to unforeseen and undesirable outcomes. Privacy is a fundamental right that deserves to be protected, but policy makers cannot blindly follow the ideals of one right to the point where it interferes with other aspects of society.

Fortunately, recent amendment proposals have attempted some refinement of the bill. Jeffrey Rosen writes in the Stanford Law Review about a certain key concept that could help legitimize the right, namely an amendment proposing that only personally contributed data may be rescinded.[9] This would help avoid interference with others’ rights to expression, and provide limitations on the extent of right to be forgotten claims. As Leslie Harris, president of the Center for Democracy and Technology wrote in the Huffington Post, amendments are needed which can specifically define personal data in the RTBF sense; thereby distinguishing which type of data is allowed to be removed.[10] In the upcoming months, the European Parliament will be considering such amendments to the proposal. This time will be crucial as it will determine if the development of the right to be forgotten will make it a viable option for the EU’s 500 million citizens.

But even after terms are defined and after safeguards are established, this underling philosophical question remains:

Should a person be able to reclaim the right to privacy after willingly giving it up in the first place?

The RTBF is obviously a contentious topic, one which may need to be gauged individually by nation states; it will soon be revealed if the EU becomes the first to adopt the right. If RTBF fails to pass in European parliament, I would hope that it at least serves to remind people of the permanence of the data which they add to the internet, further incentivizing careful consideration of what one yields to the web. Rights frequently evolve and expand to meet societal or technological advances. If we are to expand the concept of privacy, however, then we must do so with proper consideration, so that privacy may not gain disproportionate power over other rights, or vice versa.


[1]. http://bit.ly/WSZvHv

[2]. http://bit.ly/YxKaNJ

[3]. http://tcrn.ch/YdH82f

[4]. http://bit.ly/196E8qj

[5]. http://bit.ly/wJKWTZ

[6]. http://bit.ly/15aoknF

[7]. http://bit.ly/Z3JbRU

[8]. http://bit.ly/xfodhI

[9]. http://bit.ly/13uyda5

[10]. http://huff.to/16P2XIS

India's National Cyber Security Policy in Review

by Jonathan Diamond last modified Jul 31, 2013 10:40 AM
Earlier this month, the Department of Electronics and Information Technology released India’s first National Cyber Security Policy. Years in the making, the Policy sets high goals for cyber security in India and covers a wide range of topics, from institutional frameworks for emergency response to indigenous capacity building.

What the Policy achieves in breadth, however, it often lacks in depth. Vague, cursory language ultimately prevents the Policy from being anything more than an aspirational document. In order to translate the Policy’s goals into an effective strategy, a great deal more specificity and precision will be required.

The Scope of National Cyber Security

Where such precision is most required is in definitions. Having no legal force itself, the Policy arguably does not require the sort of legal precision one would expect of an act of Parliament, for example. Yet the Policy deals in terms plagued with ambiguity, cyber security not the least among them. In forgoing basic definitions, the Policy fails to define its own scope, and as a result it proves remarkably broad and arguably unfocused.

The Policy’s preamble comes close to defining cyber security in paragraph 5 when it refers to "cyber related incident[s] of national significance" involving "extensive damage to the information infrastructure or key assets…[threatening] lives, economy and national security." Here at least is a picture of cyber security on a national scale, a picture which would be quite familiar to Western policymakers: computer security practices "fundamental to both protecting government secrets and enabling national defence, in addition to protecting the critical infrastructures that permeate and drive the 21st century global economy."[*] The paragraph 5 definition of sorts becomes much broader, however, when individuals and businesses are introduced, and threats like identity theft are brought into the mix.

Here the Policy runs afoul of a common pitfall: conflating threats to the state or society writ large (e.g. cyber warfare, cyber espionage, cyber terrorism) with threats to businesses and individuals (e.g. fraud, identity theft). Although both sets of threats may be fairly described as cyber security threats, only the former is worthy of the term national cyber security. The latter would be better characterized as cyber crime. The distinction is an important one, lest cyber crime be “securitized,” or elevated to an issue of national security. National cyber security has already provided the justification for the much decried Central Monitoring System (CMS). Expanding the range of threats subsumed under this rubric may provide a pretext for further surveillance efforts on a national scale.

Apart from mission creep, this vague and overly broad conception of national cyber security risks overwhelming an as yet underdeveloped system with more responsibilities than it may be able to handle. Where cyber crime might be left up to the police, its inclusion alongside true national-level cyber security threats in the Policy suggests it may be handled by the new "nodal agency" mentioned in section IV. Thus clearer definitions would not only provide the Policy with a more focused scope, but they would also make for a more efficient distribution of already scarce resources.

What It Get Right

Definitions aside, the Policy actually gets a lot of things right — at least as an aspirational document. It certainly covers plenty of ground, mentioning everything from information sharing to procedures for risk assessment / risk management to supply chain security to capacity building. It is a sketch of what could be a very comprehensive national cyber security strategy, but without more specifics, it is unlikely to reach its full potential. Overall, the Policy is much of what one might expect from a first draft, but certain elements stand out as worthy of special consideration.

First and foremost, the Policy should be commended for its commitment to “[safeguarding] privacy of citizen’s data” (sic). Privacy is an integral component of cyber security, and in fact other states’ cyber security strategies have entire segments devoted specifically to privacy. India’s Policy stands to be more specific as to the scope of these safeguards, however. Does the Policy aim primarily to safeguard data from criminals? Foreign agents? Could it go so far as to protect user data even from its own agents? Indeed this commitment to privacy would appear at odds with the recently unveiled CMS. Rather than merely paying lip service to the concept of online privacy, the government would be well advised to pass legislation protecting citizens’ privacy and to use such legislation as the foundation for a more robust cyber security strategy.

The Policy also does well to advocate “fiscal schemes and incentives to encourage entities to install, strengthen and upgrade information infrastructure with respect to cyber security.” Though some have argued that such regulation would impose inordinate costs on private businesses, anyone with a cursory understanding of computer networks and microeconomics could tell you that “externalities in cybersecurity are so great that even the freest free market would fail”—to quote expert Bruce Schneier. In less academic terms, a network is only as strong as its weakest link. While it is true that many larger enterprises take cyber security quite seriously, small and medium-sized businesses either lack immediate incentives to invest in security (e.g. no shareholders to answer to) or more often lack the basic resources to do so. Some form of government transfer for cyber security related investments could thus go a long way toward shoring up the country’s overall security.

The Policy also “[encourages] wider usage of Public Key Infrastructure (PKI) within Government for trusted communication and transactions.” It is surprising, however, that the Policy does not mandate the usage of PKI. In general, the document provides relatively few details on what specific security practices operators of Critical Information Infrastructure (CII) can or should implement.

Where It Goes Wrong

One troubling aspect of the Policy is its ambiguous language with respect to acquisition policies and supply chain security in general. The Policy, for example, aims to “[mandate] security practices related to the design, acquisition, development, use and operation of information resources” (emphasis added). Indeed, section VI, subsection A, paragraph 8 makes reference to the “procurement of indigenously manufactured ICT products,” presumably to the exclusion of imported goods. Although supply chain security must inevitably factor into overall cyber security concerns, such restrictive acquisition policies could not only deprive critical systems of potentially higher-quality alternatives but—depending on the implementation of these policies—could also sharpen the vulnerabilities of these systems.

Not only do these preferential acquisition policies risk mandating lower quality products, but it is unlikely they will be able to keep pace with the rapid pace of innovation in information technology. The United States provides a cautionary tale. The U.S. National Institute of Standards and Technology (NIST), tasked with producing cyber security standards for operators of critical infrastructure, made its first update to a 2005 set of standards earlier this year. Other regulatory agencies, such as the Federal Energy Regulatory Commission (FERC) move at a marginally faster pace yet nevertheless are delayed by bureaucratic processes. FERC has already moved to implement Version 5 of its Critical Infrastructure Protection (CIP) standards, nearly a year before the deadline for Version 4 compliance. The need for new standards thus outpaces the ability of industry to effectively implement them.

Fortunately, U.S. cyber security regulation has so-far been technology-neutral. Operators of Critical Information Infrastructure are required only to ensure certain functionalities and not to procure their hardware and software from any particular supplier. This principle ensures competition and thus security, allowing CII operators to take advantage of the most cutting-edge technologies regardless of name, model, etc. Technology neutrality does of course raise risks, such as those emphasized by the Government of India regarding Huawei and ZTE in 2010. Risk assessment must, however, remain focused on the technology in question and avoid politicization. India’s cyber security policy can be technology neutral as long as it follows one additional principle: trust but verify.

Verification may be facilitated by the use of free and open-source software (FOSS). FOSS provides security through transparency as opposed to security through obscurity and thus enables more agile responses to security responses. Users can identify and patch bugs themselves, or otherwise take advantage of the broader user community for such fixes. Thus open-source software promotes security in much the same way that competitive markets do: by accepting a wide range of inputs.

Despite the virtues of FOSS, there are plenty of good reasons to run proprietary software, e.g. fitness for purpose, cost, and track record. Proprietary software makes verification somewhat more complicated but not impossible. Source code escrow agreements have recently gained some traction as a verification measure for proprietary software, even with companies like Huawei and ZTE. In 2010, the infamous Chinese telecommunications giants persuaded the Indian government to lift its earlier ban on their products by concluding just such an agreement.  Clearly trust but verify is imminently practicable, and thus technology neutrality.

What’s Missing

Level of detail aside, what is most conspicuously absent from the new Policy is any framework for institutional cooperation beyond 1) the designation of CERT-In “as a Nodal Agency for coordination of all efforts for cyber security emergency response and crisis management” and 2) the designation of the “National Critical Information Infrastructure Protection Centre (NCIIPC) to function as the nodal agency for critical information infrastructure protection in the country.” The Policy mentions additionally “a National nodal agency to coordinate all matters related to cyber security in the country, with clearly defined roles & responsibilities.” Some clarity with regard to roles and responsibilities would certainly be in order. Even among these three agencies—assuming they are all distinct—it is unclear who is to be responsible for what.

More confusing still is the number of other pre-existing entities with cyber security responsibilities, in particular the National Technical Research Organization (NTRO), which in an earlier draft of the Policy was to have authority over the NCIIPC. The Ministry of Defense likewise has bolstered its cyber security and cyber warfare capabilities in recent years. Is it appropriate for these to play a role in securing civilian CII? Finally, the already infamous Central Monitoring System, justified predominantly on the very basis of cyber security, receives no mention at all. For a government that is only now releasing its first cyber security policy, India has developed a fairly robust set of institutions around this issue. It is disappointing that the Policy does not more fully address questions of roles and responsibilities among government entities.

Not only is there a lack of coordination among government cyber security entities, but there is no mention of how the public and private sectors are to cooperate on cyber security information—other than oblique references to “public-private partnerships.” Certainly there is a need for information sharing, which is currently facilitated in part by the sector-level CERTS. More interesting, however, is the question of liability for high-impact cyber attacks. To whom are private CII operators accountable in the event of disruptive cyber attacks on their systems? This legal ambiguity must necessarily be resolved in conjunction with the “fiscal schemes and incentives” also alluded to in the Policy in order to motivate strong cyber security practices among all CII operators and the public more broadly.

Next Steps

India’s inaugural National Cyber Security Policy is by and large a step in the right direction. It covers many of the most pressing issues in national cyber security and lays out a number of ambitious goals, ranging from capacity building to robust public-private partnerships. To realize these goals, the government will need a much more detailed roadmap.

Firstly, the extent of the government’s proposed privacy safeguards must be clarified and ideally backed by a separate piece of privacy legislation. As Benjamin Franklin once said, “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” When it comes to cyberspace, the Indian people must demand both liberty and safety.

Secondly, the government should avoid overly preferential acquisition policies and allow risk assessments to be technologically rather than politically driven. Procurement should moreover be technology-neutral. Open source software and source code escrow agreements can facilitate the verification measures that make technology neutrality work.

Finally, to translate this policy into a sound strategy will necessarily require that India’s various means be directed toward specific ends. The Policy hints at organizational mapping with references to CERT-In and the NCIIPC, but the roles and responsibilities of other government agencies as well as the private sector remain underdetermined. Greater clarity on these points would improve inter-agency and public-private cooperation—and thus, one hopes, security—significantly.

Not only is there a lack of coordination among government cyber security entities, but there is no mention of how the public and private sectors are to cooperate on cyber security information—other than oblique references to “public-private partnerships.” Certainly there is a need for information sharing, which is currently facilitated in part by the sector-level CERTS. More interesting, however, is the question of liability for high-impact cyber attacks. To whom are private CII operators accountable in the event of disruptive cyber attacks on their systems? This legal ambiguity must necessarily be resolved in conjunction with the “fiscal schemes and incentives” also alluded to in the Policy in order to motivate strong cyber security practices among all CII operators and the public more broadly.


[*]. Melissa E. Hathaway and Alexander Klimburg, “Preliminary Considerations: On National Cyber Security” in National Cyber Security Framework Manual, ed. Alexander Klimburg, (Tallinn, Estonia: Nato Cooperative Cyber Defence Centre of Excellence, 2012), 13

Guidelines for the Protection of National Critical Information Infrastructure: How Much Regulation?

by Jonathan Diamond last modified Aug 01, 2013 04:48 AM
July has been a busy month for cyber security in India. Beginning with the release of the country’s first National Cyber Security Policy on July 2 and followed just this past week by a set of guidelines for the protection of national critical information infrastructure (CII) developed under the direction of the National Technical Research Organization (NTRO), India has made respectable progress in its thinking on national cyber security.

Yet the National Cyber Security Policy, taken together with what little is known of the as-yet restricted guidelines for CII protection, raises troubling questions, particularly regarding the regulation of cyber security practices in the private sector. Whereas the current Policy suggests the imposition of certain preferential acquisition policies, India would be best advised to maintain technology neutrality to ensure maximum security.

According to Section 70(1) of the Information Technology Act, Critical Information Infrastructure (CII) is defined as a “computer resource, the incapacitation or destruction of which, shall have debilitating impact on national security, economy, public health or safety.” In one of the 2008 amendments to the IT Act, the Central Government granted itself the authority to “prescribe the information security practices and procedures for such protected system[s].” These two paragraphs form the legal basis for the regulation of cyber security within the private sector.

Such basis notwithstanding, private cyber security remains almost completely unregulated. According to the Intermediary Guidelines [pdf], intermediaries are required to report cyber security incidents to India’s national-level computer emergency response team (CERT-In). Other than this relatively small stipulation, the only regulation in place for CII exists at the sector level. Last year the Reserve Bank of India mandated that each bank in India appoint a chief information officer (CIO) and a steering committee on information security. The finance sector is also the only sector of the four designated “critical” by the Department of Electronics and Information Technology (DEIT) Cyber Security Strategy to have established a sector-level CERT, which released a set of non-compulsory guidelines [pdf] for information security governance in late 201

The new guidelines for CII protection seek to reorganize the government’s approach to CII. According to a Times of India article on the new guidelines, the NTRO will outline a total of eight sectors (including energy, aviation, telecom and National Stock Exchange) of CII and then “monitor if they are following the guidelines.” Such language, though vague and certainly unsubstantiated, suggests the NTRO may ultimately be responsible for enforcing the “[mandated] security practices related to the design, acquisition, development, use and operation of information resources” described in the Cyber Security Policy. If so, operators of systems deemed critical by the NTRO or by other authorized government agencies may soon be subject to cyber security regulation—with teeth.

To be sure, some degree of cyber security regulation is necessary. After all, large swaths of the country’s CII are operated by private industry, and poor security practices on the part of one operator can easily undermine the security of the rest. To quote security expert Bruce Schneier, “the externalities in cybersecurity are so great that even the freest free market would fail.” In less academic terms, networks are only as secure as their weakest links. While it is true that many larger enterprises take cyber security quite seriously, small and medium-sized businesses either lack immediate incentives to invest in security (e.g. no shareholders to answer to) or more often lack the basic resources to do so. Some form of government transfer for cyber security related investments could thus go a long way toward shoring up the country’s overall security.

Yet regulation may well extend beyond the simple “fiscal schemes and incentives” outlined in section IV of the Policy and “provide for procurement of indigenously manufactured ICT products that have security implications.” Such, at least, was the aim of the Preferential Market Access (PMA) Policy recently put on hold by the Prime Minister’s Office (PMO). Under pressure from international industry groups, the government has promised to review the PMA Policy, with the PMO indicating it may strike out clauses “regarding preference to domestic manufacturer[s] on security related products that are to be used by private sector.” If the government’s aim is indeed to ensure maximum security (rather than to grow an infant industry), it would be well advised to extend this approach to the Cyber Security Policy and the new guidelines for CII protection.

Although there is a national security argument to be made in favor of such policies—namely that imported ICT products may contain “backdoors” or other nefarious flaws—there are equally valid arguments to be made against preferential acquisition policies, at least for the private sector. First and foremost, it is unlikely that India’s nascent cyber security institutions will be able to regulate procurement in such a rapidly evolving market. Indeed, U.S. authorities have been at pains to set cyber security standards, especially in the past several years. Secondly, by mandating the procurement of indigenously manufactured products, the government may force private industry to forgo higher quality products. Absent access to source code or the ability to effectively reverse engineer imported products, buyers should make decisions based on the products’ performance records, not geo-economic considerations like country of origin. Finally, limiting procurement to a specific subset of ICT products likewise restricts the set of security vulnerabilities available to hackers. Rather than improve security, however, a smaller, more distinct set of vulnerabilities may simply make networks easier targets for the sorts of “debilitating” attacks the Policy aims to avert.

As India broaches the difficult task of regulating cyber security in the private sector, it must emphasize flexibility above all. On one hand, the government should avoid preferential acquisition policies which risk a) overwhelming limited regulatory resources, b) saddling CII operators with subpar products, and/or c) differentiating the country’s attack surface. On the other hand, the government should encourage certain performance standards through precisely the sort of “fiscal schemes and incentives” alluded to in the Cyber Security Policy. Regulation should focus on what technology does and does not do, not who made it or what rival government might have had their hands in its design. Ultimately, India should adopt a policy of technology neutrality, backed by the simple principle of trust but verify. Only then can it be truly secure.

CIS Cybersecurity Series (Part 9) - Saikat Datta

by Purba Sarkar last modified Aug 05, 2013 05:24 AM
CIS interviews Saikat Datta, Resident Editor of DNA, Delhi, as part of the Cybersecurity Series.

"Anonymous speech, in countries which have extremely severe systems of governments, which do not have freedom, etcetera, is welcome. But in a democracy like India, I do not see the need for anonymous speech because it is anyways guaranteed by the Constitution of India. So, no, I do not see the need for anonymity in an open and democratic state like India and I would be seriously worried if such a requirement comes up. Shouldn't I strive to be ideal? The ideal suggests that the constitution has guaranteed freedom of speech. Anonymity, for a time being may be acceptable to some people but I would like a situation where a person, without having to seek anonymity, can speak about anything and not be prosecuted by the state, or persecuted by society. And that is the ideal situation that I would like to strive for." - Saikat Datta, Resident Editor, DNA, Delhi.

Centre for Internet and Society presents its ninth installment of the CIS Cybersecurity Series. 

The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

Saikat Datta is a journalist who began his career in December 1996 and has worked with several publications like The Indian Express, the Outlook magazine and the DNA newspaper. He is currently the Resident Editor of DNA, Delhi. Saikat has authored a book on India's Special Forces and presented papers at seminars organized by the Centre for Land Warfare Studies, the Centre for Air Power Studies and the National Security Guards. He has also been awarded the International Press Institute Award for investigative journalism, the National RTI award in the journalism category and the Jagan Phadnis Memorial Award for investigative journalism.

 

 
This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

'Ethical Hacker' Saket Modi Calls for Stronger Cyber Security Discussions

by Kovey Coles last modified Aug 05, 2013 01:11 PM
Twenty-two year old Saket Modi is the CEO and co-founder of Lucideus, a leading cyber security company in India which claims to have worked with 4 out of 5 top global e-commerce companies, 4 out of 10 top IT companies in the world, and 3 out of 5 top banks of the Asia Pacific.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC


At the Confederation of Indian Industry (CII) conference on July 13, titled “ACT – Achieving Cyber-Security Together,” Modi as the youngest speaker on the agenda delivered an impromptu talk which lambasted the weaknesses of modern cyber security discussions, enlightened the audience on modern capabilities and challenges of leading cyber security groups, and ultimately received a standing ovation from the crowd. As a later speaker commented, Modi’s controversial opinions and practitioner insight had "set the auditorium ablaze for the remainder of the evening". Since then the Centre for Internet and Society (CIS) has had the pleasure of interviewing Saket Modi over Skype.

It is quite easy to find accounts of Saket Modi's introduction into hacking just by typing his name in the search engine. Faced with the pressure of failing, a teenage Saket discovered how to hack into his high school Chemistry teacher’s test and answer database. After successfully obtaining the answers, and revealing his wrong doings to his teacher, the young man grew intrigued by the possibilities of hacking. "I thought, if I could do this in a couple hours, four hours, then what might I be able to do in four days, four weeks, four months?"

Nowadays, Modi describes himself and his Lucideus team as "ethical hackers", a term recently espoused by hacker groups in the public eye. As opposed to "hacktivists", who utilize hacking methods (including attacks) to achieve or bring awareness to political issues, ethical hackers claim to exclusively use their computer skills to support defenses. At first, incorporation of ethics into a for-profit organization’s game plan may seem confusing, as it leaves room for key questions, like how does one determine which clients constitute ethical business? When asked, however, Modi clarifies by explaining how the ethics are not manifest in the entities Lucideus supports, but instead inherent in the choice of building defensive networks as opposed to using their skills for attack or debilitation. Nevertheless, considerations remain as to whether supporting the cyber security of some entities can lead to the insecurity of others, for example, strengthening the agencies which work in covert cyber espionage. On this point, Modi seems more ambivalent, saying "it depends on a case by case basis". But he still believes cyber security is a right that should be enjoyed by all, "entitled to [you] the moment you set foot on the internet".

As an experienced professional in the field who often gives input on major cyber policy decisions, Modi emphasizes the necessity of youth engagement in cyber security practice and policy. He calls his age bracket the “web generation,” those who have “grown with technology.” According to Modi, no one over 50 or 60 years of age can properly meet the current challenges of the cyber security realm. It is "a sad thing" that those older leaders carry the most power in policy making, and that they often have problems with both understanding and acceptability of modern technological capabilities. For the public, businesses, and also government, there are misconceptions about the importance of cyber security and the extent of modern cyber threats, threats which Modi and his company claim to combat regularly. "About 90 per cent of the crimes that take place in cyber space are because of lack of knowledge, rather than the expertise of the hacker,” he explains. Modi mentions a few basic misconceptions, as simple as, "if I have an anti-virus, my system is secured" or "if you have HTTPS certificate and SSL connection, your system is secured". “These are like wearing an elbow guard while playing cricket,” Modi tells. “If the ball comes at the elbow then you are protected, but what about the rest of the body?”

This highlights another problem evident in India’s current cyber security scene, the problem of lacking “quality institutes to produce good cyber security experts.” For example, Modi takes offence at there not being “a single institute which is providing cyber security at the undergraduate level [in India].” He alludes to the recently unveiled National Cyber Security Policy, specifically the call for five lakh cyber security experts in upcoming years. He calls this “a big figure,” but agrees that there needs to be a lot more awareness throughout the nation. “You really have to change a lot of things,” he says, “in order to get the right things in the right place here in India.”

When considering citizen privacy in relation to cyber security, and the relationship between the two (be it direct or inverse), Saket Modi says the important factor is the governing body, because the issue ultimately resolves to trust. Citizens must trust the “right people with the right qualifications” to store and protect their sensitive data, and to respect privacy. Modi is no novice to the importance of personal data protection, and his company works with a plethora of extremely sensitive information relating to both their clients and their clients’ clients data, so it operates with due care lest it create a “wikileaks part two.”

On internationalization and cyber security, he views the connection between the two as natural, intrinsic. “Cyberspace has added a new dimension to humanity,” says Modi, and tells how former constructs of physical constraints and linear bounds no longer apply. International cooperation is especially pertinent, according to Modi, because the greatest challenge for catching today’s criminal hackers is their international anonymity, “the ability to jump from one country to the other in a matter of milliseconds.”

With the extent of the challenges facing cyber defense specialists, and with the somewhat disorderly current state of Indian cyber security, it is curious to see that Saket Modi has devoted himself to the "ethical" side of hacking. Why hasn’t he or the rest of the Lucideus team resorted to offensive hacking, since Modi claims the majority of cyber attacks of the world who are committed by people also fall between the ages of 15 and 24? Apparently, the answer is simple. “We believe in the need for ethical hacking,” he defends. “We believe in the purpose of making the internet safer.”

Ethical Issues in Open Data

by Kovey Coles last modified Aug 07, 2013 09:19 AM
On August 1, 2013, I took part in a web meeting, organized and hosted by Tim Davies of the World Wide Web foundation. The meeting, titled “Ethical issues in Open Data,” had an agenda focused around privacy considerations in the context of the open data movement.

The main panelists, Carly Nyst and Sam Smith from Privacy International, as well as Steve Song from the International  Development Research Centre, were joined by roughly a dozen other privacy and development researchers from around the globe in the hour long session.

The primary issue of the meeting was the concern over modern capabilities of cross-analytics for de-anonymizing data sets and revealing personally identifiable information (PII) in open data. Open data can constitute publicly available information such as budgets, infrastructures, and population statistics, as long as the data meets the three open data characteristics: accessibility, machine readability, and availability for re-use. “Historically,” said Tim Davies, “public registers have been protected through obscurity.” However, both the capabilities of data analysts and the definition of personal data have continued to expand in recent years. This concern thus presents a conflict between researchers who advocate governments releasing open data reports, and researchers who emphasize privacy in the developing world.

Steve Song, advisor to IDRC Information & Networks program, spoke of the potential collateral damage that comes with publishing more and more types of information. Song addressed the imperative of the meeting in saying, “privacy needs to be a core part of open data conversation.” In his presentation, he gave a particularly interesting example of the tensions between public and private information implications. Following the infamous 2012 school shooting in Newtown, Connecticut, the information on Newtown’s gun permit owning citizens (made publicly available through America’s Freedom of Information Act) was aggregated into an interactive map which revealed the citizens’ addresses. This obviously became problematic for the Newtown community, as the map not only singled out homes which exercised their right to bear arms but also indirectly revealed which homes were without firearm protection and thereby more vulnerable to theft and crime. The Newtown example clearly demonstrates the relationship (and conflict) between open data and privacy; it resolves to the conflict between the right to information and the right to privacy.

An apparent issue surrounding open data is its perceived binary nature. Many advocates either view data as being open, or not; any intermediary boundaries are only forms of governments limiting data accessibility. Therefore, a point raised by meeting attendee Raed Sharif aptly presented an open data counter-argument. Sarif noted how, inversely, privacy conceptions may form a threat to open data. He mentioned how governments could take advantage of privacy arguments to justify their refusal to publish open reports.

However, Carly Nyst summarized the privacy concern and argument in her remarks near the end of the meeting. Namely, she reasoned that the open data mission is viable, if only limited to generic data, i.e., data about infrastructure, or other information that is in no way personal. Doing so will avoid obstructions of individual privacy. Until more advanced anonymization techniques can be achieved, which can overcome modern re-identification methods, publicly publishing PII may prove too risky. It was generally agreed upon during the meeting that open data is not inherently bad, and in fact its analysis and availability can be beneficial, but the threat of its misuse makes it dangerous. For the future of open data, researchers and advocates should perhaps consider more nuanced approaches to the concept in order to respect considerations for other ethical issues, such as privacy.

FinFisher in India and the Myth of Harmless Metadata

by Maria Xynou last modified Aug 13, 2013 11:30 AM
In this article, Maria Xynou argues that metadata is anything but harmless, especially since FinFisher — one of the world's most controversial types of spyware — uses metadata to target individuals.
FinFisher in India and the Myth of Harmless Metadata

by John-Norris on Flickr

In light of PRISM, the Central Monitoring System (CMS) and other such surveillance projects in India and around the world, the question of whether the collection of metadata is “harmless” has arisen.[1] In order to examine this question, FinFisher[2] — surveillance spyware — has been chosen as a case study to briefly examine to what extent the collection and surveillance of metadata can potentially violate the right to privacy and other human rights. FinFisher has been selected as a case study not only because its servers have been recently found in India[3] but also because its “remote monitoring solutions” appear to be very pervasive even on the mere grounds of metadata.

FinFisher in India

FinFisher is spyware which has the ability to take control of target computers and capture even encrypted data and communications. The software is designed to evade detection by anti-virus software and has versions which work on mobile phones of all major brands.[4] In many cases, the surveillance suite is installed after the target accepts installation of a fake update to commonly used software.[5] Citizen Lab researchers have found three samples of FinSpy that masquerades as Firefox.[6]

FinFisher is a line of remote intrusion and surveillance software developed by Munich-based Gamma International. FinFisher products are sold exclusively to law enforcement and intelligence agencies by the UK-based Gamma Group.[7] A few months ago, it was reported that command and control servers for FinSpy backdoors, part of Gamma International´s FinFisher “remote monitoring solutions”, were found in a total of 25 countries, including India.[8]

The following map, published by the Citizen Lab, shows the 25 countries in which FinFisher servers have been found.[9]

Map

The above map shows the results of scanning for characteristics of FinFisher command and control servers.

FinFisher spyware was not found in the countries coloured blue, while the colour green is used for countries not responding. The countries using FinFisher range from shades of orange to shades of red, with the lightest shade of orange ranging to the darkest shade of red on a scale of 1-6, and with 1 representing the least active servers and 6 representing the most active servers in regards to the use of FinFisher. On a scale of 1-6, India is marked a 3 in terms of actively using FinFisher.[10]

Research published by the Citizen Lab reveals that FinSpy servers were recently found in India, which indicates that Indian law enforcement agencies may have bought this spyware from Gamma Group and might be using it to target individuals in India.[11] According to the Citizen Lab, FinSpy servers in India have been detected through the HostGator operator and the first digits of the IP address are: 119.18.xxx.xxx. Releasing complete IP addresses in the past has not proven useful, as the servers are quickly shut down and relocated, which is why only the first two octets of the IP address are revealed.[12]

The Citizen Lab's research reveals that FinFisher “remote monitoring solutions” were found in India, which, according to Gamma Group's brochures, include the following:

  • FinSpy: hardware or software which monitors targets that regularly change location, use encrypted and anonymous communications channels and reside in foreign countries. FinSpy can remotely monitor computers and encrypted communications, regardless of where in the world the target is based. FinSpy is capable of bypassing 40 regularly tested antivirus systems, of monitoring the calls, chats, file transfers, videos and contact lists on Skype, of conducting live surveillance through a webcam and microphone, of silently extracting files from a hard disk, and of conducting a live remote forensics on target systems. FinSpy is hidden from the public through anonymous proxies.[13]
  • FinSpy Mobile: hardware or software which remotely monitors mobile phones. FinSpy Mobile enables the interception of mobile communications in areas without a network, and offers access to encrypted communications, as well as to data stored on the devices that is not transmitted. Some key features of FinSpy Mobile include the recording of common communications like voice calls, SMS/MMS and emails, the live surveillance through silent calls, the download of files, the country tracing of targets and the full recording of all BlackBerry Messenger communications. FinSpy Mobile is hidden from the public through anonymous proxies.[14]
  • FinFly USB: hardware which is inserted into a computer and which can automatically install the configured software with little or no user-interaction and does not require IT-trained agents when being used in operations. The FinFly USB can be used against multiple systems before being returned to the headquarters and its functionality can be concealed by placing regular files like music, video and office documents on the device. As the hardware is a common, non-suspicious USB device, it can also be used to infect a target system even if it is switched off.[15]
  • FinFly LAN: software which can deploy a remote monitoring solution on a target system in a local area network (LAN). Some of the major challenges law enforcement faces are mobile targets, as well as targets who do not open any infected files that have been sent via email to their accounts. FinFly LAN is not only able to deploy a remote monitoring solution on a target´s system in local area networks, but it is also able to infect files that are downloaded by the target, by sending fake software updates for popular software or to infect the target by injecting the payload into visited websites. Some key features of the FinFly LAN include: discovering all computer systems connected to LANs, working in both wired and wireless networks, and remotely installing monitoring solutions through websites visited by the target. FinFly LAN has been used in public hotspots, such as coffee shops, and in the hotels of targets.[16]
  • FinFly Web: software which can deploy remote monitoring solutions on a target system through websites. FinFly Web is designed to provide remote and covert infection of a target system by using a wide range of web-based attacks. FinFly Web provides a point-and-click interface, enabling the agent to easily create a custom infection code according to selected modules. It provides fully-customizable web modules, it can be covertly installed into every website and it can install the remote monitoring system even if only the email address is known.[17]
  • FinFly ISP: hardware or software which deploys a remote monitoring solution on a target system through an ISP network. FinFly ISP can be installed inside the Internet Service Provider Network, it can handle all common protocols and it can select targets based on their IP address or Radius Logon Name. Furthermore, it can hide remote monitoring solutions in downloads by targets, it can inject remote monitoring solutions as software updates and it can remotely install monitoring solutions through websites visited by the target.[18]

Although FinFisher is supposed to be used for “lawful interception”, it has gained notoriety for targeting human rights activists.[19] According to Morgan Marquis-Boire, a security researcher and technical advisor at the Munk School and a security engineer at Google, FinSpy has been used in Ethiopia to target an opposition group called Ginbot.[20] Researchers have argued that FinFisher has been sold to Bahrain's government to target activists, and such allegations were based on an examination of malicious software which was emailed to Bahraini activists.[21] Privacy International has argued that FinFisher has been deployed in Turkmenistan, possibly to target activists and political dissidents.[22]

Many questions revolving around the use of FinFisher and its “remote monitoring solutions” remain   vague, as there is currently inadquate proof of whether this spyware is being used to target individuals by law enforcement agencies in the countries where command and control servers have been found, such as India.[23] However, FinFisher's brochures which were circulated in the ISS world trade shows and leaked by WikiLeaks do reveal some confirmed facts: Gamma International claims that its FinFisher products are capable of taking control of target computers, of capturing encrypted data and of evading mainstream anti-virus software.[24] Such products are exhibited in the world's largest surveillance trade show and probably sold to law enforcement agencies around the world.[25] This alone unveils a concerning fact: spyware which is so sofisticated that it even evades encryption and anti-virus software is currently in the market and law enforcement agencies can potentially use it to target activists and anyone who does not comply with social conventions.[26] A few months ago, two Indian women were arrested after having questioned the shutdown of Mumbai for Shiv Sena patriarch Bal Thackeray's funeral.[27] Thus, it remains unclear what type of behaviour is targeted by law enforcement agencies and whether spyware, such as FinFisher, would be used in India to track individuals without a legally specified purpose.

Furthermore, India lacks privacy legislation which could safeguard individuals from potential abuse, while sections 66A and 69 of the Information Technology (Amendment) Act, 2008, empower Indian authorities with extensive surveillance capabilites.[28] While it remains unclear if Indian law enforcement agencies are using FinFisher spy products to unlawfully target individuals, it is a fact that FinFisher control and command servers have been found in India and that, if used, they could potentially have severe consequences on individuals' right to privacy and other human rights.[29]

The Myth of Harmless Metadata

Over the last months, it has been reported that the Central Monitoring System (CMS) is being implemented in India, through which all telecommunications and Internet communications in the country are being centrally intercepted by Indian authorities. This mass surveillance of communications in India is enabled by the omission of privacy legislation and Indian authorities are currently capturing the metadata of communications.[30]

Last month, Edward Snowden leaked confidential U.S documents on PRISM, the top-secret National Security Agency (NSA) surveillance programme that collects metadata through telecommunications and Intenet communications. It has been reported that through PRISM, the NSA has tapped into the servers of nine leading Internet companies: Microsoft, Google, Yahoo, Skype, Facebook, YouTube, PalTalk, AOL and Apple.[31] While the extent to which the NSA is actually tapping into these servers remains unclear, it is certain that the NSA has collected metadata on a global level.[32] Yet, the question of whether the collection of metadata is “harmful” remains ambiguous.

According to the National Information Standards Organization (NISO), the term “metadata” is defined as “structured information that describes, explains, locates or otherwise makes it easier to retrieve, use or manage an information resource”. NISO claims that metadata is “data about data” or “information about information”.[33] Furthermore, metadata is considered valuable due to its following functions:

  • Resource discovery
  • Organizing electronic resources
  • Interoperability
  • Digital Identification
  • Archiving and preservation

Metadata can be used to find resources by relevant criteria, to identify resources, to bring similar resources together, to distinguish dissimilar resources and to give location information. Electronic resources can be organized through the use of various software tools which can automatically extract and reformat information for Web applications. Interoperability is promoted through metadata, as describing a resource with metadata allows it to be understood by both humans and machines, which means that data can automatically be processed more effectively. Digital identification is enabled through metadata, as most metadata schemes include standard numbers for unique identification. Moreover, metadata enables the archival and preservation of large volumes of digital data.[34]

Surveillance projects, such as PRISM and India's CMS, collect large volumes of metadata, which include the numbers of both parties on a call, location data, call duration, unique identifiers, the International Mobile Subscriber Identity (IMSI) number, email addresses, IP addresses and browsed webpages.[35] However, the fact that such surveillance projects may not have access to content data might potentially create a false sense of security.[36] When Microsoft released its report on data requests by law enforcement agencies around the world in March 2013, it revealed that most of the disclosed data was metadata, while relatively very little content data was allegedly disclosed.[37]

imilarily, Google's transparency report reveals that the company disclosed large volumes of metadata to law enforcement agencies, while restricting its disclosure of content data.[38]

Such reports may potentially provide a sense of security to the public, as they reassure that the content of personal emails, for example, has not been shared with the government, but merely email addresses – which might be publicly available online anyway. However, is content data actually more “harmful” than metadata? Is metadata “harmless”? How much data does metadata actually reveal?

The Guardian recently published an article which includes an example of how individuals can be tracked through their metadata. In particular, the example explains how an individual is tracked – despite using an anonymous email account – by logging in from various hotels' public Wi-Fi and by leaving trails of metadata that include times and locations. This example illustrates how an individual can be tracked through metadata alone, even when anonymous accounts are being used.[39]

Wired published an article which states that metadata can potentially be more harmful than content data because “unlike our words, metadata doesn't lie”. In particular, content data shows what an individual says – which may be true or false – whereas metadata includes what an individual does. While the validity of the content within an email may potentially be debateable, it is undeniable that an individual logged into specific websites – if that is what that individuals' IP address shows. Metadata, such as the browsing habits of an individual, may potentially provide a more thorough and accurate profile of an individual than that individuals' email content, which is why metadata can potentially be more harmful than content data.[40]

Furthermore, voice content is hard to process and written content in an email or chat communication may not always be valid. Metadata, on the other hand, provides concrete patterns of an individuals' behaviour, interests and interactions. For example, metadata can potentially map out an individuals' political affiliation, interests, economic background, institution, location, habits and the people that individual interacts with. Such data can potentially be more valuable than content data, because while the validity of email content is debateable, metadata usually provides undeniable facts. Not only is metadata more accurate than content data, but it is also ideally suited to automated analysis by a computer. As most metadata includes numeric figures, it can easily be analysed by data mining software, whereas content data is more complicated.[41]

FinFisher products, such as FinFly LAN, FinFly Web and FinFly ISP, provide solid proof that the collection of metadata can potentially be “harmful”. In particular, FinFly LAN can be deployed in a target system in a local area network (LAN) by infecting files that are downloaded by the target, by sending fake software updates for popular software or by infecting the payload into visited websites. The fact that FinFly LAN can remotely install monitoring solutions through websites visited by the target indicates that metadata alone can be used to acquire other sensitive data.[42]

FinFly Web can deploy remote monitoring solutions on a target system through websites. Additionally, FinFly Web can be covertly installed into every website and it can install the remote monitoring system even if only the email address is known.[43] FinFly ISP can select targets based on their IP address or Radius Logon Name. Furthermore, FinFly ISP can remotely install monitoring solutions through websites visited by the target, as well as inject remote monitoring solutions as software updates.[44] In other words, FinFisher products, such as FinFly LAN, FinFly Web and FinFly ISP, can target individuals, take control of their computers and their data, and capture even encrypted data and communications with the help of metadata alone.

The example of FinFisher products illustrates that metadata can potentially be as “harmful” as content data, if acquired unlawfully and without individual consent.[45] Thus, surveillance schemes, such as PRISM and India's CMS, which capture metadata without individuals' consent can potentially pose a major threat to the right to privacy and other human rights.[46] Privacy can be defined as the claim of individuals, groups or institutions to determine when, how and to what extent information about them is communicated to others.[47] Furthermore, privacy is at the core of human rights because it protects individuals from abuse by those in power.[48] The unlawful collection of metadata exposes individuals to the potential violation of their human rights, as it is not transparent who has access to their data, whether it is being shared with third parties or for how long it is being retained.

It is not clear if Indian law enforcement agencies are actually using FinFisher products, but the Citizen Lab did find FinFisher command and control servers in the country which indicates that there is a high probability that such spyware is being used.[49] This probability is highly concerning not only because the specific spy products have such advanced capabilities that they are even capable of capturing encrypted data, but also because India currently lacks privacy legislation which could safeguard individuals.

Thus, it is recommended that Indian law enforcement agencies are transparent and accountable if they are using spyware which can potentially breach their citizens' human rights and that privacy legislation is enacted into law. Lastly, it is recommended that all surveillance technologies are strictly regulated with regards to the protection of human rights and that Indian authorities adopt the principles on communication surveillance formulated by the Electronic Frontier Foundation and Privacy International.[50] The above could provide a decisive first step in ensuring that India is the democracy it claims to be.


[1]. Robert Anderson (2013), “Wondering What Harmless 'Metadata' Can Actually Reveal? Using Own Data, German Politician Shows Us”, The CSIA Foundation, http://bit.ly/1cIhu7G

[2]. Gamma Group, FinFisher IT Intrusion, http://bit.ly/fnkGF3

[3]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, “You Only Click Twice: FinFisher's Global Proliferation”, The Citizen Lab, 13 March 2013, http://bit.ly/YmeB7I

[4]. Michael Lewis, “FinFisher Surveillance Spyware Spreads to Smartphones”, The Star: Business, 30 August 2012, http://bit.ly/14sF2IQ

[5]. Marcel Rosenbach, “Troublesome Trojans: Firm Sought to Install Spyware Via Faked iTunes Updates”, Der Spiegel, 22 November 2011, http://bit.ly/14sETVV

[6]. Intercept Review, Mozilla to Gamma: stop disguising your FinSpy as Firefox, 02 May 2013, http://bit.ly/131aakT

[7]. Intercept Review, LI Companies Review (3) – Gamma, 05 April 2012, http://bit.ly/Hof9CL

[8]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, For Their Eyes Only: The Commercialization of Digital Spying, Citizen Lab and Canada Centre for Global Security Studies, Munk School of Global Affairs, University of Toronto, 01 May 2013, http://bit.ly/ZVVnrb

[9]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, “You Only Click Twice: FinFisher's Global Proliferation”, The Citizen Lab, 13 March 2013, http://bit.ly/YmeB7I

[10]. Ibid.

[11]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, For Their Eyes Only: The Commercialization of Digital Spying, Citizen Lab and Canada Centre for Global Security Studies, Munk School of Global Affairs, University of Toronto, 01 May 2013, http://bit.ly/ZVVnrb

[12]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, “You Only Click Twice: FinFisher's Global Proliferation”, The Citizen Lab, 13 March 2013, http://bit.ly/YmeB7I

[13]. Gamma Group, FinFisher IT Intrusion, FinSpy: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/zaknq5

[14]. Gamma Group, FinFisher IT Intrusion, FinSpy Mobile: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/19pPObx

[15]. Gamma Group, FinFisher IT Intrusion, FinFly USB: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/1cJSu4h

[16]. Gamma Group, FinFisher IT Intrusion, FinFly LAN: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/14J70Hi

[17]. Gamma Group, FinFisher IT Intrusion, FinFly Web: Remote Monitoring & Intrusion Solutions, WikiLeaks: The Spy Files, http://bit.ly/19fn9m0

[18]. Gamma Group, FinFisher IT Intrusion, FinFly ISP: Remote Monitoring & Intrusion Solutions, WikiLeaks: The Spy Files, http://bit.ly/13gMblF

[19]. Gerry Smith, “FinSpy Software Used To Surveil Activists Around The World, Reports Says”, The Huffington Post, 13 March 2013, http://huff.to/YmmhXI

[20]. Jeremy Kirk, “FinFisher Spyware seen Targeting Victims in Vietnam, Ethiopia”, Computerworld: IDG News, 14 March 2013, http://bit.ly/14J8BwW

[21]. Reporters without Borders: For Freedom of Information (2012), The Enemies of the Internet: Special Edition: Surveillance, http://bit.ly/10FoTnq

[22]. Privacy International, FinFisher Report, http://bit.ly/QlxYL0

[23]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, “You Only Click Twice: FinFisher's Global Proliferation”, The Citizen Lab, 13 March 2013, http://bit.ly/YmeB7I

[24]. Gamma Group, FinFisher IT Intrusion, FinSpy: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/zaknq5

[25]. Adi Robertson, “Paranoia Thrives at the ISS World Cybersurveillance Trade Show”, The Verge, 28 December 2011, http://bit.ly/tZvFhw

[26]. Gerry Smith, “FinSpy Software Used To Surveil Activists Around The World, Reports Says”, The Huffington Post, 13 March 2013, http://huff.to/YmmhXI

[27]. BBC News, “India arrests over Facebook post criticising Mumbai shutdown”, 19 November 2012, http://bbc.in/WoSXkA

[28]. Indian Ministry of Law, Justice and Company Affairs, The Information Technology (Amendment) Act, 2008, http://bit.ly/19pOO7t

[29]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, For Their Eyes Only: The Commercialization of Digital Spying, Citizen Lab and Canada Centre for Global Security Studies, Munk School of Global Affairs, University of Toronto, 01 May 2013, http://bit.ly/ZVVnrb

[30]. Phil Muncaster, “India introduces Central Monitoring System”, The Register, 08 May 2013, http://bit.ly/ZOvxpP

[31]. Glenn Greenwald & Ewen MacAskill, “NSA PRISM program taps in to user data of Apple, Google and others”, The Guardian, 07 June 2013, http://bit.ly/1baaUGj

[32]. BBC News, “Google, Facebook and Microsoft seek data request transparency”, 12 June 2013, http://bbc.in/14UZCCm

[33]. National Information Standards Organization (2004), Understanding Metadata, NISO Press, http://bit.ly/LCSbZ

[34]. Ibid.

[35]. The Hindu, “In the dark about 'India's PRISM'”, 16 June 2013, http://bit.ly/1bJCXg3 ; Glenn Greenwald, “NSA collecting phone records of millions of Verizon customers daily”, The Guardian, 06 June 2013, http://bit.ly/16L89yo

[36]. Robert Anderson, “Wondering What Harmless 'Metadata' Can Actually Reveal? Using Own Data, German Politician Shows Us”, The CSIA Foundation, 01 July 2013, http://bit.ly/1cIhu7G

[37]. Microsoft: Corporate Citizenship, 2012 Law Enforcement Requests Report,http://bit.ly/Xs2y6D

[38]. Google, Transparency Report, http://bit.ly/14J7hKp

[39]. Guardian US Interactive Team, A Guardian Guide to your Metadata, The Guardian, 12 June 2013, http://bit.ly/ZJLkpy

[40]. Matt Blaze, “Phew, NSA is Just Collecting Metadata. (You Should Still Worry)”, Wired, 19 June 2013, http://bit.ly/1bVyTJF

[41]. Ibid.

[42]. Gamma Group, FinFisher IT Intrusion, FinFly LAN: Remote Monitoring & Infection Solutions, WikiLeaks: The Spy Files, http://bit.ly/14J70Hi

[43]. Gamma Group, FinFisher IT Intrusion, FinFly Web: Remote Monitoring & Intrusion Solutions, WikiLeaks: The Spy Files, http://bit.ly/19fn9m0

[44]. Gamma Group, FinFisher IT Intrusion, FinFly ISP: Remote Monitoring & Intrusion Solutions, WikiLeaks: The Spy Files, http://bit.ly/13gMblF

[45]. Robert Anderson, “Wondering What Harmless 'Metadata' Can Actually Reveal? Using Own Data, German Politician Shows Us”, The CSIA Foundation, 01 July 2013, http://bit.ly/1cIhu7G

[46]. Shalini Singh, “India's surveillance project may be as lethal as PRISM”, The Hindu, 21 June 2013, http://bit.ly/15oa05N

[47]. Cyberspace Law and Policy Centre, Privacy, http://bit.ly/14J5u7W

[48]. Bruce Schneier, “Privacy and Power”, Schneier on Security, 11 March 2008, http://bit.ly/i2I6Ez

[49]. Morgan Marquis-Boire, Bill Marczak, Claudio Guarnieri & John Scott-Railton, For Their Eyes Only: The Commercialization of Digital Spying, Citizen Lab and Canada Centre for Global Security Studies, Munk School of Global Affairs, University of Toronto, 01 May 2013, http://bit.ly/ZVVnrb

[50]. Elonnai Hickok, “Draft International Principles on Communications Surveillance and Human Rights”, The Centre for Internet and Society, 16 January 2013, http://bit.ly/XCsk9b

Freedom from Monitoring: India Inc Should Push For Privacy Laws

by Sunil Abraham last modified Aug 21, 2013 07:04 AM
More surveillance than absolutely necessary actually undermines the security objective.

This article by Sunil Abraham was published in Forbes India Magazine on August 21, 2013.


I think I understand why the average Indian IT entrepreneur or enterprise does not have a position on blanket surveillance. This is because the average Indian IT enterprise’s business model depends on labour arbitrage, not intellectual property. And therefore they have no worries about proprietary code or unfiled patent applications being stolen by competitors via rogue government officials within projects such as NATGRID, UID and, now, the CMS.

A sub-section of industry, especially the technology industry, will always root for blanket surveillance measures. The surveillance industry has many different players, ranging from those selling biometric and CCTV hardware to those providing solutions for big data analytics and legal interception systems. There are also more controversial players who provide spyware, especially those in the market for zero-day exploits. The cheerleaders for the surveillance industry are techno-determinists who believe you can solve any problem by throwing enough of the latest and most expensive technology at it.

What is surprising, though, is that other indigenous or foreign enterprises that depend on secrecy and confidentiality—in sectors such a banking, finance, health, law, ecommerce, media, consulting and communications—also don’t seem to have a public position on the growing surveillance ambitions of ‘democracies’ such as India and the United States of America. (Perhaps the only exceptions are a few multinational internet and software companies that have made some show of resistance and disagreement with the blanket surveillance paradigm.)

Is it because these businesses are patriotic? Do they believe that secrecy, confidentiality and, most importantly, privacy, must be sacrificed for national security? If that were true then it would not be a particularly wise thing to do, as privacy is the precondition for security. Ann Cavoukian, privacy commissioner of Ontario, calls it a false dichotomy. Bruce Schneier, security technologist and writer, calls it a false zero sum game; he goes on to say, “There is no security without privacy. And liberty requires both security and privacy.”

The reason why the secret recipe of Coca Cola is still secret after over 120 years is the same as the reason why a captured soldier cannot spill the beans on the overall war strategy. Corporations, like militaries, have layers and layers of privacy and secrecy. The ‘need to know’ principle resists all centralising tendencies, such as blanket surveillance. It’s important to note that targeted surveillance to identify a traitor or spy within the military, or someone engaged in espionage within a corporation, is pretty much an essential. However, any more surveillance than absolutely necessary actually undermines the security objective. To summarise, privacy is a pre-condition to the security of the individual, the enterprise, the military and the nation state.

Most people complaining online about projects like the Central Monitoring System seem to think that India has no privacy laws. This is completely untrue: We have around 50 different laws, rules and regulations that aim to uphold privacy and confidentiality in various domains. Unfortunately, most of those policies are very dated and do not sufficiently take into account the challenges of contemporary information societies. These policy documents need to be updated and harmonised through the enactment of a new horizontal privacy law. A small minority will say that Section 43(A) of the Information Technology Act is the India privacy law. That is not completely untrue, but is a gross exaggeration. Section 43(A) is really only a data security provision and, at that, it does not even comprehensively address data protection, which is only a sub-set of the overall privacy regulation required in a nation.

What would an ideal privacy law for India look like? For one, it would protect the rights of all persons, regardless of whether they are citizens or residents. Two, it would define privacy principles. Three, it would establish the office of an independent and autonomous privacy commissioner, who would be sufficiently empowered to investigate and take action against both government and private entities. Four, it would define civil and criminal offences, remedies and penalties. And five, it would have an overriding effect on previous legislation that does not comply with all the privacy principles.

The Justice AP Shah Committee report, released in October 2012, defined the Indian privacy principles as notice, choice and consent, collection limitation, purpose limitation, access and correction, disclosure of information, security, openness and accountability. The report also lists the exemptions and limitations, so that privacy protections do not have a chilling effect on the freedom of expression and transparency enabled by the Right to Information Act.

The Department of Personnel and Training has been working on a privacy bill for the last three years. Two versions of the bill had leaked before the Justice AP Shah Committee was formed. The next version of the bill, hopefully implementing the recommendations of the Justice AP Shah Committee report, is expected in the near future. In a multi-stakeholder-based parallel process, the Centre for Internet and Society (where I work), along with FICCI and DSCI, is holding seven round tables on a civil society draft of the privacy bill and the industry-led efforts on co-regulation.

The Indian ITES, KPO and BPO sector should be particularly pleased with this development. As should any other Indian enterprise that holds personal information of EU and US nationals. This is because the EU, after the enactment of the law, will consider data protection in India adequate as per the requirements of its Data Protection Directive. This would mean that these enterprises would not have to spend twice the time and resources ensuring compliance with two different regulatory regimes.

Is the lack of enthusiasm for privacy in the Indian private sector symptomatic of Indian societal values? Can we blame it on cultural relativism, best exemplified by what Simon Davies calls “the Indian Train Syndrome, in which total strangers will disclose their lives on a train to complete strangers”? But surely, when email addresses are exchanged at the end of that conversation, they are not accompanied by passwords. Privacy is perhaps differently configured in Indian societies but it is definitely not dead. Fortunately for us, calls to protect this important human right are growing every day.

The Personal Data (Protection) Bill, 2013

by Prachi Arya last modified Aug 30, 2013 02:53 PM
Below is the text of the Personal Data (Protection) Bill, 2013 as discussed at the 6th Privacy Roundtable, New Delhi held on 24 August 2013. Note: This version of the Bill caters only to the Personal Data regime. The surveillance and privacy of communications regime was not discussed at the 6th Privacy Roundtable.

PDF document icon Personal Data (Protection) Bill.pdf — PDF document, 193 kB (198250 bytes)

Report on the Sixth Privacy Roundtable Meeting, New Delhi

by Prachi Arya last modified Aug 30, 2013 03:04 PM
In 2013 the Centre for Internet and Society (CIS) drafted the Privacy Protection Bill as a citizens' version of a privacy legislation for India. Since April 2013, CIS has been holding Privacy Roundtables in collaboration with Federation of Indian Chambers of Commerce and Industry (FICCI) and DSCI, with the objective of gaining public feedback to the Privacy Protection Bill and other possible frameworks for privacy in India. The following is a report on the Sixth Privacy Roundtable held in New Delhi on August 24, 2013.
Report on the Sixth Privacy Roundtable Meeting, New Delhi

A banner of the event with logos of all the organisers


This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC.


Introduction

A series of seven multi-stakeholder roundtable meetings on "privacy" were conducted by CIS in collaboration with FICCI from April 2013 to August 2013 under the Internet Governance initiative. DSCI joined CIS and FICCI as a co-organizer on April 20, 2013.

CIS was a member of the Justice A.P. Shah Committee which drafted the "Report of Groups of Experts on Privacy". CIS also drafted a Privacy (Protection) Bill 2013 (hereinafter referred to as ‘the Bill’), with the objective of establishing a well protected privacy regime in India. CIS has also volunteered to champion the session/workshops on "privacy" in the final meeting on Internet Governance proposed for October 2013.

At the roundtables the Report of the Group of Experts on Privacy and the text of the Privacy (Protection) Bill 2013 will be discussed. The discussions and recommendations from the six round table meetings will be presented at the Internet Governance meeting in October 2013.

The dates of the six Privacy Round Table meetings are enlisted below:

  1. New Delhi Roundtable: April 13, 2013
  2. Bangalore Roundtable: April 20, 2013
  3. Chennai Roundtable: May 18, 2013
  4. Mumbai Roundtable: June 15, 2013
  5. Kolkata Roundtable: July 13, 2013
  6. New Delhi Roundtable: August 24, 2013
  7. New Delhi Final Roundtable and National Meeting: October 19, 2013

This Report provides an overview of the proceedings of the Sixth Privacy Roundtable (hereinafter referred to as 'the Roundtable'), conducted at FICCI, Federation House in Delhi on August 24, 2013. The Personal Data (Protection) Bill, 2013 was discussed at the Roundtable.

The Sixth Privacy Roundtable began with reflections on the evolution of the Bill. In its penultimate form, the Bill stands substantially changed as compared to its previous versions. For the purpose of this Roundtable, which entailed participation largely from industry organizations and other entities who handle personal data, only the personal data regime was discussed. This debate was distinguished from the general and specific discussion relating to privacy, surveillance and interception of communications as it was felt that greater expertise was required to deal adequately with such a vast and nuanced area. After further discussion with security experts, the provisions on surveillance and privacy of communications will be reincorporated resulting in omnibus privacy legislation. To reflect this alteration in the ambit of the Bill in its current form, its title was changed to Personal Data (Protection) Bill from the more expansive – Privacy (Protection) Bill.

Chapter I – Preliminary

Section 2 of the first chapter enumerates various definitions including ‘personal data’, which is defined as any data that can lead to identification and ‘sensitive personal data’; a subset of personal data defined by way of a list. The main contentions arose in relation to the latter definition.

Religion and Caste

A significant modification is found in the definition of ‘sensitive personal data’, which has expanded to include two new categories, namely, (i) ethnicity, religion, race or caste, and (ii) financial and credit information. Although discussed previously, these two categories have hitherto been left out of the purview of the definition as they are fraught with issues of practicality. In the specific example of caste, the government has historically indulged in large-scale data collection for the purpose of census, for example as conducted by the Ministry of Rural Development and the Ministry of Social Justice and Empowerment, Government of India. Further, in the Indian scenario, various statutory benefits accrue from caste identities under the aegis of affirmative action policies. Hence, categorizing it as sensitive personal data may not be considered desirable. The problem is further exacerbated with respect to religion as even a person’s name can be an indicator. In light of this, some issues under consideration were –

  • Whether religion and caste should be categorized as sensitive personal data or personal data?
  • Whether it is impracticable to include it in either category?
  • If included as sensitive personal data, how should it be implemented?

The majority seemed to lean towards including it under the category of sensitive personal data rather than personal data. It was argued that the categorization of some personal data as sensitive was done on the basis of higher potential for profiling or discrimination. In the same vein, caste and religious identities were sensitive information, requiring greater protection as provided under section 16 of the Bill. Regarding the difficulties posed by revealing names, it was proposed that since it was not an indicator by default, this consideration could not be used as a rationale to eliminate religion from the definition. Instead, it was suggested that programmes sensitizing the populous to the implications of names as indicators of religion/caste should be encouraged. With regard to the issue of census, where caste information is collected, it was opined that the same could be done in an anonymously as well. The maintenance of public databases including such information by various public bodies was considered problematic for privacy as they are often easily accessible and hence have a high potential for abuse. Overall, the conclusion was that the potential for abuse of such data could be better curtailed if greater privacy requirements were mandated for both private and public organizations. The collection of this kind of data should be done on a necessity basis and kept anonymous wherever possible. However, it was acknowledged that there were greater impracticalities associated with treating religion and caste as sensitive personal data. Further, the use and disclosure of indicative names was considered to be a matter of choice. Often caste information was revealed for affirmative action schemes, for example, rank lists for admissions or appointments. In such cases, it was considered to be counter-productive to discourage the beneficiary from revealing such information. Consequently, it was suggested that they could be regulated differently and qualified wherever required. The floor was then thrown open for discussing the other categories included under the definition of ‘sensitive personal data’.

Political Affiliation

Another contentious issue discussed at the Roundtable was the categorization of ‘political affiliation’ as ‘sensitive personal data’. A participant questioned the validity of including it in the definition, arguing that it is not an issue in India. Further, it was argued that one’s political affiliation was also subject to change and hence did not mandate higher protection as provided for sensitive personal data. Instead, if included at all, it should be categorized as ‘personal data’. This was countered by other participants who argued that revealing such information should be a matter of choice and if this choice is not protected adequately, it may lead to persecution. In light of this, changing one’s political affiliation particularly required greater protection as it may leave one more vulnerable. Everyone was in agreement that the aggregation of this class of data, particularly when conducted by public and private organizations, was highly problematic, as evidenced by its historic use for targeting dissident groups. Further, it was accepted unanimously that this protection should not extend to public figures as citizens had a right to know their political affiliation. However, although there was consensus on voting being treated as sensitive personal data, the same could not be reached for extending this protection to political affiliation.

Conviction Data

The roundtable also elicited a debate on conviction data being enumerated as sensitive personal data. The contention stemmed from the usefulness of maintaining this information as a matter of public record. Inter alia, the judicial practice of considering conviction history for repeat offenders, the need to consider this data before issuing passport and the possibility of establishing a sex offenders registry in India were cited as examples for the same.

Financial and Credit Information

From the outset, the inclusion of Financial and Credit information as sensitive personal data was considered problematic as it would clash directly with existing legislations. Specifically, the Reserve Bank of India mandates on all issues revolving around this class of data. However, it was considered expedient to categorize it in this manner due to grave mismanagement associated with it, despite existing protections. In this regard, the handling of Credit Information was raised as an issue. Even though it is regulated under the Credit Information Companies (Regulation) Act, 2005, its implementation was found to be wanting by some participants. In this context, the harm sought to be prevented by its inclusion in the Bill was unregulated sharing of credit-worthiness data with foreign banks and organs of the state. Informed consent was offered as the primary qualifier. However, some participants proposed that extending a strong regime of protection to such information would not be economically viable for financial institutions. Thus, it was suggested that this category should be categorized as personal data with the aim of regulating unauthorized disclosures.

Conclusion

The debate on the definition of sensitive personal data concluded with the following suggestions and remarks:

  • The categories included under sensitive personal data should be subject to contextual provisions instead of blanket protection.
  • Sensitive personal data mandates greater protection with regard to storage and disclosure than personal data.
  • While obtaining prior consent is important for both kinds of data, obtaining informed consent is paramount for sensitive personal data.
  • Both classes of data can be collected for legitimate purposes and in compliance with the protection provided by law.

Chapter II – Regulation of Personal Data

This chapter of the Bill establishes a negative statement of a positive right under Section 3 along with exemptions under Section 4, as opposed to the previous version of the Bill, discussed at the fifth Privacy Roundtable, which established a positive right. Thus, in its current form, the Bill provides a stronger regime for the regulation of personal data. The single exemption provided under this part is for personal or domestic use.

The main issues under consideration with regard to this part were –

  • The scope of the protection provided
  • Whether the exemptions should be expanded or diminished.

A participant raised a doubt regarding the subject of the right. In response, it was clarified that the Bill was subject to existing Constitutional provisions and relevant case law. According to the apex court, in Kharak Singh v. The State of U.P. (1964), the Right to Privacy arose from the Right to Life and Personal Liberty as enshrined under Article 21 of the Constitution of India. Since the Article 21 right is applicable to all persons, the Right to Privacy has to be interpreted in conjunction. Consequently, the Right to Privacy will apply to both citizens and non-citizens in India. It would also extend to information of foreigners stored by any entity registered in India and any other entity having an Indian legal personality irrespective of whether they are registered in India or not.

The next issue that arose at the Roundtable stemmed from the exemption provided under Section 4 of the Bill. A participant opined that excluding domestic use of such data was unadvisable as often such data was used maliciously during domestic rows such as divorce. With regard to the how ‘personal and domestic use’ was to be defined it was proposed that the same had to cater existing cultural norms. In India, this entailed that existing community laws had to be followed which does not recognize nuclear families as a legal entity. It was also acknowledged that Joint Hindu Families had to be dealt with specially and their connection with large businesses in India would have to be carefully considered.

Another question regarding exemptions brought up at the Roundtable was whether they should be broadened to include the information of public servants and the handling of all information by intelligence agencies. Similarly, some participants proposed that exemptions or exceptions should be provided for journalists, private figures involved in cases of corruption, politicians, private detective agencies etc. It was also proposed that public disclosure of information should be handled differently than information handled in the course of business.

Conclusion

The overall conclusion of the discussion on this Chapter was –

  • All exemptions and exceptions included in this Chapter should be narrowly tailored and specifically defined.
  • Blanket exemptions should be avoided. The specificities can be left to the Judiciary to adjudicate on as and when contentions arise.

Chapter III – Protection of Personal Data

This chapter seeks to regulate the collection, storage, processing, transfer, security and disclosure of personal data.

Collection of Personal Data

Sections 5, 6 and 7 of the Bill regulate the collection of personal data. While section 5 establishes a broad bar for the collection of personal data, Section 6 and 7 provide for deviations from the same, for collecting data with and without prior informed consent respectively.

Collection of Data with Prior Informed Consent

Section 6 establishes the obligation to obtain prior informed consent, sets out the regime for the same and by way of 2 provisos allows for withdrawal of consent which may result in denial of certain services.

The main issues discerned from this provision involved (i) notice for obtaining consent, (ii) mediated data collection, and (iv) destruction of data.

Regarding notice, some participants observed that although it was a good practice it was not always feasible. A participant raised the issue of the frequency of obtaining consent. It was observed that services that allowed its users to stay logged in and the storage of cookies etc. were considered benefits which would be disrupted if consent had to be obtained at every stage or each time the service was used. To solve this problem, it was unanimously accepted that consent only had to be obtained once for the entirety of the service offered except when the contract or terms and conditions were altered by the service provider. It was also decided that the entity directly conducting the collection of data was obligated to obtain consent, even if the same was conducted on behalf of a 3rd party.

Mediated date collection proved to be a highly contentious issue at the Roundtable. The issue was determining the scope and extent of liability in cases where a mediating party collects data for a data controller for another subject who may or may not be a user. In this regard, two scenarios were discussed – (i) uploading pictures of a 3rd party by a data subject on social media sites like Facebook and (ii) using mobile phone applications to send emails, which involves, inter alia, the sender, the phone manufacturer and the receiver. The ancillary issues recognized by participants in this regard were – (i) how would data acquired in this manner be treated if it could lead to the identification of the 3rd party?, and (ii) whether destruction of user data due to withdrawal of consent amount to destruction of general data, i.e. of the 3rd party. The consensus was that there was no clarity on how such forms of data collection could be regulated, even though it seemed expedient to do so. The government’s inability to find a suitable solution was also brought to the table. In this regard it was suggested by some participants that the Principle of Collection Limitation, as defined in the A.P. Shah Committee Report, would provide a basic protection. Further the extent to which this would be exempted for being personal use was suggested as a threshold. A participant observed that it would be technically unfeasible for the service provider to regulate such collection, even if it involved illicit data such as pornographic or indecent photographs. Further, it was opined that such an oversight by the service provider could be undesirable since it would result in the violation of the user’s privacy. Thus, any proposal for regulation had to balance the data subject’s rights with that of the 3rd party. In light of this, it was suggested that the mediating party should be made responsible for obtaining consent from the 3rd party.

Another aspect of this provision which garnered much debate was the proviso mandating destruction of data in case of withdrawal of consent. A participant stated the need for including broad exceptions as it may not always be desirable. Regarding the definition of ‘destroy’, as provided for under Section 2, it was observed that it mandated the erasure/deletion of the data in its entirety. Instead, it was suggested, that the same could be achieved by merely anonymising the information.

Collection of Data without Consent

Section 7 of the Bill outlines four scenarios which entail collection of personal data without prior consent, which are reproduced below -

“(a) necessary for the provision of an emergency medical service to the data subject;
(b) required for the establishment of the identity of the data subject and the collection is authorised by a law in this regard;
(c) necessary to prevent a reasonable threat to national security, defence or public order; or
(d) necessary to prevent, investigate or prosecute a cognisable offence”

Most participants at the Roundtable found that the list was too large in scope. The unqualified inclusion of prevention in that last two sub clauses was found to be particularly problematic. It was suggested that Section 7 (c) was entirely redundant as its provisions could be read into Section 7 (d). Furthermore, the inclusion of ‘national security’ as a basis for collecting information without consent was rejected almost unanimously. It was suggested that if it was to be included then a qualification was desirable, allowing collection of information only when authorized by law. Some participants extended this line of reasoning to Section 7 (c) as state agencies were already authorized to collect information in this manner. It was opined that including it under the Bill would reassert their right to do so in broader terms. For similar reasons, Section 7 (b) was found objectionable as well. It was further suggested that if sub clauses (b), (c) and (d) remained in the Bill, it should be subject to existing protections, for example those established by seminal cases such as Maneka Gandhi v. Union of India (1978) and PUCL v. Union of India (1997).

Storage and Processing of Personal Data

Section 8 of the Bill lays down a principle mandating the destruction of the information collected, following the cessation of the necessity or purpose for storage and provides exceptions to the same. It sets down a regime of informed consent, purpose specific storage and data anonymization.

The first amendment suggested for this provision was regarding the requirement of deleting the stored information ‘forthwith’. It was proposed by a participant that deleting personal data instantaneously had practical constraints and a reasonability criteria should be added. It was also noticed that in the current form of the Bill, the exception of historical, archival and research purposes had been replaced by the more general phrase ‘for an Act of Parliament’. The previous definition was altered as the terms being used were hard to define. In response, a participant suggested a broader phrase which would include any legal requirement. Another participant argued that a broader phrase would need to me more specifically defined to avoid dilution.

Section 9 of the Bill sets out two limitations for processing data in terms of (i) the kind of personal data being processed and (ii) the purpose for the same. The third sub clause enumerates exceptions to the abovementioned principles in language similar to that found in Section 7.

With regard to the purpose limitation clause it was suggested by many participants that the same should be broadened to include multiple purposes as purpose swapping is widespread in existing practice and would be unfeasible and undesirable to curtail. Sub clause 3 of this Section was critiqued for the same reasons as Section 7.

Section 10 restricts cross-border transfer of data. It was clarified that different departments of the same company or the same holding company would be treated as different entities for the purpose of identifying the data processor. However, a concern was raised regarding the possibility of increased bureaucratic hurdles on global transfer of data in case this section is read too strictly. At the same time, to provide adequate protection of the data subject’s rights certain restrictions on the data controller and location of transfer.

The regime for disclosure of personal data without prior consent is provided for by Section 14. The provision did not specify the rank of the police officer in charge of passing orders for such disclosure. It was observed that a suitable rank had to be identified to ensure adequate protection. Further, it was suggested that the provision be broadened to include other competent agencies as well. This could be included by way of a schedule or subsequent notifications.

Conclusion

  • Mediated collection of data should be qualified on the basis of purpose and intent of collection.
  • The issue of cost to company (C2C) was not given adequate consideration in the Bill.
  • The need to lay down Procedures at all stages of handling personal data.
  • Special exemptions need to be provided for journalistic sources.

Meeting Conclusion

The Sixth Privacy Roundtable was the second to last of the stakeholder consultations conducted for the Citizens’ Personal Data (Protection) Bill, 2013. Various changes made to the Bill from its last form were scrutinized closely and suitable suggestions were provided. Further changes were recommended for various aspects of it, including definitions, qualifications and procedures, liability and the chapter on offences and penalties. The Bill will be amended to reflect multi-stakeholder suggestions and cater to various interests.

6th Privacy Roundtable

by Prachi Arya last modified Aug 30, 2013 08:15 AM
6th Privacy Roundtable
Full-size image: 502.1 KB | View image View Download image Download

CIS Cybersecurity Series (Part 10) - Lawrence Liang

by Purba Sarkar last modified Sep 10, 2013 08:31 AM
CIS interviews Lawrence Liang, researcher and lawyer, and co-founder of Alternative Law Forum, Bangalore, as part of the Cybersecurity Series.

"The right to privacy and the right to free speech have often been understood as distinct rights. But I think in the ecology of online communication, it becomes crucial for us to look at the two as being inseparable. And this is not entirely new in India. But, interestingly, a lot of the cases that have had to deal with this question in the Indian context, have pitted one against the other. Now, India doesn't have a law for the protection of whistle-blowers. So how do we now think of the idea of whistle-blowers being one of the subjects of speech and privacy coming together? How do we use the strong pillars that have been established, in terms of a very rich tradition that Indian law has, on the recognition of free speech issues but slowly start incorporating questions of privacy?" - Lawrence Liang, researcher and lawyer, Alternative Law Forum. 

Centre for Internet and Society presents its tenth installment of the CIS Cybersecurity Series. 

The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

Lawrence Liang is one of the co-founders of the Alternative Law Forum where he works on issues of intellectual property, censorship, and the intersection of law and culture. He is also a fellow with the Centre for Internet and Society and serves on its board.  

 
This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

Out of the Bedroom

by Nishant Shah last modified Sep 06, 2013 08:32 AM
We have shared it with our friends. We have watched it with our lovers. We have discussed it with our children and talked about it with our partners. It is in our bedrooms, hidden in sock drawers. It is in our laptops, in a folder marked "Miscellaneous". It is in our cellphones and tablets, protected under passwords. It is the biggest reason why people have learned to clean their browsing history and cookies from their browsers.

The article by Nishant Shah was published in the Indian Express on August 25, 2013.


Whether we go into surreptitious shops to buy unmarked CDs or trawl through Torrent and user-generated content sites in the quest of a video, there is no denying the fact that it has become a part of our multimedia life. Even in countries like India, where consumption and distribution of pornography are punished by law, we know that pornography is rampant. With the rise of the digital technologies of easy copy and sharing, and the internet which facilitates amateur production and anonymous distribution, pornography has escaped the industrial market and become one of the most intimate and commonplace practices of the online world.

In fact, if Google trend results are to be believed, Indians are among the top 10 nationalities searching for pornography daily. Even a quick look at our internet history tells us that it has all been about porn. The morphed pictures of a naked Pooja Bhatt adorned the covers of Stardust in the late 1990s, warning us that the true potential of Photoshop had been realised. The extraordinary sensation of the Delhi Public School MMS case which captured two underage youngsters in a grainy sexcapade announced the arrival of user-generated porn in a big way. The demise of Savita Bhabhi — India's first pornographic graphic novel — is still recent enough for us to remember that the history of the internet in India is book-ended by porn and censorship.

Recent discussions on pornography have been catalysed by a public interest litigation requesting for a ban on internet pornography filed in April by Kamlesh Vaswani. Whether Vaswani's observations on what porn can make us do stem from his own personal epiphany or his self-appointed role as our moral compass is a discussion that merits its own special space. Similarly, a debate on the role, function, and use of pornography in a society is complex, rich and not for today.

Instead, I want to focus on the pre-Web imagination of porn that Vaswani and his endorsers are trying to impose upon the rest of us. There is a common misunderstanding that all porn is the same porn, no matter what the format, medium and aesthetics of representations. Or in other words, a homogenising presumption is that erotic fiction and fantasies, pictures of naked people in a magazine, adult films produced by entertainment houses, and user-generated videos on the internet are the same kind of porn. However, as historical legal debates and public discussions have shown us, what constitutes porn is specific to the technologies that produce it. There was a time when DH Lawrence's iconic novel now taught in undergraduate university courses — Lady Chatterley's Lover — was deemed pornographic and banned in India. In more recent times, the nation was in uproar at the Choli ke peeche song from Khalnayak which eventually won awards for its lyrics and choreography.

In all the controversy, there has so far been a "broadcast imagination" of how pornography gets produced, consumed and distributed. There is a very distinct separation of us versus them when it comes to pornography. They produce porn. They distribute porn. They push porn down our throats (that was probably a poor choice of words) by spamming us and buying Google adwords to infect our search results. We consume porn. And all we need to do is go and regulate, like we do with Bollywood, the central management and distribution mechanism so that the flow of pornography can be curbed. This is what I call a broadcast way of thinking, where the roles of the performers, producers, consumers and distributors of pornography are all distinct and can be regulated.

However, within the murky spaces of the World Wide Web, the scenario is quite different. Internet pornography is not the same as availability of pornography on the internet. True, the digital multimedia space of sharing and peer-2-peer distribution has made the internet the largest gateway to accessing pornographic objects which are produced through commercial production houses. However, the internet is not merely a way of getting access to existing older forms of porn. The internet also produces pornography that is new, strange, unprecedented and is an essential part of the everyday experience of being digitally connected and networked into sociality.

The recent controversies about the former congressman from New York, Anthony Weiner, sexting — sending inappropriate sexual messages through his cellphone — gives us some idea of what internet porn looks like. It is not just something captured on a phone-cam but interactive and collaboratively produced. Or as our own Porngate, where two cabinet ministers of the Karnataka legislative assembly were caught surfing some good old porn on their mobile devices while the legislature was in session, indicated, porn is not something confined to the privacy of our rooms. Naked flashmobs, young people experimenting with sexual identities in public, and sometimes bizarre videos of a bus-ride where the camera merely captures the banal and the everyday through a "pornographic gaze" are also a part of the digital porn landscape. The world of virtual reality and multiple online role-playing games offer simulated sexual experiences that allow for human, humanoid, and non-human avatars to engage in sexual activities in digital spaces. Peer-2-peer video chat platforms like Chatroulette, offer random encounters of the naked kind, where nothing is recorded but almost everything can be seen.

The list of pornography produced by the internet — as opposed to pornography made accessible through the internet — is huge. It doesn't just hide in subcultural practices but resides on popular video-sharing sites like YouTube or Tumblr blogs. It vibrates in our cellphones as we connect to people far away from us, and pulsates on the glowing screens of our tablets as we get glimpses of random strangers and their intimate bodies and moments. An attempt to ban and censor this porn is going to be futile because it does not necessarily take the shape of a full narrative text which can be examined by others to judge its moral content. Any petition that tries to censor such activities is going to fall flat on its face because it fails to recognise that sexual expression, engagement and experimentation is a part of being human — and the ubiquitous presence of digital technologies in our life is going to make the internet a fair playground for activities which might seem pornographic in nature. In fact, trying to restrict and censor them, will only make our task of identifying harmful pornography — porn that involves minors, or hate speech or extreme acts of violence — so much more difficult because it will be pushed into the underbelly of the internet which is much larger than the searched and indexed World Wide Web.

Trying to suggest that internet pornography is an appendage which can be surgically removed from everyday cyberspace is to not understand the integral part that pornography and sexual interactions play in the development and the unfolding of the internet. The more fruitful efforts would be to try and perhaps create a guideline that helps promote healthy sexual interaction and alerts us to undesirable sexual expressions which reinforce misogyny, violence, hate speech and non-consensual invasions of bodies and privacy. This blanket ban on trying to sweep all internet porn under a carpet is not going to work — it will just show up as a big bump, in places we had not foreseen.

An Interview with Suresh Ramasubramanian

by Elonnai Hickok last modified Sep 06, 2013 09:37 AM
Suresh Ramasubramanian is the ICS Quality Representative - IBM SmartCloud at IBM. We from the Centre for Internet and Society conducted an interview on cybersecurity and issues in the Cloud.
  1. You have done a lot of work around cybersecurity and issues in the Cloud. Could you please tell us of your experience in these areas and the challenges facing them?
    a. I have been involved in antispam activism from the late 1990s and have worked in ISP / messaging provider antispam teams since 2001. Since 2005, I expanded my focus to include general cyber security and privacy, having written white papers on spam and botnets for the OECD, ITU and UNDP/APDIP. More recently, have become a M3AAWG special advisor for capacity building and outreach in India.

    In fact capacity building and outreach has been the focus of my career for a long time now. I have been putting relevant stakeholders from ISPs, government and civil society in India in touch with their counterparts around the world, and, at a small level, enabling an international exchange of ideas and information around antispam and security.

    This was a challenge over a decade back when I was a newbie to antispam and it still is. People in India and other emerging economies, with some notable exceptions, are not part of the international communities that have grown in the area of cyber security and privacy.

    There is a prevalent lack of knowledge in this area, which combined with gaps in local law and its enforcement. There is a tendency on the part of online criminals to target emerging and fast growing economies as a rich source of potential victims for various forms of online crime, and sometimes as a safe haven against prosecution.
  2. In a recent public statement Google said "Cloud users have no legitimate expectation of privacy. Do you agree with this statement?
    a. Let us put it this way. All email received by a cloud or other Internet service provider for its customers is automatically processed and data mined in one form or the other. At one level, this can be done for spam filtering and other security measures that are essential to maintain the security and stability of the service, and to protect users from being targeted by spam, malware and potential account compromises.

    The actual intent of automated data mining and processing should be transparently provided to customers of a service, with a clearly defined privacy policy, and the deployment of such processing, and the “end use” to which data mined from this processing is put, are key to agreeing or disagreeing with such a statement.

    It goes without saying that such processing must stay within the letter, scope and spirit of a company’s privacy policy, and must actually be structured to be respectful of user privacy.

    Especially where mined data is used to provide user advertising or for any other commercial purpose (such as being aggregated and resold), strict adherence to a well written privacy policy and periodic review of this policy and its implementation to examine its compliance to laws in all countries that the company operates in are essential.

    There is way too much noise in the media for me to usefully add any more to this issue and so I will restrict myself to the purely general comments above.
  3. What ways can be privacy of an individual be compromised on the cloud? What can be done to prevent such instances of compromise?
    a. All the recent headlines about companies mining their own users’ data, and yet more headlines about different countries deploying nationwide or even international lawful intercept and wiretap programs, aside, the single largest threat to individual privacy on the cloud is, and has been for years before the word “cloud” came into general use, the constant targeting of online users by online criminals with a variety of threats including scams, phish campaigns and data / account credential stealing malware.

    Poor device security is another threat – one that becomes even more of a serious problem when the long talked about “internet of things” seems set to become reality, with cars, baby monitors, even Bluetooth enabled toilets, and more dangerously, critical national infrastructure such as power plants and water utilities becoming accessible over the Internet but still running software that is basically insecure and architected with assumptions that date back to an era when there was no conception or need to connect these to the Internet.

    Someone in Bluetooth range with the appropriate android application being able to automatically flush your toilet and even download a list of the dates and times when you last used it is personally embarrassing. Having your bank account broken into because your computer got infected with a virus is even more damaging. Someone able to access a dam’s control panel over the internet and remotely trigger the dam’s gates to open can cause far more catastrophic damage.

    The line between security and privacy, between normal business practice and unacceptable, even illegal behaviour, is sometimes quite thin and in a grey area that may be leveraged to the hilt for commercial and/or national security interests. However, scams, malware, exploits of insecure systems and similar threats are well on the wrong side of the “criminal” spectrum, and are a clear and present danger that cause far more than an embarrassing or personally damaging loss of privacy.
  4. How is the jurisdiction of the data on the cloud determined?
    This is a surprisingly thorny question. Normally, a company is based in a particular country and has an end user agreement / terms of service that makes its customers / users accept that country’s jurisdiction.

    However, a cloud based provider that does business around the world may, in practice, have to comply to some extent at least, with that country’s local laws – at any rate, in respect to its users who are citizens of that country. And any cloud product sold to a local business or individual by a salesman from the vendor’s branch in the country would possibly fall under a contract executed in the country and therefore, subject to local law.

    The level of compliance for data retention and disclosure in response to legal processes will possibly vary from country to country – ranging from flat refusals to cooperate (especially where any law enforcement request for data are for something that is quite legal in the country the cloud provider is based in) to actual compliance.

    In practice this may also depend on what is at stake for the cloud vendor in complying or refusing to comply with local laws – regardless of what the terms of use policies or contract assert about jurisdiction. The number of users the cloud vendor has in the country, the extent of its local presence in the country, how vulnerable its resident employees and executives are to legal sanctions or punishment.

    In the past, it has been observed that a practical balance [which may be based on business economics as much as it is based on a privacy assessment] may be struck by certain cloud vendors with a global presence, based on the critical mass of users it stands to gain or lose by complying with local law, and the risks it faces if it complies, or conversely, does not comply with local laws – so the decision may be to fight lawsuits or prosecutions on charges of breaking local data privacy laws or not complying with local law enforcement requests for handover of user data in court, or worst case, pulling out of the country altogether.
  5. Currently, big cloud owners are US corps, yet US courts do not extend the same privacy rights to non US citizens. Is it possible for countries to use the cloud and still protect citizen data from being accessed by foreign governments? Do you think a "National Cloud" is a practical solution?
    a. The “cloud” in this context is just “the internet”, and keeping local data local and within local jurisdiction is possible in theory at any rate. Peering can be used to keep local traffic local instead of having it do a roundtrip through a foreign country and back [where it might or might not be subject to another country’s intercept activities, no comment on that].

    A national cloud demands local infrastructure including bandwidth, datacenters etc. that meet the international standards of most global cloud providers. It then requires cloud based sites that provide an equivalent level of service, functionality and quality to that provided by an international cloud vendor. And then after that, it has to have usable privacy policies and the country needs to have a privacy law and a sizeable amount of practical regulation to bolster the law, a well-defined path for reporting and redress of data breaches. There are a whole lot of other technical and process issues before having a national cloud becomes a reality, and even more before such a reality makes a palpable positive difference to user privacy.
  6. What audit mechanisms of security and standards exist for Cloud Service Providers and Cloud Data Providers?
    a. Plenty – some specific to the country and the industry sector / kind of data the cloud handles. The Cloud Security Alliance has been working for quite a while on CloudAudit, a framework developed as part of a cross industry effort to unify and automate Assertion, Assessment and Assurance of their infrastructure and service.

    Different standards bodies and government agencies have all come out with their own sets of standards and best practices in this area (this article has a reasonable list - http://www.esecurityplanet.com/network-security/cloud-security-standards-what-youshould-know.html). Some standards you absolutely have to comply with for legal reasons.

    Compliance reasons aside, a judicious mix of standards, and considerable amounts of adaptation in your process to make those standards work for you and play well together.

    The standards all exist – what varies considerably, and is a major cause of data privacy breaches, are incomplete or ham handed implementations of existing standards, any attempt at “checkbox compliance” to simply implement a set of steps that lead to a required certification, and a lack of continuing initiative to keep the data privacy and securitymomentum going once these standards have been “achieved”, till it is time for the next audit at any rate.
  7. What do you see as the big challenges for privacy in the cloud in the coming years?
    a. Not very much more than the exact same challenges for privacy in the cloud over the past decade or more. The only difference is that any threat that existed before has always amplified itself because the complexity of systems and the level of technology and computing power available to implement security, and to attempt to breach security, is exponentially higher than ever before – and set to increase as we go further down the line.
  8. Do you think encryption the answer to the private and public institutions snooping?
    a. Encryption of data at rest and in transit is a key recommendation of any data privacy standard and cloud / enterprise security policy. Companies and users are strongly encouraged to deploy and use strong cryptography for personal protection. But to call it “the answer” is sort of like the tale of the blind men and the elephant.

    There are multiple ways to circumvent encryption – social engineering to trick people into revealing data (which can be mitigated to some extent, or detected if it is tried on a large cross section of your userbase – it is something that security teams do have to watch for), or just plain coercion, which is much tougher to defend against.

    As a very popular XKCD cartoon that has been shared around social media and has been cited in multiple security papers says -

    “A crypto nerd’s imagination”

    “His laptop’s encrypted. Let us build a million dollar cluster to crack it”
    “No good! It is 4096 bit RSA”
    “Blast, our evil plan is foiled”

    “What would actually happen”
    “His laptop’s encrypted. Drug him and hit him with this $5 wrench till he tells us the password”
    “Got it”
  9. Spam is now consistently used to get people to divulge their personal data or otherwise compromise a persons financial information and perpetuate illegal activity. Can spam be regulated? If so, how?
    a. Spam has been regulated in several countries around the world. The USA has had laws against spam since 2003. So has Australia. Several other countries have laws that specifically target spam or use other statutes in their books to deal with crime (fraud, the sale of counterfeit goods, theft..) that happens to be carried out through the medium of spam.

    The problems here are the usual problems that plague international enforcement of any law at all. Spammers (and worse online criminals including those that actively employ malware) tend to pick jurisdictions to operate in where there are no existing laws on their activities, and generally take the precaution not to target residents of the country that they live in. Others send spam but attempt to, in several cases successfully, skate around loopholes in their country’s antispam laws.

    Still others fully exploit the anonymity that the Internet provides, with privately registered domain names, anonymizing proxy servers (when they are not using botnets of compromised machines), as well as a string of shell companies and complex international routing of revenue from their spam campaigns, to quickly take money offshore to a more permissible jurisdiction.

    Their other advantage is that law enforcement and regulatory bodies are generally short staffed and heavily tasked, so that even a spammer who operates in the open may continue his activities for a very long time before someone manages to prosecute him.

    Some antispam laws allow recipients of spam to sue the spammer in small claims courts – which, like regulatory action, has also previously led to judgements being handed out against spammers and their being fined or possibly imprisoned in case their spam has criminal aspects to it, attracting local computer crime laws rather than being mere violations of civil antispam laws.
  10. There has been a lot of talk about the use of malware like FinFisher and its ability to compromise national security and individual security. Do you think regulation is needed for this type of malware - and if so what type - export  controls? privacy regulation? Use control?
    a. Malware used by nation states as a part of their surveillance activities is a problem. It is further a problem if such malware is used by nation states that are not even nominally democratic and that have long standing records of human rights violations.

    Regulating or embargoing their sale is not going to help in such cases. One problem is that export controls on such software are not going to be particularly easy and countries that are on software export blacklists routinely manage to find newer and more creative ways to attempt to get around these and try to purchase embargoed software and computing equipment of all kinds.

    Another problem is that such software is not produced just by legitimate vendors of lawful intercept gear. Criminals who write malware that is capable of, say, stealing personal data such as bank account credentials are perfectly capable of writing such software, and there is a thriving underground economy in the sale of malware and of “take” from malware such as personal data, credit cards and bank accounts where any rogue nation state can easily acquire products with an equivalent functionality.

    This is going to apply even if legitimate vendors of such products are subject to strict regulations governing their sale and national laws exist regulating the use of such products. So while there is no reason not to regulate / provide judicial and regulatory oversight of their sale and intended use, it should not be seen as any kind of a solution to this problem.

    User education in privacy and access to secure computing resources is probably going to be the bedrock of any initiative that looks to protect user privacy – a final backstop to any technical / legal or other measure that is taken to protect them.

Privacy Law Must Fit the Bill

by Sunil Abraham last modified Sep 12, 2013 06:25 AM
The process of updating Indian privacy policy has gained momentum ever since the launch of the UID project and also the leak of the Radia tapes. The Department of Personnel and Training has lead the drafting of privacy bill for the last three years. This bill will ideally articulate privacy principles and establish the office of the privacy commissioner and most importantly have an over-riding effect over 50 odd existing laws, rules and policies with privacy implications.
Privacy Law Must Fit the Bill

Sunil Abraham


The article was published in the Deccan Chronicle on September 9, 2013.


Given the harmonizing impact of the proposed privacy bill, we must ensure that rigorous debate and discussion happens before the bill is finalized otherwise there may be terrible consequences.

Here is a short list of what can possibly go wrong:

One, the privacy bill ignores the massive power asymmetry in Indian societies undermining the right to information – in other jurisdictions referred to as freedom of information and access to information. The power asymmetry is addressed via a public interest test. The right to privacy would be the same for everyone except when public interest is at stake. This enables protection of the right to privacy to be inversely proportionate to power and almost conversely the requirement of transparency to be directly proportionate to power. In other words, the poor would have greater privacy than a middle-class citizens who in turn would have greater privacy than political and economic elites. And transparency requirements would be greatest for economic and political elites and lower for middle-class citizens and lowest for the poor.  If this is not properly addressed in the language of the bill – privacy activists would have undone the significant accomplishments of the right to information or transparency movement in India over the last decade.

Two, the privacy bill has chilling effect on free speech. This can happen either by denying the speaker privacy, or by affording those who are spoken about too much privacy. For the speaker - Know Your Customer (KYC) and data retention requirements for telecom and internet infrastructure necessary to participate in the networked public sphere can result in the death of anonymous and pseudonymous speech. Anonymous and pseudonymous speech must be protected as it is a necessary for good governance, free media, robust civil society, and vibrant art and culture in a democracy.  For those spoken about - privacy is clearly required in certain cases to protect the victims of certain categories of crimes. However, the right to privacy could be abused by those occupying public office and those in public life to censor speech that is in the public interest. If for example a sport person does not publicly drink the aerated drink that he or she endorses in advertisements then the public has a right to know.

Three, the privacy bill has a limited scope. Jurisprudence in India derives the right to privacy from the right to life and liberty through several key judgments including Naz Foundation v. Govt. of NCT of Delhi decided by the Delhi High Court. The right to life and liberty or Article 21 unlike other constitutionally guaranteed fundamental rights does not distinguish between citizens and non-citizens. As a consequence the privacy bill must also protect residents, visitors and other persons who may never visit India, but whose personal information may travel to India as part of the global outsourcing phenomena. Also the obligations and safeguards under the privacy bill must equally apply to both the state and the private sector entities that could potentially infringe upon the individual's right to privacy. Different levels of protection may be afforded to citizens, residents, visitors and everybody else. Government and private sector data controllers may be subject to different regulations – for ex. an intelligence agency may not require 'consent' of the data subject to collect personal information and may only provide 'notice' after the investigation has cleared the suspect of all charges.

Four, the privacy bill is expected to fix poorly designed technology. There are two diametrically opposite definitions of projects like NATGRID, CMS and UID. The government definition is that all these systems will allow only for targeted interception and surveillance, however the majority of civil society believes that these system will be used for blanket surveillance. If these systems are indeed built in a manner that supports blanket surveillance then legal band-aid in the form of a new law or provision that prohibits blanket surveillance will be a complete failure. The principle of 'privacy by design' is the only way to address this. For ex. shutters of digital cameras are silent and this allows for a particular form of voyeurism called upskirt. Almost a decade ago, the Korean government enacted a law that requires camera and mobile phone manufacturers to ensure that audio recording of a mechanical shutter is played every time the camera function is used. It is also illegal for the user to circumvent or disable this feature. In this example, the principle of notice is hardwired within the technology itself. To remix Spiderman's motto – with great power comes great temptation. We know that a rogue NTRO official installed a spy camera in the office toilet to make recording female colleagues and most recently that NSA officers confessed to spying on their love interests. If the technology can be abused it will be abused. Therefore legal safeguards are a poor substitute for technological safeguards. We need both simultaneously.

Five, the bill does not require compliance with internationally accepted privacy principles including the ones discussed so far 'consent', 'notice' and 'privacy by design'. Apart from human rights considerations – the most important imperative to modernize India privacy laws is trade. We have a vibrant ITES, BPO and KPO sector which handles personal information of foreigners mostly from the North American and European continents.  The Justice AP Shah committee in October 2012 identified privacy principle that required for India - notice, choice and consent, collection limitation, purpose limitation, access and correction, disclosure of information, security, openness and accountability. A privacy bill that does include all these principles will increase the regulatory compliance overhead for Indian enterprise with foreign clients and for multinationals operating in India. There is also the risk that privacy regulators in these jurisdictions will ban outsourcing to Indian firms because our privacy laws are not adequate by their standards.

To conclude, it is not sufficient for India to enact a privacy law it is essential that we get it right so that there are no unintended consequences on other equally important rights and dimensions of our democracy.

Transparency Reports — A Glance on What Google and Facebook Tell about Government Data Requests

by Prachi Arya last modified Sep 13, 2013 09:44 AM
Transparency Reports are a step towards greater accountability but how efficacious are they really?

Prachi Arya examines the transparency reports released by tech giants with a special focus on user data requests made to Google and Facebook by Indian law enforcement agencies.

The research was conducted as part of the 'SAFEGUARDS' project that CIS is doing with Privacy International and IDRC.


According to a recent comScore Report India has now become the third largest internet user with nearly 74 million citizens on the Internet, falling just behind China and the United States. The report also reveals that Google is the preferred search engine for Indians and Facebook is the most popular social media website followed by LinkedIn and Twitter. While users posting their photos on Facebook can limit viewership through privacy settings, there isn’t much they can do against government seeking information on their profiles. All that can be said for sure in the post-Snowden world is that large-scale surveillance is a reality and the government wants it on their citizen’s online existence. In this Orwellian scenario, transparency reports provide a trickle of information on how much our government finds out about us.

The first transparency report was released by Google three years ago to provide an insight into ‘the scale and scope of government requests for censorship and data around the globe’. Since then the issuance of such reports is increasingly becoming a standard practice for tech giants. An Electronic Frontier Foundation Report reveals that major companies that have followed Google’s lead include Dropbox, LinkedIn, Microsoft and Twitter with Facebook and Yahoo! being the latest additions . Requests to Twitter and Microsoft from Indian law enforcement agencies were significantly less than requests to Facebook and Google. Twitter revealed that Indian law enforcement agencies made less than 10 requests, none of which resulted in sharing of user information. Out of the 418 requests made to Microsoft by India (excluding Skype), 88.5 per cent were complied with for non-content user data. The Yahoo! Transparency Report revealed that 6 countries surpassed India in terms of the number of user data requests. Indian agencies requested user data 1490 times from 2704 accounts for both content and non-content data and over 50 per cent of these requests were complied with.

The following is a compilation of what the latest transparency reports issued by Facebook and Google.

"The information we share on the Transparency Report is just a sliver of what happens on the internet"
Susan Infantino, Legal Director for Google

Beginning from December 2009, Google has published several biannual transparency reports:

  • It discloses traffic data of Google services globally and statistics on removal requests received from copyright owners or governments as well as user data requests received from government agencies and courts. It also lays down the legal process required to be followed by government agencies seeking data.
  • There was a 90 per cent increment in the number of content removal requests received by Google from India. The requests complied with included:
    • Restricting videos containing clips from the controversial movie “Innocence of Muslims” from view.
    • Many YouTube videos and comments as well as some Blogger blog posts being restricted from local view for disrupting public order in relation to instability in North East India.
  • For User Data requests, the Google report details the number of user data requests and users/accounts as well as percentage of requests which were partially or completely complied with. In India the user data requests more than doubled from 1,061 in the July-December 2009 period to 2,431 in the July-December 2012 period. The compliance rate decreased from 79 per cent in the July-December 2010 period to 66 per cent in the last report.
  • Jurisdictions outside the United States can seek disclosure using Mutual Legal Assistance Treaties or any ‘other diplomatic and cooperative arrangement’. Google also provides information on a voluntary basis if requested following a valid legal process if the requests are in consonance with international norms, U.S. and the requesting countries' laws and Google’s policies.

Facebook

    "We hope this report will be useful to our users in the ongoing debate about the proper standards for government requests for user information in official investigations."
    Colin Stretch, Facebook General Counsel

Facebook inaugurated its first ever transparency report last Tuesday with a promise to continue releasing these reports.

  • The ‘Global Government Requests Report’ provides information on the number of requests received by the social media giant for user/account information by country and the percentage of requests it complied with. It also includes operational guidelines for law enforcement authorities.
  • The report covers the first six months of 2013, specifically till June 30. In this period India made 3,245 requests from 4,144 users/accounts and half of these requests were complied with.
  • Jurisdictions outside the United States can seek disclosure by way of mutual legal assistance treaties requests or letter rogatory. Legal requests can be in the form of search warrants, court orders or subpoena. The requests are usually made in furtherance of criminal investigations but no details about the nature of such investigations are provided.
  • Broad or vague requests are not processed. The requests are expected to include details of the law enforcement authority issuing the request and the identity of the user whose details are sought.

The Indian Regime

Section 69 and 69 B of the Information Technology (Amended) Act, 2008 prescribes the procedure and sets safeguards for the Indian Government to request user data from corporates. According to section 69, authorized officers can issue directions to intercept, monitor or decrypt information for the following reasons:

  1. Sovereignty or integrity of India,
  2. Defence of India,
  3. Security of the state,
  4. Friendly relations with foreign states,
  5. Maintenance of public order,
  6. Preventing incitement to the commission  of any cognizable offence relating to the above, or
  7. For investigation of any offence.

Section 69 B empowers authorized agencies to monitor and collect information for cyber security purposes, including ‘for identification, analysis and prevention of intrusion and spread of computer contaminants’. Additionally, there are rules under section 69 and 69 B that regulate interception under these provisions.

Information can also be requested through the Controller of Certifying Authority under section 28 of the IT Act which circumvents the stipulated procedure. If the request is not complied with then the intermediary may be penalized under section 44.

The Indian Government has been increasingly leaning towards greater control over online communications. In 2011, Yahoo! was slapped with a penalty of Rs. 11 lakh for not complying with a section 28 request, which called for email information of a person on the grounds of national security although the court subsequently stayed the Controller of Certifying Authorities' order. In the same year the government called for pre-screening user content by internet companies and social media sites to ensure deletion of ‘objectionable content’ before it was published. Similarly, the government has increasingly sought greater online censorship, using the Information Technology Act to arrest citizens for social media posts and comments and even emails criticizing the government.

What does this mean for Privacy?

The Google Transparency Report has thrown light on an increasing trend of governmental data requests on a yearly basis. The reports published by Google and Facebook reveal that the number of government requests from India is second only to the United States. Further, more than 50 per cent of the requests from India have led to disclosure by nearly all the companies surveyed in this post, with Twitter being the single exception.

Undeniably, transparency reports are important accountability mechanisms which reaffirm the company’s dedication towards protecting its user’s privacy. However, basic statistics and vague information cannot lift the veil on the full scope of surveillance. Even though Google’s report has steadily moved towards a more nuanced disclosure, it would only be meaningful if, inter alia, it included a break-up of the purpose behind the requests. Similarly, although Google has also included a general understanding of the legal process, more specifics need to be disclosed. For example, the report could provide statistics for notifications to indicate how often user’s under scrutiny are not notified. Such disclosures are important to enhance user understanding of when their data may be accessed and for what purposes, particularly without prior or retrospective intimation of the same. Till such time the report can provide comprehensive details about the kind of surveillance websites and internet services are subjected to, it will be of very limited use. Its greatest limitation, however, may lie beyond its scope.

The monitoring regime envisioned under the Information Technology Act effectively lays down an overly broad system which may easily lead to abuse of power. Further, the Indian Government has become infamous for their need to control websites and social media sites. Now, with the Indian Government’s plan for establishing the Central Monitoring System the need for intermediaries to conduct the interception may be done away with, giving the government unfettered access to user data, potentially rendering corporate transparency of data requests obsolete.

Privacy Meeting Brussels - Bangalore Slides

by Prasad Krishna last modified Sep 12, 2013 07:55 AM

PDF document icon presentation_vub_lsts_v3.pdf — PDF document, 1269 kB (1300025 bytes)

Privacy and Surveillance Talk by Sunil Abraham

by Prasad Krishna last modified Sep 13, 2013 09:47 AM

PDF document icon lecture_ccmg_2013september18.pdf — PDF document, 212 kB (217342 bytes)

The National Privacy Roundtable Meetings

by Bhairav Acharya last modified Mar 21, 2014 10:03 AM
The Centre for Internet & Society ("CIS"), the Federation of Indian Chambers of Commerce and Industry ("FICCI"), the Data Security Council of India ("DSCI") and Privacy International are, in partnership, conducting a series of national privacy roundtable meetings across India from April to October 2013. The roundtable meetings are designed to discuss possible frameworks to privacy in India.

This research was undertaken as part of the 'SAFEGUARDS' project that CIS is undertaking with Privacy International and IDRC.


Background: The Roundtable Meetings and Organisers

CIS is a Bangalore-based non-profit think-tank and research organisation with interests in, amongst other fields, the law, policy and practice of free speech and privacy in India. FICCI is a non-governmental, non-profit association of approximately 250,000 Indian bodies corporate. It is the oldest and largest organisation of businesses in India and represents a national corporate consensus on policy issues. DSCI is an initiative of the National Association of Software and Service Companies, a non-profit trade association of Indian information technology ("IT") and business process outsourcing ("BPO") concerns, which promotes data protection in India. Privacy International is a London-based non-profit organisation that defends and promotes the right to privacy across the world.

Privacy in the Common Law and in India

Because privacy is a multi-faceted concept, it has rarely been singly regulated. A taxonomy of privacy yields many types of individual and social activity to be differently regulated based on the degree of harm that may be caused by intrusions into these activities.[1]

The nature of the activity is significant; activities that are implicated by the state are attended by public law concerns and those conducted by private persons inter se demand market-based regulation. Hence, because the principles underlying warranted police surveillance differ from those prompting consensual collections of personal data for commercial purposes, legal governance of these different fields must proceed differently. For this and other reasons, the legal conception of privacy — as opposed to its cultural construction – has historically been diverse and disparate.

Traditionally, specific legislations have dealt separately with individual aspects of privacy in tort law, constitutional law, criminal procedure and commercial data protection, amongst other fields. The common law does not admit an enforceable right to privacy.[2] In the absence of a specific tort of privacy, various equitable remedies, administrative laws and lesser torts have been relied upon to protect the privacy of claimants.[3]

The question of whether privacy is a constitutional right has been the subject of limited judicial debate in India. The early cases of Kharak Singh (1964)[4] and Gobind (1975)[5] considered privacy in terms of physical surveillance by the police in and around the homes of suspects and, in the latter case, the Supreme Court of India found that some of the Fundamental Rights “could be described as contributing to the right to privacy” which was nevertheless subject to a compelling public interest. This inference held the field until 1994 when, in the Rajagopal case (1994),[6] the Supreme Court, for the first time, directly located privacy within the ambit of the right to personal liberty guaranteed by Article 21 of the Constitution of India. However, Rajagopal dealt specifically with a book, it did not consider the privacy of communications. In 1997, the Supreme Court considered the question of wiretaps in the PUCL case (1996)[7] and, while finding that wiretaps invaded the privacy of communications, it continued to permit them subject to some procedural safeguards.[8] A more robust statement of the right to privacy was made recently by the Delhi High Court in the Naz Foundation case (2011)[9] that de-criminalised consensual homosexual acts; however, this judgment is now in appeal.

Attempts to Create a Statutory Regime

The silence of the common law leaves the field of privacy in India open to occupation by statute. With the recent and rapid growth of the Indian IT and BPO industry, concerns regarding the protection of personal data to secure privacy have arisen. In May 2010, the European Union ("EU") commissioned an assessment of the adequacy of Indian data protection laws to evaluate the continued flow of personal data of European data subjects into India for processing. That assessment made adverse findings on the adequacy and preparedness of Indian data protection laws to safeguard personal data.[10]

Conducted amidst negotiations for a free trade agreement between India and the EU, the failed assessment potentially impeded the growth of India’s outsourcing industry that is heavily reliant on European and North American business.

Consequently, the Department of Electronics and Information Technology of the Ministry of Communications and Information Technology, Government of India, issued subordinate legislation under the rule-making power of the Information Technology Act, 2000 ("IT Act"), to give effect to section 43A of that statute. These rules – the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 ("Personal Data Rules")[11] — were subsequently reviewed by the Committee on Subordinate Legislation of the 15th Lok Sabha.[12] The Committee found that the Personal Data Rules contained clauses that were ambiguous, invasive of privacy and potentially illegal.[13]

In 2011, a draft privacy legislation called the ‘Right to Privacy Bill, 2011’, which was drafted within the Department of Personnel and Training ("DoPT") of the Ministry of Personnel, Public Grievances and Pensions, Government of India,  was made available on the internet along with several file notings ("First DoPT Bill"). The First DoPT Bill contained provisions for the regulation of personal data, interception of communications, visual surveillance and direct marketing. The First DoPT Bill was referred to a Committee of Secretaries chaired by the Cabinet Secretary which, on 27 May 2011, recommended several changes including re-drafts of the chapters relating to interception of communications and surveillance.

Aware of the need for personal data protection laws to enable economic growth, the Planning Commission constituted a Group of Experts under the chairmanship of Justice Ajit P. Shah, a retired Chief Justice of the Delhi High Court who delivered the judgment in the Naz Foundation case, to study foreign privacy laws, analyse existing Indian legal provisions and make specific proposals for incorporation into future Indian law. The Justice Shah Group of Experts submitted its Report to the Planning Commission on 16 October 2012 wherein it proposed the adoption of nine National Privacy Principles.[14] These are the principles of notice, choice and consent, collection limitation, purpose limitation, disclosure of information, security, openness, and accountability. The Report recommended the application of these principles in laws relating to interception of communications, video and audio recordings, use of personal identifiers, bodily and genetic material, and personal data.

Criminal Procedure and Special Laws Relating to Privacy

While the Kharak Singh and Gobind cases first brought the questions of permissibility and limits of police surveillance to the Supreme Court, the power to collect information and personal data of a person is firmly embedded in Indian criminal law and procedure. Surveillance is an essential condition of the nation-state; the inherent logic of its foundation requires the nation-state to perpetuate itself by interdicting threats to its peaceful existence. Surveillance is a method by which the nation-state’s agencies interdict those threats. The challenge for democratic countries such as India is to find the optimal balance between police powers of surveillance and the essential freedoms of its citizens, including the right to privacy.

The regime governing the interception of communications is contained in section 5(2) of the Indian Telegraph Act, 1885 ("Telegraph Act") read with rule 419A of the Indian Telegraph Rules, 1951 ("Telegraph Rules"). The Telegraph Rules were amended in 2007[15] to give effect to, amongst other things, the procedural safeguards laid down by the Supreme Court in the PUCL case. However, India’s federal scheme permits States to also legislate in this regard. Hence, in addition to the general law on interceptions contained in the Telegraph Act and Telegraph Rules, some States have also empowered their police forces with interception functions in certain cases.[16] Ironically, even though some of these State laws invoke heightened public order concerns to justify their invasions of privacy, they establish procedural safeguards based on the principle of probable cause that surpasses the Telegraph Rules.

In addition, further subordinate legislation issued to fulfil the provisions of sections 69(2) and 69B(3) of the IT Act permit the interception and monitoring of electronic communications — including emails — to collect traffic data and to intercept, monitor, and decrypt electronic communications.[17]

The proposed Privacy (Protection) Bill, 2013 and Roundtable Meetings

In this background, the proposed Privacy (Protection) Bill, 2013 seeks to protect privacy by regulating (i) the manner in which personal data is collected, processed, stored, transferred and destroyed — both by private persons for commercial gain and by the state for the purpose of governance; (ii) the conditions upon which, and procedure for, interceptions of communications — both voice and data communications, including both data-in-motion and data-at-rest — may be conducted and the authorities permitted to exercise those powers; and, (iii) the manner in which forms of surveillance not amounting to interceptions of communications — including the collection of intelligence from humans, signals, geospatial sources, measurements and signatures, and financial sources — may be conducted.

Previous roundtable meetings to seek comments and opinion on the proposed Privacy (Protection) Bill, 2013 took place at:

The roundtable meetings were multi-stakeholder events with participation from industry representatives, lawyers, journalists, civil society organizations and Government representatives. On an average, 75 per cent of the participants represented industry concerns, 15 per cent represented civil society and 10 per cent represented regulatory authorities. The model followed at the roundtable meetings allowed for equal participation from all participants.


[1]. See generally, Dan Solove, “A Taxonomy of Privacy” University of Pennsylvania Law Review (Vol. 154, No. 3, January 2006).

[2]. Wainwright v. Home Office [2003] UKHL 53.

[3]. See A v. B plc [2003] QB 195; Wainwright v. Home Office [2001] EWCA Civ 2081; R (Ellis) v. Chief Constable of Essex Police [2003] EWHC 1321 (Admin).

[4]. Kharak Singh v. State of Uttar Pradesh AIR 1963 SC 1295.

[5]. Gobind v. State of Madhya Pradesh AIR 1975 SC 1378.

[6]. R. Rajagopal v. State of Tamil Nadu AIR 1995 SC 264.

[7]. People’s Union for Civil Liberties v. Union of India (1997) 1 SCC 30.

[8]. A Division Bench of the Supreme Court of India comprising Kuldip Singh and Saghir Ahmad, JJ, found that the procedure set out in section 5(2) of the Indian Telegraph Act, 1885 and rule 419 of the Indian Telegraph Rules, 1951 did not meet the “just, fair and reasonable” test laid down in Maneka Gandhi v. Union of India AIR 1978 SC 597 requisite for the deprivation of the right to personal liberty, from whence the Division Bench found a right to privacy emanated, guaranteed under Article 21 of the Constitution of India. Therefore, Kuldip Singh, J, imposed nine additional procedural safeguards that are listed in paragraph 35 of the judgment.

[9]. Naz Foundation v. Government of NCT Delhi (2009) 160 DLT 277.

[10]. The 2010 data adequacy assessment of Indian data protection laws was conducted by Professor Graham Greenleaf. His account of the process and his summary of Indian law can found at Graham Greenleaf, "Promises and Illusions of Data Protection in Indian Law" International Data Privacy Law (47-69, Vol. 1, No. 1, March 2011).

[11]. The Rules were brought into effect vide Notification GSR 313(E) on 11 April 2011. CIS submitted comments on the Rules that can be found here – http://cis-india.org/internet-governance/blog/comments-on-the-it-reasonable-security-practices-and-procedures-and-sensitive-personal-data-or-information-rules-2011.

[12]. The Committee on Subordinate Legislation, a parliamentary ‘watchdog’ committee, is mandated by rules 317-322 of the Rules of Procedure and Conduct of Business in the Lok Sabha (14th edn., New Delhi: Lok Sabha Secretariat, 2010) to examine the validity of subordinate legislation.

[13]. See the 31st Report of the Committee on Subordinate Legislation that was presented on 21 March 2013.

[14]. See paragraphs 7.14-7.17 on pages 69-72 of the Report of the Group of Experts on Privacy, 16 October 2012, Planning Commission, Government of India.

[15]. See, the Indian Telegraph (Amendment) Rules, 2007, which were brought into effect vide Notification GSR 193(E) of the Department of Telecommunications of the Ministry of Communications and Information Technology, Government of India, dated 1 March 2007.

[16]. See, inter alia, section 14 of the Maharashtra Control of Organised Crime Act, 1999; section 14 of the Andhra Pradesh Control of Organised Crime Act, 2001; and, section 14 of the Karnataka Control of Organised Crime Act, 2000.

[17]. See, the Information Technology (Procedure and Safeguards for Monitoring and Collecting Traffic Data and Information) Rules, 2009 vide GSR 782 (E) dated 27 October 2009; and, Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009 vide GSR 780 (E) dated 27 October 2009.

Blocking of Websites

by Prasad Krishna last modified Sep 24, 2013 09:11 AM

PDF document icon Blocking of websites A1.pdf — PDF document, 2037 kB (2086287 bytes)

Freedom of Speech (Poster)

by Prasad Krishna last modified Sep 24, 2013 09:16 AM

PDF document icon Freedom of speech.pdf — PDF document, 1448 kB (1482956 bytes)

Intermediary Liabilty Poster

by Prasad Krishna last modified Sep 24, 2013 09:30 AM

PDF document icon intermediary 36x12.pdf — PDF document, 1566 kB (1604607 bytes)

Internet Governance Forum Poster

by Prasad Krishna last modified Sep 24, 2013 09:35 AM

PDF document icon IGF a2.pdf — PDF document, 11476 kB (11752118 bytes)

DNA Poster 1

by Prasad Krishna last modified Sep 24, 2013 10:12 AM

PDF document icon DNA 1.pdf — PDF document, 205 kB (210890 bytes)

DNA Poster 2

by Prasad Krishna last modified Sep 24, 2013 10:14 AM

PDF document icon DNA2.pdf — PDF document, 200 kB (205486 bytes)

UID Poster 1

by Prasad Krishna last modified Sep 24, 2013 10:15 AM

PDF document icon UID 1.pdf — PDF document, 187 kB (191529 bytes)

UID Poster 2

by Prasad Krishna last modified Sep 24, 2013 10:17 AM

PDF document icon UID 2.pdf — PDF document, 235 kB (241347 bytes)

Privacy Events Poster

by Prasad Krishna last modified Sep 24, 2013 10:19 AM

PDF document icon Privacy Events.pdf — PDF document, 37 kB (38448 bytes)

CIS and International Coalition Calls upon Governments to Protect Privacy

by Elonnai Hickok last modified Sep 25, 2013 07:21 AM
The Centre for Internet and Society (CIS) along with the International Coalition has called upon governments across the globe to protect privacy.

On September 20 in Geneva, CIS joined a huge international coalition in calling upon countries across the globe, including India to assess whether national surveillance laws and activities are in line with their international human rights obligations.

The Centre for Internet and Society has endorsed a set of international principles against unchecked surveillance. The 13 Principles set out for the first time an evaluative framework for assessing surveillance practices in the context of international human rights obligations.

A group of civil society organizations officially presented the 13 Principles this past Friday in Geneva at a side event attended by Navi Pillay, the United Nations High Commissioner for Human Rights and the United Nations Special Rapporteur on Freedom of Expression and Opinion, Frank LaRue, during the 24th session of the Human Rights Council. The side event was hosted by the Permanent Missions of Austria, Germany, Liechtenstein, Norway, Switzerland and Hungary.

Elonnai Hickok, Programme Manager at the Centre for Internet and Society has noted that "the 13 Principles are an important first step towards informing governments, corporates, and individuals across jurisdictions, including India, about needed safeguards for surveillance practices and related policies to ensure that they are necessary and proportionate."

Navi Pillay, the United Nations High Commissioner for Human Rights, speaking at the Human Rights Council stated in her opening statement on September 9:

"Laws and policies must be adopted to address the potential for dramatic intrusion on individuals’ privacy which have been made possible by modern communications technology."

Navi Pillay, the United Nations High Commissioner for Human Rights, speaking at the event, said that:

"technological advancements have been powerful tools for democracy by giving access to all to participate in society, but increasing use of data mining by intelligence agencies blurs lines between legitimate surveillance and arbitrary mass surveillance."

Frank La Rue, the United Nations Special Rapporteur on Freedom of Expression and Opinion made clear the case for a direct relationship between state surveillance, privacy and freedom of expression in this latest report to the Human Rights Council:

"The right to privacy is often understood as an essential requirement for the realization of the right to freedom of expression. Undue interference with individuals’ privacy can both directly and indirectly limit the free development and exchange of ideas. … An infringement upon one right can be both the cause and consequence of an infringement upon the other."

Speaking at the event, the UN Special Rapporteur remarked that:

"previously surveillance was carried out on targeted basis but the Internet has changed the context by providing the possibility for carrying out mass surveillance. This is the danger."

Representatives of the Centre for Internet and Society, Privacy International, the Electronic Frontier Foundation,Access,Human Rights Watch,Reporters Without Borders, Association for Progressive Communications, and theCenter for Democracy and Technology all are taking part in the event.

Find out more about the Principles at https://NecessaryandProportionate.org

Contacts

NGOs currently in Geneva for the 24th Human Rights Council:

Access
Fabiola Carrion: [email protected]

Association for Progressive Communication
Shawna Finnegan: [email protected]

Center for Democracy and Technology
Matthew Shears: [email protected]

Electronic Frontier Foundation
Katitza Rodriguez:  [email protected] - @txitua

Human Rights Watch
Cynthia Wong: [email protected]

Privacy International
Carly Nyst: [email protected]

Reporters Without Borders
Lucie Morillon: [email protected]
Hélène Sackstein: [email protected]

Signatories

Argentina
Ramiro Alvarez: [email protected]
Asociación por los Derechos Civiles

Argentina
Beatriz Busaniche: [email protected]
Fundación Via Libre

Colombia
Carolina Botero: [email protected]
Fundación Karisma

Egypt
Ahmed Ezzat: [email protected]
Afteegypt

Honduras
Hedme Sierra-Castro: [email protected]
ACI-Participa

India
Elonnai Hickok: [email protected]
Center for Internet and Society

Korea
Prof. Park:  [email protected]
Open Net Korea

Macedonia
Bardhyl Jashari: [email protected]
Metamorphosis Foundation for Internet and Society

Mauritania, Senegal, Tanzania
Abadacar Diop: [email protected]
Jonction

Portugal
Andreia Martins: [email protected]
ASSOCIAÇÃO COOLPOLITICS

Peru
Miguel Morachimo: [email protected]
Hiperderecho

Russia
Andrei Soldatov: [email protected]
Agentura.ru

Serbia
Djordje Krivokapic: [email protected]
SHARE Foundation

Western Balkans
Valentina Pellizer: [email protected]
Oneworldsee

Brasil
Marcelo Saldanha: [email protected]
IBEBrasil

Bangalore + Social Good

by Prasad Krishna last modified Sep 25, 2013 07:41 AM

PDF document icon The Social Good Summit.pdf — PDF document, 181 kB (185484 bytes)

The National Cyber Security Policy: Not a Real Policy

by Bhairav Acharya last modified Sep 25, 2013 09:49 AM
Cyber security in India is still a nascent field without an organised law and policy framework. Several actors participate in and are affected by India's still inchoate cyber security regime. The National Cyber Security Policy (NCSP) presented the government and other stakeholders with an opportune moment to understand existing legal limitations before devising a future framework. Unfortunately, the NCSP's poor drafting and meaningless provisions do not advance the field.

This article was published in the Observer Research Foundation's Cyber Security Monitor Vol. I, Issue.1, August 2013.


For some time now, law and policy observers in India have been noticing a definite decline in the quality of national policies emanating from the Central Government. Unlike legislation, which is notionally subject to debate in the Parliament of India, policies face no public evaluation before they are brought in to force. Since, unlike legislation, policies are neither binding nor enforceable, there has been no principled ground for demanding public deliberation of significant national policies. While Parliament’s falling standard of competence has been almost unanimously condemned, there has been nearly no criticism of the corresponding failure of the Centre to invigilate the quality of the official policies of its ministries. Luckily for the drafters of the National Cyber Security Policy (NCSP), the rest of the country has also mostly failed to notice its poor content.

The NCSP was notified into effect on 2 July 2013 by the Department of Electronics and Information Technology – which calls itself DeitY – of the Ministry of Communications and Information Technology. As far as legislation and legal drafting go, DeitY has a dubious record. In March 2013, in a parliamentary appraisal of subordinate law framed by DeitY, a Lok Sabha committee found ambiguity, invasions of privacy and potentially illegal clauses. Apprehensions about statutory law administered by DeitY have also found their way to the Supreme Court of India, where a constitutional challenge to certain provisions of the Information Technology Act, 2000 (IT Act) continues. On more than one occasion, owing to poor drafting, DeitY has been forced to issue advisories and press releases to clarify the meaning of its laws. Ironically, the legal validity of these clarifications is also questionable.

A national policy must set out, in real and quantifiable terms, the objectives of the government in a particular field within a specified time frame. To do that, the policy must provide the social, economic, political and legal context prevalent at the time of its issue as well as a normative statement of factual conditions it seeks to achieve at the time of its expiry. Between these two points in time, the policy must identify and explain all the particular social, economic, political and legal measures it intends to implement to secure its success. Albeit concerned solely with economic growth, the Five-Year Plans – the Second and Tenth Plans in particular, without prejudice to their success or failure, are samples of policies that are well-drafted. In this background, the NCSP should be judged on the basis of how it addresses, in no particular order, national security, democratic freedoms, economic growth and knowledge development. Let us restrict ourselves to the first two issues.

There are broadly two intersections between national security and information technology; these are: (i) the security of networked communications used by the armed forces and intelligence services, and (ii) the storage of civil information of national importance. While the NCSP makes no mention of it, the adoption of the doctrine of network-centric warfare by the three armed forces is underway. Understanding the doctrine is simple – an intensive use of information technology to create networks of information aids situational awareness and enables collaboration to bestow an advantage in combat. However, the doctrine is vulnerable to asymmetric attack using both primitive and highly sophisticated means. Pre-empting such attacks should be a primary policy concern; not so, apparently, for the NCSP which is completely silent on this issue. The NCSP is slightly more forthcoming on the protection of critical information infrastructure of a civil nature. Critical information infrastructure, such as the national power grid or the Aadhar database, is narrowly defined in section 70 of the IT Act where it used to describe a protected system. Other provisions of the IT Act also deal with the protection of critical information infrastructure. The NCSP does not explain how these statutory provisions have worked or failed, as the case may be, to necessitate further mention in a policy document. For instance, section 70A of the IT Act, inserted in 2008, enables the creation of a national nodal agency to undertake research and development and other activities in respect of critical information infrastructure. Despite this, five years later, the NCSP makes a similar recommendation to operate a National Critical Information Infrastructure Protection Centre to undertake the same activities. In the absence of any meaningful explanation of intended policy measures, there is no reason to expect that the NCSP will succeed where an Act of Parliament has failed.

But, putting aside the shortcomings of its piece-meal provisions, the NCSP also fails to address high-level conceptual policy concerns. As information repositories and governance services through information technology become increasingly integrated and centralised, the security of the information that is stored or distributed decreases. Whether by intent or error, if these consolidated repositories of information are compromised, the quantity of information susceptible to damage is greater leading to higher insecurity. Simply put, if power transmission is centrally controlled instead of zonally, a single attack could black out the entire country instead of only a part of it. Or if personal data of citizens is centrally stored, a single leak could compromise the privacy of millions of people instead of only hundreds. Therefore, a credible policy must, before it advocates greater centralisation of information, examine the merits of diffused information storage to protect national security. The NCSP utterly fails in this regard.

Concerns short of national security, such as the maintenance of law and order, are also in issue because crime is often planned and perpetrated using information technology. The prevention of crime before it is committed and its prosecution afterwards is a key policy concern. While the specific context may vary depending on the nature of the crime – the facts of terrorism are different from those of insurance fraud – the principles of constitutional and criminal law continue to apply. However, the NCSP neither examines the present framework of cybersecurity-related offences nor suggests any changes in existing law. It merely calls for a “dynamic legal framework and its periodic review to address the cyber security challenges” (sic). This is self-evident, there was no need for a new national policy to make this discovery; and, ironically, it fails to conduct the very periodic review that it envisages. This is worrying because the NCSP presented DeitY with an opportunity to review existing laws and learn from past mistakes. There are concerns that cybersecurity laws, especially relevant provisions of the IT Act and its rules, betray a lack of understanding of India’s constitutional scheme. This is exemplified by the insertion, in 2008, of section 66A into the IT Act that criminalises the sending of annoying, offensive and inconvenient electronic messages without regard for the fact that free speech that is annoying is constitutionally protected.

In India, cybersecurity law and policy attempts to compensate for the state’s inability to regulate the internet by overreaching into and encroaching upon democratic freedoms. The Central Monitoring System (CMS) that is being assembled by the Centre is a case in point. Alarmed at its inability to be privy to private communications, the Centre proposes to build systems to intercept, in real time, all voice and data traffic in India. Whereas liberal democracies around the world require such interceptions to be judicially sanctioned, warranted and supported by probable cause, India does not even have statutory law to regulate such an enterprise. Given that, once completed, the CMS will represent the largest domestic interception effort in the world, the failure of the NCSP to examine the effect of such an exercise on daily cybersecurity is bewildering. This is made worse by the fact that the state does not possess the technological competence to build such a system by itself and is currently tendering private companies for equipment. The state’s incompetence is best portrayed by the activities of the Indian Computer Emergency Response Team (CERT-In) that was constituted under section 70B of the IT Act to respond to “cyber incidents”. CERT-In has repeatedly engaged in extra-judicial censorship and has ham-handedly responded to allegedly objectionable blogs or websites by blocking access to entire domains. Unfortunately, the NCSP, while reiterating the operations of CERT-In, attempts no evaluation of its activities precluding the scope for any meaningful policy measures.

The NCSP’s poor drafting, meaningless provisions, deficiency of analysis and lack of stated measures renders it hollow. Its notification into force adds little to the public or intellectual debate about cybersecurity and does nothing to further the trajectory of either national security or democratic freedoms in India. In fairness, this problem afflicts many other national policies. There is a need to revisit the high intellectual and practical standards set by most national policies that were issued in the years following Independence.

India:Privacy in Peril

by Bhairav Acharya last modified Sep 25, 2013 09:56 AM
The danger of mass surveillance in India is for real. The absence of a regulating law is damning for Indians who want to protect their privacy against the juggernaut of state and private surveillance.
India:Privacy in Peril

The police browsing case details using the Crime and Criminal Tracking Network and Systems Technology in Hyderabad. Photo:K. RAMESH BABU


The article was originally published in the Frontline on July 12, 2013.


At the concluding scene of his latest movie, Superman disdainfully flings a surveillance drone down to earth in front of a horrified general. “You can’t control me,” he tells his military minder. “You can’t find out where I hang up my cape.” This exchange goes to the crux of surveillance: control. Surveillance is the means by which nation-states exercise control over people. If the logical basis of the nation-state is the establishment and maintenance of homogeneity, it is necessary to detect and interdict dissent before it threatens the boundedness and continuity of the national imagination. This imagination often cannot encompass diversity, so it constructs categories of others that include dissenters and outsiders. Admittedly, this happens less in India because the foundation of the Indian nation-state imagined a diverse society expressing a plurality of ideas in a variety of languages secured by a syncretic and democratic government that protected individual freedoms. Unfortunately, this vision is still to be realised, and the foundational idea of India continues to be challenged by poor governance, poverty, insurgencies and rebellion. Consequently, surveillance is, for the modern nation-state, a condicio sine qua non—an essential element without which it will eventually cease to exist. The challenge for democratic nation-states is to find the optimal balance between surveillance and the duty to protect the freedoms of its citizens.

History of wiretaps

Some countries, such as the United States, have assembled a vast apparatus of surveillance to monitor the activities of their citizens and foreigners. Let us review the recent controversy revealed by the whistle-blower Edward Snowden. In 1967, the U.S. Supreme Court ruled in Katz vs United States that wiretaps had to be warranted, judicially sanctioned and supported by probable cause. This resulted in the passage of the Wiretap Act of 1968 that regulated domestic surveillance. Following revelations that Washington was engaging in unrestricted foreign surveillance in the context of the Vietnam war and anti-war protests, the U.S. Congress enacted the Foreign Intelligence Surveillance Act (FISA) in 1978. FISA gave the U.S. government the power to conduct, without judicial sanction, surveillance for foreign intelligence information; and, with judicial sanction from a secret FISA court, surveillance of anybody if the ultimate target was a foreign power. Paradoxically, even a U.S. citizen could be a foreign power in certain circumstances. Domestically, FISA enabled secret warrants for specific items of information such as library book borrowers and car rentals.

Following the 9/11 World Trade Centre attacks, Congress enacted the Patriot Act of 2001, Section 215 of which dramatically expanded the scope of FISA to allow secret warrants to conduct surveillance in respect of “any tangible thing” that was relevant to a national security investigation. In exercise of this power, a secret FISA court issued secret warrants ordering a number of U.S. companies to share, in real time, voice and data traffic with the National Security Agency (NSA). We may never know the full scope of the NSA’s surveillance, but we know this: (a) Verizon Communications, a telecommunications major, was ordered to provide metadata for all telephone calls within and without the U.S.; (b) the NSA runs a clandestine programme called PRISM that accesses Internet traffic, such as e-mails, web searches, forum comments and blogs, in real time; and (c) the NSA manages a comprehensive data analysis system called Boundless Informant that intercepts and analyses voice and data traffic around the world and subjects them to automated pattern recognition. The documents leaked by Snowden allege that Google, Facebook, Apple, Dropbox, Microsoft and Yahoo! participate in PRISM, but these companies have denied their involvement.

India fifth-most monitored

How does this affect India? The Snowden documents reveal that India is the NSA’s fifth-most monitored country after Iran, Pakistan, Jordan and Egypt. Interestingly, China is monitored less than India. Several billion pieces of data from India, such as e-mails and telephone metadata, were intercepted and monitored by the NSA. For Indians, it is not inconceivable that our e-mails, should they be sent using Gmail, Yahoo! Mail or Hotmail, or our documents, should we be subscribing to Dropbox, or our Facebook posts, are being accessed and read by the NSA. Incredibly, most Indian governmental communication, including that of Ministers and senior civil servants, use private U.S. e-mail services. We no longer enjoy privacy online. The question of suspicious activity, irrespective of the rubric under which suspicion is measured, is moot. Any use of U.S. service providers is potentially compromised since U.S. law permits intrusive dragnet surveillance against foreigners. This clearly reveals a dichotomy in U.S. constitutional law: the Fourth Amendment’s guarantees of privacy, repeatedly upheld by U.S. courts, protect U.S. citizens to a far greater extent than they do foreigners. It is natural for a nation-state to privilege the rights of its citizens over others. As Indians, therefore, we must clearly look out for ourselves.

Privacy and personal liberty

Unfortunately, India does not have a persuasive jurisprudence of privacy protection. In the Kharak Singh (1964) and Gobind (1975) cases, the Supreme Court of India considered the question of privacy from physical surveillance by the police in and around homes of suspects. In the latter case, the court found that some of the Fundamental Rights “could be described as contributing to the right to privacy”, which was subject to a compelling public interest. This insipid inference held the field until 1994 when, in the Rajagopal (“Auto Shankar”, 1994) case, the Supreme Court, for the first time, directly located privacy within the ambit of the right to personal liberty recognised by Article 21 of the Constitution. However, Rajagopal dealt specifically with the publication of an autobiography, it did not consider the privacy of communications. In 1997, the Supreme Court considered the question of wiretaps in the People’s Union for Civil Liberties (PUCL) case. While finding that wiretaps invaded the privacy of communications, it continued to permit them subject to some procedural safeguards which continue to be routinely ignored. A more robust statement of the right to privacy was made by the Delhi High Court in the Naz Foundation case (2011) that decriminalised consensual homosexual acts; however, there is an appeal against the judgment in the Supreme Court.

Legislative silence

Judicial vagueness has been compounded by legislative silence. India does not have a law to operationalise a right to privacy. Consequently, a multitude of laws permit daily infractions of privacy. These infractions have survived because they are diverse, dissipated and quite disorganised. However, the technocratic impulse to centralise and consolidate surveillance and data collection has, in recent years, alarmed many citizens. The state hopes to, through enterprises such as the Central Monitoring System (CMS), the Crime and Criminals Tracking Network and System (CCTNS), the National Intelligence Grid (NATGRID), the Telephone Call Interception System (TCIS) and the Unique Identification Number (UID), replicate the U.S. successes in surveillance and monitoring and profiling all its citizens. However, unlike the U.S., India proposes to achieve this without an enabling law. Let us consider the CMS. No documents have been made available that indicate the scope and size of the CMS.

From a variety of police tenders for private equipment, it appears that the Central government hopes to put in place a system that will intercept, in real time, all voice and data traffic originating or terminating in India or being carried by Indian service providers. This data will be subject to pattern recognition and other automated tests to detect emotional markers, such as hate, compassion or intent. The sheer scale of this enterprise is intimidating; all communications in India’s many languages will be subject to interception and testing designed to detect different forms of dissent. This mammoth exercise in monitoring is taking place—it is understood that some components of the CMS are already operational—without statutory sanction. No credible authorities exist to supervise this exercise, no avenues for redress have been identified and no consequences have been laid down for abuse.

Statutory Surveillance

In a recent interview, Milind Deora, Minister of State for Communications and Information Technology, dismissed public scepticism of the CMS saying that direct state access to private communications was better for privacy since it reduced dependence on the interception abilities of private service providers. This circular argument is both disingenuous and incorrect. No doubt, trusting private persons with the power to intercept and store the private data of citizens is flawed. The leaking of the Niira Radia tapes, which contain the private communications of Niira Radia taped on the orders of the Income Tax Department, testifies to this flaw. However, bypassing private players to enable direct state access to private communications will preclude leaks and, thereby, remove from public knowledge the fact of surveillance. This messy situation may be obviated by a regime of statutory regulation of warranted surveillance by an independent and impartial authority. This system is favoured by liberal democracies around the world but conspicuously resisted by the Indian government.

The question of privacy legislation was recently considered by a committee chaired by Justice Ajit Prakash Shah, a former judge of the Delhi High Court who sat on the Bench that delivered the Naz Foundation judgment. The Shah Committee was constituted by the Planning Commission for a different reason: the need to protect personal data that are outsourced to India for processing. The lack of credible privacy law, it is foreseen, will result in European and other foreign personal data being sent to other attractive processing destinations, such as Vietnam, Israel or the Philippines, resulting in the decline of India’s outsourcing industry. However, the Shah Committee also noted the absence of law sufficient to protect against surveillance abuses. Most importantly, the Shah Committee formulated nine national privacy principles to inform any future privacy legislation (see story on page 26). In 2011, the Department of Personnel and Training (DoPT) of the Ministry of Human Resource Development, the same Ministry entrusted with implementing the Right to Information Act, 2005, leaked a draft privacy Bill, marked ‘Secret’, on the Internet. The DoPT Bill received substantive criticism from the Attorney General and some government Secretaries for the clumsy drafting. A new version of the DoPT Bill is reported to have been drafted and sent to the Ministry of Law for consideration. This revised Bill, which presumably contains chapters to regulate surveillance, including the interception of communications, has not been made public.

The need for privacy legislation cannot be overstated. The Snowden affair reveals the extent of possible state surveillance of private communications. For Indians who must now explore ways to protect their privacy against the juggernaut of state and private surveillance, the absence of regulatory law is damning. Permitting, through public inaction, unwarranted and non-targetted dragnet surveillance by the Indian state without reasonable cause would be an act of surrender of far-reaching implications.

Information, they say, is power. Allowing governments to exercise this power over us without thought for the rule of law constitutes the ultimate submission possible in a democratic nation-state. And, since superheroes are escapist fantasies, without the prospect of good laws we will all be subordinate to a new national imagination of control and monitoring, surveillance and profiling. If allowed to come to pass, this will be a betrayal of the foundational idea of India as a free and democratic republic tolerant of dissent.


Bhairav Acharya is a constitutional lawyer practising in the Supreme Court of India. He advises the Centre for Internet & Society, Bangalore, on privacy law and other constitutional issues.

The Central Monitoring System: Some Questions to be Raised in Parliament

by Bhairav Acharya last modified Sep 25, 2013 10:30 AM
The following are some model questions to be raised in the Parliament regarding the lack of transparency in the central monitoring system.

Preliminary

  • The Central Monitoring System (CMS) is a Central Government project to intercept communications, both voice and data, that is transmitted via telephones and the internet to, from and within India. Owing to the vast nature of this enterprise, the CMS cannot be succinctly described and the many issues surrounding this project are diverse. This Issue Brief will outline preliminary constitutional, legal and technical concerns that are presented by the CMS.
  • At the outset, it must be clearly understood that no public documentation exists to explain the scope, functions and technical architecture of the CMS. This lack of transparency is the single-largest obstacle to understanding the Central Government’s motives in conceptualising and operationalizing the CMS. This lack of public documentation is also the chief reason for the brevity of this Issue Note. Without making public the policy, law and technical abilities of the CMS, there cannot be an informed national debate on the primary concerns posed by the CMS, i.e the extent of envisaged state surveillance upon Indian citizens and the safeguards, if any, to protect the individual right to privacy.

Surveillance and Privacy

  • Surveillance is necessary to secure political organisation. Modern nation-states, which are theoretically organised on the basis of shared national and societal characteristics, require surveillance to detect threats to these characteristics. In democratic societies, beyond the immediate requirements of national integrity and security, surveillance must be targeted at securing the safety and rights of individual citizens. This Issue Brief does not dispute the fact that democratic countries, such as India, should conduct surveillance to secure legitimate ends. Concerns, however, arise when surveillance is conducted in a manner unrestricted and unregulated by law; these concerns are compounded when a lack of law is accompanied by a lack of transparency.
  • Technological advancement leads to more intrusive surveillance. The evolution of surveillance in the United States resulted, in 1967, in the first judicial recognition of the right to privacy. In Katz v. United States the US Supreme Court ruled that the privacy of communications had to be balanced with the need to conduct surveillance; and, therefore, wiretaps had to be warranted, judicially sanctioned and supported by probable cause. Katz expanded the scope of the Fourth Amendment of the US Constitution, which protected against unreasonable searches and seizures. Most subsequent US legal developments relating to the privacy of communications from surveillance originate in the Katz judgement. Other common law countries, such as the United Kingdom and Canada, have experienced similar judicial evolution to recognise that the right to privacy must be balanced with governance.


Right to Privacy in India

  • Unfortunately, India does not have a persuasive jurisprudence of privacy protection. In the Kharak Singh (1964) and Gobind (1975) cases, the Supreme Court of India considered the question of privacy from physical surveillance by the police in and around the homes of suspects. In the latter case, the Supreme Court found that some of the Fundamental Rights “could be described as contributing to the right to privacy” which was nevertheless subject to a compelling public interest. This insipid inference held the field until 1994 when, in the Rajagopal (“Auto Shankar”, 1994) case, the Supreme Court, for the first time, directly located privacy within the ambit of the right to personal liberty recognised by Article 21 of the Constitution. However, Rajagopal dealt specifically with the publication of an autobiography, it did not consider the privacy of communications. In 1997, the Supreme Court considered the question of wiretaps in the PUCL case. While finding that wiretaps invaded the privacy of communications, it continued to permit them subject to some procedural safeguards which continue to be routinely ignored. A more robust statement of the right to privacy was made recently by the Delhi High Court in the Naz Foundation case (2011) that de-criminalised consensual homosexual acts; however, this judgment has been appealed to the Supreme Court.

Issues Pertaining to the CMS

  • While judicial protection from physical surveillance was cursorily dealt with in the Kharak Singh and Gobind cases, the Supreme Court of India directly considered the issue of wiretaps in the PUCL case. Wiretaps in India primarily occur on the strength of powers granted to certain authorities under section 5(2) of the Indian Telegraph Act, 1885. The Court found that the Telegraph Act, and Rules made thereunder, did not prescribe adequate procedural safeguards to create a “just and fair” mechanism to conduct wiretaps. Therefore, it laid down the following procedure to conduct wiretaps:

(a) the order should be issued by the relevant Home Secretary (this power is delegable to a Joint Secretary),
(b) the interception must be carried out exactly in terms of the order and not in excess of it,
(c) a determination of whether the information could be reasonably secured by other means,
(d) the interception shall cease after sixty (60) days.

  • Therefore, prima facie, any voice interception conducted through the CMS will be in violation of this Supreme Court judgement. The CMS will enforce blanket surveillance upon the entire country without regard for reasonable cause or necessity. This movement away from targeted surveillance to blanket surveillance without cause, conducted without statutory sanction and without transparency, is worrying.
  • Accordingly, the following questions may be raised, in Parliament, to learn more about the CMS project:
  1. Which statutes, Government Orders, notifications etc deal with the establishment and maintenance of the CMS?
  2. Which is the nodal agency in charge of implementing the CMS?
  3. What are the powers and functions of the nodal agency?
  4. What guarantees exist to protect ordinary Indian citizens from intrusive surveillance without cause?
  5. What are the technical parameters of the CMS?
  6. What are the consequences for misuse or abuse of powers by any person working in the CMS project?
  7. What recourse is available to Indian citizens against whom there is unnecessary surveillance or against whom there has been a misuse or abuse of power?

CYFY 2013 Event Brochure

by Prasad Krishna last modified Sep 26, 2013 06:49 AM

PDF document icon cyfy flyer prog.pdf — PDF document, 854 kB (875361 bytes)

Privacy Timeline

by Prasad Krishna last modified Sep 26, 2013 10:08 AM

PDF document icon Timeline.pdf — PDF document, 42 kB (43167 bytes)

Privacy Roundtable Delhi (October)

by Prasad Krishna last modified Sep 27, 2013 12:52 PM

PDF document icon Invite-Delhi.pdf — PDF document, 2395 kB (2453051 bytes)

Privacy Protection Bill (September 2013)

by Prasad Krishna last modified Sep 27, 2013 02:03 PM

PDF document icon Privacy (Protection) Bill - 20 Sep 2013.pdf — PDF document, 199 kB (204657 bytes)

Privacy (Protection) Bill, 2013: Updated Third Draft

by Bhairav Acharya last modified Oct 01, 2013 12:25 PM
The Centre for Internet and Society has been researching privacy in India since 2010 with the objective of raising public awareness around privacy, completing in depth research, and driving a privacy legislation in India. As part of this work, we drafted the Privacy (Protection) Bill, 2013.

This research is being undertaken as part of the 'SAFEGUARDS' project that CIS is doing with Privacy International and IDRC. The following is the latest version with changes based on the Round Table held on August 24:


[Preamble]

CHAPTER I

Preliminary

1. Short title, extent and commencement. – (1) This Act may be called the Privacy (Protection) Act, 2013.

(2) It extends to the whole of India.

(3) It shall come into force on such date as the Central Government may, by notification in the Official Gazette, appoint.

2. Definitions. – In this Act and in any rules made thereunder, unless the context otherwise requires, –

(a) “anonymise” means, in relation to personal data, the removal of all data that may, whether directly or indirectly in conjunction with any other data, be used to identify the data subject;

(b) “appropriate government” means, in relation the Central Government or a Union Territory Administration, the Central Government; in relation a State Government, that State Government; and, in relation to a public authority which is established, constituted, owned, controlled or substantially financed by funds provided directly or indirectly –

(i) by the Central Government or a Union Territory Administration, the Central Government;

(ii) by a State Government, that State Government;

(c) “authorised officer” means an officer, not below the rank of a Gazetted Officer, of an All India Service or a Central Civil Service, as the case may be, who is empowered by the Central Government, by notification in the Official Gazette, to intercept a communication of another person or carry out surveillance of another person under this Act;

(d) “biometric data” means any data relating to the physical, physiological or behavioural characteristics of a person which allow their unique identification including, but not restricted to, facial images, finger prints, hand prints, foot prints, iris recognition, hand writing, typing dynamics, gait analysis and speech recognition;

(e) “Chairperson” and “Member” mean the Chairperson and Member appointed under sub-section (1) of section 17;

(f) “collect”, with its grammatical variations and cognate expressions, means, in relation to personal data, any action or activity that results in a data controller obtaining, or coming into the possession or control of, any personal data of a data subject;

(g) “communication” means a word or words, spoken, written or indicated, in any form, manner or language, encrypted or unencrypted, meaningful or otherwise, and includes visual representations of words, ideas, symbols and images, whether transmitted or not transmitted and, if transmitted, irrespective of the medium of transmission;

(h) “competent organisation” means an organisation or public authority listed in the Schedule;

(i) “data controller” means a person who, either alone or jointly or in concert with other persons, determines the purposes for which and the manner in which any personal data is processed;

(j) “data processor” means any person who processes any personal data on behalf of a data controller;

(k) “Data Protection Authority” means the Data Protection Authority constituted under sub-section (1) of section 17;

(l) “data subject” means a person who is the subject of personal data;

(m) “deoxyribonucleic acid data” means all data, of whatever type, concerning the characteristics of a person that are inherited or acquired during early prenatal development;

(n) “destroy”, with its grammatical variations and cognate expressions, means, in relation to personal data, to cease the existence of, by deletion, erasure or otherwise, any personal data;

(o) “disclose”, with its grammatical variations and cognate expressions, means, in relation to personal data, any action or activity that results in a person who is not the data subject coming into the possession or control of that personal data;

(p) “intelligence organisation” means an intelligence organisation under the Intelligence Organisations (Restriction of Rights) Act, 1985 (58 of 1985);

(q) “interception” or “intercept” means any activity intended to capture, read, listen to or understand the communication of a person;

(r) “personal data” means any data which relates to a natural person if that person can, whether directly or indirectly in conjunction with any other data, be identified from it and includes sensitive personal data;

(s) “prescribed” means prescribed by rules made under this Act;

(t) “process”, with its grammatical variations and cognate expressions, means, in relation to personal data, any action or operation which is performed upon personal data, whether or not by automated means including, but not restricted to, organisation, structuring, adaptation, modification, retrieval, consultation, use, alignment or destruction;

(u) “receive”, with its grammatical variations and cognate expressions, means, in relation to personal data, to come into the possession or control of any personal data;

(v) “sensitive personal data” means personal data as to the data subject’s –

(i) biometric data;

(ii) deoxyribonucleic acid data;

(iii) sexual preferences and practices;

(iv) medical history and health;

(v) political affiliation;

(vi) commission, or alleged commission, of any offence;

(vii) ethnicity, religion, race or caste; and

(viii) financial and credit information.

(w) “store”, with its grammatical variations and cognate expressions, means, in relation to personal data, to retain, in any form or manner and for any purpose or reason, any personal data;

(x) “surveillance” means any activity intended to watch, monitor, record or collect, or to enhance the ability to watch, record or collect, any images, signals, data, movement, behaviour or actions, of a person, a group of persons, a place or an object, for the purpose of obtaining information of a person;

and all other expressions used herein shall have the meanings ascribed to them under the General Clauses Act, 1897 (10 of 1897) or the Code of Criminal Procedure, 1973 (2 of 1974), as the case may be.

CHAPTER II

Regulation of Personal Data

3. Regulation of personal data. – Notwithstanding anything contained in any other law for time being in force, no person shall collect, store, process, disclose or otherwise handle any personal data of another person except in accordance with the provisions of this Act and any rules made thereunder.

4. Exemption. – Nothing in this Act shall apply to the collection, storage, processing or disclosure of personal data for personal or domestic use.

CHAPTER III

Protection of Personal Data

5. Regulation of collection of personal data. – (1) No personal data of a data subject shall be collected except in conformity with section 6 and section 7.

(2) No personal data of a data subject may be collected under this Act unless it is necessary for the achievement of a purpose of the person seeking its collection.

(3) Subject to section 6 and section 7, no personal data may be collected under this Act prior to the data subject being given notice, in such and form and manner as may be prescribed, of the collection.

6. Collection of personal data with prior informed consent. – (1) Subject to sub-section (2), a person seeking to collect personal data under this section shall, prior to its collection, obtain the consent of the data subject.

(2) Prior to a collection of personal data under this section, the person seeking its collection shall inform the data subject of the following details in respect of his personal data, namely: –

(a) when it will be collected;

(b) its content and nature;

(c) the purpose of its collection;

(d) the manner in which it may be accessed, checked and modified;

(e) the security practices, privacy policies and other policies, if any, to which it will be subject;

(f) the conditions and manner of its disclosure; and

(g) the procedure for recourse in case of any grievance in relation to it.

(3) Consent to the collection of personal data under this section may be obtained from the data subject in any manner or medium but shall not be obtained as a result of a threat, duress or coercion:

Provided that the data subject may, at any time after his consent to the collection of personal data has been obtained, withdraw the consent for any reason whatsoever and all personal data collected following the original grant of consent shall be destroyed forthwith:

Provided that the person who collected the personal data in respect of which consent is subsequently withdrawn may, if the personal data is necessary for the delivery of any good or the provision of any service, not deliver that good or deny that service to the data subject who withdrew his grant of consent.

7. Collection of personal data without prior consent. – Personal data may be collected without the prior consent of the data subject if it is –

(a) necessary for the provision of an emergency medical service to the data subject;

(b) required for the establishment of the identity of the data subject and the collection is authorised by a law in this regard;

(c) necessary to prevent a reasonable threat to national security, defence or public order; or

(d) necessary to prevent, investigate or prosecute a cognisable offence.

8. Regulation of storage of personal data. – (1) No person shall store any personal data for a period longer than is necessary to achieve the purpose for which it was collected or received, or, if that purpose is achieved or ceases to exist for any reason, for any period following such achievement or cessation.

(2) Save as provided in sub-section (3), any personal data collected or received in relation to the achievement of a purpose shall, if that purpose is achieved or ceases to exist for any reason, be destroyed forthwith.

(3) Notwithstanding anything contained in this section, any personal data may be stored for a period longer than is necessary to achieve the purpose for which it was collected or received, or, if that purpose has been achieved or ceases to exist for any reason, for any period following such achievement or cessation, if –

(a) the data subject grants his consent to such storage prior to the purpose for which it was collected or received being achieved or ceasing to exist;

(b) it is adduced for an evidentiary purpose in a legal proceeding; or

(c) it is required to be stored under the provisions of an Act of Parliament:

Provided that only that amount of personal data that is necessary to achieve the purpose of storage under this sub-section shall be stored and any personal data that is not required to be stored for such purpose shall be destroyed forthwith:

Provided further that any personal data stored under this sub-section shall, to the extent possible, be anonymised.

9. Regulation of processing of personal data. – (1) No person shall process any personal data that is not necessary for the achievement of the purpose for which it was collected or received.

(2) Save as provided in sub-section (3), no personal data shall be processed for any purpose other than the purpose for which it was collected or received.

(3) Notwithstanding anything contained in this section, any personal data may be processed for a purpose other than the purpose for which it was collected or received if –

(a) the data subject grants his consent to the processing and only that amount of personal data that is necessary to achieve the other purpose is processed;

(b) it is necessary to perform a contractual duty to the data subject;

(c) it is necessary to prevent a reasonable threat to national security, defence or public order; or

(d) it necessary to prevent, investigate or prosecute a cognisable offence.

10. Transfer of personal data for processing. – (1) Subject to the provisions of this section, personal data that has been collected in conformity with this Act may be transferred by a data controller to a data processor, whether located in India or otherwise, if the transfer is pursuant to an agreement that explicitly binds the data processor to same or stronger measures in respect of the storage, processing, destruction, disclosure and other handling of the personal data as are contained in this Act.

(2) No data processor shall process any personal data transferred under this section except to achieve the purpose for which it was collected.

(3) A data controller that transfers personal data under this section shall remain liable to the data subject for the actions of the data processor.

11. Security of personal data and duty of confidentiality. – (1) No person shall collect, receive, store, process or otherwise handle any personal data without implementing measures, including, but not restricted to, technological, physical and administrative measures, adequate to secure its confidentiality, secrecy, integrity and safety, including from theft, loss, damage or destruction.

(2) Data controllers and data processors shall be subject to a duty of confidentiality and secrecy in respect of personal data in their possession or control.

(3) Without prejudice to the provisions of this section, a data controller or data processor shall, if the confidentiality, secrecy, integrity or safety of personal data in its possession or control is violated by theft, loss, damage or destruction, or as a result of any disclosure contrary to the provisions of this Act, or for any other reason whatsoever, notify the data subject, in such form and manner as may be prescribed, forthwith.

12. Regulation of disclosure of personal data. – Subject to section 10, section 13 and section 14, no person shall disclose, or otherwise cause any other person to receive, the content or nature of any personal data that has been collected in conformity with this Act.

13. Disclosure of personal data with prior informed consent. – (1) Subject to sub-section (2), a data controller or data processor seeking to disclose personal data under this section shall, prior to its disclosure, obtain the consent of the data subject.

(2) Prior to a disclosure of personal data under this section, the data controller or data processor, as the case may be, seeking to disclose the personal data, shall inform the data subject of the following details in respect of his personal data, namely: –

(a) when it will be disclosed;

(b) the purpose of its disclosure;

(c) the security practices, privacy policies and other policies, if any, that will protect it; and

(d) the procedure for recourse in case of any grievance in relation to it.

14. Disclosure of personal data without prior consent. – (1) Subject to sub-section (2), personal data may be disclosed without the prior consent of the data subject if it is necessary –

(a) to prevent a reasonable threat to national security, defence or public order; or

(b) to prevent, investigate or prosecute a cognisable offence.

(2) No data controller or data processor shall disclose any personal data unless it has received an order in writing from a police officer not below the rank of [___] in such form and manner as may be prescribed:

Provided that an order for the disclosure of personal data made under this sub-section shall not require the disclosure of any personal data that is not necessary to achieve the purpose for which the disclosure is sought:

Provided further that the data subject shall be notified, in such form and manner as may be prescribed, of the disclosure of his personal data, including details of its content and nature, and the identity of the police officer who ordered its disclosure, forthwith.

15. Quality and accuracy of personal data. – (1) Each data controller and data processor shall, to the extent possible, ensure that the personal data in its possession or control, is accurate and, where necessary, is kept up to date.

(2) No data controller or data processor shall deny a data subject whose personal data is in its possession or control the opportunity to review his personal data and, where necessary, rectify anything that is inaccurate or not up to date.

(3) A data subject may, if he finds personal data in the possession or control of a data controller or data processor that is not necessary to achieve the purpose for which it was collected, received or stored, demand its destruction, and the data controller shall destroy, or cause the destruction of, the personal data forthwith.

16. Special provisions for sensitive personal data. – Notwithstanding anything contained in this Act and the provisions of any other law for the time being in force –

(a) no person shall store sensitive personal data for a period longer than is necessary to achieve the purpose for which it was collected or received, or, if that purpose has been achieved or ceases to exist for any reason, for any period following such achievement or cessation;

(b) no person shall process sensitive personal data for a purpose other than the purpose for which it was collected or received;

(c) no person shall disclose sensitive personal data to another person, or otherwise cause any other person to come into the possession or control of, the content or nature of any sensitive personal data, including any other details in respect thereof.

CHAPTER IV

The Data Protection Authority

17. Constitution of the Data Protection Authority. – (1) The Central Government shall, by notification, constitute, with effect from such date as may be specified therein, a body to be called the Data Protection Authority consisting of a Chairperson and not more than four other Members, to exercise the jurisdiction and powers and discharge the functions and duties conferred or imposed upon it by or under this Act.

(2) The Chairperson shall be a person who has been a Judge of the Supreme Court:

Provided that the appointment of the Chairperson shall be made only after consultation with the Chief Justice of India.

(3) Each Member shall be a person of ability, integrity and standing who has a special knowledge of, and professional experience of not less than ten years in privacy law and policy.

18. Term of office, conditions of service, etc. of Chairperson and Members. – (1) Before appointing any person as the Chairperson or Member, the Central Government shall satisfy itself that the person does not, and will not, have any such financial or other interest as is likely to affect prejudicially his functions as such Chairperson or Member.

(2) The Chairperson and every Member shall hold office for such period, not exceeding five years, as may be specified in the order of his appointment, but shall be eligible for reappointment:

Provided that no person shall hold office as the Chairperson or Member after he has attained the age of sixty-seven years.

(3) Notwithstanding anything contained in sub-section (2), the Chairperson or any Member may –

(a) by writing under his hand resign his office at any time;

(b) be removed from office in accordance with the provisions of section 19 of this Act.

(4) A vacancy caused by the resignation or removal of the Chairperson or Member under sub-section (3) shall be filled by fresh appointment.

(5) In the event of the occurrence of a vacancy in the office of the Chairperson, such one of the Members as the Central Government may, by notification, authorise in this behalf, shall act as the Chairperson till the date on which a new Chairperson, appointed in accordance with the provisions of this Act, to fill such vacancy, enters upon his office.

(6) When the Chairperson is unable to discharge his functions owing to absence, illness or any other cause, such one of the Members as the Chairperson may authorise in writing in this behalf shall discharge the functions of the Chairperson, till the date on which the Chairperson resumes his duties.

(7) The salaries and allowances payable to and the other terms and conditions of service of the Chairperson and Members shall be such as may be prescribed:

Provided that neither the salary and allowances nor the other terms and conditions of service of the Chairperson and any member shall be varied to his disadvantage after his appointment.

19. Removal of Chairperson and Members from office in certain circumstances. – The Central Government may remove from office the Chairperson or any Member, who –

(a) is adjudged an insolvent; or

(b) engages during his term of office in any paid employment outside the duties of his office; or

(c) is unfit to continue in office by reason of infirmity of mind or body; or

(d) is of unsound mind and stands so declared by a competent court; or

(e) is convicted for an offence which in the opinion of the President involves moral turpitude; or

(f) has acquired such financial or other interest as is likely to affect prejudicially his functions as a Chairperson or Member, or

(g) has so abused his position as to render his continuance in offence prejudicial to the public interest.

20. Functions of the Data Protection Authority. – (1) The Chairperson may inquire, suo moto or on a petition presented to it by any person or by someone acting on his behalf, in respect of any matter connected with the collection, storage, processing, disclosure or other handling of any personal data and give such directions or pass such orders as are necessary for reasons to be recorded in writing.

(2) Without prejudice to the generality of the foregoing provision, the Data Protection Authority shall perform all or any of the following functions, namely –

(a) review the safeguards provided by or under this Act and other law for the time being       in force for the protection of personal data and recommend measures for their effective implementation;

(b) review any measures taken by any entity for the protection of personal data and take such further action is it deems fit;

(c) review any action, policy or procedure of any entity to ensure compliance with this Act and any rules made hereunder;

(d) formulate, in consultation with experts, norms for the effective protection of personal data;

(e) promote awareness and knowledge of personal data protection through any means necessary;

(f) undertake and promote research in the field of protection of personal data;

(g) encourage the efforts of non-governmental organisations and institutions working in the field of personal data protection;

(h) publish periodic reports concerning the incidence of collection, processing, storage, disclosure and other handling of personal data;

(i) such other functions as it may consider necessary for the protection of personal data.

(3) Subject to the provisions of any rules prescribed in this behalf by the Central Government, the Data Protection Authority shall have the power to review any decision, judgement, decree or order made by it.

(4) In the exercise of its functions under this Act, the Data Protection Authority shall give such directions or pass such orders as are necessary for reasons to be recorded in writing.

(5) The Data Protection Authority may, in its own name, sue or be sued.

21. Secretary, officers and other employees of the Data Protection Authority. – (1) The Central Government shall appoint a Secretary to the Data Protection Authority to exercise and perform, under the control of the Chairperson such powers and duties as may be prescribed or as may be specified by the Chairperson.

(2) The Central Government may provide the Data Protection Authority with such other officers and employees as may be necessary for the efficient performance of the functions of the Data Protection Authority.

(3) The salaries and allowances payable to and the conditions of service of the Secretary and other officers and employees of the Data Protection Authority shall be such as may be prescribed.

22. Salaries, etc. be defrayed out of the Consolidated Fund of India. – The salaries and allowances payable to the Chairperson and Members and the administrative expenses, including salaries, allowances and pension, payable to or in respect of the officers and other employees of the of the Data Protection Authority shall be defrayed out of the Consolidated Fund of India.

23. Vacancies, etc. not to invalidate proceedings of the Data Protection Authority. – No act or proceeding of the Data Protection Authority shall be questioned on the ground merely of the existence of any vacancy or defect in the constitution of the Data Protection Authority or any defect in the appointment of a person acting as the Chairperson or Member.

24. Chairperson, Members and employees of the Data Protection Authority to be public servants. – The Chairperson and Members and other employees of the Data Protection Authority shall be deemed to be public servants within the meaning of section 21 of the Indian Penal Code, 1860 (45 of 1860).

25. Location of the office of the Data Protection Authority. The offices of the Data Protection Authority shall be in [___] or any other location as directed by the Chairperson in consultation with the Central Government.

26. Procedure to be followed by the Data Protection Authority. – (1) Subject to the provisions of this Act, the Data Protection Authority shall have powers to regulate –

(a) the procedure and conduct of its business;

(b) the delegation to one or more Members of such powers or functions as the Chairperson may specify.

(2) In particular and without prejudice to the generality of the foregoing provisions, the powers of the Data Protection Authority shall include the power to determine the extent to which persons interested or claiming to be interested in the subject-matter of any proceeding before it may be allowed to be present or to be heard, either by themselves or by their representatives or to cross-examine witnesses or otherwise take part in the proceedings:

Provided that any such procedure as may be prescribed or followed shall be guided by the principles of natural justice.

27. Power relating to inquiries. – (1) The Data Protection Authority shall, for the purposes of any inquiry or for any other purpose under this Act, have the same powers as vested in a civil court under the Code of Civil Procedure, 1908 (5 of 1908), while trying suits in respect of the following matters, namely –

(a) the summoning and enforcing the attendance of any person from any part of India and examining him on oath;

(b) the discovery and production of any document or other material object producible as evidence;

(c) the reception of evidence on affidavit;

(d) the requisitioning of any public record from any court or office;

(e) the issuing of any commission for the examination of witnesses; and,

(f) any other matter which may be prescribed.

(2) The Data Protection Authority shall have power to require any person, subject to any privilege which may be claimed by that person under any law for the time being in force, to furnish information on such points or matters as, in the opinion of the Data Protection Authority, may be useful for, or relevant to, the subject matter of an inquiry and any person so required shall be deemed to be legally bound to furnish such information within the meaning of section 176 and section 177 of the Indian Penal Code, 1860 (45 of 1860).

(3) The Data Protection Authority or any other officer, not below the rank of a Gazetted Officer, specially authorised in this behalf by the Data Protection Authority may enter any building or place where the Data Protection Authority has reason to believe that any document relating to the subject matter of the inquiry may be found, and may seize any such document or take extracts or copies therefrom subject to the provisions of section 100 of the Code of Criminal Procedure, 1973 (2 of 1974), in so far as it may be applicable.

(4) The Data Protection Authority shall be deemed to be a civil court and when any offence as is described in section 175, section 178, section 179, section 180 or section 228 of the Indian Penal Code, 1860 (45 of 1860) is committed in the view or presence of the Data Protection Authority, the Data Protection Authority may, after recording the facts constituting the offence and the statement of the accused as provided for in the Code of Criminal Procedure, 1973 (2 of 1974), forward the case to a Magistrate having jurisdiction to try the same and the Magistrate to whom any such case is forwarded shall proceed to hear the complaint against the accused as if the case had been forwarded to him under section 346 of the Code of Criminal Procedure, 1973 (2 of 1974).

28. Decisions of the Data Protection Authority. – (1) The decisions of the Data Protection Authority shall be binding.

(2) In its decisions, the Data Protection Authority has the power to –

(a) require an entity to take such steps as may be necessary to secure compliance with the provisions of this Act;

(b) require an entity to compensate any person for any loss or detriment suffered;

(c) impose any of the penalties provided under this Act.

29. Proceedings before the Data Protection Authority to be judicial proceedings. – The Data Protection Authority shall be deemed to be a civil court for the purposes of section 195 and Chapter XXVI of the Code of Criminal Procedure, 1973 (2 of 1974), and every proceeding before the Data Protection Authority shall be deemed to be a judicial proceeding within the meaning of section 193 and section 228 and for the purposes of section 196 of the Indian Penal Code, 1860 (45 of 1860).

CHAPTER V

Regulation by Data Controllers and Data Processors

30. Co-regulation by Data Controllers and the Data Protection Authority. – (1) The Data Protection Authority may, in consultation with data controllers, formulate codes of conduct for the collection, storage, processing, disclosure or other handling of any personal data.

(2) No code of conduct formulated under sub-section (1) shall be binding on a data controller unless –

(a) it has received the written approval of the Data Protection Authority; and

(b) it has received the approval, by signature of a director or authorised signatory, of the data controller.

31. Co-regulation without prejudice to other remedies. – Any code of conduct formulated under this chapter shall be without prejudice to the jurisdiction, powers and functions of the Data Protection Authority.

32. Self-regulation by data controllers. – (1) The Data Protection Authority may encourage data controllers and data processors to formulate professional codes of conduct to establish rules for the collection, storage, processing, disclosure or other handling of any personal data.

(2) No code of conduct formulated under sub-section (1) shall be effective unless it is registered, in such form and manner as may be prescribed, by the Data Protection Authority.

(3) The Data Protection Authority shall, for reasons to be recorded in writing, not register any code of conduct formulated under sub-section (1) that is not adequate to protect personal data.

CHAPTER IV

Surveillance and Interception of Communications

33. Surveillance and interception of communication to be warranted. – Notwithstanding anything contained in any other law for the time being in force, no –

(i) surveillance shall be carried out, and no person shall order any surveillance of another person;

(ii) communication shall be intercepted, and no person shall order the interception of any communication of another person; save in execution of a warrant issued under section 36, or an order made under section 38, of this Act.

34. Application for issuance of warrant. – (1) Any authorised officer seeking to carry out any surveillance or intercept any communication of another person shall prefer an application for issuance of a warrant to the Magistrate.

(2) The application for issuance of the warrant shall be in the form and manner prescribed in the Schedule and shall state the purpose for which the warrant is sought.

(3) The application for issuance of the warrant shall be accompanied by –

(i) a report by the authorised officer of the suspicious conduct of the person in respect of whom the warrant is sought, and all supporting material thereof;

(ii) an affidavit of the authorised officer, or a declaration under his hand and seal, that the contents of the report and application are true to the best of his knowledge, information and belief, and that the warrant shall be executed only for the purpose stated in the application and shall not be misused or abused in any manner including to interfere in the privacy of any person;

(iii) details of all warrants previously issued in respect of the person in respect of whom the warrant is sought, if any.

35. Considerations prior to the issuance of warrant. – (1) No warrant shall issue unless the requirements of section 34 and this section have been met.

(2) The Magistrate shall consider the application made under section 34 and shall satisfy himself that the information contained therein sets out –

(i) a reasonable threat to national security, defence or public order; or

(ii) a cognisable offence, the prevention, investigation or prosecution of which is necessary in the public interest.

(3) The Magistrate shall satisfy himself that all other lawful means to acquire the information that is sought by the execution of the warrant have been exhausted.

(4) The Magistrate shall verify the identity of the authorised officer and shall satisfy himself that the application for issuance of the warrant is authentic.

36. Issue of warrant. – (1) Subject to section 34 and section 35, the Magistrate may issue a warrant for surveillance or interception of communication, or both of them.

(2) The Magistrate may issue the warrant in Chambers.

37. Magistrate may reject application for issuance of warrant. – If the Magistrate is not satisfied that the requirements of section 34 and section 35 have been met, he may, for reasons to be recorded in writing, –

(i) refuse to issue the warrant and dispose of the application;

(ii) return the application to the authorised officer without disposing of it;

(iii) pass any order that he thinks fit.

38. Order by Home Secretary in emergent circumstances. – (1) Notwithstanding anything contained in section 35, if the Home Secretary of the appropriate government is satisfied that a grave threat to national security, defence or public order exists, he may, for reasons to be recorded in writing, order any surveillance or interception of communication.

(2) An authorised officer seeking an order for surveillance or interception of communication under this section shall prefer an application to the Home Secretary in the form and manner prescribed in the Schedule and accompanied by the documents required under sub-section (3) of section 34.

(3) No order for surveillance or interception of communication made by the Home Secretary under this section shall be valid upon the expiry of a period of seven days from the date of the order.

(4) Before the expiry of a period of seven days from the date of an order for surveillance or interception of communication made under this section, the authorised officer who applied for the order shall place the application before the Magistrate for confirmation.

39. Duration of warrant or order. – (1) The warrant or order for surveillance or interception of communication shall specify the period of its validity and, upon its expiry, all surveillance and interception of communication, as the case may be, carried out in relation to that warrant or order shall cease forthwith:

Provided that no warrant or order shall be valid upon the expiry of a period of sixty days from the date of its issue.

(2) A warrant issued under section 36, or an order issued under section 38, for surveillance or interception of communication, or both of them, may be renewed by a Magistrate if he is satisfied that the requirements of sub-section (2) of section 35 continue to exist.

40. Duty to inform the person concerned. – Subject to sub-section (2), before the expiry of a period of sixty days from the conclusion of any surveillance or interception of communication carried out under this Act, the authorised officer who carried out the surveillance or interception of communication shall, in writing in such form and manner as may be prescribed, notify, with reference to the warrant of the Magistrate, and, if applicable, the order of the Home Secretary, each person in respect of whom the warrant or order was issued, of the fact of such surveillance or interception and duration thereof.

(2) The Magistrate may, on an application made by an authorised officer in such form and manner as may be prescribed, if he is satisfied that the notification under sub-section (1) would –

(a) present a reasonable threat to national security, defence or public order, or

(b) adversely affect the prevention, investigation or prosecution of a cognisable offence,

for reasons to be recorded in writing addressed to the authorised officer, order that the person in respect of whom the warrant or order of surveillance or interception of communication was issued, not be notified of the fact of such interception or the duration thereof:

41. Security and duty of confidentiality and secrecy. – (1) No person shall carry out any surveillance or intercept any communication of another person without implementing measures, including, but not restricted to, technological, physical and administrative measures, to secure the confidentiality and secrecy of all information obtained as a result of the surveillance or interception of communication, as the case may be, including from theft, loss or unauthorised disclosure.

(2) Any person who carries out any surveillance or interception of any communication, or who obtains any information, including personal data, as a result of surveillance or interception of communication, shall be subject to a duty of confidentiality and secrecy in respect of it.

(3) Every competent organisation shall, before the expiry of a period of one hundred days from the enactment of this Act, designate as many officers as it deems fit as Privacy Officers who shall be administratively responsible for all interceptions of communications carried out by that competent organisation.

42. Disclosure of information. – (1) Save as provided in this section, no person shall disclose to any other person, or otherwise cause any other person to come into the knowledge or possession of, the content or nature of any information, including personal data, obtained as a result of any surveillance or interception carried out under this Act.

(2) Notwithstanding anything contained in this section, if the disclosure of any information, including personal data, obtained as a result of any surveillance or interception of any communication is necessary to –

(a) prevent a reasonable threat to national security, defence or public order, or

(b) prevent, investigate or prosecute a cognisable offence,

an authorised officer may disclose the information, including personal data, to any authorised officer of any other competent organisation.

CHAPTER VI

Offences and penalties

43. Punishment for offences related to personal data. – (1) Whoever, except in conformity with the provisions of this Act, collects, receives, stores, processes or otherwise handles any personal data shall be punishable with imprisonment for a term which may extend to [___] years and may also be liable to fine which may extend to [___] rupees.

(2) Whoever attempts to commit any offence under sub section (1) shall be punishable with the punishment provided for such offence under that sub-section.

(3) Whoever, except in conformity with the provisions of this Act, collects, receives, stores, processes or otherwise handles any sensitive personal data shall be punishable with imprisonment for a term which may extend to [increased for sensitive personal data] years and and may also be liable to fine which may extend to [___] rupees.

(4) Whoever attempts to commit any offence under sub section (3) shall be punishable with the punishment provided for such offence under that sub-section.

44. Abetment and repeat offenders. – (1) Whoever abets any offence punishable under this Act shall, if the act abetted is committed in consequence of the abetment, be punishable with the punishment provided for that offence.

(2) Whoever, having been convicted of an offence under any provision of this Act is again convicted of an offence under the same provision, shall be punishable, for the second and for each subsequent offence, with double the penalty provided for that offence.

45. Offences by companies. – (1) Where an offence under this Act has been committed by a company, every person who, at the time of the offence was committed, was in charge of, and was responsible to, the company for the conduct of the business of the company, as well as the company shall be deemed to be guilty of the offence and shall be liable to be proceeded against and punished accordingly:

Provided that nothing contained in this sub-section shall render any such person liable to any punishment, if he proves that the offence was committed without his knowledge or that he had exercised all due diligence to prevent the commission of such offence.

(2) Notwithstanding anything contained in sub-section (1), where any offence under this Act has been committed by a company and it is proved that the offence has been committed with the consent or connivance of, or is attributable to any neglect on the part of any director, manager, secretary or other officer of the company, such director, manager, secretary or other officer shall be deemed to be guilty of that offence, and shall be liable to be proceeded against and punished accordingly.

46. Cognisance. – Notwithstanding anything contained in the Code of Criminal Procedure, 1973 (2 of 1974), the offences under section 43, section 44 and section 45 shall be cognisable and non-bailable.

47. General penalty. – Whoever, in any case in which a penalty is not expressly provided by this Act, fails to comply with any notice or order issued under any provisions thereof, or otherwise contravenes any of the provisions of this Act, shall be punishable with fine which may extend to [___] rupees, and, in the case of a continuing failure or contravention, with an additional fine which may extend to [___] rupees for every day after the first during which he has persisted in such failure or contravention.

48. Punishment to be without prejudice to any other action. – The award of punishment for an offence under this Act shall be without prejudice to any other action which has been or which may be taken under this Act with respect to such contravention.

CHAPTER VII

Miscellaneous

49. Power to make rules. – (1) The Central Government may, by notification in the Official Gazette, make rules to carry out the provisions of this Act.

(2) In particular, and without prejudice to the generality of the foregoing power, such rules may provide for –

[__]

(3) Every rule made under this section shall be laid, as soon as may be after it is made, before each House of Parliament while it is in session for a period of thirty days which may be comprised in one session or in two successive sessions and if before the expiry of the session in which it is so laid or the session immediately following, both Houses agree in making any modification in the rule, or both Houses agree that the rule should not be made, the rule shall thereafter have effect only in such modified form or be of no effect, as the case may be, so however, that any such modification or annulment shall be without prejudice to the validity of anything previously done under that rule.

50. Bar of jurisdiction. – (1) On and from the appointed day, no court or authority shall have, or be entitled to exercise, any jurisdiction, powers or authority (except the Supreme Court and a High Court exercising powers under Article 32, Article 226 and Article 227 of the Constitution) in relation to matters specified in this Act.

(2) No order passed under this Act shall be appealable except as provided therein and no civil court shall have jurisdiction in respect of any matter which the Data Protection Authority is empowered by, or under, this Act to determine and no injunction shall be granted by any court or other authority in respect of any action taken or to be taken in pursuance of any power conferred by or under this Act.

51. Protection of action taken in good faith. – No suit or other legal proceeding shall lie against the Central Government, State Government, Data Protection Authority, Chairperson, Member or any person acting under the direction either of the Central Government, State Government, Data Protection Authority, Chairperson or Member in respect of anything which is in good faith done or intended to be done in pursuance of this Act or of any rules or any order made thereunder.

52. Power to remove difficulties. – (1) If any difficulty arises in giving effect to the provisions of this Act, the Central Government may, by order, published in the Official Gazette, make such provisions, not inconsistent with the provisions of this Act, as appears to it to be necessary or expedient for removing the difficulty:

Provided that no such order shall be made under this section after the expiry of a period of three years from the commencement of this Act.

(2) Every order made under this section shall be laid, as soon as may be after it is made, before each House of Parliament.

53. Act to have overriding effect. – The provisions of this Act shall have effect notwithstanding anything inconsistent therewith contained in any other law for the time being in force.

US Privacy FTC

by Prasad Krishna last modified Sep 30, 2013 06:41 AM

PDF document icon US-FTC Privacy Overview (India 2013).pdf — PDF document, 1628 kB (1668070 bytes)

A Privacy Meeting with the Federal Trade Commission in New Delhi

by Elonnai Hickok last modified Oct 03, 2013 10:25 AM
On September 20, the Centre for Internet and Society held a roundtable meeting with Betsy Broder, Counsel for International Consumer Protection, and Sarah Schroeder, Attorney, Bureau of Consumer Protection, Federal Trade Commission (FTC), United States. The meeting took place at the Imperial, Janpath, New Delhi and discussed both the U.S framework to privacy and potential frameworks and challenges to privacy in India.

As a note, thoughts shared during the meeting represented personal perspectives, and did not constitute the official position of the Federal Trade Commission.

When explaining the U.S regulatory framework for privacy the FTC attorneys highlighted that the United States does not have comprehensive privacy legislation, like in Europe,  but instead has  sectoral laws that address different aspects of privacy. For example, the Fair Credit Reporting Act maintains confidentiality of consumer credit report information, the Gramm Leach Bliley Act imposes privacy and security requirements for financial institutions, HIPAA applies to patient health information,  and the Children’s Online Privacy Protection Act prevents the collection and posting of personal information from minors.  It was discussed that the sectoral model followed by the United States allows for a nuanced balance to be struck between privacy protection and the market.  It was noted, however, that some have critiqued the U.S. regulatory framework for lacking clear principles that apply to the commercial world and lay out strong privacy protections for the individual. In light of this, the White House is developing a Privacy Bill of Rights.

The Federal Trade Commission is an independent agency in the United States Government with responsibility for enforcing both consumer protection and competition laws. It is composed of five commissioners, and a staff of roughly 1,000, which includes attorneys and economists. The FTC is primarily a law enforcement agency, but also undertakes policy development through workshops and reports, Consumer education is another key function of the agency.

On the consumer protection side, Congress has directed the FTC to enforce the Federal Trade Commission Act, as well as some more specific statutes, such as those that protect consumers from unwanted telemarketing laws, and the protection of children on line.  Its main objectives are to protect consumer interests, and prevent fraud and unfair and deceptive business practices. The FTC carries out its privacy work through its consumer protection mission.

When understanding the FTC’s role in relation to privacy, it is important to understand that the FTC’s jurisdiction applies only to certain industries as defined by Congress. Thus, for example, the FTC does not have jurisdiction over banks or telecommunications.

The most critical part of the FTC’s activities is its law enforcement function.  The FTC can investigate an organization if the staff believes that the entity may be involved in conduct that contravenes the FTC Act’s prohibition on unfair or deceptive practices, or another specific privacy law. The FTC has brought a number of privacy-related cases against major companies including Facebook, Google, ChoicePoint, and Twitter.  Many of these cases address new challenges brought about by rapidly changing technologies.

The vast majority of the FTC’s actions have been settled with consent judgments.  When the statute that the FTC enforces allows for the imposition of a civil penalty, the FTC sets the penalty at a level that ensures that it is fair and provides a deterrent, but will not impose a hardship on the company.  As a civil enforcement agency, the FTC cannot seek criminal sanctions. While enforcement is the cornerstone of the FTC’s approach to privacy, the agency also supports self-regulation, where appropriate.  In this system the FTC does not pre-approve an organization’s practices or define principles that all companies should abide by as it is felt that every organization is unique and has different needs and abilities, and assigning specific technical standards may stifle innovation.

In the meeting it was also discussed how US privacy laws may apply to overseas companies where they are providing services for US consumers or working on behalf of US companies.  For example, under the Gramm Leach Bliley Act the FTC has created the Safeguards Rule, which speaks to how financial data by financial institutions must be handled and protected.  This Rule applies to companies overseas if the company is performing work for US companies or US consumers.  In other words, a US company cannot avoid compliance by outsourcing its work to an off shore organization.    Discussions during the meeting also focused on consent and the key role that context, accessibility, and timing play in ensuring individuals have the ability to provide informed consent.  Some of the attendees suggested that this  practice  could be greatly improved in India. For example, currently in India there are companies that only provide consumers access to the company privacy policy after an individual has consented and signed up to the service.  When asked about the challenges to privacy that exist in India, many shared that, culturally, there is a different understanding of privacy in India than in many western countries.

Other thoughts included that the Indian government is currently imagining privacy regulation as being either fluid and purely self regulatory or being enforced through strict legal provisions.  Instead, the government needs to begin to expand the possibilities for a regulatory framework for privacy in India in such a way that allows for strong legal enforcement, and flexible standards.  The right to be forgotten was also discussed and it was mentioned that California has proposed a law that will allow individuals to request deletion of information.

CPR South 1

by Prasad Krishna last modified Sep 30, 2013 10:58 AM

PDF document icon CPR South 1.pdf — PDF document, 221 kB (226687 bytes)

CPR South 2

by Prasad Krishna last modified Sep 30, 2013 11:17 AM

PDF document icon CPR South 2.pdf — PDF document, 163 kB (167757 bytes)

An Analysis of the Cases Filed under Section 46 of the Information Technology Act, 2000 for Adjudication in the State of Maharashtra

by Bhairav Acharya last modified Oct 01, 2013 03:29 PM
This is a brief review of some of the cases related to privacy filed under section 46 of the Information Technology Act, 2000 ("the Act") seeking adjudication for alleged contraventions of the Act in the State of Maharashtra.

Background

Section 46 of the Act grants the Central Government the power to appoint an adjudicating officer to hold an enquiry to adjudge, upon complaints being filed before that adjudicating officer, contraventions of the Act. The adjudicating officer may be of the Central Government or of the State Government [see section 46(1) of the Act], must have field experience with information technology and law [see section 46(3) of the Act] and exercises jurisdiction over claims for damages up to `5,00,00,000 [see section 46(1A) of the Act]. For the purpose of adjudication, the officer is vested with certain powers of a civil court [see section 46(5) of the Act] and must follow basic principles of natural justice while conducting adjudications [see section 46(2) of the Act]. Hence, the adjudicating officer appointed under section 46 is a quasi-judicial authority.

In addition, the quasi-judicial adjudicating officer may impose penalties, thereby vesting him with some of the powers of a criminal court [see section 46(2) of the Act], and award compensation, the quantum of which is to be determined after taking into account factors including unfair advantage, loss and repeat offences [see section 47 of the Act]. The adjudicating officer may impose penalties for any of the offences described in section 43, section 44 and section 45 of the Act; and, further, may award compensation for losses suffered as a result of contraventions of section 43 and section 43A. The text of these sections is reproduced in the Schedule below. Further law as to the appointment of the adjudicating officer and the procedure attendant on all adjudications was made by Information Technology (Qualification and Experience of Adjudicating Officers and the Manner of Holding Enquiry) Rules, 2003.[1]

It is clear that the adjudicating officer is vested with significant judicial powers, including the power to enforce certain criminal penalties, and is an important quasi-judicial authority.

Excursus

At the outset, it is important to understand the distinction between compensation and damages. Compensation is a sum of money awarded by a civil court, before or along with the primary decree, to indemnify a person for injury or loss. It is usually awarded to a person who has a suffered a monetary loss as a result of the acts or omissions of another party. Its quantification is usually guided by principles of equity. [See Shantilal Mangaldas AIR 1969 SC 634 and Ranbir Kumar Arora AIR 1983 P&H 431]. On the hand, damages are punitive and, in addition to restoring an indemnitee to wholeness, may be imposed to deter an offender, punish exemplary offences, and recover consequential losses, amongst other objectives. Damages that are punitive, while not judicially popular in India, are usually imposed by a criminal court in common law jurisdictions. They are distinct from civil and equitable actions. [See the seminal case of The Owners of the Steamship Mediana [1900] AC 113 (HL)].

Unfortunately, section 46 of the Act uses the terms “damage”, “injury” and “compensation” interchangeably without regard for the long and rich jurisprudence that finds them to be different concepts.

The Cases related to Privacy

In the State of Maharashtra, there have been a total of 47 cases filed under section 46 of the Act. Of these, 33 cases have been disposed of by the Adjudicating Officer and 14 are currently pending disposal. [2] At least three of these cases before the Adjudicating Officer deal with issues related to privacy of communications and personal data. They are:

Case TitleForumDate

Vinod Kaushik v. Madhvika Joshi

Shri Rajesh Aggarwal
Adjudicating Officer, ex-officio Secretary, IT
Government of Maharashtra
10.10.2011
Amit D. Patwardhan v. Rud India Chains Shri Rajesh Aggarwal
Adjudicating Officer, ex-officio
Secretary, IT
Government of Maharashtra
15.04.2013
Nirmalkumar Bagherwal v. Minal Bagherwal Shri Rajesh Aggarwal
Adjudicating Officer, ex-officio Secretary, IT
Government of Maharashtra
26.08.2013

In all three cases the Adjudicating Officer was called upon to determine and penalise unauthorised access to personal data of the complainants. In the Vinod Kaushik case, the complainants’ emails and chat sessions were accessed, copied and made available to the police for legal proceedings without the permission of the complainants. In the Amit Patwardhan and Nirmalkumar Bagherwal cases, the complainants’ financial information in the form of bank account statements were obtained from their respective banks without their consent and used against them in legal proceedings.

The Vinod Kaushik complaint was filed in 2010 for privacy violations committed between 2008 and 2009. The complaint was made against the complainant’s daughter-in-law – the respondent, who was estranged from her husband, the complainant’s son. The respondent had, independent of the proceedings before the Adjudicating Officer, instituted criminal proceedings alleging cruelty and dowry-related harassment against her estranged husband and the complainant. To support some of the claims made in the criminal proceedings, the respondent accessed the email accounts of her estranged husband and the complainant and printed copies of certain communications, both emails and chat transcripts. The complaint to the Adjudicating Officer was made in relation to these emails and chat transcripts that were obtained without the consent and knowledge of the complainant and his son. On 09.08.2010, the then Adjudicating Officer dismissed the complaint after finding that, owing to the marriage between the respondent and the complainant’s son, there was a relation of mutual trust between them that resulted in the complainant and his son consensually sharing their email account passwords with the respondent. This ruling was appealed to the Cyber Appellate Tribunal ("CyAT") which, in a decision of 29.06.2011, found irregularities in the complainant’s son’s privity to the proceedings and remanded the complaint to the Adjudicating Officer for re-adjudication. The re-adjudication, which was conducted by Shri Rajesh Aggarwal as Adjudicating Officer, resulted in a final order of 10.10.2011 ("the final order") that is the subject of this analysis. The final order found that the respondent had violated the privacy of the complainant and his son by her unauthorised access of their email accounts and sharing of their private communications. However, the Adjudicating Officer found that the intent of the unauthorised access – to obtain evidence to support a criminal proceeding – was mitigatory and hence ordered the respondent to pay only a small token amount in compensation, not to the complainants but instead to the State Treasury. The Delhi High Court, which was moved in appeal because the CyAT was non-functional, upheld the final order in its decision of 27.01.2012.

The Amit Patwardhan complaint was filed against the complainant’s ex-employer – the respondent, for illegally obtaining copies of the complainant’s bank account statement. The complainant had left the employ of the respondent to work with a competing business company but not before colluding with the competing business company and diverting the respondent’s customers to them. For redress, the respondent filed suit for a decree of compensation and lead the complainant’s bank statements in evidence to prove unlawful gratification. Since the bank statements were obtained electronically by the respondent without the complainant’s consent, the jurisdiction of the Adjudicating Officer was invoked. In his order of 15.04.2013, Shri Rajesh Aggarwal, the Adjudicating Officer, found that the respondent had, by unlawfully obtaining the complainant’s bank account statements which constitute sensitive personal data, violated the complainant’s privacy. The Adjudicating Officer astutely applied the equitable doctrine of clean hands to deny compensation to the complainant; however, because the complainant’s bank was not a party to the complaint, the Adjudicating Officer was unable to make a ruling on the lack of action by the bank to protect the sensitive personal data of its depositors.

The Nirmalkumar Bagherwal complaint bears a few similarities to the preceding two cases. Like the Vinod Kaushik matter, the issue concerned the manner in which a wife, estranged but still legally married, accessed electronic records of personal data of the complainants; and, like the Amit Patwardhan matter, the object of the privacy violation was the bank account statements of the complainants that constitute sensitive personal data. The respondent was the estranged wife of one of the complainants who, along with his complainant father, managed the third complainant company. To support her claim for maintenance from the complainant and his family in an independent legal proceeding, the respondent obtained certain bank account statements of the complainants without their consent and, possibly, with the collusion of the respondent bank. After reviewing relevant law from the European Union and the United States, and observant of relevant sectoral regulations applicable in India including the relevant Master Circular of the Reserve Bank of India, and further noting preceding consumer case law on the subject, the Adjudicating Officer issued an order on 26.08.2013. The order found that the complainant’s right to privacy was violated by both the respondents but, while determining the quantum of compensation, distinguished between the respondents in respect of the degree of liability; the respondent wife was ordered to pay a token compensation amount while the respondent bank was ordered to pay higher compensation to each of the three complainants individually.

The high quality of each of the three orders bears specific mention. Despite the superb quality of the judgments of the Indian higher judiciary in the decades after independence, the overall quality of judgment-writing appears to have declined. [3] In the last decade, several Indian judges have called for higher standards of judgment writing from their fellow judges. [4] In this background, it is notable that Shri Rajesh Aggarwal, despite not being a member of the judiciary, has delivered well-reasoned, articulate and clear orders that are cognisant of legal issues and also easily understandable to a non-legal reader.

In each of these cases, the Adjudicating Officer has successfully navigated around the fact that none of the primary parties were interacting and transacting at arm’s length. In the Vinod Kaushik and Nirmalkumar Bagherwal matters, the primary parties were estranged but still legally married partners and in the Amit Patwardhan matter the parties were in an employer-employee relationship. The first Adjudicating Officer in the Vinod Kaushik matter failed, in his order of 09.08.2010, to appreciate that the individual communications of individual persons were privileged by an expectation of privacy, regardless of their relationship. Hence, despite acknowledging that the marital partners in that matter were in conflict with each other, and despite being told by one party that the other party’s access to those private communications was made without consent, the Adjudicating Officer allowed his non-judicial opinion of marriage to influence his order. This mistake was corrected when the matter was remanded for re-adjudication. In the re-adjudication, the new Adjudicating Officer correctly noted that the respondent wife could have chosen to approach the police or a court to follow the proper investigative procedure for accessing emails and other private communications of another person and that her unauthorised use of the complainant’s passwords amounted to a violation of their privacy.

Popular conceptions of different types of relationships may affect the (quasi) judicial imagination of privacy. In comparison to the Vinod Kaushik matter, the Nirmalkumar Bagherwal and Amit Patwardhan matters both dealt with unauthorised access to bank account statements, by a wife and by an ex-employer respectively. In any event, the same Adjudicating Officer presided over all three matters and correctly found that the facts in all three matters admitted to contraventions of the privacy of the complainants. The conjecture as to whether the first Adjudicating Officer in the Vinod Kaushik matter would have applied the same standard of family unity to unauthorised access of bank account statements by an estranged wife who was seeking maintenance remains untested. However, the reliance placed on the decision of the Delhi State Consumer Protection Commission in the matter of Rupa Mahajan Pahwa, [5] where the Commission found that unauthorised access to a bank pass book by an estranged husband violated the privacy of the wife, would suggest that judges clothe financial information with a standard of privacy higher than that given to emails.

Emails are a form of electronic communication. The PUCL case (Supreme Court of India, 1996)[6] while it did not explicitly deal with the standard of protection accorded to emails, held that personal communications were protected by an individual right to privacy that emanated from the protection of personal liberty guaranteed under Article 21 of the Constitution of India. Following the Maneka Gandhi case (Supreme Court of India, 1978)[7]

it is settled that persons may be deprived of their personal liberty only by a just, fair and reasonable procedure established by law. As a result, interceptions of private communications that are protected by Article 21 may only be conducted in pursuance of such a procedure. This procedure exists in the form of the Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009 that came into effect on 27 October 2009 ("the Interception Rules"). The Interception Rules set out a regime for accessing private emails in certain conditions. The powers and procedure of Section 91 of the Code of Criminal Procedure ("CrPC") may also apply to obtain data at rest, such as emails stored in an inbox or sent-mail folder.

Finally, the orders of the Adjudicating Officer reveal a well-reasoned and progressive understanding of the law and principles relating to the quantification of compensation. By choosing to impose larger amounts of compensation on the bank that violated the privacy of the complainant in the Nirmalkumar Bagherwal matter, the Adjudicating Officer has indicated that the institutions that hold sensitive personal data, such as financial information, are subject to a higher duty of care in relation of it. But, most importantly, the act of imposing monetary compensation of privacy violations is a step forward because, for the first time in India, it recognises that privacy violations are civil wrongs or injuries that demand compensation.


[1]. These Rules were issued vide GSR 220(E), dated 17 March 2003 and published in the Gazette of India, Extraordinary, Part II, Section 3(i). These Rules can be accessed here – http://it.maharashtra.gov.in/PDF/Qual_ExpAdjudicatingOfficer_Manner_of_Holding_Enquiry_Rules.PDF (visited on 30 September 2013).

[2]. These cases and statistics may be viewed here – http://it.maharashtra.gov.in/1089/IT-Act-Judgements (visited on 30 September 2013).

[3]. See generally, Upendra Baxi “"The Fair Name of Justice": The Memorable Voyage of Chief Justice Chandrachud” in A Chandrachud Reader (Justice V. S. Deshpande ed., Delhi: Documentation Centre etc., 1985) and, Rajeev Dhavan, "Judging the Judges" in Judges and the Judicial Power: Essays in Honour of Justice V. R. Krishna Iyer (Rajeev Dhavan and Salman Khurshid eds., London: Sweet & Maxwell, 1985).

[4]. See generally, Justice B.G .Harindranath, Art of Writing Judgments (Bangalore: Karnataka Judicial Academy, 2004); Justice T .S. Sivagnanam, The Salient Features of the Art of Writing Orders and Judgments (Chennai: Tamil Nadu State Judicial Academy, 2010); and, Justice Sunil Ambwani, “Writing Judgments: Comparative Models” Presentation at the National Judicial Academy, Bhopal (2006) available here – http://districtcourtallahabad.up.nic.in/articles/writing%20judgment.pdf (visited on 29 Sep 2013).

[5]. Appeal No. FA-2008/659 of the Delhi State Consumer Protection Commission, decided on 16 October 2008.

[6]. (1997) 1 SCC 301.

[7]. (1978) 1 SCC 248.

CIS Cybersecurity Series (Part 11) - Anja Kovacs

by Purba Sarkar last modified Oct 15, 2013 03:25 PM
CIS interviews Anja Kovacs, researcher and activist, and director of the Internet Democracy, Project as part of the Cybersecurity Series.

"Having the cyber security debate become more and more important was a real challenge for civil society. I think in part because many of us who were focused on human rights aren't necessarily techies. And so, when you have a conversation with a government bureaucrat, and ask questions about the kind of decisions they decided to take, very often they will come up with a technical answer in response. And then, if you don't have that expertise, it is difficult to react. In the meantime though, I think it has become clear that this is one of the biggest issues in the internet field at the moment. It is also one of the big issues that is driving the desires of governments to have a bigger role to play in internet governance. So it is an area that is unavoidable for activists. What has happened slowly is that we have come to realize that the first thing, as in most other areas, is not the technical details, but principles, and those principles are fairly similar to how they are in many other fields." - Anja Kovacs, Internet Democracy Project

Centre for Internet and Society presents its eleventh installment of the CIS Cybersecurity Series. 

The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

In this installment, CIS speaks to Anja Kovacs, director of the Internet Democracy Project. Her work focuses on a wide range of questions regarding freedom of expression, cybersecurity and the architecture of Internet governance as they relate to the Internet and democracy. Anja is currently also a member of the of the Investment Committee of the Digital Defenders Partnership and of the interim Steering Group of Best Bits, a global network of civil society members.

(Bio from internetdemocracy.in) 

Internet Democracy Project homepage: http://internetdemocracy.in/

 

This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.


The India Privacy Monitor Map

by Maria Xynou last modified Oct 09, 2013 04:26 PM
The Centre for Internet and Society has started the first Privacy Watch in India! Check out our map which includes data on the UID, NPR and CCTNS schemes, as well as on the installation of CCTV cameras and the use of drones throughout the country.
The India Privacy Monitor Map

by gruntzooki on flickr

In a country of twenty-eight diverse states and seven union territories, it remained unclear to what extent surveillance, biometric and other privacy-intrusive schemes are being implemented. We are trying to make up for this by mapping out data in every single state in India on the UID, CCTNS and NPR schemes, as well as on the installation of CCTV cameras and the use of Unmanned Aerial Vehicles (UAVs), otherwise known as drones.

In particular, the map in its current format includes data on the following:

UID: The Unique Identification Number (UID), also known as AADHAAR, is a 12-digit unique identification number which the Unique Identification Authority of India (UIDAI) is currently issuing for all residents in India (on a voluntary basis). Each UID is stored in a centralised database and linked to the basic demographic and biometric information of each individual. The UIDAI and AADHAAR currently lack legal backing.

NPR: Under the National Population Register (NPR), the demographic data of all residents in India is collected on a mandatory basis. The Unique Identification Authority of India (UIDAI) supplements the NPR with the collection of biometric data and the issue of the AADHAAR number.

CCTV: Closed-circuit television cameras which can produce images or recordings for surveillance purposes.

UAV: Unmanned Aerial Vehicles (UAVs), otherwise known as drones, are aircrafts without a human pilot on board. The flight of a UAV is controlled either autonomously by computers in the vehicle or under the remote control of a pilot on the ground or in another vehicle. UAVs are used for surveillance purposes.

CCTNS: The Crime and Criminal Tracking Networks and Systems (CCTNS) is a nationwide networking infrastructure for enhancing efficiency and effectiveness of policing and sharing data among 14,000 police stations across India.

Our India Privacy Monitor Map can be viewed through the following link: http://cis-india.org/cisprivacymonitor

This map is part of on-going research and will hopefully expand to include other schemes and projects which are potentially privacy-intrusive. We encourage all feedback and additional data!

Interview with Big Brother Watch on Privacy and Surveillance

by Maria Xynou last modified Oct 15, 2013 02:24 PM
Maria Xynou interviewed Emma Carr, the Deputy Director of Big Brother Watch, on privacy and surveillance. View this interview and gain an insight on why we should all "have something to hide"!

For all those of you who haven't heard of Big Brother Watch, it's a London-based campaign group which was founded in 2009 to protect individual privacy and defend civil liberties.

Big Brother Watch was set up to challenge policies that threaten our privacy, our freedoms and our civil liberties, and to expose the true scale of the surveillance state. The campaign group has produced unique research exposing the erosion of civil liberties in the UK, looking at the dramatic expansion of surveillance powers, the growth of the database state and the misuse of personal information. Big Brother Watch campaigns to give individuals more control over their personal data, and hold to account those who fail to respect our privacy, whether private companies, government departments or local authorities.

Emma Carr joined Big Brother Watch as Deputy Director in February 2012 and has since been regularly quoted in the UK press. The Centre for Internet and Society interviewed Emma Carr on the following questions:

  1. How do you define privacy?

  2. Can privacy and freedom of expression co-exist? Why/Why not?

  3. What is the balance between Internet freedom and surveillance?

  4. According to your research, most people worldwide care about their online privacy – yet they give up most of it through the use of social networking sites and other online services. Why, in your opinion, does this occur and what are the potential implications?

  5. Should people have the right to give up their right to privacy? Why/Why not?

  6. What implications on human rights can mass surveillance potentially have?

  7. “I'm not a terrorist and I have nothing to hide...and thus surveillance can't affect me personally.” Please comment.

  8. Do we have Internet freedom?

 

VIDEO

Interview with Bruce Schneier - Internationally Renowned Security Technologist

by Maria Xynou last modified Oct 17, 2013 08:54 AM
Maria Xynou recently interviewed Bruce Schneier on privacy and surveillance. View this interview and gain an insight on why we should all "have something to hide"!

Bruce Schneier is an internationally renowned security technologist, called a "security guru" by The Economist.

He is the author of 12 books -- including Liars and Outliers: Enabling the Trust Society Needs to Survive -- as well as hundreds of articles, essays, and academic papers. His influential newsletter "Crypto-Gram" and his blog "Schneier on Security" are read by over 250,000 people. He has testified before Congress, is a frequent guest on television and radio, has served on several government committees, and is regularly quoted in the press.

Schneier is a fellow at the Berkman Center for Internet and Society at Harvard Law School, a program fellow at the New America Foundation's Open Technology Institute, a board member of the Electronic Frontier Foundation, an Advisory Board Member of the Electronic Privacy Information Center, and the Security Futurologist for BT -- formerly British Telecom.

The Centre for Internet and Society (CIS) interviewed Bruce Schneier on the following questions:

  1. Do you think India needs privacy legislation? Why/ Why not?

  2. The majoity of India's population lives below the line of poverty and barely has any Internet access. Is surveillance an elitist issue or should it concern the entire population in the country? Why/ Why not?

  3. “I'm not a terrorist and I have nothing to hide...and thus surveillance can't affect me personally.” Please comment.

  4. Can free speech and privacy co-exist? What is the balance between privacy and freedom of expression?

  5. Should people have the right to give up their right to privacy? Why/ Why not?

  6. Should surveillance technologies be treated as traditional arms/weapons? Why/ Why not?

  7. How can individuals protect their data (and themselves) from spyware, such as FinFisher?

  8. How would you advise young people working in the surveillance industry?

VIDEO

Interview with the Tactical Technology Collective on Privacy and Surveillance

by Maria Xynou last modified Oct 18, 2013 09:56 AM
The Centre for Internet and Society recently interviewed Anne Roth from the Tactical Technology Collective in Berlin. View this interview and gain an insight on why we should all "have something to hide"!

For all those of you who haven't heard of the Tactical Technology Collective, it's a Berlin and Bangalore-based non-profit organisation which aims to advance the skills, tools and techniques of rights advocates, empowering them to use information and communications to help marginalised communities understand and effect progressive social, environmental and political change.

Tactical Tech's Privacy & Expression programme builds the digital security awareness and capacity of human rights defenders, independent journalists, anti-corruption advocates and activists. The programme's activities range from awareness-raising comic films aimed at audiences new to digital security issues, to direct training and materials for high-risk defenders working in some of the world's most repressive environments.

Anne Roth works with Tactical Tech on the Privacy & Expression programme as a researcher and editor. Anne holds a degree in political science from the Free University of Berlin. She cofounded one of the first interactive media activist websites, Indymedia, in Germany in 2001 and has been involved with media activism and various forms of activist online media ever since. She has worked as a web editor and translator in the past. Since 2007 she has written a blog that covers privacy, surveillance, media, net politics and feminist issues.

The Centre for Internet and Society interviewed Anne Roth on the following questions:

  1. How do you define privacy?

  2. Can privacy and freedom of expression co-exist? Why/ Why not?

  3. What is the balance between Internet freedom and surveillance?

  4. According to research, most people worldwide care about their online privacy – yet they give up most of it through the use of social networking sites and other online services. Why, in your opinion, does this occur and what are the potential implications?

  5. Should people have the right to give up their right to privacy? Why/ Why not?

  6. What implications on human rights can mass surveillance potentially have?

  7. “I'm not a terrorist and I have nothing to hide...and thus surveillance can't affect me personally”. Please comment.

  8. Do we have Internet freedom?

VIDEO

Tweets from Bali IGF 2013

by Pranesh Prakash last modified Oct 28, 2013 09:09 AM
CIS is logging all tweets with the words "igf2013", "igf13", "igf", "bestbits", and "genderit" during the Intenet Governance Forum going on in the Bali this week, and making it available in downloadable files.

To enable research by those who don't want to mess around with Twitter's APIs, we are making available CSV files available to the general public. These files can be opened up in any spreadsheet software (including web-based ones), or even in a text editor.

These files will be updated with the latest version at the end of each evening in Bali.

If you have any ideas as to other keywords we should capture, or about visualizations that we should engage in, do get in touch with pranesh AT cis-india DOT org.

Open Letter to Members of the European Parliament of the Civil Liberties, Justice and Home Affairs Committee

by Elonnai Hickok last modified Oct 23, 2013 05:00 AM
An open letter was sent to the Members of the European Parliament of the Civil Liberties, Justice and Home Affairs Committee on the proposed EU Regulation. The letter was apart of an initiative that Privacy International and a number of other NGO's are undertaking.

Dear Members of the European Parliament of the Civil Liberties, Justice and Home Affairs Committee,

On behalf of The Centre for Internet and Society, Bangalore, India,  we are writing to express our support of the European Commission’s proposed General Data Protection Regulation (COM (2012) 11).

The legal framework established under the 1995 Data Protection Directive (95/46/EC) in Europe has positively influenced many existing privacy regimes worldwide, serving as a model legal framework in jurisdictions that are in the process of developing privacy regimes, including India. The positive impact of the Data Protection Directive shows the potential of the Regulation to become a global model for the protection of personal data. The Regulation seeks to address new scenarios that have arisen in the context of rapidly changing technologies and practices, increasing its potential for positively influencing privacy rights for individuals globally.

India is currently in the process of considering the enactment of privacy legislation, in part with the aim of ensuring adequate safeguards to enable and enhance information flows into India from countries around the world, including Europe. At the same time, India is seeking  Data Secure Status from the EU, on the basis of its current regime.

It is clear that the EU framework for data protection has a major influence on the current and emerging privacy regime in India. India is only one country of many that are in the beginning stages of developing a comprehensive privacy regime. Thus, we ask that you keep in mind how the Regulation will impact the rights of individual in countries outside of Europe, particularly in countries that are in the process of developing privacy regimes.

We ask that you take into consideration the four following points that we believe need to be addressed in the Regulation to help ensure adequate protection of the rights of individuals in the European Union and around the world.

  1. Strengthen the principle of purpose limitation: The Regulation should incorporate a strong purpose limitation principle that strictly limits present and future uses of personal data to the purposes for which it was originally collected. Currently, Article 6(4) allows for the further processing of data when the processing is “not compatible with the one for which the personal data have been collected”. Though the provision establishes legal requirements, one of which must be before information can be used for a further purpose, this is has proven insufficient in the existing Directive. The current provision in the Regulation dilutes the principle of purpose limitation as well as weakening an individual’s ability to make informed decisions about their personal data.
  2. Define principles for interpretation of broad terms: The Regulation should create principles for interpreting broad terms such as “legitimate interest” and “public interest”. These vague terms are used throughout the Regulation, and create the potential for loopholes or abuse. Because these terms can be interpreted in many different ways, it is important to create a set of principles to guide their interpretation  by data protection authorities and courts to avoid inconsistent application and enforcement of the Regulation.
  3. Clarify the scope of the Regulation: The Regulation should clearly describe the jurisdictional scope and reach of its provisions. Currently Article 3(1) states that the Regulation will apply to the processing of data “in the context of the activities of an establishment of a controller or a processor in the Union”.  The flow of information on the online environment coupled with trends such as cloud computing, outsourcing, and cross border business creates a scenario where defining what constitutes “context of the activities of an establishment”, is difficult and could lead to situations where personal data is not protected, as the collection, use, or storage of it does not necessarily fall within the “context of the activities”.
  4. Address access by foreign alliance bodies: In light of growing demands by law enforcement for access, use, and transfer of personal information for investigative purposes across jurisdictions– the Regulation should define the circumstances in which personal data protected by its provisions can be accessed and used by foreign intelligence bodies, and the procedure by which to do so. The Regulation should address challenges such as access by foreign intelligence bodies to data stored on the cloud and data that has passed through/is stored on foreign networks/servers.

Interview with Dr. Alexander Dix - Berlin Data Protection and Freedom of Information Commissioner

by Maria Xynou last modified Nov 06, 2013 09:29 AM
Maria Xynou recently interviewed Berlin's Data Protection and Freedom of Information Commissioner: Dr. Alexander Dix. View this interview and gain an insight on recommendations for better data protection in India!

Dr. Alexander Dix has been Berlin's Data Protection and Freedom of Information Commissioner since June 2005. He has more than 26 years of practical experience in German data protection authorities and previously served as Commissioner for the state of Bradenburg for seven years.

Dr. Dix is a specialist in telecommunications and media and has dealt with a number of issues regarding the cross-border protection of citizen’s privacy. He chairs the International Working Group on Data Protection in Telecommunications (“Berlin Group”) and is a member of the Article 29 Working Party of European Data Protection Supervisory Authorities. In this Working Party he represents the Data Protection Authorities of the 16 German States (Länder).

A native of Bad Homburg, Hessen, Dr. Alexander Dix graduated from Hamburg University with a degree in law in 1975. He received a Master of Laws degree from the London School of Economics and Political Science in 1976 and a Doctorate in law from Hamburg University in 1984. He has published extensively on issues of data protection and freedom of information. Inter alia he is a co-editor of the German Yearbook on Freedom of Information and Information Law.

The Centre for Internet and Society interviewed Dr. Alexander Dix on the following questions:

  1. What activities and functions does the Berlin data commissioner's office undertake?

  2. What powers does the Berlin data commissioner's office have? In your opinion, are these sufficient? Which powers have been most useful? If there is a lack, what would you feel is needed?

  3. How is the office of the Berlin Data Protection Commissioner funded?

  4. What is the organisational structure at the Office of the Berlin Data Protection Commissioner and the responsibilities of the key executives?

  5. If India creates a Privacy Commissioner, what structure/framework would you suggest for the office?

  6. What challenges has your office faced?

  7. What is the most common type of privacy violation that your office is faced with?

  8. Does your office differ from other EU data protection commissioner offices?

  9. How do you think data should be regulated in India?

  10. Do you support the idea of co-regulation or self-regulation?

  11. How can India protect its citizens' data when it is stored in foreign servers?

VIDEO

An Interview with Jacob Kohnstamm, Dutch Data Protection Authority and Chairman of the Article 29 Working Party

by Elonnai Hickok last modified Oct 25, 2013 04:50 AM
The Centre for Internet and Society interviewed Jacob Kohnstamm, Dutch Data Protection Authority and Chairman of the Article 29 Working Party.

What activities and functions does your office undertake?

The activities and functions of the Dutch data protection authority can roughly be divided in 4 different categories: supervisory activities, giving advise on draft legislation, raising awareness and international tasks.

The Dutch DPA supervises the legislation applicable in the Netherlands with regard to the use of personal data. The most important law is the Dutch Data Protection Act, but the Dutch DPA also supervises for example the Acts governing data processing by police and justice as well as parts of the Telecoms Act.

The supervisory activities mainly consist of investigating, ex officio, violations of the law, with the focus on violations that are serious, structural and impact a large amount of people. Where necessary, the Dutch DPA can use its sanctioning powers, including imposing a conditional fine, to enforce the law. The Dutch DPA can also decide to examine sector-wide codes of conduct that are submitted to it and provide its views in the form of a formal opinion.

In addition to investigations, the Dutch DPA advises the government, and sometimes the parliament, on draft legislation related to the processing of personal data. Following the Data Protection Act, the government is obliged to submit both primary and secondary legislation related to data processing to the DPA for advice.

As regards awareness-raising, next to publishing the results of the investigations, its views on codes of conduct and its advice on legislation, the Dutch DPA also issues guidelines, on its own initiative, explaining legal norms. Via its websites, the Dutch DPA provides more information to both data subjects and controllers on how data can and cannot be processed. Specifically for data subjects, self-empowerment tools – including standard letters to exercise their rights – are made available. Furthermore, they can contact the Dutch DPA daily via a telephone hotline.

Last but not least, the Dutch DPA participates in several International and European fora, including the Article 29 Working Party of which I am the Chair, the European and the International Conference of data protection and privacy commissioners, of whose Executive Committee I am also the Chair.

What powers does your office have? in your opinion are these sufficient? Which powers have been most useful? If there is a lack, what do you feel is needed?

The Dutch DPA has a broad range investigative powers, including the power to order the controller to hand over all relevant information and entering the premises of the controller unannounced. All organisations subjected to the supervision of the Dutch DPA are obligated to cooperate.

The Dutch DPA also has a considerable range of sanctioning powers, it can for example order the suspension or termination of certain processing operations and can also impose a conditional fine. Currently a bill is before Parliament to provide the Dutch DPA with fining powers as well.

Especially when the bill providing the Dutch DPA with fining powers will be passed, I feel the powers are sufficient, giving us all the necessary enforcement tools to ensure compliance with the law.

How is your office funded?

The Dutch DPA is funded through the government who, together with the parliament, each year determines the budget for the next year. The budget is drafted on the basis of a proposal from the Dutch DPA.

What is the organizational structure of your office and the responsibilities of the key executives?

The Dutch DPA consists of a college of commissioners and the supporting Secretariat, itself consisting of 6 departments and headed by the Director. The Dutch DPA has 2 supervision departments, one for the private and one for the public sector, a legal department, a communications department, an international department and a department providing the operational support.

If India creates a  framework of co-regulation, how would you suggest the overseeing body be structured?

Considering the many differences between India and the Netherlands - and Europe - this is a very hard question to answer. But whatever construction is chosen in India, it is of utmost importance to guarantee the independence of the supervisory authorit(y)(ies), who shall be provided with sufficient and scalable powers to be able to sanction violations.

What legal challenges has your office faced?

The biggest legal challenge we face at the moment is the new European legal framework currently being discussed. It is as yet uncertain whether and when this will enter into force, but it is clear that it will bring new challenges for our office.

What are the main differences between your offices?

Generally, I think that the differences between my office and the UK and Canadian offices mostly stem from our different legal and cultural backgrounds, especially the difference between the common law and codified law systems.

In addition, the norms and powers differ per supervisory authority. The Dutch DPA for example can enter a building without prior notice, while the ICO, if I understand correctly, can only enter with the consent of the supervised organisation.

I however prefer to look at the similarities and possibilities to overcome our differences, because I think that we all feel that providing a high level of data protection and ensuring user control are all of our main priorities.

Naturally, I am very curious to hear from Chrisopher and Chantal as well.

What are the most recent privacy developments for each of your respective offices?

The technological developments of the past decades and the increasing use of smartphones and tablets, have also made privacy developments necessary and have obliged us, as data protection authorities, to consider the rules and norms in this new environment.

What would you broadly recommend for a privacy legislation for India?

In my view the privacy legislation in India should in any case contain the basic principles of the protection of personal data, applicable to both the public and the private sector. Naturally with some exceptions for law enforcement purposes.

Furthermore, the Indian law should protect the imported data of citizens from other parts of the world as well, including the EU.

And as mentioned in my answer to question 5, it is of utmost importance that the Indian legislation guarantees the establishment of (a) completely independent supervisory authorit(y)(ies), provided with sufficient sanctioning powers, to supervise compliance with the legislation also of the government, including police and justice.

What India can Learn from the Snowden Revelations

by Elonnai Hickok last modified Oct 25, 2013 07:29 AM
Big Brother is watching, across cyberspace and international borders. Meanwhile, the Indian government has few safeguards in theory and fewer in practice. There’s no telling how prevalent or extensive Indian surveillance really is.

The title of the article was changed in the version published by Yahoo on October 23, 2013.


Since the ‘Snowden revelations’, which uncovered the United States government’s massive global surveillance through the PRISM program, there have been reactions aplenty to their impact.

The Snowden revelations highlighted the issue of human rights in the context of the existing cross-border and jurisdictional nightmare: the data of foreign citizens surveilled and harvested by agencies such as the National Security Agency through programs such as PRISM are not subject to protection found in the laws of the country. Thus, the US government has the right to access and use the data, but has no responsibility in terms of how the data will be used or respecting the rights of the people from whom the data was harvested.

The Snowden revelations demonstrated that the biggest global surveillance efforts are now being conducted by democratically elected governments – institutions of the people, by the people, for the people – that are increasingly becoming suspicious of all people.

Adding irony to this worrying trend, Snowden sought asylum from many of the most repressive regimes: this dynamic speaks to the state of society today. The Snowden revelations also demonstrate how government surveillance is shifting from targeted surveillance, warranted for a specific reason and towards a specified individual, to blanket surveillance where security agencies monitor and filter massive amounts of information.

This is happening with few checks and balances for cross-border and domestic surveillance in place, and even fewer forms of redress for the individual. This is true for many governments, including India.

India’s reaction

After the first news of the Snowden revelations, the Indian Supreme Court agreed to hear a Public Interest Litigation requesting that foreign companies that shared the information with US security agencies be held accountable for the disclosure. In response to the PIL, the Supreme Court stated it did not have jurisdiction over the US government.

The response of the Supreme Court of India demonstrates the potency of jurisdiction in today’s global information economy in the context of governmental surveillance. Despite being upset at the actions of America’s National Security Agency (NSA), there is little direct legal action that any government or individual can take against the US government or companies incorporated there.

In the PIL, the demand that companies be held responsible is interesting and representative of a global debate, as it implies that in the context of governmental surveillance, companies have a responsibility to actively evaluate and reject or accept governmental surveillance requests. Although I do not disagree with this as a principle, in reality, this evaluation is a difficult step for companies to take.

For example, in India, under Section 69 of the Information Technology Act, 2000, service providers are penalized with up to seven years in prison for non-compliance with a governmental request for surveillance. The incentives for companies to actually reject governmental requests are minimal, but one factor that could possibly push companies to become more pronounced in their resistance to installing backdoors for the government and complying with governmental surveillance requests is market pressure from consumers.

To a certain extent, this has already started to happen. Companies such as Facebook, Yahoo and Google have created ‘transparency reports’ that provide – at different granularities – information about governmental requests and the company’s compliance or rejection of the same.

In India, P. Rajeev, Member of Parliament from Kerala, has started a petition asking that the companies disclose information on Indian data given to US security agencies. Although transparency by complying companies does not translate directly into regulation of surveillance, it allows the customer to make informed choices and decide whether a company’s level of compliance with governmental requests will impact his/her use of that service.

The PIL also called for the establishment of Indian servers to protect the privacy of Indian data. This solution has been voiced by many, including government officials. Though the creation of domestic servers would ensure that the US government does not have direct and unfettered access to Indian data, as it would require that foreign governments access Indian information through a formal Mutual Legal Assistance Treaty process, it does not necessarily enhance the privacy of Indian data.

As a note, India has MLAT treaties with 34 countries. If domestic servers were established, the information would be subject to Indian laws and regulations.

Snooping

The Snowden Revelations are not the first instance to spark a discussion on domestic servers by the Government of India.

For example, in the back-and-forth between the Indian government and the Canadian company RIM, now BlackBerry, the company eventually set up servers in Mumbai and provided a lawful interception solution that satisfied the Indian government. The Indian government made similar demands from Skype and Google. In these instances, the domestic servers were meant to facilitate greater surveillance by Indian law enforcement agencies.

Currently in India there are a number of ways in which the government can legally track data online and offline. For example, the interception of telephonic communications is regulated by the Indian Telegraph Act, 1885, and relies on an order from the Secretary to the Ministry of Home Affairs. Interception, decryption, and monitoring of digital communications are governed by Section 69 of the Information Technology Act, 2000 and again rely on the order of the executive.

The collection and monitoring of traffic data is governed by Section 69B of the Information Technology Act and relies on the order of the Secretary to the government of India in the Department of Information Technology. Access to stored data, on the other hand, is regulated by Section 91 of the Code of Criminal Procedure and permits access on the authorization of an officer in charge of a police station.

The gaps in the Indian surveillance regime are many and begin with a lack of enforcement and harmonization of existing safeguards and protocols. Presently, India is in the process of realizing a privacy legislation.

In 2012, a committee chaired by Justice AP Shah (of which the Center for Internet and Society was a member) wrote The Report of the Group of Experts on Privacy, which laid out nine national privacy principles meant to be applied to different legislation and sectors – including Indian provisions on surveillance.

The creation of domestic servers is just one example of how the Indian government has been seeking greater access to information flowing within its borders. New requirements for Indian service providers and the creation of projects that go beyond the legal limits of governmental surveillance in India enable greater access to details about an individual on a real-time and blanket basis.

For example, telecoms in India are now required to include user location data as part of the ‘call detail record’ and be able to provide the same to law enforcement agencies on request under provisions in the Unified Access Service and Internet Service Provider Licenses.

At the same time, the Government of India is in the process of putting in place a Central Monitoring System that would provide Indian security agencies the ability to directly intercept communications, bypassing the service provider.

Even if the Central Monitoring System were to adhere to the legal safeguards and procedures defined under the Indian Telegraph Act and Information Technology Act, the system can only do so partially, as both provisions create a clear chain of custody that the government and service providers must follow – that is, the service provider was included as an integral component of the interception process.

If the Indian government implements the Central Monitoring System, it could remove governmental surveillance completely from the public eye. Bypassing the service provider allows the government to fully determine how much the public knows about surveillance. It also removes the market and any pressure that consumers could exert from insight provided by companies on the surveillance requests that they are facing.

Though the Indian government could (and should) be transparent about the amount and type of surveillance it is undertaking, currently there is no legal requirement for the government of India to disclose this information, and security agencies are exempt from the Right to Information Act. Thus, unless India has a Snowden somewhere in the apparatus, the Indian public cannot hope to get an idea of how prevalent or extensive Indian surveillance really is.

Policy vacuum

For any government, the surveillance of its citizens, to some degree, might be necessary. But the Snowden revelations demonstrate that there is a vacuum when it comes to surveillance policy and practices. This vacuum has permitted draconian measures of surveillance to take place and created an environment of mistrust between citizens and governments across the globe.

When governments undertake surveillance, it is critical that the purpose, necessity and legality of monitoring, and the use of the material collected are built into the regime to ensure it does not violate the human rights of the people surveilled, foreign or domestic.

In 2013, the International Principles on the Application of Human Rights to Communications Surveillance were drafted, in part, to address this vacuum. The principles seek to explain how international human rights law applies to surveillance of communications in the current digital and technological environment. They define safeguards to ensure that human rights are protected and upheld when governments undertake surveillance of communications.

When the Indian surveillance regime is measured against these principles, it appears to miss a number of them, and does not fully meet several others. In the context of surveillance projects like the Central Monitoring System, and in order to avoid an Indian version of the PRISM program, India should take into consideration the safeguards defined in the principles and strengthen its surveillance regime to ensure not only the protection of human rights in the context of surveillance, but to also establish trust in its surveillance regime and practices with other countries.


Elonnai Hickok is the Program Manager for Internet Governance at the Centre for Internet and Society, and leads its research on privacy.

(IMDEC) 2013

by Prasad Krishna last modified Oct 25, 2013 06:09 AM

PDF document icon Proposed-Program-IMDEC.pdf — PDF document, 120 kB (122998 bytes)

Mapping Digital Media Background Note

by Prasad Krishna last modified Oct 25, 2013 09:14 AM

PDF document icon Background note_MDM.pdf — PDF document, 447 kB (458684 bytes)

MDM Invite Poster

by Prasad Krishna last modified Oct 25, 2013 09:22 AM

PDF document icon MDM Invite_Poster.pdf — PDF document, 1104 kB (1130749 bytes)

MDM Press Invite

by Prasad Krishna last modified Oct 25, 2013 09:24 AM

PDF document icon MDM_Press Invite.pdf — PDF document, 775 kB (794198 bytes)

MDM Digital Media Press Release

by Prasad Krishna last modified Oct 25, 2013 09:30 AM

PDF document icon Press release_MDM Public Consultation (1).pdf — PDF document, 216 kB (221365 bytes)

Spy Files 3: WikiLeaks Sheds More Light On The Global Surveillance Industry

by Maria Xynou last modified Nov 14, 2013 04:21 PM
In this article, Maria Xynou looks at WikiLeaks' latest Spy Files and examines the legality of India's surveillance technologies, as well as their potential connection with India's Central Monitoring System (CMS) and implications on human rights.
Spy Files 3: WikiLeaks Sheds More Light On The Global Surveillance Industry

by RamyRaoof on flickr

Last month, WikiLeaks released Spy Files 3”, a mass exposure of the global surveillance trade and industry. WikiLeaks first released the Spy Files in December 2011, which entail brochures, presentations, marketing videos and technical specifications on the global trade of surveillance technologies. Spy Files 3 supplements this with 294 additional documents from 92 global intelligence contractors.

So what do the latest Spy Files reveal about India?

When we think about India, the first issues that probably come to mind are poverty and corruption, while surveillance appears to be a more “Western” and elitist issue. However, while many other developing countries are excluded from WikiLeaks’ list of surveillance technology companies, India is once again on the list with some of the most controversial spyware.

ISS World Surveillance Trade Shows

The latest Spy Files include a brochure of the ISS World 2013 -the so-called “wiretapper’s ball”- which is the world’s largest surveillance trade show. This yearsISS World Asia will take place in Malaysia during the first week of December and law enforcement agencies from around the world will have another opportunity to view and purchase the latest surveillance tech. The leaked ISS World 2013 brochure entails a list of last years’ global attendees. According to the brochure, 53% of the attendees included law enforcement agencies and individuals from the defense, public safety and interior security sectors, 41% of the attendees were ISS vendors and technology integrators, while only 6% of the attendees were telecom operators and from the private enterprise. The brochure boasts that 4,635 individuals from 110 countries attended the ISS World trade shows last year and that the percentage of attendance is increasing.

The following table lists the Indian attendees at last yearsISS World:

Law Enforcement, Defense and Interior Security Attendees

Telecom Operators and Private Enterprises Attendees

ISS Vendors and Technology Integrators Attendees

Andhra Pradesh India Police

BT

AGC Networks

CBI Academy

Cogence Investment Bank

Aqsacom India

Government of India, Telecom Department

India Reliance Communications

ClearTrail Technologies

India Cabinet Secretariat

Span Telecom Pvt. Ldt.

Foundation Technologies

India Centre for Development of Telematics (C-DOT)

Kommlabs

India Chandigarh Police

Paladion Networks

India Defence Agency

Polaris Wireless

India General Police

Polixel Security Systems

India Intelligence Department

Pyramid Cyber Security

India National Institute of Criminology

Schleicher Group

India office LOKAYUKTA NCT DELHI

Span Technologies

India Police Department, A.P.

TATA India

India Tamil Nadu Police Department

Tata Consultancy Services

Indian Police Service, Vigilance

Telecommunications India

Indian Telecommunications Authority

Vehere Interactive

NTRO India

SAIC Indian Tamil Nadu Police

17                                                        4                                                      15

According to the above table - which is based on data from the WikiLeaksISS World 2013 brochure- the majority of Indian attendees at last years’ ISS World were from the law enforcement, defense and interior security sectors. 15 Indian companies exhibited and sold their surveillance technologies to law enforcement agencies from around the world and it is notable that India’s popular ISP provider, Reliance Communications, attended the trade show too.

In addition to the ISS World 2013 brochure, the Spy Files 3 entail a detailed brochure of a major Indian surveillance technology company: ClearTrail Technologies.

ClearTrail Technologies

ClearTrail Technologies is an Indian company based in Indore. The document titled Internet Monitoring Suite from ClearTrail Technologies boasts about the company’s mass monitoring, deep packet inspection, COMINT, SIGINT, tactical Internet monitoring, network recording and lawful interception technologies. ClearTrail’s Internet Monitoring Suite includes the following products:

1. ComTrail: Mass Monitoring of IP and Voice Networks

ComTrail is an integrated product suite for centralized interception and monitoring of voice and data networks. It is equipped with an advanced analysis engine for pro-active analysis of thousands of connections and is integrated with various tools, such as Link Analysis, Voice Recognition and Target Location.

ComTrail is deployed within a service provider network and its monitoring function correlates voice and data intercepts across diverse networks to provide a comprehensive intelligence picture. ComTrail supports the capture, record and replay of a variety of Voice and IP communications in pretty much any type of communication, including - but not limited to- Gmail, Yahoo, Hotmail, BlackBerry, ICQ and GSM voice calls.

Additionally, ComTrail intercepts data from any type of network -whether Wireless, packet data, Wire line or VoIP networks- and can decode hundreds of protocols and P2P applications, including HTTP, Instant Messengers, Web-mails, VoIP Calls and MMS.

In short, ComTrail’s key features include the following:

- Equipped to handle millions of communications per day intercepted over high speed STM & Ethernet Links

- Doubles up as Targeted Monitoring System

- On demand data retention, capacity exceeding several years

- Instant Analysis across thousands of Terabytes

- Correlates Identities across multiple networks

- Speaker Recognition and Target Location

2. xTrail: Targeted IP Monitoring

xTrail is a solution for interception, decoding and analysis of high speed data traffic over IP networks and independently monitors ISPs/GPRS and 3G networks. xTrail has been designed in such a way that it can be deployed within minutes and enables law enforcement agencies to intercept and monitor targeted communications without degrading the service quality of the IP network. This product is capable of intercepting all types of networks -including wireline, wireless, cable, VoIP and VSAT networks- and acts as a black box for “record and replay” targeted Internet communications.

Interestingly enough, xTrail can filter based on a “pure keyword”, a URL/Domain with a keyword, an IP address, a mobile number or even with just a user identity, such as an email ID, chat ID or VoIP ID. Furthermore, xTrail can be integrated with link analysis tools and can export data in a digital format which can allegedly be presented in court as evidence.

In short, xTrail’s key features include the following:

- Pure passive probe

- Designed for rapid field operations at ISP/GPRS/Wi-Max/VSAT Network Gateways

- Stand-alone solution for interception, decoding and analysis of multi Gigabit IP traffic

- Portable trolley based for simplified logistics, can easily be deployed and removed from any network location

- Huge data retention, rich analysis interface and tamper proof court evidence

- Easily integrates with any existing centralized monitoring system for extended coverage

3. QuickTrail: Tactical Wi-Fi Monitoring

Some of the biggest IP monitoring challenges that law enforcement agencies face include cases when targets operate from public Internet networks and/or use encryption.

QuickTrail is a device which is designed to gather intelligence from public Internet networks, when a target is operating from a cyber cafe, a hotel, a university campus or a free Wi-Fi zone. In particular, QuickTrail is equipped with multiple monitoring tools and techniques that can help intercept almost any wired, Wi-Fi or hybrid Internet network so that a target communication can be monitored. QuickTrail can be deployed within fractions of seconds to intercept, reconstruct, replay and analyze email, chat, VoIP and other Internet activities of a target. This device supports real time monitoring and wiretapping of Ethernet LANs.

According to ClearTrail’s brochure, QuickTrail is a “all-in-one” device which can intercept secured communications, know passwords with c-Jack attack, alert on activities of a target, support active and passive interception of Wi-Fi and wired LAN and capture, reconstruct and replay. It is noteworthy that QuickTrail can identify a target machine on the basis of an IP address, MAC ID, machine name, activity status and several other parameters. In addition, QuickTrail supports protocol decoding, including HTTP, SMTP, POP3 and HTTPS. This device also enables the remote and central management of field operations at geographically different locations.

In short, QuickTrail’s key features include the following:

- Conveniently housed in a laptop computer

- Intercepts Wi-Fi and wired LANs in five different ways

- Breaks WEP, WPA/WPA2 to rip-off secured Wi-Fi networks

- Deploys spyware into a target’s machine

- Monitor’s Gmail, Yahoo and all other HTTPS-based communications

- Reconstructs webmails, chats, VoIP calls, news groups and social networks

4. mTrail: Off-The-Air Interception

mTrail offers active and passive ‘off-the-air’ interception of GSM 900/1800/1900 Mhz phone calls and data to meet law enforcement surveillance and investigation requirements. The mTrail passive interception system works in the stealth mode so that there is no dependence on the network operator and so that the target is unaware of the interception of its communications.

The mTrail system has the capability to scale from interception of 2 channels (carrier frequencies) to 32 channels. mTrail can be deployed either in a mobile or fixed mode: in the mobile mode the system is able to fit into a briefcase, while in the fixed mode the system fits in a rack-mount industrial grade chassis.

Target location identification is supported by using signal strength, target numbers, such as IMSI, TIMSI, IMEI or MSI SDN, which makes it possible to listen to the conversation on so-called “lawfully intercepted” calls in near real-time, as well as to store all calls. Additionally, mTrail supports the interception of targeted calls from pre-defined suspect lists and the monitoring of SMS and protocol information.

In short, mTrail’s key features include the following:

- Designed for passive interception of GSM communications

- Intercepts Voice and SMS “off-the-air”

- Detects the location of the target

- Can be deployed as a fixed unit or mounted in a surveillance van

- No support required from GSM operator

5. Astra: Remote Monitoring and Infection framework

Astra is a remote monitoring and infection framework which incorporates both conventional and proprietary infection methods to ensure bot delivery to the targeted devices. It also offers a varied choice in handling the behavior of bots and ensuring non-traceable payload delivery to the controller.

The conventional methods of infection include physical access to a targeted device by using exposed interfaces, such as a CD-ROM, DVD and USB ports, as well as the use of social media engineering techniques. However, Astra also supports bot deployment without requiring any physical access to the target device.

In particular, Astra can push bot to any targeted machine sharing the same LAN (wired, wi-fi or hybrid). The SEED is a generic bot which can identify a target’s location, log keystrokes, capture screen-shots, capture Mic, listen to Skype calls, capture webcams and search the target’s browsing history. Additionally, the SEED bot can also be remotely activated, deactivated or terminated, as and when required. Astra allegedly provides an un-traceable reporting mechanism that operates without using any proxies, which overrules the possibility of getting traced by the target.

Astra’s key features include the following:

- Proactive intelligence gathering

- End-to-end remote infection and monitoring framework

- Follow the target, beat encryption, listen to in-room conversations, capture keystrokes and screen shots

- Designed for centralized management of thousands of targets

- A wide range of deployment mechanisms to optimize success ration

- Non-traceable, non-detectable delivery mechanism

- Intrusive yet stealthy

- Easy interface for handling most complex tasks

- Successfully tested over the current top 10 anti-virus available in the market

- No third party dependencies

- Free from any back-door intervention

ClearTrail Technologies argue that they meet lawful interception regulatory requirements across the globe. In particular, they claim that their products are compliant with ETSI and CALEA regulations and that they are efficient to cater to region specific requirements as well.

The latest Spy Files also include data on foreign surveillance technology companies operating in India, such as Telesoft Technologies, AGT International and Verint Systems. In particular, Verint Systems has its headquarters in New York and offices all around the world, including Bangalore in India. Founded in 1994 and run by Dan Bodner, Verint Systems produces a wide range of surveillance technologies, including the following:

- Impact 360 Speech Analytics

- Impact 360 Text Analytics

- Nextiva Video Management Software (VMS)

- Nextiva Physical Security Information Management (PSIM)

- Nextiva Network Video Recorders (NVRs)

- Nextiva Video Business Intelligence (VBI)

- Nextiva Surveillance Analytics

- Nextiva IP cameras

- CYBERVISION Network Security

- ENGAGE suite

- FOCAL-INFO (FOCAL-COLLECT & FOCAL-ANALYTICS)

- RELIANT

- STAR-GATE

- VANTAGE

While Verint Systems claims to be in compliance with ETSI, CALEA and other worldwide lawful interception and standards and regulations, it remains unclear whether such products successfully help law enforcement agencies in tackling crime and terrorism, without violating individuals’ right to privacy and other human rights. After all, Verint Systems has participated in ISS World Trade shows which exhibit some of the most controversial spyware in the world, used to target individuals and for mass surveillance.

And what do the latest Spy Files mean for India?

Why is it even important to look at the latest Spy Files? Well, for starters, they reveal data about which Indian law enforcement agencies are interested in surveillance and which companies are interested in selling and/or buying the latest spy gear. And why is any of this important? I can think of three main reasons:

1. The Central Monitoring System (CMS)

2. Is any of this surveillance even legal in India?

3. Can such surveillance result in the violation of human rights?

Spy Files 3...and the Central Monitoring System (CMS)

Following the Mumbai 2008 terrorist attacks, the Telecom Enforcement, Resource and Monitoring (TREM) cells and the Centre for Development of Telematics (C-DOT) started preparing the Central Monitoring System (CMS). As of April 2013, this project is being manned by the Intelligence Bureau, while agencies which are planned to have access to it include the Research & Analysis Wing (RAW) and the Central Bureau of Investigation (CBI). ISP and Telecom operators are required to install the gear which enables law enforcement agencies to carry out the Central Monitoring System under the Unified Access Services (UAS) License Agreement.

The Central Monitoring System aims at centrally monitoring all telecommunications and Internet communications in India and its estimated cost is Rs. 4 billion. In addition to equipping government agencies with Direct Electronic Provisioning, filters and alerts on the target numbers, the CMS will also enable Call Data Records (CDR) analysis and data mining to identify personal information of the target numbers. The CMS supplements regional Internet Monitoring Systems, such as that of Assam, by providing a nationwide monitoring of telecommunications and Internet communications, supposedly to assist law enforcement agencies in tackling crime and terrorism.

However, data monitored and collected through the CMS will be stored in a centralised database, which could potentially increase the probability of centralized cyber attacks and thus increase, rather than reduce, threats to national security. Furthermore, some basic rules of statistics indicate that the bigger the amount of data, the bigger the probability of an error in matching profiles, which could potentially result in innocent people being charged with crimes they did not commit. And most importantly: the CMS currently lacks adequate legal oversight, which means that it remains unclear how monitored data will be used. The UAS License Agreement regarding the CMS mandates mass surveillance by requiring ISPs and Telecom operators to enable the monitoring and interception of communications. However, targeted and mass surveillance through the CMS not only raises serious questions around its legality, but also creates the potential for abuse of the right to privacy and other human rights.

Interestingly enough, Indian law enforcement agencies which attended last yearsISS World trade shows are linked to the Central Monitoring System. In particular, last years’ law enforcement, defense and interior security attendees include the Centre for Development of Telematics (C-DOT) and the Department of Telecommunications, both of which prepared the Central Monitoring System. The list of attendees also includes India’s Intelligence Bureau, which is manning the CMS, as well as the agencies which will have access to the CMS: the Central Bureau of Investigation (CBI), the Research and Analysis Wing (RAW), the National Technical Research Organization (NTRO) and various other state police departments and intelligence agencies.

Furthermore, Spy Files 3 entail a list of last yearsISS World security company attendees, which includes several Indian companies. Again, interestingly enough, many of these companies may potentially be aiding law enforcement with the technology to carry out the Central Monitoring System. ClearTrail Technologies, in particular, provides solutions for targeted and mass monitoring of IP and voice networks, as well as remote monitoring and infection frameworks - all of which would potentially be perfect to aid the Central Monitoring System.

In fact, ClearTrail states in its brochure that its ComTrail product is equipped to handle millions of communications per day, while its xTrail product can easily be integrated with any existing centralised monitoring system for extended coverage. And if that’s not enough, ClearTrail’s Astrais designed for the centralized management of thousands of targets. While there may not be any concrete proof that ClearTrail is indeed aiding the Centralized Monitoring System, the facts speak for themselves: ClearTrail is an Indian company which sells target and mass monitoring products to law enforcement agencies. The Centralized Monitoring System is currently being implemented. What are the odds that ClearTrail is not equipping the CMS? And what are the odds that such technology is not being used for other mass electronic surveillance programmes, such as the Lawful Intercept and Monitoring (LIM)?

Spy Files 3...and the legality of India’s surveillance technologies

ClearTrail Technologies’ brochure -the only leaked document on Indian surveillance technology by the latest Spy Files- states that the company complies with ETSI and CALEA regulations. While it’s clear that the company complies with U.S. and European regulations on the interception of communications to attract more customers in the international market, such regulations don’t really apply within India, which is part of ClearTrail’s market. Notably enough, ClearTrail does not mention any compliance with Indian regulations in its brochure. So let’s have a look at them.

India has five laws which regulate surveillance:

1. The Indian Telegraph Act, 1885

2. The Indian Post Office Act, 1898

3. The Indian Wireless Telegraphy Act, 1933

4. The Code of Criminal Procedure (CrPc), 1973: Section 91

5. The Information Technology (Amendment) Act, 2008

The Indian Post Offices Act does not cover electronic communications and the Indian Wireless Telegraphy Act lacks procedures which would determine if surveillance should be targeted or not. Neither the Indian Telegraph Act nor the Information Technology (Amendment) Act cover mass surveillance, but are both limited to targeted surveillance. Moreover, targeted interception in India according to these laws requires case-by-case authorization by either the home secretary or the secretary department of information technology. In other words, unauthorized, limitless, mass surveillance is not technically permitted by law in India.

The Indian Telegraph Act mandates that the interception of communications can only be carried out on account of a public emergency or for public safety. However, in 2008, the Information Technology Act copied most of the interception provisions of the Indian Telegraph Act, but removed the preconditions of public emergency or public safety, and instead expanded the power of the government to order interception for the “investigation of any offense”.

The interception of Internet communications is mainly covered by the 2009 Rules under the Information Technology Act 2008 and Sections 69 and 69B are particularly noteworthy. According to these Sections, an Intelligence Bureau officer who leaked national secrets may be imprisoned for up to three years, while Section 69 not only allows for the interception of any information transmitted through a computer resource, but also requires that users disclose their encryption keys upon request or face a jail sentence of up to seven years.

While these laws allow for the interception of communications and can be viewed as widely controversial, they do not technically permit the mass surveillance of communications. In other words, ClearTrail’s products, such as ComTrail, which enable the mass interception of IP networks, lack legal backing. However, the Unified Access Services (UAS) License Agreement regarding the Central Monitoring System mandates mass surveillance and requires ISP and Telecom operators to comply.

Through the licenses of the Department of Telecommunications, Internet service providers, cellular providers and telecoms are required to provide the Government of India direct access to all communications data and content even without a warrant, which is not permitted under the laws on interception. These licenses also require cellular providers to have ‘bulk encryption’ of less than 40 bits, which means that potentially any person can use off-the-air interception to monitor phone calls. However, such licenses do not regulate the capture of signal strength, target numbers like IMSI, TIMSI, IMEI or MSI SDN, which can be captured through ClearTrail’s mTrail product.

More importantly, following allegations that the National Technical Research Organization (NTRO) had been using off-the-air interception equipment to snoop on politicians in 2011, the Home Ministry issued a directive to ban the possession or use of all off-the-air phone interception gear. As a result, the Indian Government asked the Customs Department to provide an inventory of all all such equipment imported over a ten year period, and it was uncovered that as many as 73,000 pieces of equipment had been imported. Since, the Home Ministry has informed the heads of law enforcement agencies that there has been a compete ban on use of such equipment and that all those who possess such equipment and fail to inform the Government will face prosecution and imprisonment. In short, ClearTrail's product, mTrail, which undertakes off-the-air phone monitoring is illegal and Indian law enforcement agencies are prohibited from using it.

ClearTrail’s Astra product is capable of remote infection and monitoring, which can push bot to any targeted machine sharing the same LAN. While India’s ISP and telecommunications licenses generally provide some regulations, they appear to be inadequate in regulating specific surveillance technologies which have the capability to target machines and remotely monitor them. Such licenses mandate mass surveillance, but legally, wireless communications are completely unregulated, which raises the question of whether the interception of public Internet networks is allowed. In other words, it is not clear if ClearTrail’s QuickTrail is technically legal or not. The UAS License agreement mandates mass surveillance, and while the law does not prohibit it, it does not mandate mass surveillance either. This remains a grey area.

The issue of data retention arises from ClearTrails leaked brochure. In particular, ClearTrail states in its brochure that ComTrail - which undertakes mass monitoring of IP and Voice networks - retains data upon request, with a capacity that exceeds several years. xTrail - for targeted IP monitoring - has the ability to retain huge volumes of data which can potentially be used as proof in court. However, India currently lacks privacy legislation which would regulate data retention, which means that data collected by ClearTrail could potentially be stored indefinitely.

Section 7 of the Information Technology (Amendment) Act, 2008, deals with the retention of electronic records. However, this section does not state a particular data retention period, nor who will have authorized access to data during its retention, who can authorize such access, whether retained data can be shared with third parties and, if so, under what conditions. Section 7 of the Information Technology (Amendment) Act, 2008, appears to be incredibly vague and to fail to regulate data retention adequately.

Data retention requirements for service providers are included in the ISP and UASL licenses and, while they clarify the type of data they retain, they do not specify adequate conditions for data retention. Due to the lack of data protection legislation in India, it remains unclear how long data collected by companies, such as ClearTrail, would be stored for, as well as who would have authorized access to such data during its retention period, whether such data would be shared with third parties and disclosed and if so, under what conditions.

India currently lacks specific regulations for the use of various types of technologies, which makes it unclear whether ClearTrails spy products are technically legal or not. It is clear that ClearTrail’s mass interception products, such as ComTrail, are not legalized - since Indian laws allow for targeted interception- but they are mandated through the UAS License agreement regarding the Central Monitoring System.

In short, the legality of ClearTrail’s surveillance technologies remains ambiguous. While India’s ISP and telecom licenses and the UAS License Agreement mandate mass surveillance, the laws - particularly the 2009 Information Technology Rules- mandate targeted surveillance and remain silent on the issue of mass surveillance. Technically, this does not constitute mass surveillance legal or illegal, but rather a grey area. Furthermore, while Indias Telegraph Act, Information Technology Act and 2009 Rules allow for the interception, monitoring and decryption of communications and surveillance in general, they do not explicitly regulate the various types of surveillance technologies, but rather attempt to “legalize” them through the blanket term of surveillance.

One thing is clear: India’s license agreements ensure that all ISPs and telecom operators are a part of the surveillance regime. The lack of regulations for India’s surveillance technologies appear to create a grey zone for the expansion of mass surveillance in the country. According to Saikat Datta, an investigative journalist, a senior privacy telecom official stated:

Do you really think a private telecom company can stand up to the government or any intelligence agency and cite law if they want to tap someone’s phone?”



Spy Files 3...and human rights in India

The facts speak for themselves. The latest Spy Files confirm that the same agencies involved in the development of the Central Monitoring System (CMS) are also interested in the latest surveillance technology sold in the global market. Spy Files 3 also provide data on one of India’s largest surveillance technology companies, ClearTrail, which sells a wide range of surveillance technologies to law enforcement agencies around the world. And Spy Files 3 show us exactly what these technologies can do.

In particular, ClearTrail’s ComTrail provides mass monitoring of IP and voice networks, which means that law enforcement agencies using it are capable of intercepting millions of communications every day through Gmail, Yahoo, Hotmail and others, of correlating our identities across networks and of targeting our location. xTrail enables law enforcement agencies to monitor us based on our “harmless” metadata, such as our IP address, our mobile number and our email ID. Think our data is secure when using the Internet through a cyber cafe? Well QuickTrail proves us wrong, as it’s able to assist law enforcement agencies in monitoring and intercepting our communications even when we are using public Internet networks.

And indeed, carrying a mobile phone is like carrying a GPS device, especially since mTrail provides law enforcement with off-the-air interception of mobile communications. Not only can mTrail target our location, listen to our calls and store our data, but it can also undertake passive off-the-air interception and monitor our voice, SMS and protocol information. Interestingly enough, mTrail also intercepts targeted calls from a predefined suspect list. The questions though which arise are: who is a suspect? How do we even know if we are suspects? In the age of the War on Terror, potentially anyone could be a suspect and thus potentially anyone’s mobile communications could be intercepted. After all, mass surveillance dictates that we are all suspicious until proven innocent.

And if anyone can potentially be a suspect, then potentially anyone can be remotely infected and monitored by Astra. Having physical access to a targeted device is a conventional surveillance mean of the past. Today, Astra can remotely push bot to our laptops and listen to our Skype calls, capture our Webcams, search our browsing history, identify our location and much more. And why is any of this concerning? Because contrary to mainstream belief, we should all have something to hide!

Privacy protects us from abuse from those in power and safeguards our individuality and autonomy as human beings. If we are opposed to the idea of the police searching our home without a search warrant, we should be opposed to the idea of our indiscriminate mass surveillance. After all, mass surveillance - especially the type undertaken by ClearTrails products - can potentially result in the access, sharing, disclosure and retention of data much more valuable than that acquired by the police searching our home. Our credit card details, our photos, our acquaintances, our personal thoughts and opinions, and other sensitive personal information can usually be found in our laptops, which potentially can constitute much more incriminating information than that found in our homes.

And most importantly: even if we think that we have nothing to hide, it’s really not up to us to decide: it’s up to data analysts. While we may think that our data is “harmless”, a data analyst linking our data to various other people and search activities we have undertaken might indicate otherwise. Five years ago, a UK student studying Islamic terrorism for his Masters dissertation was detained for six days. The student may not have been a terrorist, but his data said this: “Young, male, Muslim... who is downloading Al-Qaeda’s training material” - and that was enough for him to get detained. Clearly, the data analysts mining his online activity did not care about the fact that the only reason why he was downloading Al-Qaeda material was for his Masters dissertation. The fact that he was a male Muslim downloading terrorist material was incriminating enough.

This incident reveals several concerning points: The first is that he was clearly already under surveillance, prior to downloading Al-Qaeda’s material. However, given that he did not have a criminal record and was “just a Masters student in the UK”, there does not appear to be any probable cause for his surveillance in the first place. Clearly he was on some suspect list on the premise that he is male and Muslim - which is a discriminative approach. The second point is that after this incident, it is likely that some male Muslims may be more cautious about their online activity - with the fear of being on some suspect list and eventually being prosecuted because their data shows that “they’re a terrorist”. Thus, mass surveillance today appears to also have implications on freedom of expression. The third point is that this incident reveals the extent of mass surveillance, since even a document downloaded by a Masters student is being monitored.

This case proves that innocent people can potentially be under surveillance and prosecuted, as a result of mass, indiscriminate surveillance. Anyone can potentially be a suspect today, and maybe for the wrong reasons. It does not matter if we think our data is “harmless”, but what matters is who is looking at our data, when and why. Every bit of data potentially hides several other bits of information which we are not aware of, but which will be revealed within a data analysis. We should always have something to hide, as that is the only way to protect us from abuse by those in power.

In the contemporary surveillance state, we are all suspects and mass surveillance technologies, such as the ones sold by ClearTrail, can potentially pose major threats to our right to privacy, freedom of expression and other human rights. And probably the main reason for this is because surveillance technologies in India legally fall in a grey area. Thus, it is recommended that law enforcement agencies in India regulate the various types of surveillance technologies in compliance with the International Principles on Communications Surveillance and Human Rights.

Spy Files 3 show us why our human rights are at peril and why we should fight for our right to be free from suspicion.

 

This article was cross-posted in Medianama on 6th November 2013.

Re: The Human DNA Profiling Bill, 2012

by Bhairav Acharya last modified Oct 29, 2013 10:00 AM
This short note speaks to legal issues arising from the proposed Human DNA Profiling Bill, 2012 ("DBT Bill") that was circulated drafted under the aegis of the Department of Biotechnology of the Ministry of Science and Technology, Government of India, which seeks to collect human DNA samples, profile them and store them. These comments are made clause-by-clause against the DBT Bill.

Note: Clause-by-clause comments on the Working Draft version of April 29, 2012 from the Centre for Internet and Society


  1. This short note speaks to legal issues arising from the proposed Human DNA Profiling Bill, 2012 ("DBT Bill") that was circulated within the Experts Committee constituted under the aegis of the Department of Biotechnology of the Ministry of Science and Technology, Government of India.
  2. This note must be read against the relevant provisions of the DBT Bill and, where indicated, together with the proposed Forensic DNA Profiling (Regulation) Bill, 2013 that was drafted by the Centre for Internet & Society, Bangalore ("CIS Bill"). These comments must also be read alongside the two-page submission titled “A Brief Note on the Forensic DNA Profiling (Regulation) Bill, 2013” ("CIS Note"). Whereas the aforesaid CIS Note raised issues that informed the drafting of the CIS Bill, this present note seeks to provide legal comments on the DBT Bill.
    Preamble
  3. The DBT Bill, in its current working form, lacks a preamble. No doubt, a preamble will be added later once the text of the DBT Bill is finalised. Instead, the DBT Bill contains an introduction. It must be borne in mind that the purpose of the legislation should be spelt out in the preamble since preambular clauses have interpretative value. [See, A. Thangal Kunju Musaliar AIR 1956 SC 246; Burrakur Coal Co. Ltd. AIR 1961 SC 954; and Arnit Das (2000) 5 SCC 488]. Hence, a preamble that states the intent of Parliament to create permissible conditions for DNA source material collection, profiling, retention and forensic use in criminal trials is necessary.
    Objects Clause
  4. An ‘objects clause,’ detailing the intention of the legislature and containing principles to inform the application of a statute, in the main body of the statute is an enforceable mechanism to give directions to a statute and can be a formidable primary aid in statutory interpretation. [See, for example, section 83 of the Patents Act, 1970 that directly informed the Order of the Controller of Patents, Mumbai, in the matter of NATCO Pharma and Bayer Corporation in Compulsory Licence Application No. 1 of 2011.] Therefore, the DBT Bill should incorporate an objects clause that makes clear that (i) the principles of notice, confidentiality, collection limitation, personal autonomy, purpose limitation and data minimisation must be adhered to at all times; (ii) DNA profiles merely estimate the identity of persons, they do not conclusively establish unique identity; (iii) all individuals have a right to privacy that must be continuously weighed against efforts to collect and retain DNA; (iv) centralised databases are inherently dangerous because of the volume of information that is at risk; (v) forensic DNA profiling is intended to have probative value; therefore, if there is any doubt regarding a DNA profile, it should not be received in evidence by a court; (vi) once adduced, the evidence created by a DNA profile is only corroborative and must be treated on par with other biometric evidence such as fingerprint measurements.
    Definitions
  5. The definition of “analytical procedure” in clause 2(1)(a) of the DBT Bill is practically redundant and should be removed. It is used only twice – in clauses 24 and 66(2)(p) which give the DNA Profiling Board the power to frame procedural regulations. In the absence of specifying the content of any analytical procedure, the definition serves no purpose.
  6. The definition of “audit” in clause 2(1)(b) is relevant for measuring the training programmes and laboratory conditions specified in clauses 12(f) and 27. However, the term “audit” is subsequently used in an entirely different manner in Chapter IX which relates to financial information and transparency. This is a conflicting definition. The term “audit” has a well-established use for financial information that does not require a definition. Hence, this definition should be removed.
  7. The definition of “calibration” in clause 2(1)(d) is redundant and should be removed since the term is not meaningfully used in the DBT Bill.
  8. The definition of “DNA Data Bank” in clause 2(1)(h) is unnecessary. The DBT Bill seeks to establish a National DNA Data Bank, State DNA Data Banks and Regional DNA Data Banks vide clause 32. These national, state and regional databases must be defined individually with reference to their establishment clauses. Defining a “DNA Data Bank”, exclusive of the national, state and regional databases, creates the assumption that any private individual can start and maintain a database. This is a drafting error.
  9. The definition of “DNA Data Bank Manager” in clause 2(1)(i) is misleading since, in the text of the DBT Bill, it is only used in relation to the proposed National DNA Data Bank and never in relation to the State and Regional Data Banks. If it is the intention of DBT Bill that only the national database should have a manager, the definition should be renamed to ‘National DNA Data Bank Manager’ and the clause should specifically identify the National DNA Data Bank. This is a drafting error.
  10. The definition of “DNA laboratory” in clause 2(1)(j) should refer to the specific clauses that empower the Central Government and State Governments to license and recognise DNA laboratories. This is a drafting error.
  11. The definition of “DNA profile” in clause 2(1)(l) is too vague. Merely the results of an analysis of a DNA sample may not be sufficient to create an actual DNA profile. Further, the results of the analysis may yield DNA information that, because of incompleteness or lack of information, is inconclusive. These incomplete bits of information should not be recognised as DNA profiles. This definition should be amended to clearly specify the contents of a complete and valid DNA profile that contains, at least, numerical representations of 17 or more loci of short tandem repeats that are sufficient to estimate biometric individuality of a person.
  12. The definition of “forensic material” in clause 2(1)(o) needs to be amended to remove the references to intimate and non-intimate body samples. If the references are retained, then evidence collected from a crime scene, where an intimate or non-intimate collection procedure was obviously not followed, will not fall within the scope of “forensic material”.
  13. The terms “intimate body sample” and “non-intimate body sample” that are defined in clauses 2(1)(q) and 2(1)(v) respectively are not used anywhere outside the definitions clause except for an inconsequential reference to non-intimate body samples only in the rule-making provision of clause 66(2)(zg). “Intimate body sample” is not used anywhere outside the definitions clause. Both these definitions are redundant and should be removed.
  14. The terms “intimate forensic procedure” and “non-intimate forensic procedure”, that are defined in clauses 2(1)(r) and 2(1)(w) respectively, are not used anywhere except for an inconsequential reference of non-intimate forensic procedure in the rule-making provision of clause 66(2)(zg). “Intimate forensic procedure” is not used anywhere outside the definitions clause. Both these definitions are redundant and should be removed.
  15. The term “known samples” that is defined in clause 2(1)(s) is not used anywhere outside the definitions clause and should be removed for redundancy.
  16. The definition of “offender” in clause 2(1)(y) if vague because it does not specify the offences for which an “offender” need be convicted. It is also linked to an unclear definition of the term “undertrial”, which does not specify the nature of pending criminal proceedings and, therefore, could be used to describe simple offences such as, for example, failure to pay an electricity bill, which also attracts criminal penalties.
  17. The term “proficiency testing” that is defined in clause 2(1)(zb) is not used anywhere in the text of the DBT Bill and should be removed.
  18. The definitions of “quality assurance”, “quality manual” and “quality system” serve no enforceable purpose since they are used only in relation to the DNA Profiling Board’s rule-making powers under clauses 18 and 66. Their inclusion in the definitions clause is redundant. Accordingly, these definitions should be removed.
  19. The term “suspect” defined in clause 2(1)(zi) is vague and imprecise. The standard by which suspicion is to be measured, and by whom suspicion may be entertained – whether police or others, has not been specified. The term “suspect” is not defined in either the Code of Criminal Procedure, 1973 ("CrPC") or the Indian Penal Code, 1860 ("IPC").
    The
    DNA Profiling Board
  20. Clause 3 of the DBT Bill, which provides for the establishment of the DNA Profiling Board, contains a sub-clause (2) which vests the Board with corporate identity. This vesting of legal personality in the DNA Profiling Board – when other boards and authorities, even ministries and independent departments, and even the armed forces do not enjoy this function – is ill-advised and made without sufficient thought. Bodies corporate may be corporations sole – such the President of India, or corporations aggregate – such as companies. The intent of corporate identity is to create a fictional legal personality where none previously existed in order for the fictional legal personality to exist apart from its members, enjoy perpetual succession and to sue in its own legal name. Article 300 of the Constitution of India vests the Central Government with legal personality in the legal name of the Union of India and the State Governments with legal personality in the legal names of their respective states. Apart from this constitutional dispensation, some regulatory authorities, such as the Telecom Regulatory Authority of India ("TRAI") and the Securities and Exchange Board of India ("SEBI") have been individually vested with legal personalities as bodies corporate to enable their autonomous governance and independent functioning to secure their ability to free, fairly and impartially regulate the market free from governmental or private collusion. Similarly, some overarching national commissions, such as the Election Commission of India and the National Human Rights Commission ("NHRC") have been vested with the power to sue and be sued in their own names. In comparison, the DNA Profiling Board is neither an independent market regulator nor an overarching national commission with judicial powers. There is no legal reason for it to be vested with a legal personality on par with the Central Government or a company. Therefore, clause 3(2) should be removed.
  21. The size and composition of the Board that is staffed under clause 4 is extremely large. Creating unwieldy and top-heavy bureaucratic authorities and investing them with regulatory powers, including the powers of licensing, is avoidable. The DBT Bill proposes to create a Board of 16 members, most of them from a scientific background and including a few policemen and one legal administrator. In its present form, the Board is larger than many High Courts but does not have a single legal member able to conduct licensing. Drawing from the experiences of other administrative and regulatory bodies in India, the size of the Board should be drastically reduced to no more than five members, at least half of whom should be lawyers or ex-judges. The change in the legal composition of the Board is necessary because the DBT Bill contemplates that it will perform the legal function of licensing that must obey basic tenets of administrative law. The current membership may be viable only if the Board is divested of its administrative and regulatory powers and left with only scientific advice functions. Moreover, stacking the Board with scientists and policemen appears to ignore the perils that DNA collection and retention pose to the privacy of ordinary citizens and their criminal law rights. The Board should have adequate representation from the human rights community – both institutional (e.g NHRC and the State Human Rights Commissions) and non-institutional (well-regarded and experienced human rights activists). The Board should also have privacy advocates.
  22. Clauses 5(2) and 5(3) establish an unequal hierarchy within the Board by privileging some members with longer terms than others. There is no good reason for why the Vice-Chancellor of a National Law University, the Director General of Police of a State, the Director of a Central Forensic Science Laboratory and the Director of a State Forensic Science Laboratory should serve membership terms on the Board that are longer than those of molecular biologists, population geneticists and other scientists. Such artificial hierarchies should be removed at the outset. The Board should have one pre-eminent chairperson and other equal members with equal terms.
  23. The Chairperson of the Board, who is first mentioned in clause 5(1), has not been duly and properly appointed. Clause 4 should be modified to mention the appointment of the Chairperson and other Members.
  24. Clause 7 deals with the issue of conflict of interest in narrow cases. The clause requires members to react on a case-by-case basis to the business of the Board by recusing themselves from deliberations and voting where necessary. Instead, it may be more appropriate to require members to make a full and public disclosures of their real and potential conflicts of interest, and then granting the Chairperson the power to prevent such members from voting on interested matters. Failure to follow these anti-collusion and anti-corruption safeguards should attract criminal penalties.
  25. Clause 10 anticipates the appointment of a Chief Executive Officer of the Board who shall be a serving Joint Secretary to the Central Government. Clause 10(3) further requires this officer to be scientist. This may not be possible because the administrative hierarchy of the Central Government may not contain a genetic scientist.
  26. The functions of the Board specified in clause 12 are overbroad. Advising ministries, facilitating governments, recommending the size of funds and so on – these are administrative and governance functions best left to the executive. Once the Board is modified to have sufficient legal and human rights representation, then the functions of the Board can non-controversially include licensing, developing standards and norms, safeguarding privacy and other rights, ensuring public transparency, promoting information and debate and a few other limited functions necessary for a regulatory authority.
    DNA Laboratories
  27. The provisions of Chapters V and VI may be simplified and merged.
    DNA Data Banks
  28. The creation of multiple indices in clause 32(4) cannot be justified and must be removed. The collection of biological source material is an invasion of privacy that must be conducted only in strict conditions when the potential harm to individuals is outweighed by the public good. This balance may only be struck when dealing with the collection and profiling of samples from certain categories of offenders. The implications of collecting and profiling DNA samples from corpses, suspects, missing persons and others are vast and have either not been properly understood or deliberately ignored. At this moment, the forcible collection of biological source material should be restricted to the categories of offenders mentioned in the Identification of Prisoners Act, 1920 ("Prisoners Act") with a suitable addition for persons arrested in connection with certain specified terrorism-related offences. Therefore, databases should contain only an offenders’ index and a crime scene index.
  29. Clause 32(6), which requires the names of individuals to be connected to their profiles, and hence accessible to persons connected with the database, should be removed. DNA profiles, once developed, should be anonymised and retained separate from the names of their owners.
  30. Clause 36, which allows international disclosures of DNA profiles of Indians, should be removed immediately. Whereas an Indian may have legal remedies against the National DNA Data Bank, he/she certainly will not be able to enforce any rights against a foreign government or entity. This provision will be misused to rendition DNA profiles abroad for activities not permitted in India. Similarly, as in data protection regimes around the world, DNA profiles should remain within jurisdictions with high privacy and other legal standards.
    Use
  31. The only legitimate purpose for which DNA profiles may be used is for establishing the identity of individuals in criminal trials and confirming their presence or absence from a certain location. Accordingly, clauses 39 and 40 should be re-drafted to specify this sole forensic purpose and also specify the manner in which DNA profiles may be received in evidence. For more information on this point, see the relevant provisions of the CIS Note and the CIS Bill.
  32. The disclosure of DNA profiles should only take place to a law enforcement agency conducting a valid investigation into certain offences and to courts currently trying the individuals to whom the DNA profiles pertains. All other disclosures of DNA profiles should be made illegal. Non-consensual disclosure of DNA profiles for the study of population genetics is specifically illegal. The DBT Bill does not prescribe stringent criminal penalties and other mechanisms to affix individual liability on individual scientists and research institutions for improper use of DNA profiles; it is therefore open to the criticism that it seeks to sacrifice individual rights of persons, including the fundamental right to privacy, without parallel remedies and penalties. Clause 40 should be removed in entirety.
  33. Clause 43 should be removed in entirety. This note does not contemplate the retention of DNA profiles of suspects and victims, except as derived from a crime scene.
  34. Clause 45 sets out a post-conviction right related to criminal procedure and evidence. This would fundamentally alter the nature of India’s criminal justice system, which currently does not contain specific provisions for post-conviction testing rights. However, courts may re-try cases in certain narrow cases when fresh evidence is brought forth that has a nexus to the evidence upon which the person was convicted and if it can be proved that the fresh evidence was not earlier adduced due to bias. Any other fresh evidence that may be uncovered cannot prompt a new trial. Clause 45 is implicated by Article 20(2) of the Constitution of India and by section 300 of the CrPC. The principle of autrefois acquit that informs section 300 of the CrPC specifically deals with exceptions to the rule against double jeopardy that permit re-trials. [See, for instance, Sangeeta Mahendrabhai Patel (2012) 7 SCC 721].

Concerns Regarding DNA Law

by Bhairav Acharya last modified Oct 29, 2013 10:09 AM
Recently, a long government process to draft a law to permit the collection, processing, profiling, use and storage of human DNA is nearing conclusion. There are several concerns with this government effort. Below, we present broad-level issues to be kept in mind while dealing with DNA law.

Background

The Department of Biotechnology released, in 29 April 2012, a working draft of a proposed Human DNA Profiling Bill, 2012 ("DBT Bill") for public comments. The draft reveals an effort to (i) permit the collection of human blood, tissue and other samples for the purpose of creating DNA profiles, (ii) license private laboratories that create and store the profiles, (iii) store the DNA samples and profiles in various large databanks in a number of indices, and (iv) permit the use of the completed DNA profiles in scientific research and law enforcement. The regulation of human DNA profiling is of significant importance to the efficacy of law enforcement and the criminal justice system and correspondingly has a deep impact on the freedoms of ordinary citizens from profiling and monitoring. Below, we highlight five important concerns to bear in mind before drafting and implementing DNA legislation.

Primary Issues

Purpose of DNA Profiling

DNA  profiling  serves  two broad  purposes – (i) forensic – to establish  unique  identity  of a person in the criminal justice system; and, (ii) research – to understand human genetics and its contribution  to  anthropology, biology  and  other  sciences.  These  two  purposes have  very different approaches  to DNA  profiling and  the  issues and  concerns attendant on them vary accordingly. Forensic DNA profiling is undertaken to afford either party in a criminal trial a better  possibility  of  adducing corroborative evidence to  prosecute,  or to  defend, an alleged offence. DNA, like fingerprints, is a biometric estimation of the individuality of a person. By itself, in the same manner that fingerprint evidence is only proof of the presence of a person at a particular place and not proof of the commission of a crime, DNA is merely corroborative evidence  and cannot,  on its  own  strength,  result  in a conviction  or  acquittal  of  an  offence. Therefore, DNA  and fingerprints,  and the  process  by which they  are  collected and  used as evidence, should be broadly similar.

Procedural Integrity

Forensic DNA profiling results from biological source material that is usually collected from crime scenes or forcibly from offenders and convicts. Biological source material found at a crime scene is very rarely non-contaminated and the procedure by which it is collected and its integrity ensured is of primary legislative importance. To avoid the danger of contaminated crime scene evidence being introduced in the criminal justice system to pervert the course of justice, it is crucial to ensure that DNA is collected only from intact human cells and not from compromised genetic material. Therefore, if the biological source material found at a crime scene  does  not  contain  at  least  one  intact  human  cell,  the  whole  of  the biological  source material should be destroyed to prevent the possibility of compromised genetic material being collected to  yield  inconclusive results.  Adherence  to  this  basic  principle  will  obviate  the possibility  of  partial  matches  of  DNA  profiles  and  the  resulting  controversy  and  confusion that ensues.

Conditions of Collection

In India, the taking of fingerprints is chiefly governed by the Identification of Prisoners Act, 1920 ("Prisoners Act") and section 73 of the Indian Evidence Act, 1872 ("Evidence Act"). The Prisoners Act permits  the forcible taking of  fingerprints from convicts and  suspects in certain  conditions.  The Evidence  Act,  in  addition,  permits  courts  to  require  the  taking  of fingerprints  for  the  forensic  purpose  of  establishing  unique  identity  in  a  criminal  trial. No
provisions exist for consensual taking of fingerprints, presumably because of the danger of self-incrimination and general privacy concerns. Since, as discussed earlier, fingerprints and DNA are  biometric  measurements  that  should  be treated  equally to the  extent possible, the conditions for the collection of DNA should be similar to those for the taking of fingerprints.Accordingly,  there  should  be  no  legal  provisions  that  enable  other  kinds  of  collection, including from volunteers and innocent people.

Retention of DNA

As  a  general  rule applicable  in  India,  the  retention  of  biometric  measurements  must  be supported  by  a  clear  purpose  that  is  legitimate, judicially  sanctioned  and  transparent. The Prisoners Act, which permits the forcible taking of fingerprints from convicts, also mandates the destruction of these fingerprints when the person is acquitted or discharged. The indefinite collection  of  biometric  measurements  of people  is  dangerous,  susceptible  to  abuse  and invasive of civil rights. Therefore, once lawfully collected from crime scenes and offenders, their DNA profiles must  be  retained  in  strictly  controlled  databases with  highly  restricted access for the forensic purpose of law enforcement only. DNA should not be held in databases that allow non-forensic use. Further, the indices within these databases should be watertight and exclusive of each other.

DNA Laboratories

The process by which DNA profiles are created from biological source material is of critical importance. Because of the evidentiary value of DNA profiles, the laboratories in which these profiles  are  created  must  be  properly  licensed, professionally  managed  and manned  by competent  and  impartial  personnel.  Therefore,  the  process  by  which  DNA laboratories  are licensed and permitted to operate is significant.

Interview with Caspar Bowden - Privacy Advocate and former Chief Privacy Adviser at Microsoft

by Maria Xynou last modified Nov 06, 2013 08:16 AM
Maria Xynou recently interviewed Caspar Bowden, an internationally renowned privacy advocate and former Chief Privacy Adviser at Microsoft. Read this exciting interview and gain an insight on India's UID and CMS schemes, on the export of surveillance technologies, on how we can protect our data in light of mass surveillance and much much more!
Caspar Bowden is an independent advocate for better Internet privacy technology and regulation. He is a specialist in data protection policy, privacy enhancing technology research, identity management and authentication. Until recently he was Chief Privacy Adviser for Microsoft, with particular focus on Europe and regions with horizontal privacy law.
From 1998-2002, he was the director of the Foundation for Information Policy Research (www.fipr.org) and was also an expert adviser to the UK Parliament for the passage of three bills concerning privacy, and was co-organizer of the influential Scrambling for Safety public conferences on UK encryption and surveillance policy. His previous career over two decades ranged from investment banking (proprietary trading risk-management for option arbitrage), to software engineering (graphics engines and cryptography), including work for Goldman Sachs, Microsoft Consulting Services, Acorn, Research Machines, and IBM.
The Centre for Internet and Society interviewed Caspar Bowden on the following questions:

 

1. Do you think India needs privacy legislation? Why / Why not?

 

Well I think it's essential for any modern democracy based on a constitution to now recognise a universal human right to privacy. This isn't something that would necessarily have occurred to the draft of constitutions before the era of mass electronic communications, but this is now how everyone manages their lives and maintains social relationships at a distance, and therefore there needs to be an entrenched right to privacy – including communications privacy – as part of the core of any modern state.

2. The majority of India's population lives below the line of poverty and barely has any Internet access. Is surveillance an elitist issue or should it concern the entire population in the country? Why / Why not?

 

Although the majority of people in India are still living in conditions of poverty and don't have access to the Internet or, in some cases, to any electronic communications, that's changing very rapidly. India has some of the highest growth rates in take up with both mobile phones and mobile Internet and so this is spreading very rapidly through all strata of society. It's becoming an essential tool for transacting with business and government, so it's going to be increasingly important to have a privacy law which guarantees rights equally, no matter what anyone's social station or situation. There's also, I think, a sense in which having a right to privacy based on individual rights is much preferable to some sort of communitarian approach to privacy, which has a certain philosophical following; but that model of privacy - that somehow, because of a community benefit, there should also be a sort of community sacrifice in individual rights to privacy - has a number of serious philosophical flaws which we can talk about.

3. "I'm not a terrorist and I have nothing to hide...and thus surveillance can't affect me personally." Please comment.

 

Well, it's hard to know where to begin. Almost everybody in fact has “something to hide”, if you consider all of the social relationships and the way in which you are living your life. It's just not true that there's anybody who literally has nothing to hide and in fact I think that it's rather a dangerous idea, in political culture, to think about imposing that on leaders and politicians. There's an increasing growth of the idea – now, probably coming from America- that political leaders (and even their staff - to get hired in the current White House) should open up their lives, even to the extent of requiring officials to give up their passwords to their social network accounts (presumably so that they can be vetted for sources of potential political embarrassment in their private life). This is a very bad idea because if we only elect leaders, and if we only employ bureaucrats, who do not accord any subjective value to privacy, then it means we will almost literally be electing (philosophical) zombies. And we can't expect our political leaders to respect our privacy rights, if we don't recognise that they have a right to privacy in their own lives also. The main problem with the “nothing to hide, so nothing to fear” mantra is that this is used as a rhetorical tool by authoritarian forces in government and society, who simply wish to take a more paternalistic and protective attitude. This reflects a disillusionment within the “deep state” about how democratic states should function.

Essentially, those who govern us are given a license through elections to exercise power with consent, but this entails no abrogation of a citizen's duty to question authority. Instead, that should be seen as a civic duty - providing the objections are reasonable. People actually know that there are certain things in their lives that they don't wish other people to know, but by indoctrinating the “nothing to hide” ideology, it inculcates a general tendency towards more conformism in society, by inhibiting critical voices.

4. Should people have the right to give up their right to privacy? Why / Why not?

 

In European data protection law there is an obscure provision which is particularly relevant to medical privacy, but almost never used in the area of so-called sensitive personal data, like political views or philosophical views. It is possible currently for European governments to legislate to override the ability of the individual to consent. So this might arise, for example, if a foreign company sets up a service to get people to consent to have their DNA analysed and taken into foreign databases, or generally where people might consent to a big foreign company analysing and capturing their medical records. I think there is a legitimate view that, as a matter of national policy, a government could decide that these activities were threatening to data sovereignty, or that was just bad public policy. For example, if a country has a deeply-rooted social contract that guarantees the ability to access medical care through a national health service, private sector actors could try to undermine that social-solidarity basis for universal provision of health care. So for those sorts of reasons I do think it's defensible for governments to have the ability in those sectors to say: “Yes, there are areas where people should not be able to consent to give up their privacy!”

But then going back to the previous answer, more generally, commercial privacy policies are now so complicated – well, they've always been complicated, but now are mind-blowingly devious as well - people have no real possibility of knowing what they're consenting to. For example, the secondary uses of data flows in social networks are almost incomprehensible, even for technologists at the forefront of research. The French Data Protection authorities are trying to penalize Google for replacing several very complicated privacy policies by one so-called unified policy, which says almost nothing at all. There's no possible way for people to give informed consent to this over-simplified policy, because it doesn't even tell anything useful to an expert. So again in these circumstances, it's right for a regulator to intercede to prevent unfair exploitation of the deceptive kind of “tick-box” consent. Lastly, it is not possible for EU citizens to waive or trade away their basic right to access (or delete) their own data in future, because this seems a reckless act and it cannot be foreseen when this right might become essential in some future circumstances. So in these three senses, I believe it is proper for legislation to be able to prevent the abuse of the concept of consent.

5. Do you agree with India's UID scheme? Why / Why not?

 

There is a valid debate about whether it's useful for a country to have a national identity system of some kind - and there's about three different ways that can be engineered technically. The first way is to centralise all data storage in a massive repository, accessed through remote terminal devices. The second way is a more decentralised approach with a number of different identity databases or systems which can interoperate (or “federate” with eachother), with technical and procedural rules to enforce privacy and security safeguards. In general it's probably a better idea to decentralise identity information, because then if there is a big disaster (or cyber-attack) or data loss, you haven't lost everything. The third way is what's called “user-centric identity management”, where the devices (smartphones or computers) citizens use to interact with the system keep the identity information in a totally decentralised way.

Now the obvious objection to that is: “Well, if the data is decentralised and it's an official system, how can we trust that the information in people's possession is authentic?”. Well, you can solve that with cryptography. You can put digital signatures on the data, to show that the data hasn't been altered since it was originally verified. And that's a totally solved problem. However, unfortunately, not very many policy makers understand that and so are easily persuaded that centralization is the most efficient and secure design – but that hasn't been true technically for twenty years. Over that time, cryptographers have refined the techniques (the alogithms can now run comfortably on smartphones) so that user-centric identity management is totally achievable, but policy makers have not generally understood that. But there is no technical reason a totally user-centric vision of identity architecture should not be realized. But still the UID appears to be one of the most centralised large systems ever conceived.

There are still questions I don't understand about its technical architecture. For example, just creating an identity number by itself doesn't guarantee security and it's a classic mistake to treat an identifier as an authenticator. In other words, to use an identifier or knowledge of an identifier - which could become public information, like the American social security number – to treat knowledge of that number as if it were a key to open up a system to give people access to their own private information is very dangerous. So it's not clear to me how the UID system is designed in that way. It seems that by just quoting back a number, in some circumstances this will be the key to open up the system, to reveal private information, and that is an innately insecure approach. There may be details of the system I don't understand, but I think it's open to criticism on those systemic grounds.

And then more fundamentally, you have to ask what's the purpose of that system in society. You can define a system with a limited number of purposes – which is the better thing to do – and then quite closely specify the legal conditions under which that identity information can be used. It's much more problematic, I think, to try and just say that “we'll be the universal identity system”, and then you just try and find applications for it later. A number of countries tried this approach, for example Belgium around 2000, and they expected that having created a platform for identity, that many applications would follow and tie into the system. This really didn't happen, for a number of social and technical reasons which critics of the design had predicted. I suppose I would have to say that the UID system is almost the anithesis of the way I think identity systems should be designed, which should be based on quite strong technical privacy protection mechanisms - using cryptography - and where, as far as possible, you actually leave the custody of the data with the individual.

Another objection to this user-centric approach is “back-up”: what happens when you lose the primary information and/or your device? Well, you can anticipate that. You can arrange for this information to be backed-up and recovered, but in such a way that the back-up is encrypted, and the recovered copy can easily be checked for authenticity using cryptography.

6. Should Indian citizens be concerned about the Central Monitoring System (CMS)? Why / Why not?


Well, the Central Monitoring System does seem to be an example of very large scale “strategic surveillance”, as it is normally called. Many western countries have had these for a long time, but normally only for international communications. Normally surveillance of domestic communications is done under a particular warrant, which can only be applied one investigation at a time. And it's not clear to me that that is the case with the Central Monitoring System. It seems that this may also be applicable to mass surveillance of communications inside India. Now we're seeing a big controversy in the U.S - particularly at the moment - about the extent to which their international strategic surveillance systems are also able to be used internally. What has happened in the U.S. seems rather deceptive; although the “shell” of the framework of individual protection of rights was left in place, there are actually now so many exemptions when you look in the detail, that an awful lot of Americans' domestic communications are being subjected to this strategic mass surveillance. That is unacceptable in a democracy.

There are reasons why, arguably, it's necessary to have some sort of strategic surveillance in international communications, but what Edward Snowden revealed to us is that in the past few years many countries – the UK, the U.S, and probably also Germany, France and Sweden – have constructed mass surveillance systems which knowingly intrude on domestic communications also. We are living through a transformation in surveillance power, in which the State is becoming more able to monitor and control the population secretively than ever before in history. And it's very worrying that all of these systems appear to have been constructed without the knowledge of Parliaments and without precise legislation. Very few people in government even seem to have understood the true mind-boggling breadth of this new generation of strategic surveillance. And no elections were fought on a manifesto asking “Do people want this or not?”. It's being justified under a counter-terrorism mantra, without very much democratic scrutiny at all. The long term effects of these systems on democracies are really uncharted territory.

We know that we're not in an Orwellian state, but the model is becoming more Kafkaesque. If one knows that this level of intensive and automated surveillance exists, then it has a chilling effect on society. Even if not very much is publicly known about these systems, there is still a background effect that makes people more conformist and less politically active, less prepared to challenge authority. And that's going to be bad for democracy in the medium term – not just the long term.

7. Should surveillance technologies be treated as traditional arms / weapons? If so, should export controls be applied to surveillance technologies? Why / Why not?


Surveillance technologies probably do need to be treated as weapons, but not necessarily as traditional weapons. One probably is going to have to devise new forms of export control, because tangible bombs and guns are physical goods – well, they're not “goods”, they're “bads” - that you can trace by tagging and labelling them, but many of the “new generation” of surveillance weapons are software. It's very difficult to control the proliferation of bits – just as it is with copyrighted material. And I remember when I was working on some of these issues thirteen years ago in the UK – during the so-called crypto wars – that the export of cryptographic software from many countries was prohibited. And there were big test cases about whether the source code of these programs was protected under the US First Amendment, which would prohibit such controls on software code. It was intensely ironic that in order to control the proliferation of cryptography in software, governments seemed to be contemplating the introduction of strategic surveillance systems to detect (among other things) when cryptographic software was being exported. In other words, the kind of surveillance systems which motivated the “cypherpunks” to proselytise cryptography, were being introduced (partly) with the perverse justification of preventing such proliferation of such cryptography!

In the case of the new, very sophisticated software monitoring devices (“Trojans”) which are being implanted into people's computers – yes, this has to be subject to the same sort of human rights controls that we would have applied to the exports of weapon systems to oppressive regimes. But it's quite difficult to know how to do that. You have to tie responsibility to the companies that are producing them, but a simple system of end-user licensing might not work. So we might actually need governments to be much more proactive than they have been in the past with traditional arms export regimes and actually do much more actively to try and follow control after export – whether these systems are only being used by the intended countries. As for the law enforcement agencies of democratic countries which are buying these technologies: the big question is whether law enforcement agencies are actually applying effective legal and operational supervision over the use of those systems. So, it's a bit of a mess! And the attempts that have been made so far to legislate this area I don't think are sufficient.

8. How can individuals protect their data (and themselves) from spyware, such as FinFisher?

 

In democratic countries, with good system of the rule of law and supervision of law enforcement authorities, there have been cases – notably in Germany – where it's turned out that the police using techniques, like FinFisher, have actually disregarded legal requirements from court cases laying down the proper procedures. So I don't think it's good enough to assume that if one was doing ordinary lawful political campaigning, that one would not be targeted by these weapons. So it's wise for activists and advocates to think about protecting themselves – of course, other professions as well who look after confidential information – because these techniques may also get into the hands of industrial spies, private detectives and generally by people who are not subject to even the theoretical constraints of law enforcement agencies.

After Edward Snowden's revelations, we understand that all our computer infrastructure is much more vulnerable – particularly to foreign and domestic intelligence agencies – than we ever imagined. So for example, I don't use Microsoft software anymore – I think that there are techniques which are now being sold to governments and available to governments for penetrating Microsoft platforms and probably other major commercial platforms as well. So, I've made the choice, personally, to use free software – GNU/Linux, in particular – and it still requires more skill for most people to use, but it is much much easier than even a few years ago. So I think it's probably wise for most people to try and invest a little time getting rid of proprietary software if they care at all about societal freedom and privacy. I understand that using the latest, greatest smartphone is cool, and the entertainment and convenience of Cloud and tablets – but people should not imagine that they can keep those platforms secure.

It might sound a bit primitive, but I think people should have to go back to the idea that if they really want confidential communications with their friends, or if they are involved with political work, they have to think about setting aside one machine - which they keep offline and just use essentially for editing and encrypting/decrypting material. Once they've encrypted their work on their “air gap” machine, as it's called, then they can put their encrypted emails on a USB stick and transfer them to their second machine which they use to connect online (I notice Bruce Schneier is just now recommending the same approach). Once the “air gap” machine has been set up and configured, you should not connect that to the network – and preferably, don't connect it to the network, ever! So if you follow those sorts of protocols, that's probably the best that is achievable today.

9. How would you advise young people working in the surveillance industry?

 

Young people should try and read a little bit into the ethics of surveillance and to understand their own ethical limits in what they want to do, working in that industry. And in some sense, I think it's a bit like contemplating a career in the arms industry. There are defensible uses of military weapons, but the companies that build these weapons are, at the end of the day, just corporations maximizing value for shareholders. And so, you need to take a really hard look at the company that you're working for or the area you want to work in and satisfy your own standard of ethics, and that what you're doing is not violating other people's human rights. I think that in the fantastically explosive growth of surveillance industries that we've seen over the past few years – and it's accelerating – the sort of technologies particularly being developed for electronic mass surveillance are fundamentally and ethically problematic. And I think that for a talented engineer, there are probably better things that he/she can do with his/her career.

    Mapping Digital Media: Broadcasting, Journalism and Activism in India: A Public Consultation

    by Samantha Cassar last modified Nov 07, 2013 03:38 AM
    Lawyers, researchers, journalists and activists gathered on Sunday, October 27, 2013 at the Bangalore International Centre in response to India’s country report on Mapping Digital Media, which examines citizen’s access to quality news and information across different industries, and impacts on media freedoms as a result of digitisation. Respondents examined themes related to regulation, journalism and activism, and engaging discussions took place among attendees.
    Mapping Digital Media: Broadcasting, Journalism and Activism in India: A Public Consultation

    Respondents of various perspectives spoke for the public consultation regarding different sections of the Mapping Digital Media: India report.


    On behalf of event organizers, we invite you to view the report, available online for free access here: "Mapping Digital Media: India.


    Event organizers, Alternative Law Forum, The Centre for Internet & Society, and Maraa, held a public consultation at the Bangalore International Centre with the ultimate goals to inform and engage the public within key themes of the Mapping Digital Media: India report, as a new knowledge basis for better understanding India’s transitioning digital landscape. Many resulting ideas about moving forward with the report’s findings also came about, as prospective proceeding steps within the life cycle following the report’s release.

    Respondents consisted of reputed media lawyers, researchers, journalists, activist and other media professionals. Each spoke before the meeting room within three panel discussions pertaining to different sections of the report: Policies, Laws and Regulators; Digital Activism; and Digital Journalism. Each speaker shed a new light on key challenges confronting our emergent digital media landscape with special focus given to broadcasting (radio and television), cable operations and newspapers (print & online) as each of these sectors undergo digitisation.

    Opening

    Vibodh Parthasarathi, who had anchored the country report, started off the consultation by underscoring the report's objective of mapping the different sectors and seemingly disparate aspects of India's complex media landscape. Following a brief introduction to the report was the setting of the stage by Alternative Law Forum Co-founder and Partner, Lawrence Liang, as he shared the ultimate aims of the event in speaking collectively to the report so that we may gain a better understanding of an area that is otherwise opaque by most. Lawrence also brings to the forefront the report’s debunking of the idea of the digital divide for India, and its account of a rich media landscape.

    Policies, Laws and Regulators

    The consultation’s first panel discussion was started by Lawrence, as he responded to the report from a perspective of legality. Lawrence examines the role of the state in India’s rich media landscape, specifically in terms of the four values at the centre of such: freedom of speech and expression, access to infrastructure, the question of development, and the question of market regulations—all of which are tied together within the country report.  Lawrence argues that we must arrive at quantitative measures of accessing diversity and quantity of freedom of speech, but only after understanding the ecology in which freedom of speech operates, and attempts to do so in examining drafted policies, policing measures, and market regulatory measures taken within the context of India.

    An engaging discussion following this panel’s speakers took place. Amongst points made by event attendees includes questions of how to scale up the citizen’s stake in media within a legal paradigm, as well as points made with reference to challenges to equity in media in terms of content and challenges to such.

    Digital Media and Society (Digital Activism)

    The discussion had begun with panelist, Arjun Venkatraman, Co-founder of the Mojolab Foundation as well as the digital activism platform, Swara. Arjun engages within the digital media debate in speaking on behalf of members of civil society that act from within the digital divide and exposes the gaps within new modes of activism that arise out of a lack of understanding on how to engage with these new medias. He also informed attendees of how to make cheap IVR based voice portals, linking voice users to the web for under USD200 as means of leveraging users’ voices via unlicensed spectrum.

    Also contributing to the discussion on digital activism was Meera K, Cofounder of Bangalore News publication, Citizen Matters. In examining examples of new spaces that digital media has provided for the exchange of pluralistic views and alternative voices, Meera critiques different types of activism that have emerged, including  social activism, political activism, and middle class activism. She questions whether new media can be seen as sufficient space for free speech with reference to various challenges, such as the polarization of debates, and also compares and contrasts the positive outcomes of new media campaigns—such as tangible capitalized solutions—with corresponding pitfalls.

    A debate amongst attendees followed in response to the question of assessing the value of media in terms of impact or size of public outreach, along with how content is generated and controlled.

    Digital Media and Journalism

    Independent journalist and media analyst, Geeta Seshu, got the conversation started regarding digital media and journalism by comparing the pitfalls of journalism in traditional media with the possibilities offered by digital journalism. Geeta argues that journalists have become devalued and are losing their footing within traditional media. She discussed the new forms of journalism and how news can be generated in an interactive and non-hierarchical manner and examined the intersections of mainstream media and journalism.  She questions the possibility of digital journalism existing on its own, without the influence of or incorporation of principles of traditional media, and grapples with possibilities for providing a new model for doing so.

    The day’s last speaker was Subhash Rai, Associate Editor of New Indian Express. Subhash offers a mainstream perspective and argues that we must look at traditional and mainstream forms of media as a starting point for emerging forms of journalism before we can begin to understand these journalism models better. Just as well, traditional and mainstreams means of news dissemination can learn from digital media, however we should not be quick to look away from the core of the entire picture, as traditional forms of media are still very strong in comparison.

    A discussion followed surrounded questions posed by speakers and attendees, such as what digital journalism should look like, and how such a transition to new forms of media should be imagined. How information has changed with respect to its creation and consumption was debated as well.

    Moving Forward

    Before the conclusion of the public consultation, attendees and speakers discussed future advancements for the country report.  Many recommendations and ideas were generated, including suggestions for future public consultations, advocacy windows offered by the report, and ways to produce another iteration of the report. Prospective initiatives included online working groups to dive deeper into specific themes of the report, a Hackathon where attendees will pool ideas together, and follow-up public consultations.

    Mapping Digital Media 2

    Participants brainstormed together on how to move forward the report’s findings. Many ideas were drafted, including a Hack-a-thon and online focus groups.

    The event's agenda went as follows:

    TimeDetail
    10.00 a.m. Introductory Remarks by Vibodh Parthasarathi, CCMG, Jamia
    10.15 a.m. - 11.30 a.m. Policies, Laws and Regulators
    Session Moderator – Ram Bhat
    Speakers – Lawrence Liang (ALF) and Mathew John (JGLS)
    11.30 a.m. - 11.45 a.m.

    Tea Break

    11.45 a.m. - 1.15 p.m. Digital Media and Society (Digital Activism)
    Session Moderator – Lawrence Liang
    Speakers – Arjun Venkatraman (Mojolab) and Meera K (Citizen Matters)

    1.15 p.m. - 2.00 p.m. Lunch Break
    2.00 p.m. - 3.15 p.m. Digital Media and Journalism
    Session Moderator – Vibodh Parthasarathi
    Speakers – Geeta Seshu (Free Speech Hub) and Subhash Rai (newindianexpress.com)
    3.15 p.m. - 4.00 p.m. The Way Ahead (Moving Forward)
    Moderated by Lawrence Liang

    Event Participants

    1. Rashmi Vallabhrajasyuva
    2. Meera K, Oorvani Foundation
    3. Samantha Cassar, CIS
    4. Sharath Chandra Ram, CIS
    5. Suresh Kumar, Artist
    6. Aruna Sekhar, Amnesty India
    7. Sriram Sharma, Part time Blogger
    8. Ammu Joseph, Independent Researcher
    9. Mathew John, Jindal Global Law School
    10. Swati Mehta, The Rules
    11. James North, The Rules
    12. Bhairav Acharya, Lawyer
    13. Deepa Kurup, The Hindu
    14. Abhilash N, Independent
    15. Deepu, Pedestrian Pictures
    16. Rashmi M, PhD Student at NIAS
    17. Jayanth S, LOCON Solutions Pvt Ltd.
    18. Nehaa Chaudhari, CIS
    19. Dinesh TB, Servelots
    20. Snehashish Ghosh, CIS
    21. Lawrence Liang, ALF
    22. Vibodh Parthasarathi, CCMG, Jamia
    23. Ram Bhat, Maraa
    24. Ashish Sen, AMARC
    25. Subhash Rai, New Indian Express
    26. Geeta Seshu, Free Speech Hub, The Hoot
    27. Arjun Venkatraman, Mojo Lab Foundation
    28. Raajen, Centre for Education and Documentation
    29. Ekta, Maraa
    30. Smarika Kumar, ALF

    Press Coverage

    1. Need to increase diversity in online journalism (The New Indian Express, October 28, 2013).
    2. Experts moot holistic approach to media laws (The Hindu, October 28, 2013).

    CIS Cybersecurity Series (Part 12) - Namita Malhotra

    by Purba Sarkar last modified Nov 18, 2013 10:03 AM
    CIS interviews Namita Malhotra, researcher and lawyer at Alternative Law Forum, Bangalore, as part of the Cybersecurity Series.

    "In a strange mix of how both capitalism and state control work, what is happening is that more and more of these places that one could access, for various reasons, whether it is for ones own pleasure or for political conversations, are getting further and further away from us. And I think that that mix of both corporate interests and state control is particularly playing a role in this regard." - Namita Malhotra, researcher and lawyer, Alternative Law Forum

    Centre for Internet and Society presents its twelfth installment of the CIS Cybersecurity Series. 

    The CIS Cybersecurity Series seeks to address hotly debated aspects of cybersecurity and hopes to encourage wider public discourse around the topic.

    Namita Malhotra is a researcher and lawyer at Alternative Law Forum (ALF). She has a keen interest in working on law, technology and media through legal research, cultural studies, new media practices and film making.

    ALF homepage: www.altlawforum.org


    This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

     

    First Look: CIS Cybersecurity documentary film

    by Purba Sarkar last modified Dec 17, 2013 08:16 AM
    CIS presents the trailer of its documentary film DesiSec: Cybersecurity & Civil Society in India

    The Centre for Internet and Society is pleased to release the trailer of its first documentary film, on cybersecurity and civil society in India. 

    The documentary is part of the CIS Cybersecurity Series, a work in progress which may be found here.

    DesiSec: Cybersecurity and Civil Society in India

    The trailer of DesiSec: Cybersecurity and Civil Society in India was shown at the Internet Governance Forum in Bali on October 24. It was a featured presentation at the Citizen Lab workshop, Internet Governance For The Next Billion Users.

    The transcript of the workshop is available here: http://www.intgovforum.org/cms/component/content/article/121-preparatory-process/1476-ws-344-internet-governance-for-the-next-billion-users 

    This work was carried out as part of the Cyber Stewards Network with aid of a grant from the International Development Research Centre, Ottawa, Canada.

    Seventh Privacy Round-table

    by Elonnai Hickok last modified Nov 20, 2013 09:58 AM
    On October 19, 2013, the Centre for Internet and Society (CIS) in collaboration with the Federation for Indian Chambers of Commerce and Industry, the Data Security Council of India, and Privacy International held a “Privacy Round-table” in New Delhi at the FICCI Federation House.

    The Round-table was the last in a series of seven, beginning in April 2013, which were held across India.

    Previous Privacy Round-tables were held in:

    • New Delhi: (April 13, 2013) with 45 participants;
    • Bangalore: (April 20, 2013) with 45 participants;
    • Chennai: (May 18, 2013) with 25 participants;
    • Mumbai, (June 15, 2013) with 20 participants;
    • Kolkata: (July 13, 2013) with 25 participants; and
    • New Delhi: (August 24, 2013) with 40 participants.

    Chantal Bernier, Assistant Privacy Commissioner Canada, Jacob Kohnstamm, Dutch Data Protection Authority and Chairman of the Article 29 Working Party, and Christopher Graham, Information Commissioner UK were the featured speakers for this event.

    The Privacy Round-tables were organised to ignite spark in public dialogues and gain feedback for a privacy framework for India. To achieve this, the Privacy Protection Bill, 2013, drafted by the Centre for Internet and Society, Strengthening Privacy through Co-regulation by the Data Security Council of India, and the Report of the Group of Experts on Privacy by the Justice A.P. Shah committee were used as background documents for the Round-tables. As a note, after each Round-table, CIS revised the text of the Privacy Protection Bill, 2013 based on feedback gathered from the general public.

    The Seventh Privacy Round-table meeting began with an overview of the past round-tables and a description of the evolution of a privacy legislation in India till date, and an overview of the Indian interception regime. In 2011, the Department of Personnel and Training drafted a Privacy Bill that incorporated provisions regulating data protection, surveillance, interception of communications, and unsolicited messages. Since 2010, India has been seeking data secure status from the European Union, and in 2012 a report was issued noting that the Reasonable Security Practices and Procedures and Sensitive Personal Data or Information Rules found under section 43A of the Information Technology Act, were not sufficient to meet EU data secure adequacy.  In 2012, the Report of the Group of Experts on Privacy was published recommending a privacy framework for India and was accepted by the government, and the Department of Personnel and Training is presently responsible for drafting of a privacy legislation for India.


    Presentation: Jacob Kohnstamm, Dutch Data Protection Authority and Chairman of the Article 29 Working Group


    Jacob Kohnstamm, made a presentation on the privacy framework in the European Union. In his presentation, Khonstamm shared how history, such as the Second World War, shaped the present understanding and legal framework for privacy in the European Union, where privacy is seen as a fundamental human right. Kohnstamm also explained how over the years technological developments have made data gold, and subsequently, companies who process this data and create services that allow for the generation of more data are becoming monopolies. This has created an unbalanced situation for the individual consumer, where his or her data is being routinely collected by companies, and once collected — the individual loses control over the data. Because of this asymmetric relationship, data protection regulations are critical to ensure that individual rights are safeguarded.

    Kohnstamm recognized the tension between stringent data protection regulations and security for the government, and the provision of services for businesses was recognized. However, he argued that the use of technology without regulation — for commercial reason or security reasons, can lead to harm. Thus, it is key that any regulation incorporate proportionality as a cornerstone to the use of these technologies to ensure trust between the individual and the State, and the individual and the corporation. This will also ensure that individuals are given the right of equality, and the right to live free of discrimination. Kohnstamm went on to explain that any regulation needs to ensure that individuals are provided the necessary tools to control their data and that a robust supervisory authority is established with enough powers to enforce the provisions, and that checks and balances are put in place to safeguard against abuse.

    In response to a question asked about how the EU addresses the tension of data protection and national security, Kohnstamm clarified that in the EU, national security is left as a matter for member states to address but the main principles found in the EU Data Protection Directive also apply to the handling of information for national security purposes. He emphasized the importance of the creation of checks and balances. As security agencies are given additional and broader powers, they must also be subjected to stronger safeguards.

    Kohnstamm also discussed the history of the fair trade agreement with India, and India’s request for data secure status. It was noted that currently the fair trade agreement between India and the EU is stalled, as India has asked for data secure status. For the EU to grant this status, it must be satisfied that when European data is transferred and processed in India and that it is subject to the same level of protections as it would be if it were processed in the EU. Without a privacy legislation in place, India’s present  regime does not reflect the same level of protections as the EU regime. To find a way out of this ‘dead lock’, the EU and India have agreed to set up an expert group — with experts from both the EU and India to find a way in which India’s regime can be modified to meet EU date secure adequacy. As of date, no experts from the Indian side have been nominated and communicated to the EU.

    Key Points:

    1. Europe’s history has influenced the understanding and formulation of the right to privacy as a fundamental right.
    2. Any privacy regulation must have strong checks and balances in place and ensure that individuals are given the tools to control their data.
    3. India’s current regime does not meet EU data secure adequacy. Currently, the EU is waiting for India to nominate experts to work with the EU to find a way of the ‘dead lock’.

    Discussion: National Security, Surveillance and Privacy


    Opening the discussion up to the floor, it was discussed how in India, there is a tension between data protection and national security, as national security is always a blanket exception to the right to privacy. This tension has been discussed and debated by both democratic institutions in India and commercial entities. It was pointed out that though data protection is a new debate, national security is a debate that has existed in India for many years. It was also pointed out that currently there are not sufficient checks and balances for the powers given to Indian security agencies. One missing safeguard that the Indian regime has been heavily criticized for is the power of the Secretary of the Home Ministry to authorize interception requests, as having the authorization power vested in the executive leaves little space between interested parties seeking approval of interception orders, and could result in abuse or conflict of interest. With regards to the Indian interception regime, it was explained that currently there are five ways in which messages can be intercepted in India. Previously, the Law Commission of India had asked that amendments be made to both the Indian Post Office Act and the Indian Telegraph Act.

    Moving the discussion to the Privacy Protection Bill, 2013 by CIS, in Chapter V “Surveillance and Interception of Communications” clause 34, the authorization of interception and surveillance orders is left to a magistrate. Previously, the authorization of interception orders rested with the Privacy Commissioner, but this model was heavily critiqued in previous round-tables, and the authorizing authority has been subsequently changed to a magistrate. Participants pointed out that the Bill should specify the level of the magistrate that will be responsible for the authorization of surveillance orders, and also raised the concern that the lower judiciary in India is not adequately functioning as the courts are overwhelmed, thus creating the possibility for abuse. Participants also suggested that perhaps data protection and surveillance should be de-linked from each other and placed in separate bills. This echoes public feedback from previous roundtables.

    While discussing needed safeguards in an interception and surveillance regime for India, it was called out that transparency of surveillance, by both the government and the service providers as key safeguards to ensuring the protection of privacy, as it would enable individuals to make educated decisions about the services they choose to use and the extent of governmental surveillance. The need to bring in a provision that incorporated the idea of "nexus of surveillance" was also highlighted. It was also pointed out that in Canada, entities wanting to deploy surveillance in the name of public safety, must take steps to prove nexus. For example, the organization must empirically prove that there is a need for a security requirement, demonstrate that only data that is absolutely necessary will be collected, show how the technology will be effective, prove that there is not a less invasive way to collect the information, demonstrate security measures in place to ensure against loss and misuse, and the organizations must have in place both internal and external oversight mechanisms. It was also shared that in Canada, security agencies are regulated by the Office of the Canadian Privacy Commissioner, as privacy and security are not seen as separate matters. In the Canadian regime, because security agencies have more powers, they are also subjected to greater oversight.

    Key Points:

    1. The Indian surveillance regime currently does not have strong enough safeguards.
    2. The concept of ‘nexus’ should be incorporated into the Privacy Protection Bill, 2013.
    3. A magistrate, through judicial oversight for interception and surveillance requests, might not be the most effective authority for this role in India.

    Presentation: Chantal Bernier, Deputy Privacy Commissioner, Canada


    In her presentation, Bernier made the note that in the Canadian model there are multiple legislative initiatives that are separate but connected, and all provide a legislative basis for the right to privacy. Furthermore, it was pointed out that there are two privacy legislations in Canada, one regulating the private sector and the other regulating the public sector. It has been structured this way as it is understood that the relationship between individuals and business is based on consent, while the relationship between individuals and the state is based on human rights. Furthermore, aspects of privacy, such as consent are different in the public sector and the private sector. In her presentation, Bernier pointed out that privacy is a global issue and because of this, it is critical that countries have privacy regimes that can speak to each other. This does not mean that the regimes must be identical, but they must at the least be inter-operable.

    Bernier described three main characteristics of the Canadian privacy regime including:

    1. It is comprehensive and applies to both the public and the private sectors.
    2. The right to privacy in Canada is constitutionally based and is a fundamental right as it is attached to personal integrity. This means that privacy is above contractual fairness. That said, the right to privacy must be balanced collectively with other imperatives.
    3. The Canadian privacy regime is principle based and not rule based. This flexible model allows for quick adaption to changing technologies and societal norms. Furthermore, Bernier explained how Canada places responsibility and accountability on companies to respect, protect, and secure privacy in the way in which the company believes it can meet. Bernier also noted that all companies are responsible and accountable for any data that they outsource for processing.

    Furthermore, any company that substantially deals with Canadians must ensure that the forum for which complaints etc., are heard is Canada. Furthermore, under the Canadian privacy regime, accountability for data protection rests with the original data holder who must ensure — through contractual clauses — that any information processed through a third party meets the Canadian level of protection. This means any company that deals with a Canadian company will be required to meet the Canadian standards for data protection.

    Speaking to the governance structure of the Office of the Privacy Commissioner in Canada, Bernier explained that the OPC is a completely independent office and reports directly to the Parliament. The OPC hears complaints from both individuals and organizations. The OPC does not have any enforcement powers, such as finding a company, but does have the ability to "name" companies who are not in compliance with Canadian regulations, if it is in the public interest to do so. The OPC can perform audits upon discretion with respect to the public sector, and can perform audits on the private sector if they have reasonable grounds to investigate.

    Bernier concluded her presentation with lessons that have been learned from the Canadian experience including:

    1. The importance of having strong regulators.
    2. Privacy regulators must work and cooperate together.
    3. Privacy has become a condition of trade.
    4. In today’s age, issues around surveillance cannot be underestimated.
    5. Companies that have strong privacy practices now have a competitive advantage in place in today’s global market.
    6. Privacy frameworks must be clear and flexible.
    7. Oversight must be powerful to ensure proper protection of citizens in a world of asymmetry between individuals, corporations, and governments.

    Key Points:

    1. The Right to Privacy is a fundamental right in Canada.
    2. The Canadian privacy regime regulates the public sector and the private sector, but through two separate legislations.
    3. The OPC does not have the power to levy fines, but does have the power to conduct audits and investigations and ‘name’ companies who are not in compliance with Canadian regulations if it is in the public interest.

    Discussion: The Data Protection Authority


    Participants also discussed the composition of the Data Protection Authority as described in chapter IV of the Privacy Protection Bill. It was called out that the in the Bill, the Data Protection Authority might need to be made more independent. It was suggested that to avoid having the office of the Data Protection Authority be filled with bureaucrats, the Bill should specify that the office must be staffed by individuals with IT experience, lawyers, judges, etc. On the other hand it was cautioned, that though this might be useful to some extent, it might not be helpful to be overly prescriptive, as there is no set profile of what composition of employees makes for a strong and effective Data Protection Authority. Instead the Bill should ensure that the office of the Data Protection Authority is independent, accountable, and chosen by an independent selection board.

    When discussing possible models for the framework of the Data Protection Authority, it was pointed out that there are many models that could be adopted. Currently in India the commission model is not flexible, and many commissions that are set up, are not effective due to funding and internal bureaucracy. Taking that into account, in the Privacy Protection Bill, 2013, the Data Protection Authority, could be established as a small regulator with an appellate body to hear complaints.

    Key Points:

    1. The Data Protection Authority established in the Privacy Protection Bill must be adequately independent.
    2. The composition of the Data Protection Authority be diverse and it should have the competence to address the dynamic nature of privacy.
    3. The Data Protection Authority could be established as a small regulator with an appellate body attached.

    Presentation: Christopher Graham, Information Commissioner, United Kingdom


    Christopher Graham, the UK Information Commissioner, spoke about the privacy regime in the United Kingdom and his role as the UK Information Commissioner. As the UK Information Commissioner, his office is responsible for both the UK Data Protection Act and the Freedom of Information Act. In this way, the right to know is not in opposition to the right to privacy, but instead an integral part.

    Graham said that his office also provides advice to data controllers on how to comply with the privacy principles found in the Data Protection Act, and his office has the power to fine up to half a million pounds on non-compliant data controllers. Despite having this power, it is rarely used, as a smaller fine is usually sufficient enough for the desired effect. Yet, at the end of the day, whatever penalty is levied, it must be proportionate and risk based i.e., selective to be effective. In this way the regulatory regime should not be heavy handed but instead should be subtle and effective. In fact, one of the strongest regulators is the reality of the market place where the price of not having strong standards is innovation and economic growth. To this extent, Graham also pointed out that self regulation and co-regulation are both workable models, if there is strong enforcement mechanisms. Graham emphasized the fact that any data protection must go beyond, and cannot be limited to, just security.

    Graham also explained that he has found that currently there is a lack of confidence in Indian partners. This is problematic as the Indian industry tries to grow with European partners. For example, he has been told that customers are moving banks because their previous bank’s back offices were located in India. Citing other examples of cases of data breaches from Indian data controllers, such as a call center merging the accounts of two customers and another call centre selling customer information, he explained that the lack of confidence in the Indian regime has real economic implications. Graham further explained that one difficulty that the office of the UK ICO is faced with, is that India does not have the equivalent of the ICO. Thus, when a breach does happen, it is unclear who can be approached in India about the breach.

    Touching upon the issue of data adequacy with the EU, Graham noted that if data adequacy is a goal of India, the privacy principles as defined in the Directive and reflected in the UK Data Protection Act, must be addressed in addition to security. In his presentation, Graham emphasized the importance of India amending their current regime, if they want data secure status and spoke about the economic benefits for both Europe and India, if India does in fact obtain data secure status. In response to a question about why it is so important that India amend its laws, if in effect the UK has the ability to enforce the provisions of UK Data Protection Act, Graham clarified that most important is the rule of law, and according to UK law and more broadly the EU Directive, companies cannot transfer information to jurisdictions that do not have recognized adequate levels of protection. Thus, if companies still wish to transfer information to India, this must be done through binding corporate rules.

    Another question which was put forth was about how the right to privacy differs from other human rights, and why countries are requiring that other countries to uphold the right to privacy to the same level, when, for example this is not practiced for other human rights such as children’s rights. In response Graham explained that data belongs to the individual, and when it is transferred to another country — it still belongs to the individual. Although the UK would like all countries to uphold the rights of children to the standard that they do, the UK is not exporting UK citizen’s children to India. Thus, as the Information Commissioner he has a responsibility to protect his citizen’s data, even when it leaves the UK jurisdiction.  Graham explained further that in the history of Europe, the misuse of data to do harm has been a common trend, which is why privacy is seen as a fundamental right, and why it is paramount that European data is subject to the same level of protection no matter what jurisdiction it is in. India needs to understand that privacy is a fundamental right and goes beyond security, and that when a company processes data it does not own the data, the individual owns the data and thus has rights attached to it to understand why Europe requires countries to be ‘data secure’ before transferring data to them.

    Key Points:

    1. The UK Information Commissioners Office regulates both the right to information and privacy, and thus the two rights are seen as integral to each other.
    2. Penalties must be proportionate and scalable to the offense.
    3. Co-regulation and self-regulation can both be viable models to for privacy, but enforcement is key to them being effective.

    Discussion: Collection of Data with Consent and Collection of Data without Consent


    Participants also discussed the collection of data with consent and the collection of data without consent found in Chapter III of the Bill. When asked opinions about the circumstances when informed consent should not be required,  it was pointed out that in the Canadian model, the option to collect information without consent only applies to the public sector if it is necessary for the delivery of a service by the government. In the private sector all collection of information requires informed and meaningful consent. Yet, collection of data without consent in the commercial context is an area that Canada is wrestling with, as there are instances, such as online advertising, where it is unreasonable to expect consent all the time. It was also pointed out that in the European Directive, consent is only one of the seven grounds under which data can be collected. As part of the conversation on consent, it was pointed out that the Bill currently does not take explicitly take into account the consent for transfer of information, and it does not address changing terms of service and if companies must re-take consent, or if providing notice to the individual was sufficient. The question about consent and additional collection of data that is generated through use of that service was also raised. For example, if an individual signs up for a mobile connection and initially provides information that the service provider stores in accordance to the privacy principles, does the service provider have an obligation to treat all data generated by the user while using the service of the same? The exception of disclosure without consent was also raised and it was pointed out that companies are required to disclose information to law enforcement when required. For example, telecom service providers must now store location data of all subscribers for up to 6 months and share the same when requested by law enforcement.

    Key Points:

    1. There are instances where expecting companies to have informed consent for every collection of information is not reasonable. Alternative models, based on — for example transparency — must be explored to address these situations.
    2. The Privacy Protection Bill should explicitly address transfer of information to other countries.
    3. The Privacy Protection Bill should address consent in the context of changing terms of service.

    Discussion: Penalties and Offences


    The penalties and offenses prescribed in chapter VI of the Privacy Protection Bill were discussed by participants. While discussing the chapter, many different opinions were voiced. For example, some participants held the opinion that offences and penalties should not exist in the Privacy Protection Bill, because in reality they are more likely than not to be effective. For example, when litigating civil penalties, it takes a long time for the money to be realized. Others argued that in India, where enforcement of any law is often weak, strong, clear, and well defined criminal penalties are needed. Another comment raised the point that a distinction should be made between breaches of the law by data controllers and breaches by rogue individuals — as the type of violation. For example, a breach by a data controller is often a matter identifying the breach and putting in place strictures to ensure that it does not happen again by holding the company accountable through oversight. Where as a breach by a rogue agent entails identifying the breach and the rogue agent and creating a strong enough penalty to ensure that they will not repeat the violation.  Adding to this discussion, it was pointed out that in the end, scalability is key in ensuring that penalties are proportional and effective. It was also noted that in the UK, any fine that is levied is appealable. This builds in a system of checks and balances, and ensures that companies and individuals are not subject to unfair or burdensome penalties.

    The possibility of incentivizing compliance, through rewards and distinctions, was discussed by participants. Some felt that incentivizing compliance would be more effective as it would give companies distinct advantages to incorporating privacy protections, while others felt that incentives can be included but penalties cannot be excluded, otherwise the provisions of the Privacy Protection Bill 2013 will not be enforceable. It was also pointed out that in the context of India, ideally there should be a mechanism to address the ‘leakages’ that happen in the system i.e., corruption. Though this is difficult to achieve, regulations could take steps like specifically prohibiting the voluntary disclosure of information by companies to law enforcement. Taking a sectoral approach to penalties was also suggested as companies in different sectors face specific challenges and types of breaches. Another approach that could be implemented is the statement of a time limit for data controllers and commissioners to respond to complaints. This has worked for the implementation of the Right to Information Act in India, and it would be interesting to see how it plays out for the right to privacy. Throughout the discussion a number of different possible ways to structure offenses and penalties were suggested, but for all of them it was clear that  it is important to be creative about the type of penalties and not rely only on financial penalty, as for many companies, a fine has less of an impact than perhaps having to publicly disclose what happened around a data breach.

    Key Points:

    1. Penalties and offenses by companies vs. rogue agents should be separately addressed in the Bill.
    2. Instead of levying penalties, the Bill should include incentives to ensure compliance.
    3. Penalties for companies should go beyond fines and include mechanisms such as requiring the company to disclose to the public information about the breach.

    Discussion: Cultural Aspects of Privacy


    The cultural realities of India, and the subsequent impact on the perception of privacy in India were discussed. It was pointed out that India has a history of colonization, multiple religions and languages, ethnic tensions, a communal based society, and a large population. All of these factors impact understandings, perceptions, practices, and the effectiveness of different frameworks around privacy in India. For example, the point was raised that given India’s cultural and political diversity, having a principle based model might be too difficult to enforce as every judge, authority, and regulator will have a different perspective and agenda. Other participants pointed out that there is a lack of awareness around privacy in India, and this will impact the effectiveness of the regulation. It was also highlighted that anecdotal claims that cultural privacy in India is different, such as the fact that in India on a train everyone will ask you personal questions, and thus Indian’s do not have a concept of privacy, cannot influence how a privacy law is framed for India.

    Key Points:

    1. India’s diverse culture will impact perceptions of privacy and the implementation of any privacy regulation.
    2. Given India’s diversity, a principle based model might not be adequate.
    3. Though culture is important to understand and incorporate into the framing of any privacy regulation in India, anecdotal stories and broad assumptions about India’s culture and societal norms around privacy cannot influence how a privacy law is framed for India.

    Conclusion

    The seventh privacy round-table concluded with a conversation on the NSA spying and the Snowden Revelations. It was asked if domestic servers could be an answer to protect Indian data. Participants agreed that domestic servers are just a band aid to the problem. With regards to the Privacy Protection Bill it was clarified that CIS is now in the process of collecting public statements to the Bill and will be submitting a revised version to the Department of Personnel and Training. Speaking to the privacy debate at large, it was emphasized that every stakeholder has an important voice and can impact the framing of a privacy law in India.

    Why 'Facebook' is More Dangerous than the Government Spying on You

    by Maria Xynou last modified Nov 23, 2013 08:38 AM
    In this article, Maria Xynou looks at state and corporate surveillance in India and analyzes why our "choice" to hand over our personal data can potentially be more harmful than traditional, top-down, state surveillance. Read this article and perhaps reconsider your "choice" to use social networking sites, such as Facebook.
    Why 'Facebook' is More Dangerous than the Government Spying on You

    by AJC1 on flickr

    Do you have a profile on Facebook? Almost every time I ask this question, the answer is ‘yes’. In fact, I think the amount of people who have replied ‘no’ to this question can literally be counted on my right hand. But this is not an article about Facebook per se. It’s more about the ‘Facebooks’ of the world, and of people’s increasing “choice” to hand over their most personal data. More accurate questions are probably:

    Would you like the Government to go through your personal diary? If not, then why do you have a profile on Facebook?”

    The Indian Surveillance State

    Following Snowdens revelations, there’s finally been more talk about surveillance. But what is surveillance?

    David Lyon - who directs the Surveillance Studies Centre - defines surveillance as “any collection and processing of personal data, whether identifiable or not, for the purposes of influencing or managing those whose data have been garnered”. Surveillance can also be defined as the monitoring of the behaviour, activities or other changing information of individuals or groups of people. However, this definition implies that individuals and/or groups of people are being monitored in a top-down manner, without this being their “choice”. But is that actually the case? To answer this question, let’s have a look at how the Indian government and corporations operating in India spy on us.

    State Surveillance

    The first things that probably come to mind when thinking about India from a foreigner’s perspective are poverty and corruption. Surveillance appears to be a “Western, elitist issue”, which mainly concerns those who have already solved their main survival problems. In other words, the most mainstream argument I hear in India is that surveillance is not a real issue, especially since the majority of the population in the country lives below the line of poverty and does not even have any Internet access. Interestingly enough though, the other day when I was walking around a slum in Koramangala, I noticed that most people have Airtel satellites...even though they barely have any clean water!

    The point though is that surveillance in India is a fact, and the state plays a rather large role in it. In particular, Indian law enforcement agencies follow three steps in ensuring that targeted and mass surveillance is carried out in the country:

    1. They create surveillance schemes, such as the Central Monitoring System (CMS), which carry out targeted and/or mass surveillance

    2. They create laws, guidelines and license agreements, such as the Information Technology (Amendment) Act 2008, which mandate targeted and mass surveillance and which require ISP and telecom operators to comply

    3. They buy surveillance technologies from companies, such as CCTV cameras and spyware, and use them to carry out targeted and/or mass surveillance

    While Indian law enforcement agencies don’t necessarily follow these steps in this precise order, they usually try to create surveillance schemes, legalise them and then buy the gear to carry them out.

    In particular, surveillance in India is regulated under five laws: the Indian Telegraph Act 1885, the Indian Post Office Act 1898, the Indian Wireless Telegraphy Act 1933, section 91 of the 1973 Code of Criminal Procedure (CrPc) and the Information Technology (Amendment) Act 2008. These laws mandate targeted surveillance, but remain silent on the issue of mass surveillance which means that technically it is neither allowed nor prohibited, but remains a grey legal area.

    While surveillance laws in India may not mandate mass surveillance, some of their sections are particularly concerning. Section 69 of the Information Technology (Amendment) Act 2008 allows for the interception of all information transmitted through a computer resource, while requiring that all users disclose their private encryption keys or face a jail sentence of up to seven years. This appears to be quite bizarre, as individuals can only keep their data private and protect themselves from surveillance through encryption.

    Section 44 of the Information Technology (Amendment) Act 2008 imposes stiff penalties on anyone who fails to provide requested information to authorities - which kind of reminds us of Orwell’s totalitarian regime in “1984”. Furthermore, section 66A of the same law states that individuals will be punished for sending “offensive messages through communication services”. However, the vagueness of this section raises huge concerns, as it remains unclear what defines an “offensive message” and whether this will have grave implications on the freedom of expression. The arrest of two Indian women last November over a Facebook post reminds us of this.

    Laws in India may not mandate mass surveillance, but guidelines and license agreements issued by the Department of Telecommunications do. In particular, the UAS License Agreement regarding the Central Monitoring System (CMS) not only mandates mass surveillance, but also attempts to legalise a mass surveillance scheme which aims to intercept all telecommunications and Internet communications in India. Furthermore, the Department of Telecommunications has issued numerous guidelines and license agreements for ISPs and telecom operators, which require them to not only be “surveillance-friendly”, but to also enable law enforcement agencies to tap into their servers on the grounds of national security. And then, of course, there’s the new National Cyber Security Policy, which mandates surveillance to tackle cyber-crime, cyber-terrorism, cyber-war and cyber-vandalism.

    As both a result and prerequisite of these laws, the Indian government has created various surveillance schemes and teams to aid them. In particular, Indias Computer Emergency Response Team (CERT) is currently monitoring “any suspicious move on the Internet” in order to checkmate any potential cyber attacks from hackers. While this may be useful for the purpose of preventing and detecting cyber-criminals, it remains unclear how “any suspicious move” is defined and whether that inevitably enables mass surveillance, without individuals’ knowledge or consent.

    The Crime and Criminal Tracking and Network & Systems (CCTNS) is the creation of a nationwide networking infrastructure for enhancing the efficiency and effectiveness of policing and sharing data among 14,000 police stations across the country. It has been estimated that Rs. 2000 crore has been allocated for the CCTNS project and while it may potentially increase the effectiveness of tackling crime and terrorism, it raises questions around the legality of data sharing and its potential implications on the right to privacy and other human rights - especially if such data sharing results in data being disclosed or shared with unauthorised third parties.

    Similarly, the National Intelligence Grid (NATGRID) is an integrated intelligence grid that will link the databases of several departments and ministries of the Government of India so as to collect comprehensive patterns of intelligence that can be readily accessed by intelligence agencies. This was first proposed in the aftermath of the Mumbai 2008 terrorist attacks and while it may potentially aid intelligence agencies in countering crime and terrorism, enforced privacy legislation should be a prerequisite, which would safeguard our data from potential abuse.

    However, the most controversial surveillance scheme being implemented in India is probably the Central Monitoring System (CMS). While several states, such as Assam, already have Internet Monitoring Systems in place, the Central Monitoring System appears to raise even graver concerns. In particular, the CMS is a system through which all telecommunications and Internet communications in India will be monitored by Indian authorities. In other words, the CMS will be capable of intercepting our calls and of analyzing our data on social networking sites, while all such data would be retained in a centralised database. Given that India currently lacks privacy legislation, such a system would mostly be unregulated and would pose major threats to our right to privacy and other human rights. Given that data would be centrally stored, the system would create a type of “honeypot” for centralised cyber attacks. Given that the centralised database would have massive volumes of data for literally a billion people, the probability of error in pattern and profile matching would be high - which could potentially result in innocent people being convicted for crimes they did not commit. Nonetheless, mass surveillance through the CMS is currently a reality in India.

    And the even bigger question: How can law enforcement agencies mine the data of 1.2 billion people? How do they even carry out surveillance in practice? Well, that’s where surveillance technology companies come in. In fact, the surveillance industry in India is massively expanding - especially in light of its new surveillance schemes which require advanced and sophisticated technology. According to CISIndia Privacy Monitor Map - which is part of ongoing research - Indian law enforcement agencies use CCTV cameras in pretty much every single state in India. The map also shows that Unmanned Aerial Vehicles (UAVs), otherwise known as drones, are being used in most states in India and the DRDOsNetra - which is a lightweight drone, not much bigger than a bird - is particularly noteworthy.

    But Indian law enforcement agencies also buy surveillance software and hardware which is aimed at intercepting telecommunications and Internet communications. In particular, ClearTrail Technologies is an Indian company - based in Indore - which equips law enforcement agencies in India and around the world with surveillance software which can probably be compared with the “notorious” FinFisher. So in short, there appears to be a tight collaboration between Indian law enforcement agencies and the surveillance industry, which can be clearly depicted in the ISS surveillance trade shows, otherwise known as “the wiretappers’ ball”.

    Corporate Surveillance

    When I ask people about corporate surveillance, the answer I usually get is: “Corporations only care about their profit - they don’t do surveillance per se”. And while that may be true, David Lyons definition of surveillance - as “any collection and processing of personal data, whether identifiable or not, for the purposes of influencing or managing those whose data have been garnered” - may indicate otherwise.

    Corporations, like Google, Amazon and Facebook, may not have an agenda for spying per se, but they do collect massive volumes of personal data and, in cases such as PRISM, allow law enforcement to tap into their servers. Once law enforcement agencies get hold of data collected by companies, such as Facebook, they then use data mining software - equipped by various surveillance technology companies - to process and mine the data. And how do companies, like Google and Facebook, make money off our personal data? By selling it to big buyers, such as law enforcement agencies.

    So while Facebook and all the ‘Facebooks’ of the world may not profit from surveillance per se, they do profit from collecting our personal data and selling it to third parties, which include law enforcement agencies. And David Lyon argues that surveillance involves the collection of personal data - which corporations, like Facebook, do - for the purpose of influencing and managing individuals. While this last point can probably be widely debated on, it is clear that corporations share their collected data with third parties, which ultimately leads to the influence or managing of individuals - directly or indirectly. In other words, the collection of personal data, in combination with its disclosure to third parties, is surveillance. So when we think about companies, like Google or Facebook, we should not just think of businesses interested in their profit - but also of spying agencies. After all, if the product is free, you are the product.

    Now if we look at online corporations more closely, we can probably identify three categories:

    1. Websites through which we buy products and hand over our personal details - e.g. Amazon

    2. Websites through which we use services and hand over our personal details - e.g. flight ticket

    3. Websites through which we communicate and hand over our personal details - e.g. Facebook

    And why could the above be considered “spying” at all? Because such corporations collect massive volumes of personal data and subsequently:

    - Disclose such data to law enforcement agencies

    - Allow law enforcement agencies to tap into their servers

    - Sell such data to “third parties”

    What’s notable about so-called corporate surveillance is that, in all cases, there is a mutual, key element: we consent to the handing over of our personal information. We are not forced to hand over our personal data when buying a book online, booking a flight ticket or using Facebook. Instead, we “choose” to hand over our personal data in exchange for a product or service. Now what significantly differentiates state surveillance to corporate surveillance is the factor of “choice”. While we may choose to hand over our most personal details to large online corporations, such as Google and Facebook, we do not have a choice when the government monitors our communications, collects and stores our personal data.

    State Surveillance vs. Corporate Surveillance

    Both Indian law enforcement agencies and corporations collect massive volumes of personal data. In fact, it is probably noteworthy to mention that Facebook, in particular, collects 20 times more data per day than the NSA in total. In addition, Facebook has claimed that it has received more demands from the US government for information about its users than from all other countries combined. In this sense, the corporate collection of personal data can potentially be more harmful than government surveillance, especially when law enforcement agencies are tapping into the servers of companies like Facebook. After all, the Indian government and all other governments would have very little data to analyse if it weren’t for such corporations.

    Surveillance is not just about “spying” or about “watching people” - it’s about much much more. Observing people’s behaviour only really becomes harmful when the data observed is collected, retained, analysed, shared and disclosed to unauthorised third parties. In other words, surveillance is meaningful to examine because it involves the analysis of data, which in turn involves pattern matching and profiling, which can potentially have actual, real-world implications - good or bad. But such analysis cannot be possible without having access to large volumes of data - most of which belong to large corporations, like Facebook. The question, though, is: How do corporations collect such large volumes of personal data, which they subsequently share with law enforcement agencies? Simple: Because we “choose” to hand over our data!

    Three years ago, when I was doing research on young people’s perspective of Facebook, all of the interviewees replied that they feel that they are in control of their personal data, because they “choose” what they share online. While this may appear to be a valid point, the “choice” factor can widely be debated on. There are many reasons why people “choose” to hand over their personal data, whether to buy a product, use a service, to communicate with peers or because they feel socially pressured into using social networking sites. Nonetheless, it all really comes down to one main reason: convenience. Today, in most cases, the reason why we hand over our personal data online in exchange for products or services is because it is simply more convenient to do so. And while that is understandable, at the same time we are exposing our data (and ultimately our lives) in the name of convenience.

    The irony in all of this is that, while many people reacted to Snowdens revelations on NSA dragnet surveillance, most of these people probably have profiles on Facebook. Secret, warrantless government surveillance is undeniably intrusive, but in the end of the day, our profiles on Facebook - and on all the ‘Facebooks’ of the world - is what enabled it to begin with. In other words, if we didn’t choose to give up our personal data - especially without really knowing how it would be handled - large databases would not exist and the NSA - and all the ‘NSAs’ of the world - would have had a harder time gathering and analysing data.

    In short, the main difference between state and corporate surveillance is that the first is imposed in a top-down manner by authorities, while the second is a result of our “choice” to give up our data. While many may argue that it’s worse to have control imposed on you, I strongly disagree. When control and surveillance are imposed on us in a top-down manner, it’s likely that we will perceive this - sooner or later - as a direct threat to our human rights, which means that it’s likely that we will resist to it at some point. People usually react to what they perceive as a direct threat, whereas they rarely react to what does not directly affect them. For example, one may perceive murder or suicide as a direct threat due the immediateness of its effect, whereas smoking may not be seen as an equally direct threat, because its consequences are indirect and can usually be seen in the long term. It’s somehow like that with surveillance.

    University students have protested on the streets against the installation of CCTV cameras, but how many of them have profiles on social networking sites, such as Facebook? People may react to the installation of CCTV cameras, because it may appear as a direct threat to their right to privacy. However, the irony is that the real danger does not necessarily lie within some CCTV cameras, but rather within the profile of each person on a major commercial social networking site. At very best, a CCTV camera will capture some images of us and through that, track our location and possibly our acquaintances. What type of data is captured through a simple, “harmless” Facebook profile? The following probably only includes a tiny percentage of what is actually captured:

    - Personal photos

    - Biometrics (possibly through photos)

    - Family members

    - Friends and acquaintances

    - Habits, hobbies and interests

    - Location (through IP address)

    - Places visited

    - Economic standing (based on pictures, comments, etc.)

    - Educational background

    - Ideas and opinions (which may be political, religious, etc.)

    - Activities

    - Affiliations

    The above list could potentially go on and on, probably depending on how much - or what type - of data is disclosed by the individual. The interesting element to this is that we can never really know how much data we are disclosing, even if we think we control it. While an individual may argue that he/she chooses to disclose an x amount of data, while retaining the rest, that individual may actually be disclosing a 10x amount of data. This may be the case because usually every bit of data hides lots of other bits of data, that we may not be aware of. It all really comes down to who is looking at our data, when and why.

    For example, (fictional) Priya may choose to share on her Facebook profile (through photos, comments, or any other type of data) that she is female, Indian, a Harvard graduate and that her favourite book is Anarchism and other Essays by Emma Goldman. At first glance, nothing appears to be “wrong” with what Priya is revealing and in fact, she appears to care about her privacy by not revealing “the most intimate details” of her life. Moreover, one could argue that there is absolutely nothing “incriminating” about her data and that, on the contrary, it just reflects that she is a “shiny star” from Harvard. However, I am not sure if a data analyst would be restricted to this data and if data analysis would show the same “sparkly” image.

    In theory, the fact that Priya is an Indian who attended Harvard reveals another bit of information, that Priya did not choose to share: her economic standing. Given that the majority of Indians live below the line of poverty, there is a big probability that Priya belongs to India’s middle class - if not elite. Priya may not have intentionally shared this information, but it was indirectly revealed through the bits of data that she did reveal: female Indian and Harvard graduate. And while there may not be anything “incriminating” about the fact that she has a good economic standing, in India this usually means that there’s also some strong political affiliation. That brings us to her other bit of information, that her favourite author is a feminist, anarchist. While that may be viewed as indifferent information, it may be crucial depending on the specific political actors in the country she’s in and on the general political situation. If a data analyst were to map the data that Priya chose to share, along with all her friends and acquaintances that she inevitably has through Facebook, that data analyst could probably tell a story about her. And the concerning part is that that story may or may not be true. But that doesn’t really matter.

    Today, governments don’t judge us and take decisions based on our version of our data, but based on what our data says about us. And perhaps, under certain political, social and economic circumstances, our “harmless” data could be more incriminating than what we think. While an individual may express strong political views within a democratic regime, if that political system were to change in the future and to become authoritarian, that individual would possibly be suspicious in the eyes of the government - to say the least. This is where data retention plays a significant role.

    Most companies retain data indefinitely or for a long period of time, which means that future, potentially less-democratic governments may have access to it. And the worst part is that we can never really know what data is being held about us, because within data analysis, every bit of data may potentially entails various other bits of data that we are not even aware of. So, when we “choose” to hand over our data, we don’t necessarily know what or how much we are choosing to disclose. Thus, this is why I agree with Bruce Schneier’s argument that people have an illusionary sense of control over their personal data.

    Social network analysis software is specifically designed to mine huge volumes of data that is collected through social networking sites, such as Facebook. Such software is specifically designed to profile individuals, to create “trees of communication” around them and to match patterns. In other words, this software tells a story about each and every one of us, based on our activities, interests, acquaintances, and all other data. And as mentioned before, such a story may or may not be true.

    In data mining, behavioural statistics are being used to analyse our data and to predict how we are likely to behave. When applied to national databases, this may potentially amount to predicting how masses or groups within the public are likely to behave and to subsequently control them. If a data analyst can predict an individual’s future behaviour - with some probability - based on that individuals’ data, the same could potentially occur on a mass, public level. As such, the danger within surveillance - especially corporate surveillance through which we voluntarily disclose massive amounts of data about ourselves - is that it appears to come down to public control.

    According to security expert Bruce Schneier, data today is a byproduct of the Information Society. Unlike an Orwellian totalitarian state where surveillance is imposed in a top-down manner, surveillance today appears to widely exist because we indirectly choose and enable it (by handing over our data to online companies), rather than it being imposed on us in a solely top-down manner. However, contemporary surveillance may potentially be far worse than that described in Orwell’s “1984”, because surveillance is publicly perceived to be an indirect threat - if considered to be a threat at all. It is more likely that people will resist a direct threat, than an indirect threat, which means that the possibility of mass violations of human rights as a result of surveillance is real.

    Hannah Arendt argued that a main prerequisite and component of totalitarian power is support by the masses. Today, surveillance appears to be socially integrated within societies which indicates that contemporary power fueled by surveillance has mass support. While the argument that surveillance is being socially integrated can potentially be widely debated on and requires an entire in depth research of its own, few simple facts might be adequate to prove it at this stage. Firstly, CCTV cameras are installed in most countries, yet there has been very little resistance - on the contrary, there appears to be a type of universal acceptance on the grounds of security. Secondly, different types of spy products exist in the market - such as Spy Coca Cola cans - which can be purchased by anyone online. Thirdly, countries all over the world carry out controversial surveillance schemes - such as the Central Monitoring System in India - yet public resistance to such projects is limited. And while one may argue that the above cases don’t necessarily prove that surveillance is being socially integrated, it would be interesting to look at a fourth fact: most people who have Internet access choose to share their personal data through the use of social networking sites.

    Reality shows, such as Big Brother, which broadcast the surveillance of people’s lives and present it as a form of entertainment - when actually, I think it should be worrisome - appear to enable the social integration of surveillance. The very fact that we all probably - or, hopefully - know that Facebook can share our personal data with unauthorised third parties and - now, after the Snowden revelations - that governments can tap into Facebook’s servers, should be enough to convince us to delete our profiles. Yet, why do we still all have Facebook profiles? Perhaps because surveillance is socially integrated and perhaps because it is just convenient to be on Facebook. But that doesn’t change the fact that surveillance can potentially be a threat to our human rights. It just means that we perceive surveillance as an indirect threat and that we are unlikely to react to it.

    In the long term, what does this mean? Well, it seems like we will probably be more acceptive towards more authoritarian power, that we will be used to the idea of censoring our own thoughts and actions (in the fear of getting caught by the CCTV camera on the street or the spyware which may or may not be implanted in our laptop) and that ultimately, we will be less politically active and more reluctant to challenge the authority.

    What’s particularly interesting though about surveillance today is that it is fueled and enabled through our freedom of speech and general Internet freedom. If we didn’t have any Internet freedom - or as much as we do - we would have disclosed less personal data and thus surveillance would probably have been more restricted. The more Internet freedom we have, the more personal data we will disclose on Facebook - and on all the ‘Facebooks’ of the world - and the more data will potentially be available to mine, analyse, share and generally incorporate in the surveillance regime. So in this sense, Internet freedom appears to be a type of prerequisite of surveillance, as contradictory and ironic as it may seem. No wonder why the Chinese government has gone the extra mile in creating the Chinese versions of Facebook and Twitter - it’s probably no coincidence.

    While we may blame governments for establishing surveillance schemes, ISP and TSP operators for complying with governments’ license agreements which often mandate that they create backdoors for spying on us and security companies for creating the surveillance gear in the first place, in the end of the day, we are all equally a part of this mess. If we didn’t choose to hand over our personal data to begin with, none of the above would have been possible.

    The real danger in the Digital Age is not necessarily surveillance per se, but our choice to voluntarily disclose our personal data.

    Document Actions