The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 31 to 45.
Big Tech’s privacy promise to consumers could be good news — and also bad news
http://editors.cis-india.org/internet-governance/blog/indian-express-rajat-kathuria-isha-suri-big-tech-consumers-privacy-policy
<b>Rajat Kathuria, Isha Suri write: Its use as a tool for market development must balance consumer protection, innovation, and competition.</b>
<p style="text-align: justify; ">In February, Facebook, rebranded as Meta, stated that its revenue in 2022 is anticipated to reduce by $10 billion due to steps undertaken by Apple to enhance user privacy on its mobile operating system. More specifically, Meta attributed this loss to a new AppTrackingTransparency feature that requires apps to request permission from users before tracking them across other apps and websites or sharing their information with and from third parties. Through this change, Apple effectively shut the door on “permissionless” internet tracking and has given consumers more control over how their data is used. Meta alleged that this would hurt small businesses benefiting from access to targeted advertising services and charged Apple with abusing its market power by using its app store to disadvantage competitors under the garb of enhancing user privacy.</p>
<hr />
<p style="text-align: justify; ">Access the full article published in the <a class="external-link" href="https://indianexpress.com/article/opinion/columns/big-tech-consumers-privacy-policy-7866701/">Indian Express</a> on April 13, 2022</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/indian-express-rajat-kathuria-isha-suri-big-tech-consumers-privacy-policy'>http://editors.cis-india.org/internet-governance/blog/indian-express-rajat-kathuria-isha-suri-big-tech-consumers-privacy-policy</a>
</p>
No publisherRajat Kathuria and Isha SuriInternet GovernancePrivacy2023-01-18T23:25:28ZBlog EntryInputs to the Report on the Non-Personal Data Governance Framework
http://editors.cis-india.org/raw/inputs-to-report-on-non-personal-data-governance-framework
<b>This submission presents a response by researchers at the Centre for Internet and Society, India (CIS) to the draft Report on Non-Personal Data Governance Framework prepared by the Committee of Experts under the Chairmanship of Shri Kris Gopalakrishnan. The inputs are authored by Aayush Rathi, Aman Nair, Ambika Tandon, Pallavi Bedi, Sapni Krishna, and Shweta Mohandas (in alphabetical order), and reviewed by Sumandro Chattapadhyay.</b>
<p> </p>
<h4>Text of submitted inputs: <a href="https://cis-india.org/raw/files/cis-inputs-to-report-on-non-personal-data-governance-framework" target="_blank">Read</a> (PDF)</h4>
<h4>Report by the Committee of Experts on Non-Personal Data Governance Framework: <a href="https://static.mygov.in/rest/s3fs-public/mygov_159453381955063671.pdf" target="_blank">Read</a> (PDF)</h4>
<hr />
<h2>Inputs</h2>
<h3>Clause 3.7 (v): The role of the Indian government in the operation of data markets</h3>
<p>While highlighting the potential for India to be one of the top consumer and data markets of the world, it also sheds light on the concern about the possibility of data monopolies. The clause envisions the role of the Indian government as a regulator and a catalyst for domestic data markets.</p>
<p>In doing so, the clause does not acknowledge that the proactive and dominant roles of the Indian government in generation and reuse of data, based on the existing data collection practices, as well as the provisions that have been given, as under the compulsory sharing provisions in the Report, and would continue to be given by the Personal Data Protection Bill. In reality, the Indian government’s role is not just of a catalyst but also of a key player, potentially with monopolistic market power, in the domestic data market, especially due to the ongoing data marketplace initiatives as detailed in published policy and vision documents. [1]</p>
<h3>Clause 3.8 (iv): Introducing collective privacy</h3>
<p>The introduction of collective privacy has initiated an overdue discussion at the policy level to arrive at privacy formulations that account for limitations in the contemporary dominant social, legal and ethical paradigms of privacy premised on individual interests and personal harm. The notion of collective privacy has garnered contemporary attention with the rise of data processing technologies and business models that thrive on the collection and processing of aggregate information.</p>
<p>While the Report acknowledges that collective privacy is an evolving concept, it doesn’t attempt to define either collective or what privacy could entail in the context of a collective. The postulation of collective privacy as a legally binding right is bereft with challenges in both domestic and international legal frameworks. [2]</p>
<p>Central to these challenges is the representation of the group of the entity. While the Report illustrates harms that may be incurred by certain collectives that collective privacy could protect against, these illustrated collectives are already recognised in law as rights-holding groups (society members, for example), and/or share pre-determined attributes (sexual orientation, for example).</p>
<p>The Report does not acknowledge that the very technological processes that may have rendered the articulation of collective privacy necessary, also are intended to create ad-hoc and newer sets of individuals or groups with shared attributes. [3] In doing so, the Report furthers an ontology of groups having intuitive, predetermined attributes that exist naturally, or in law, whereas the intervention of data collection and processing technologies can determine shared group attributes afresh. Moreover, the Report also ignores that predetermined attributes are static, and in doing so, ignores a vast existing literature speaking to fluidity of identities and the intersectionality of identities that individuals in groups occupy. [4] We fully appreciate the challenges these pose in the determination of the legal contours of collective privacy. Much of the Report’s recommendations are premised on the idea of a predetermined collective, rendering more granular exploration of these ideas urgent.</p>
<p>Further, the Report also puts forth a limited conception of privacy as a safeguard against data-related harms that may be caused to collectives. In doing so, it dilutes the conceptualisation of individual privacy as articulated in Justice K. S. Puttaswamy (Retd.) and Anr. vs Union Of India And Ors. Notwithstanding this dilution, the illustrations also only indicate harms that may be caused by private actors. Any further recommendations should envision the harms that may also be caused by public data-driven processes, such as those incubated within the state machinery.</p>
<h3>Clause 4.1 (iii) and Recommendation 1: Defining Non-Personal Data</h3>
<p>The Report proposes the definition of non-personal data to include (i) data that was never related to an identified or identifiable natural person, and (ii) aggregated, anonymised personal data such that individual events are “no longer identifiable”. In doing so, they have attempted to extend protections to categories of data that fall outside the ambit of the Personal Data Protection Bill, 2019 (hereafter “PDP Bill”). The Report is cognizant of the fallible nature of anonymization techniques but fails to indicate how these may be addressed.
The test of anonymization in regarding data as non-personal data requires further clarification. Anonymization, in and of itself, is an ambiguous standard. Scholarship has indicated that anonymised data may never be completely anonymous. [5] Despite this, the PDP Bill proposes a high threshold of zero-risk of anonymization in relation to personal data, to mean “such irreversible process of transforming or converting personal data to a form in which a data principal cannot be identified”. From a plain reading, it appears that the Report proposes a lower threshold of the anonymization requirements governing non-personal data. It is unclear how non-personal data would then be different from inferred data as described within the definition of personal data under the PDP Bill. This adds regulatory uncertainty making it imperative for the Committee to articulate bright-line, risk-based principles and rules for the test of anonymization. Such rules should also indicate the factors that ought to be taken into account to determine whether anonymization has occurred and the timescale of reference for anonymization outcomes. [6]</p>
<p>The recommendation also states that the data principal should "also provide consent for anonymisation and usage of this anonymized data while providing consent for collection and usage of his/her personal data". However the framing of this recommendation fails to mention the responsibility of the data fiduciary to provide notice to the data principal about the usage of the anonymized data while seeking the data principal’s consent for anonymization. The notice provided to the data principal should provide clear indication that consent of the data principal is based on their knowledge of the use of the anonymized data.</p>
<h3>Clause 4.8 (i), (ii): Function of data custodians</h3>
<p>The Report does not make it clear who may perform the role of data custodians. The use of data fiduciary indicates the potential import of the definition of ‘data fiduciary’ as specified under Clause 3.13 of the PDP Bill. However, this needs to be further clarified.</p>
<h3>Clause 4.8 (iii): Data custodians’ “duty of care”</h3>
As is outlined in the following section on data trustees, it can be difficult for a singular entity to maintain a duty of care and undertake actions with the best interest of a community when that community consists of sub-communities that may be marginalised.
Further, ‘duty of care’, ‘best interest’, and ‘absence of harm’ are not sufficient standards for data processing by data custodians. Recommendations to the effect of obligating data custodians to uphold the rights of data principals, including economic and fundamental rights need to be incorporated in the framework.
<h3>Clause 4.9: Data trustees</h3>
<p>The committee’s suggestion that the “most appropriate representative body” should be the data trustee—that often being either the corresponding government entity or community body— is reasonable at face value. However, in the absence of any clear principles defining what constitutes “most appropriate” there are a number of potential issues that can appear:</p>
<p><strong>Lack of means for selecting a data trustee:</strong> The report makes note of the fact that both private and public entities can be selected to be data trustees but offers no principles on how these data trustees can be selected, i.e. whether they are to be directly selected by the members of a community, and if so how. Any selection criteria or process prescribed has to keep in mind the following point regarding the potential lack of representation for marginalised communities that could arise from a direct selection of a data trustee by a group of people.</p>
<p><strong>Issues of having a single data trustee for large scale communities and when dealing with marginalised communities:</strong> The report assumes that in instances wherein a community is spread across a geographic region, or consists of multiple sub-communities, then the data trustee will be the closest shared government authority (for example, the Ministry of Health and Family Welfare, Government of India being the data trustee for data regarding diabetes among Indian citizens).</p>
<p><strong>This idea of a singular data trustee assumes that the ‘best interests’ of a community are uniform across that community. This can prove problematic especially when dealing with data obtained from marginalised communities that forms a part of a wider dataset.</strong> It is entirely possible to imagine that a smaller disenfranchised community may have interests that are not aligned with the general majority. In such a situation the Report is unclear as to whether the data trustee would have to ensure that the best interests of all groups are maintained, or would they be responsible for ensuring the best interests of the largest number of people within that community.
There are power differentials between citizens, government agencies, and other entities described by the Report. This places citizens at risk of abuse of power by government entities in their role as trustees, who are effectively being empowered through this policy framework as opposed to a representative mechanism. It is recommended that data trustees be appointed by relevant communities through clear and representative mechanisms. Additionally, any individual should be able to file complaints regarding the discharge of community trust by data trustees. This is necessary as any subsequent rights vested in the community can only be exercised through the data trustee, and become unenforceable in the lack of an appropriate data trustee.</p>
<p>Any legislation that arises on the basis of this report will therefore have to not only provide a means for selecting the data trustee, but also safeguards for ensuring that data collected from marginalised communities are used keeping in mind their specific best interests—with these best interests being informed through consultation with that community.</p>
<h3>Clause 4.10 (iii): Data trusts</h3>
<p>Section 4.10 (iii) notes that data custodians may voluntarily share data in these data trusts. However it is unclear if such sharing must be done with the express consent of the relevant data trustee.</p>
<h3>Clause 4.10 (iv): Mandatory sharing and competition</h3>
<p>The fundamental premise of a mandatory data sharing regime seems increasingly distant from its practical impacts. The EU which earlier championed the cause now seems reluctant to further it on the face of studies which skews towards counteractive impacts of such steps. Such steps could apply to huge volumes of first-party data companies collect on their own assets, products and services, even though such data are among the least likely to create barriers to entry or contribute to abuses of dominant positions. [7] This is hence likely to bring in more chilling effect on innovation and investment than a pro-competition environment. The velocity of big data also adds to the futility of such data sharing mandates. [8] It is recommended that a sectoral analysis of this mandate be undertaken instead of an overarching stipulation.</p>
<p>The Report suggests extensive data sharing without addressing the extent of obligation on the private players to submit to these requests and process them. The availability of meta-data about the data collected may be made easily accessible under mandates of transparency. However, the access to the detailed underlying data will be difficult in most cases due to the current structure of entities functioning in cyberspace, evidenced by the lack of compliance to such mandates by Courts of Law in the EU. Such a system can easily eliminate the comparative advantage of smaller players, helping larger players with more money at their disposal enabling their growth and throttling the smaller players. It could have serious implications on data quality and integrity through the sharing of erroneous data. Access to superior quality digital services in India may also have to be compromised. If this regime is furthered without amends to address these concerns, it might end up counter productive.</p>
<h3>Clause 5.1 (iv): Grievance redressal against state’s role</h3>
<p>This clause acknowledges the vast potential for government authorities and other bodies to abuse their power as data trustee. In addition, it should describe the setting up of impartial and accessible mechanisms for citizens to complain against such abuse of power and appropriate penalties, including the removal of the data trustee.</p>
<h3>Chapter 7, Recommendation 5: Purpose of data-sharing</h3>
<p>Recommendation 5 leaves scope for “national security” as a sovereign purpose for data sharing. This continues to be in line with the trend of having an overarching national security clause, as in the Personal Data Protection Bill, 2019. There could be provisions made to enable access to data for sovereign purposes without such broad definition, replacing it based on constitutional terms which will limit it to the confines laid down in the Constitution. This will effectively curb any misuse of the provision and strongly embed the proposed regulation of non-personal data on constitutional ethos. This can also prevent future conflicts with the fundamental rights.</p>
<p>Platform companies have leveraged their position in society to take on an ever-greater number of quasi-public functions, exercising new forms of unaccountable, transnational authority. It is not difficult to imagine that this trend can continue to non-platform companies, or even taken forward by these very entities which also have access to a large chunk of non-personal data. A strict division between sovereign purposes and core public interest purposes seems difficult. However, it is imperative to have a clearer definition of core public interest purposes and sovereign purposes. The broad based definition may facilitate reduced accountability. Separating government actions from sovereign purposes could bring forth the power imbalance between the State and its people, while in the case of the non-governmental entities, it will facilitate encroachment of government functions by private players. Both these cases may not consider the best interest of the data generators, or the people at large.</p>
<h3>Clause 7.1 (i): Data needs of law enforcement</h3>
<p>Clause 7.1 (i) allows for acquisition of data governed by this framework for crime mapping, devising anticipation and preventive measures, and for investigations and law enforcement. While this may be necessary to be granted to law enforcement in certain cases, this should happen only with an express permission of a court of law. Blanket executive access allows higher possibility of misuse by the people involved in law enforcement.</p>
<h3>Clause 7.2 (iv): Use of health data as a pilot</h3>
<p>The clause suggests the use of health sector data as a pilot use-case. This is highly undesirable due to the inherent nature of high sensitivity of the larger part of data related to the health sector. The high vulnerability of such data to harm the data principals should act as a deterrent in using this as the pilot use-case. Given the mass availability of data related to the health sector due to the pandemic, it creates further points of vulnerabilities which can be illegally monetised and misappropriated. It is recommended that this proposal be scrapped altogether.</p>
<h3>Clause 7.2 (iii): Power of government bodies</h3>
<p>As per this clause, data trustees or government bodies (who could also be acting as data trustees) can make requests for data sharing and place such data in appropriate data infrastructures or trusts. This presents a conflict of interest, as a data trust or government body can empower itself to be the data trustee. Such cases should be addressed within the scope of the framework.</p>
<h3>Clause 8.2 (vii): Level-playing field for all Indian actors</h3>
<p>In terms of this clause the “Non-Personal Data Authority (Authority) will ensure a level playing field for all Indian actors to fulfil the objective of maximising Indian data’s value to the Indian economy”. The emphasis on ensuring a level playing field for only Indian actors instead of non-discriminatory platform for all concerned actors irrespective of the country/nationality of the actor has the potential of violating India’s trade obligations under the WTO. Member states of the WTO are essentially restricted from discriminating between products and services coming from different WTO Members, and between foreign and domestic products and services unless they can avail of exceptions. There is also no clarity on what constitutes ‘Indian Actors’, would a Multi-National Corporation with its headquarters in a foreign State, but its subsidiaries in India also come within its ambit.</p>
<h3>Clause 8.2 (x): Composition of the Authority</h3>
<p>Clause 8.2 (x) states that the Authority will have some members with relevant industry experience. However, apart from this clause, the report is silent on the composition of the Authority. The report recognises that Authority will need individuals/organisations with specialised knowledge, i.e. data governance, technology, latest research and innovation in the field of non-personal data), however, it does not mention or refer to the role of civil society organisations and the need for representation from such organisations in the Authority.</p>
<p>The report frequently alludes to non-personal data being used for the best interest of the data principal and therefore, it is essential that the composition of the Authority reflect the inherent asymmetry of power between the data principal and the State. Considering that the Authority will also be responsible for sharing of community data and with determining the code of conduct for sharing of such data, it is important that the Authority also has adequate representation from civil society organisations along with groups or individuals having the necessary technological and legal skills.</p>
<h3>Clause 8.2 (iii) and (vi): Roles and Responsibility of the Authority</h3>
<p>A majority of the datasets in the country comprise of ‘mixed datasets’, i.e. it consists of both personal and non-personal data. However, there is lack of clarity about the coordination between the Data Protection Authority constituted under the PDP Bill and the Non-Personal Data Authority with regard to the regulation of such datasets. The Report refers to the European Union which provides that the Non-Personal Data Regulation applies to the Non-Personal Data of mixed datasets; if the Non-Personal Data part and the personal data parts are ‘inextricably linked’, the General Data Protection Regulation apply to the whole mixed dataset. However, it is unclear whether the Report also proposes the same mechanism for the regulation of mixed datasets.</p>
<p>Further, the contours of the enforcement role of the Committee should be specified and clearly laid down. Will the Committee also have penal powers as prescribed for the Data Protection Authority under the PDP Bill? Also, will the privacy concerns emanating from the risk of re-anonymisation of data be addressed by the NPD Committee or by the DPA under the PDP Bill. Ideally, it should be specified that any such privacy concerns will fall within the domain of the DPA as the data is then converted into personal data and the DPA will be empowered to deal with such issues.</p>
<h3>Endnotes</h3>
<p>[1] See Ministry of Health and Family Welfare. (2020). National Digital Health Blueprint. Government of India. <a href="https://main.mohfw.gov.in/sites/default/files/Final%20NDHB%20report_0.pdf">https://main.mohfw.gov.in/sites/default/files/Final%20NDHB%20report_0.pdf</a>; Tandon, A. (2019). Big Data and Reproductive Health in India: A Case Study of the Mother and Child Tracking System. <a href="https://cis-india.org/raw/big-data-reproductive-health-india-mcts">https://cis-india.org/raw/big-data-reproductive-health-india-mcts</a></p>
<p>[2] Taylor, L., Floridi, L., van der Sloot, B. eds. (2017) Group Privacy: new challenges of data technologies. Dordrecht: Springer.</p>
<p>[3] Mittelstadt, B. (2017). From Individual to Group Privacy in Big Data Analytics. Philos. Technol. 30, 475–494.</p>
<p>[4] See Taylor, L., Floridi, L., van der Sloot, B. eds. (2017) Group Privacy: new challenges of data technologies. Dordrecht: Springer; Tisne, M. (n.d). The Data Delusion: Protecting Individual Data Isn't Enough When The Harm is Collective. Stanford Cyber Policy Centre. <a href="https://cyber.fsi.stanford.edu/publication/data-delusion">https://cyber.fsi.stanford.edu/publication/data-delusion</a></p>
<p>[5] Rocher, L., Hendrickx, J.M. & de Montjoye, Y. (2019). Estimating the success of re-identifications in incomplete datasets using generative models. Nat Commun 10, 3069 . <a href="https://doi.org/10.1038/s41467-019-10933-3">https://doi.org/10.1038/s41467-019-10933-3</a></p>
<p>[6] Finck, M. & Pallas, F. (2020). They who must not be identified—distinguishing personal from non-personal data under the GDPR. International Data Privacy Law, 10 (1), 11–36. <a href="https://doi.org/10.1093/idpl/ipz026">https://doi.org/10.1093/idpl/ipz026</a></p>
<p>[7] European Commission (2020). Communication From The Commission To The European Parliament, The Council, The European Economic And Social Committee And The Committee Of The Regions: A European strategy for data. <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1593073685620&uri=CELEX:52020DC0066">https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1593073685620&uri=CELEX:52020DC0066</a></p>
<p>[8] Modrall, Jay. (2019). Antitrust risks and Big Data. Norton Rose Fullbright. <a href="https://www.nortonrosefulbright.com/en-in/knowledge/publications/64c13505/antitrust-risks-and-big-data">https://www.nortonrosefulbright.com/en-in/knowledge/publications/64c13505/antitrust-risks-and-big-data</a></p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/inputs-to-report-on-non-personal-data-governance-framework'>http://editors.cis-india.org/raw/inputs-to-report-on-non-personal-data-governance-framework</a>
</p>
No publishersumandroData SystemsPrivacyResearchers at WorkDigital EconomyData GovernanceSubmissions2020-12-30T09:40:52ZBlog EntryThe PDP Bill 2019 Through the Lens of Privacy by Design
http://editors.cis-india.org/internet-governance/blog/the-pdp-bill-2019-through-the-lens-of-privacy-by-design
<b>This paper evaluates the PDP Bill based on the Privacy by Design approach. It examines the implications of Bill in terms of the data ecosystem it may lead to, and the visual interface design in digital platforms. This paper focuses on the notice and consent communication suggested by the Bill, and the role and accountability of design in its interpretation. </b>
<h2>Background</h2>
<div> </div>
<p>The Personal Data Protection (PDP) Bill, 2019 was introduced in the Lok Sabha on December 11, 2019 by the Minister of Electronics and Information Technology. The Bill aims to provide for protection of personal data of individuals, and establishes a Data Protection Authority for the same <a class="external-link" href="https://www.prsindia.org/billtrack/personal-data-protection-bill-2019">[1]</a>. The PDP Bill, 2019 contains several clauses that have implications on the visual design of digital products. These include the specific requirements for communication of notice and consent at various stages of the product. The Bill also introduces the Privacy by Design policy. Privacy by Design (PbD), as a concept, was proposed by Ann Cavoukian in the 1990s, with the purpose of approaching privacy from a design-thinking perspective <a class="external-link" href="https://iab.org/wp-content/IAB-uploads/2011/03/fred_carter.pdf">[2]</a>. She describes this perspective to be holistic, interdisciplinary, integrative, and innovative. The approach suggests that privacy must be incorporated into networked data systems and technologies, by default <a class="external-link" href="https://iab.org/wp-content/IAB-uploads/2011/03/fred_carter.pdf">[3]</a>. It challenges the practice of enhancing privacy as an afterthought. It expects privacy to be a default setting, and a proactive (not reactive) measure that would be embedded into a design in its initial stage and throughout the life cycle of the product <a class="external-link" href="https://www.smashingmagazine.com/2019/04/privacy-ux-aware-design-framework/">[4]</a>. While PbD is a conceptual framework, it’s application can change the way digital platforms are created and the way in which people interact with them. From devising a business model, to making technological decisions, PbD principles can make privacy integral to the processes and standards of a digital platform.</p>
<p><br />The PDP Bill states that data fiduciaries are required to prepare a Privacy by Design policy and have it certified by the Data Protection Authority. According to the Bill, the policy would contain the managerial, organisational, business practices and technical systems designed to anticipate, identify and avoid harm to the data principal <a class="external-link" href="http://164.100.47.4/BillsTexts/LSBillTexts/Asintroduced/373_2019_LS_Eng.pdf">[5]</a>. It would mention if the technology used in the processing of personal data is in accordance with the certified standards. It would also comprise of the ways in which privacy is being protected throughout the stages of processing of personal data, and that the interest of the individual is accounted for in each of these stages. Once certified by the Data Protection Authority, the data fiduciaries are also required to publish this policy on their website <a class="external-link" href="https://sflc.in/key-changes-personal-data-protection-bill-2019-srikrishna-committee-draft">[6]</a>. This forces the data fiduciaries to envision privacy as a fundamental requirement and not an afterthought. Such a policy would have a huge impact in the way digital platforms are conceptualised, both from the technological and the design point of view. The adoption of this policy by digital platforms would enable people to know if their privacy is protected by the companies, and what are the various steps being taken for this purpose. Besides the explicit Privacy by Design policy, the PDP Bill, 2019, also recommends the regulations for data minimisation, establishment of the Data Protection Authority (DPA), and the development of a consent framework. These steps are also part of the Privacy by Design approach.</p>
<p><br />This paper evaluates the PDP Bill based on the Privacy by Design approach. The Bill’s scope includes both the conceptual and technological aspects of a digital platform, as well as the interface aspect that the individual using the platform faces. The paper will hence analyse how PbD approach is reflected in both these aspects. At the conceptual level, it will look at the data ecosystem that the Bill unwittingly creates, and at the interface level, it will critically analyse the Bill’s implication on the notice and consent communication in the digital products. This includes the several points of communication or touchpoints between a company and an individual using their service, as dictated by the Bill, and how they would translate into visual design. Visual design forms an integral part of digital platforms. It is the way in which the platforms interact with the individuals. The choices made by individuals are largely driven by the visual structuring and presentation of information on these platforms. Presently, the interface design in several platforms is being used to perpetuate unethical data practices in the form of dark patterns. Dark Patterns are deceptive user interface interactions, designed to mislead or trick users to make them do something they don’t want to do<a class="external-link" href="https://uxdesign.cc/dark-patterns-in-ux-design-7009a83b233c"> [7]</a>. The design of the notice and consent touchpoints can significantly influence the enforcement of this Bill, and how it benefits individuals. Moreover, digital platforms may technically follow the regulations but can still be manipulative through their interface design. Thus, the role and accountability of design becomes crucial in the interpretation of the data protection regulations.</p>
<p> </p>
<p>The full paper can be read <a href="http://editors.cis-india.org/internet-governance/the-pdp-bill-2019-through-the-lens-of-privacy-by-design/at_download/file" class="external-link">here</a>.</p>
<p>[1] <a class="external-link" href="https://www.prsindia.org/billtrack/personal-data-protection-bill-2019">https://prsindia.org/billtrack/personal-data-protection-bill-2019</a> </p>
<p>[2] <a class="external-link" href="https://iab.org/wp-content/IAB-uploads/2011/03/fred_carter.pdf">https://iab.org/wp-content/IAB-uploads/2011/03/fred_carter.pdf</a></p>
<p>[3] <a class="external-link" href="https://iab.org/wp-content/IAB-uploads/2011/03/fred_carter.pdf">https://iab.org/wp-content/IAB-uploads/2011/03/fred_carter.pdf</a></p>
<p>[4] <a class="external-link" href="https://www.smashingmagazine.com/2019/04/privacy-ux-aware-design-framework/">https://www.smashingmagazine.com/2019/04/privacy-ux-aware-design-framework/</a></p>
<p>[5] <a class="external-link" href="http://164.100.47.4/BillsTexts/LSBillTexts/Asintroduced/373_2019_LS_Eng.pdf">http://164.100.47.4/BillsTexts/LSBillTexts/Asintroduced/373_2019_LS_Eng.pdf</a></p>
<p>[6] <a class="external-link" href="https://sflc.in/key-changes-personal-data-protection-bill-2019-srikrishna-committee-draft">https://sflc.in/key-changes-personal-data-protection-bill-2019-srikrishna-committee-draft</a></p>
<p>[7] <a class="external-link" href="https://uxdesign.cc/dark-patterns-in-ux-design-7009a83b233c">https://uxdesign.cc/dark-patterns-in-ux-design-7009a83b233c</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/the-pdp-bill-2019-through-the-lens-of-privacy-by-design'>http://editors.cis-india.org/internet-governance/blog/the-pdp-bill-2019-through-the-lens-of-privacy-by-design</a>
</p>
No publisherSaumyaa Naidu, Akash Sheshadri, Shweta Mohandas, and Pranav M Bidare; Edited by Arindrajit Basu, Shweta Reddy; With inputs from Amber SinhaDesignInternet GovernanceData ProtectionPrivacy2020-11-13T07:51:03ZBlog EntryFundamental Right to Privacy — Three Years of the Puttaswamy Judgment
http://editors.cis-india.org/internet-governance/blog/fundamental-right-to-privacy-three-years-of-the-puttaswamy-judgment
<b></b>
<p id="docs-internal-guid-bf702073-7fff-fb00-21f6-28515e6faf55" dir="ltr"> </p>
<p dir="ltr">Today marks three years since the Supreme Court of India recognised the fundamental right to privacy, but the ideals laid down in the Puttaswamy Judgment are far from being completely realized. Through our research, we invite you to better understand the judgment and its implications, and take stock of recent issues pertaining to privacy. </p>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Amber Sinha dissects the Puttaswamy Judgment through an analysis of the sources, scope and structure of the right, and its possible limitations. [<a href="https://cis-india.org/internet-governance/blog/the-fundamental-right-to-privacy-an-analysis">link</a>]</p>
</li></ol>
<ol start="2"><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Through a visual guide to the fundamental right to privacy, Amber Sinha and Pooja Saxena trace how courts in India have viewed the right to privacy since Independence, explain how key legal questions were resolved in the Puttaswamy Judgement, and provide an account of the four dimensions of privacy — space, body, information and choice — recognized by the Supreme Court. [<a href="https://cis-india.org/internet-governance/files/amber-sinha-and-pooja-saxena-the-fundamental-right-to-privacy-a-visual-guide/view">link</a>]</p>
</li></ol>
<ol start="3"><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Based on publicly available submissions, press statements, and other media reports, Arindrajit Basu and Amber Sinha track the political evolution of the data protection ecosystem in India, on EPW Engage. They discuss how this has, and will continue to impact legislative and policy developments. [<a href="https://www.epw.in/engage/article/politics-indias-data-protection-ecosystem">link</a>] </p>
</li></ol>
<ol start="4"><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">For the AI Policy Exchange, Arindrajit Basu and Siddharth Sonkar examine the Automated Facial Recognition Systems (AFRS), and define the key legal and policy questions related to privacy concerns around the adoption of AFRS by governments around the world. [<a href="https://aipolicyexchange.org/2019/12/26/decrypting-automated-facial-recognition-systems-afrs-and-delineating-related-privacy-concerns/">link</a>]</p>
</li></ol>
<ol start="5"><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Over the past decade, reproductive health programmes in India have been digitising extensive data about pregnant women. In partnership with Privacy International, we studied the Mother and Child Tracking system (MCTS), and Ambika Tandon presents the impact on the privacy of mothers and children in the country. [<a href="https://cis-india.org/internet-governance/blog/privacy-international-ambika-tandon-october-17-2019-mother-and-child-tracking-system-understanding-data-trail-indian-healthcare">link</a>] </p>
</li></ol>
<ol start="6"><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">While the right to privacy can be used to protect oneself from state surveillance, Mira Swaminathan and Shubhika Saluja write about the equally crucial problem of lateral surveillance — surveillance that happens between individuals, and within neighbourhoods, and communities — with a focus on this issue during the COVID-19 crisis. [<a href="https://cis-india.org/internet-governance/blog/essay-watching-corona-or-neighbours-introducing-2018lateral-surveillance2019-during-covid201919">link</a>]</p>
</li></ol>
<ol start="7"><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Finally, take a dive into the archives of the Centre for Internet and Society to read our work, which was cited in the Puttaswamy judgment — essays by Ashna Ashesh, Vidushi Marda and Bhairav Acharya that displaced the notion that privacy is inherently a Western concept, by attempting to locate the constructs of privacy in Classical Hindu [<a href="https://cis-india.org/internet-governance/blog/loading-constructs-of-privacy-within-classical-hindu-law">link</a>], and Islamic Laws [<a href="https://cis-india.org/internet-governance/blog/identifying-aspects-of-privacy-in-islamic-law">link</a>]; and Acharya’s article in the Economic and Political Weekly, which highlighted the need for privacy jurisprudence to reflect theoretical clarity, and be sensitive to unique Indian contexts [<a href="https://cis-india.org/internet-governance/blog/economic-and-political-weekly-bhairav-acharya-may-30-2015-four-parts-of-privacy-in-india">link</a>]. </p>
</li></ol>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/fundamental-right-to-privacy-three-years-of-the-puttaswamy-judgment'>http://editors.cis-india.org/internet-governance/blog/fundamental-right-to-privacy-three-years-of-the-puttaswamy-judgment</a>
</p>
No publisherpranavinternet governanceInternet GovernancePrivacy2020-08-24T07:46:10ZBlog EntryA Compilation of Research on the PDP Bill
http://editors.cis-india.org/internet-governance/blog/compilation-of-research-on-data-protection
<b>The most recent step in India’s initiative to create an effective and comprehensive Data Protection regime was the call for comments to the Personal Data Protection Bill, 2019, which closed last month. Leading up to the comments, CIS has published numerous research pieces with the goal of providing a comprehensive overview of how this legislation would place India within the global scheme, and how the local situation has developed, as well as analysing its impacts on citizens’ rights.</b>
<p> </p>
<p>In addition to general and clause-by-clause comments and recommendations, we
have compiled an annotated version of the Personal Data Protection
Bill, which lays out our <a class="external-link" href="https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019">commentary</a> in an easy-to-follow format.</p>
<p> </p>
<p><img src="https://cis-india.org/internet-governance/pdp-bill-compilation-post-image/" alt="null" width="100%" /></p>
<p> </p>
<p>Below, you can find our other recent research on Data Protection:</p>
<p> </p>
<ul><li>Pallavi Bedi has put together a <a class="external-link" href="https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019">note</a> on the Divergence between EU’s General Data Protection Regulation (GDPR) and the Personal Data Protection Bill.</li></ul>
<div> </div>
<ul><li>In addition, Pallavi has also <a class="external-link" href="https://cis-india.org/internet-governance/blog/comparison-of-the-personal-data-protection-bill-with-the-general-data-protection-regulation-and-the-california-consumer-protection-act-2">contrasted</a> the Personal Data Protection Bill with the GDPR and California Consumer Protection Act, in the contexts of jurisdiction and scope, rights of the data principal, obligations of data fiduciaries, exemptions, data protection authority, and breach of personal data. </li></ul>
<div> </div>
<ul><li>On IAPP’s blog <em>Privacy Perspectives</em>, D. Shweta Reddy has <a class="external-link" href="https://iapp.org/news/a/grade-sheet-for-indias-adequacy-status/">assessed</a> whether the Personal Data Protection Bill 2019 is sufficient for India to receive adequacy status from the EU.</li></ul>
<div> </div>
<ul><li>Along with Justin Sherman, Arindrajit Basu has <a class="external-link" href="https://www.lawfareblog.com/key-global-takeaways-indias-revised-personal-data-protection-bill">outlined</a> the key global takeaways from the Personal Data Protection Bill 2019 on <em>Lawfare</em>.</li></ul>
<div> </div>
<ul><li>On <em>The Diplomat</em>, Arindrajit has also <a class="external-link" href="https://thediplomat.com/2020/01/the-retreat-of-the-data-localization-brigade-india-indonesia-and-vietnam/">traced</a> the narrowing localization provisions in India, as well as Vietnam and Indonesia, and studied the actors and geopolitical tussle that has shaped these provisions.</li></ul>
<div> </div>
<ul><li>Through a string of publicly available submissions, press statements, and other media reports, Arindrajit and Amber Sinha have <a class="external-link" href="https://www.epw.in/engage/article/politics-indias-data-protection-ecosystem">tracked</a> the political evolution of the data protection ecosystem in India, and how this has, and will continue to impact legislative and policy developments on <em>EPW Engage</em>.</li></ul>
<div> </div>
<ul><li>Gurshabad Grover and Tanaya Rajwade have <a class="external-link" href="https://thewire.in/tech/indias-privacy-bill-regulates-social-media-platforms">written</a> on <em>The Wire</em> about how the Personal Data Protection Bill regulates social media.</li></ul>
<div> </div>
<ul><li>Amber was also a guest on <em>Suno India’s <a class="external-link" href="https://www.sunoindia.in/cyber-democracy/personal-data-protection-bill-what-does-it-mean-for-your-right-to-privacy/">Cyber Democracy podcast</a></em>, with Srinivas Kodali, to discuss how the latest version of the Personal Data Protection Bill will impact the right to privacy.
</li></ul>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/compilation-of-research-on-data-protection'>http://editors.cis-india.org/internet-governance/blog/compilation-of-research-on-data-protection</a>
</p>
No publisherpranavinternet governanceInternet GovernanceData ProtectionPrivacy2020-03-05T08:04:24ZBlog EntryDivergence between the General Data Protection Regulation and the Personal Data Protection Bill, 2019
http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019
<b></b>
<p>Our note on the divergence between the General Data Protection Regulation and the Personal Data Protection Bill can be downloaded as a PDF <a href="http://editors.cis-india.org/internet-governance/divergence-between-the-gdpr-and-pdp-bill-2019" class="internal-link" title="Divergence between the GDPR and PDP Bill 2019">here</a>.</p>
<p>The European Union’s General Data
Protection Regulation (GDPR), replacing the 1995 EU Data Protection Directive
came into effect in May 2018. It harmonises the data protection regulations
across the European Union. In India, the Ministry of Electronics and
Information Technology had constituted a Committee of Experts (chaired by
Justice Srikrishna) to frame recommendations for a data protection framework in
India. The Committee submitted its report and a draft Personal Data Protection
Bill in July 2018 (2018 Bill). Public comments were sought on the bill till
October 2018. The Central Government revised the Bill and introduced the
revised version of the Personal Data Protection Bill (PDP Bill) on December 11,
2019 in the Lok Sabha.</p>
<p>The PDP Bill has incorporated certain
aspects of the GDPR, such as requirements for notice to be given to the data
principal, consent for processing of data, establishment of a data protection
authority, etc. However, there are some differences and in this note we have highlighted
the areas of divergence between the two. It only includes
provisions which are common to the GDPR and the PDP Bill. It does not include
the provisions on (i) Appellate Tribunal, (ii) Finance, Account and Audit; and
(iii) Non- Personal Data. </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019'>http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019</a>
</p>
No publisherPallavi BediInternet GovernanceData ProtectionPrivacy2020-02-21T11:08:50ZBlog EntryAnnouncing Selected Researchers: Welfare, Gender, and Surveillance
http://editors.cis-india.org/raw/announcing-selected-researchers-welfare-gender-and-surveillance
<b>We published a Call for Researchers on January 10, 2020, to invite applications from researchers interested in writing a narrative essay that interrogates the modes of surveillance that people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations are put under as they seek sexual and reproductive health (SRH) services in India. We received 29 applications from over 10 locations in India in response to the call, and are truly overwhelmed by and grateful for this interest and support. We eventually selected applications by 3 researchers that we felt aligned best with the specific objectives of the project. Please find below brief profile notes of the selected researchers.</b>
<p> </p>
<h4>Call for Researchers: <a href="https://cis-india.org/jobs/researchers-welfare-gender-surveillance-call" target="_blank">URL</a></h4>
<hr />
<h2>Kaushal Bodwal</h2>
<p>Kaushal is persuing his MPhil in Sociology at Delhi School of Economics, University of Delhi. He completed his Master's in Sociology at Centre for the Study of Social Systems, Jawaharlal Nehru University after getting a BSc honors degree in Biomedical Sciences from Delhi University. He is one of the founding members of Hasratein: a queer collective, New Delhi. He has been an active spokesperson for Queer and Trans Rights in India and have been on a number of panel discussion on Trans Act 2019 in various campuses. He has also delivered a lecture series on Colonialism and Medicine in Ambedkar University, Kashmiri Gate, Delhi. His areas of interest are Sociology of medicine, gender and medicine, sexuality, religion and biomedical science, intersex studies.</p>
<p><a href="https://kafila.online/2019/08/27/queerness-as-disease-a-continuing-narrative-in-21st-century-india-kaushal-bodwal/" target="_blank">Queerness as disease – a continuing narrative in 21st century India</a>, Kafila, 27 August 2019</p>
<p><a href="https://www.firstpost.com/india/what-it-means-to-be-a-queer-and-live-under-regime-bent-on-remaking-india-on-terms-of-their-tradition-writes-queer-scholar-trolled-by-right-wing-7915391.html" target="_blank">What it means to be queer under a regime bent on remaking India on its own ideological terms</a>, Firstpost, 17 January 2020</p>
<h2>Rosamma Thomas</h2>
<p>Rosamma has worked both as a reporter and as an editor of news reports with newspapers. She currently writes reports for NGOs while also undertaking freelance reporting assignments. She is based in Pune.</p>
<p><a href="http://iced.cag.gov.in/wp-content/uploads/2016-17/NTP%2007/article.pdf " target="_blank">India's mining state steps up fight to rein in killer silicosis</a>, The Times of India, 29 June 2016</p>
<p><a href="https://www.newsclick.in/doctor-may-have-found-early-marker-silicosis-who-will-fund-him" target="_blank">Doctor may have found early marker for silicosis, but who will fund him?</a>, Newsclick, 18 July 2019</p>
<p><a href="https://www.newsclick.in/Asbestos-Poisoning-Raghunath-Manwar-Fight-Safer-Work-Conditions" target="_blank">Asbestos poisoning: Raghunath Manwar’s fight for safer work conditions</a>, Newsclick, 9 January 2020</p>
<h2>Shreya Ila Anasuya</h2>
<p>Shreya is a writer, editor, journalist and performance artist currently based in Calcutta. Her fiction explores the places where myth, memory, history and the performing arts meet. As a journalist, her work explores gender, sexuality, politics, culture and history. She has been published in <em>The Wire</em>, <em>Caravan</em>, <em>Scroll</em>, <em>Mint Lounge</em>, <em>Deep Dives</em>, <em>GenderIT</em>, <em>Helter Skelter</em>, and many more. She is the editor of the digital publication <a href="https://medium.com/skin-stories" target="_blank"><em>Skin Stories</em></a>, housed at the non-profit Point of View. She is the writer and narrator of ‘Gul - a story in text, song and dance’ which has been performed in several cities in India. She was a Felix Scholar at SOAS, University of London, from where she has an MA in Anthropology. For a full portfolio, please click <a href="http://porterfolio.net/dervishdancing" target="_blank">here</a> or visit her <a href="https://www.shreyailaanasuya.com/" target="_blank">website</a>.</p>
<hr />
<p>This project is led by Ambika Tandon, Aayush Rathi, and Sumandro Chattapadhyay at the Centre for Internet and Society, and is supported by a grant from Privacy International.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/announcing-selected-researchers-welfare-gender-and-surveillance'>http://editors.cis-india.org/raw/announcing-selected-researchers-welfare-gender-and-surveillance</a>
</p>
No publishersumandroWelfare GovernancePrivacyGenderResearchGender, Welfare, and PrivacyResearchers at Work2020-02-13T15:04:24ZBlog Entry Comments to the Personal Data Protection Bill 2019
http://editors.cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019
<b>The Personal Data Protection Bill, 2019 was introduced in the Lok Sabha on December 11, 2019. </b>
<p> </p>
<h4>Please view our general comments below, or download as PDF <a href="http://editors.cis-india.org/accessibility/blog/cis-general-comments-to-the-pdp-bill-2019" class="internal-link" title="CIS' General Comments to the PDP Bill 2019">here</a>.</h4>
<h4>Our comments and recommendations can be downloaded as PDF <a href="http://editors.cis-india.org/accessibility/blog/cis-comments-pdp-bill-2019" class="internal-link" title="CIS Comments PDP Bill 2019">here</a>.</h4>
<h4>We have also prepared an annotated version of the Bill, where our detailed comments and recommendations can be viewed alongside the Bill, available as PDF <a href="http://editors.cis-india.org/accessibility/blog/annotated-ver-pdp-bill-2019" class="internal-link" title="Annotated ver PDP Bill 2019">here</a>.</h4>
<hr />
<h2>General Comments</h2>
<h3>1. Executive notification cannot abrogate fundamental rights <br /></h3>
<p>In 2017, the Supreme Court in K.S. Puttaswamy v Union of India [1] held the right to privacy to be a fundamental right. While this right is subject to reasonable restrictions, the restrictions have to meet a three fold requirement, namely (i) existence of a law; (ii) legitimate state aim; (iii) proportionality.Under the 2018 Bill, the exemption to government agencies for processing of personal data from the provisions of the Bill in the ‘interest of the security of the State’ [2] was subject to a law being passed by Parliament. However, under Clause 35 of the present Bill, the Central Government is merely required to pass a written order exempting the government agency from the provisions of the Bill.Any restriction on the right to privacy will have to comply with the conditions prescribed in Puttaswamy I. An executive order issued by the central government authorising any agency of the government to process personal data does not satisfy the first requirement laid down by the Supreme Court in Puttaswamy I — as it is not a law passed by Parliament. The Supreme Court while deciding upon the validity of Aadhar in K.S. Puttaswamy v Union of India [3] noted that “an executive notification does not satisfy the requirement of a valid law contemplated under Puttaswamy. A valid law in this case would mean a law passed by Parliament, which is just, fair and reasonable. Any encroachment upon the fundamental right cannot be sustained by an executive notification.”</p>
<p> </p>
<h3>2. Exemptions under Clause 35 do not comply with the legitimacy and proportionality test</h3>
<p>The lead judgement in Puttaswamy I while formulating the three fold test held that the restraint on privacy emanate from the procedural and content based mandate of Article 21 [4]. The Supreme Court in Maneka Gandhi v Union India [5] had clearly established that “mere prescription of some kind of procedure cannot ever meet the mandate of Article 21. The procedure prescribed by law has to be fair, just and reasonable, not fanciful, oppressive and arbitrary” [6]. The existence of a law is the first requirement; the second requirement is that of ‘legitimate state aim’. As per the lead judgement this requirement ensures that “the nature and content of the law which imposes the restriction falls within the zone of reasonableness mandated by Article 14, which is a guarantee against arbitrary state action” [7]. It is established that for a provision which confers upon the executive or administrative authority discretionary powers to be regarded as non-arbitrary, the provision should lay down clear and specific guidelines for the executive to exercise the power [8]. The third test to be complied with is that the restriction should be ‘proportionate,’ i.e. the means that are adopted by the legislature are proportional to the object and needs sought to be fulfilled by the law. The Supreme Court in Modern Dental College & Research Centre v State of Madhya Pradesh [9] specified the components of proportionality standards —</p>
<ol><li>A measure restricting a right must have a legitimate goal;</li>
<li>It must be a suitable means of furthering this goal;</li>
<li>There must not be any less restrictive, but equally effective alternative; and</li>
<li>The measure must not have any disproportionate impact on the right holder</li></ol>
<p>Clause 35 provides extensive grounds for the Central Government to exempt any agency from the requirements of the bill but does not specify the procedure to be followed by the agency while processing personal data under this provision. It merely states that the ‘procedure, safeguards and oversight mechanism to be followed’ will be prescribed in the rules.The wide powers conferred on the central government without clearly specifying the procedure may be contrary to the three fold test laid down in Puttaswamy I, as it is difficult to ascertain whether a legitimate or proportionate objective is being fulfilled [10].</p>
<p> </p>
<h3>3. Limited powers of Data Protection Authority in comparison with the Central Government</h3>
<p>In comparison with the last version of the Personal Data Protection Bill, 2018 prepared by the Committee of Experts led by Justice Srikrishna, we witness an abrogation of powers of the Data Protection Authority (Authority), to be created, in this Bill. The powers and functions that were originally intended to be performed by the Authority have now been allocated to the Central Government. For example:</p>
<ol><li>In the 2018 Bill, the Authority had the power to notify further categories of sensitive personal data. Under the present Bill, the Central Government in consultation with the sectoral regulators has been conferred the power to do so.</li>
<li>Under the 2018 Bill, the Authority had the sole power to determine and notify significant data fiduciaries, however, under the present Bill, the Central Government has in consultation with the Authority been given the power to notify social media intermediaries as significant data fiduciaries.</li></ol>
<p>In order to govern data protection effectively, there is a need for a responsive market regulator with a strong mandate and resources. The political nature of the personal data also requires that the governance of data, particularly the rule-making and adjudicatory functions performed by the Authority are independent of the Executive.</p>
<p> </p>
<h3>4. No clarity on data sandbox</h3>
<p>The Bill contemplates a sandbox for “ innovation in artificial intelligence, machine-learning or any other emerging technology in public interest.” A Data Sandbox is a non-operational environment where the analyst can model and manipulate data inside the data management system. Data sandboxes have been envisioned as a secure area where only a copy of the company’s or participant companies’ data is located [11]. In essence, it refers to the scalable and creation platform which can be used to explore an enterprise’s information sets. On the other hand, regulatory sandboxes are controlled environments where firms can introduce innovations to a limited customer base within a relaxed regulatory framework, after which they may be allowed entry into the larger market after meeting certain conditions. This purportedly encourages innovation through the lowering of entry barriers by protecting newer entrants from unnecessary and burdensome regulation. Regulatory sandboxes can be interpreted as a form of responsive regulation by governments that seek to encourage innovation – they allow selected companies to experiment with solutions within an environment that is relatively free of most of the cumbersome regulations that they would ordinarily be subject to, while still subject to some appropriate safeguards and regulatory requirements. Sandboxes are regulatory tools which may be used to permit companies to innovate in the absence of heavy regulatory burdens. However, these ordinarily refer to burdens related to high barriers to entry (such as capital requirements for financial and banking companies), or regulatory costs. In this Bill, however, the relaxing of data protection provisions for data fiduciaries would lead to restrictions of the privacy of individuals. Limitations to a fundamental rights on grounds of ‘fostering innovation’ is not a constitutional tenable position, and contradict the primary objectives of a data protection law.</p>
<p> </p>
<h3>5. The primacy of ‘harm’ in the Bill ought to be reconsidered</h3>
<p>While a harms based approach is necessary for data protection frameworks, such approaches should be restricted to the positive obligations, penal provisions and responsive regulation of the Authority. The Bill does not provide any guidance on either the interpretation of the term ‘harm,’ [12] or on the various activities covered within the definition of the term. Terms such as ‘loss of reputation or humiliation’ ‘any discriminatory treatment’ are a subjective standard and are open to varied interpretations. This ambiguity in the definition will make it difficult for the data principal to demonstrate harm and for the DPA to take necessary action as several provisions are based upon harm being caused or likely to be caused.Some of the significant provisions where ‘harm’ is a precondition for the provision to come into effect are —</p>
<ol><li>Clause 25: Data Fiduciary is required to notify the Authority about the breach of personal data processed by the data fiduciary, if such breach is likely to cause harm to any data principal. The Authority after taking into account the severity of the harm that may be caused to the data principal will determine whether the data principal should be notified about the breach.</li>
<li>Clause 32 (2): A data principal can file a complaint with the data fiduciary for a contravention of any of the provisions of the Act, which has caused or is likely to cause ‘harm’ to the data principal.</li><li>Clause 64 (1): A data principal who has suffered harm as a result of any violation of the provision of the Act by a data fiduciary, has the right to seek compensation from the data fiduciary.</li></ol>
<p>Clause 16 (5): The guardian data fiduciary is barred from profiling, tracking or undertaking targeted advertising directed at children and undertaking any other processing of personal data that can cause significant harm to the child.</p>
<p> </p>
<h3>6. Non personal data should be outside the scope of this Bill</h3>
<p>Clause 91 (1) states that the Act does not prevent the Central Government from framing a policy for the digital economy, in so far as such policy does not govern personal data. The Central Government can, in consultation with the Authority, direct any data fiduciary to provide any anonymised personal data or other non-personal data to enable better targeting of delivery of services or formulation of evidence based policies in any manner as may be prescribed.It is concerning that the data protection bill has specifically carved out an exception for the Central Government to frame policies for the digital economy and seems to indicate that the government plans to freely use any and all anonymized and/or non-personal data that rests with any data fiduciary that falls under the ambit of the bill to support the digital economy including for its growth, security, integrity, and prevention of misuse. It is unclear how the government, in practice, will be able to compel organizations to share this data. Further, there is a lack of clarity on the contours of the definition of non-personal data and the Bill does not define the term. It is also unclear whether the Central Government can compel the data fiduciary to transfer/share all forms of non-personal data and the rights and obligations of the data fiduciaries and data principals over such forms of data. Anonymised data refers to data which has ‘ irreversibly’ been converted into a form in which the data principal cannot be identified. However, as several instances have shown ‘ irreversible’ anonymisation is not possible. In the United States, the home addresses of taxi drivers were uncovered and in Australia individual health records were mined from anonymised medical bills [13]. In September 2019, the Ministry of Electronics and Information Technology, constituted an expert committee under the chairmanship of Kris Gopalkrishnan to study various issues relating to non-personal data and to deliberate over a data governance framework for the regulation of such data.The provision should be deleted and the scope of the bill should be limited to protection of personal data and to provide a framework for the protection of individual privacy. Until the report of the expert committee is published, the Central Government should not frame any law/regulation on the access and monetisation of non-personal/ anonymised data nor can they create a blanket provision allowing them to request such data from any data fiduciary that falls within the ambit of the bill. If the government wishes to use data resting with a data fiduciary; it must do so on a case to case basis and under formal and legal agreements with each data fiduciary.</p>
<p> </p>
<h3>7. Steps towards greater decentralisation of power</h3>
<p>We propose the following steps towards greater decentralisation of powers and devolved jurisdiction —</p>
<ol><li>Creation of State Data Protection Authorities: A single centralised body may not be the appropriate form of such a regulator. We propose that on the lines of central and state commissions under the Right to Information Act, 2005, state data protection authorities are set up which are in a position to respond to local complaints and exercise jurisdiction over entities within their territorial jurisdictions.</li>
<li>More involvement of industry bodies and civil society actors: In order to lessen the burden on the data protection authorities it is necessary that there is active engagement with industry bodies, sectoral regulators and civil society bodies engaged in privacy research. Currently, the Bill provides for involvement of industry or trade association, association representing the interests of data principals, sectoral regulator or statutory Authority, or an departments or ministries of the Central or State Government in the formulation of codes of practice. However, it would be useful to also have a more active participation of industry associations and civil society bodies in activities such as promoting awareness among data fiduciaries of their obligations under this Act, promoting measures and undertaking research for innovation in the field of protection of personal data.</li></ol>
<p> </p>
<h3>8. The Authority must be empowered to exercise responsive regulation</h3>
<p>In a country like India, the challenge is to move rapidly from a state of little or no data protection law, and consequently an abysmal state of data privacy practices to a strong data protection regulation and a powerful regulator capable of enabling a state of robust data privacy practices. This requires a system of supportive mechanisms to the stakeholders in the data ecosystem, as well as systemic measures which enable the proactive detection of breaches. Further, keeping in mind the limited regulatory capacity in India, there is a need for the Authority to make use of different kinds of inexpensive and innovative strategies.We recommend the following additional powers for the Authority to be clearly spelt out in the Bill —</p>
<ol><li>Informal Guidance: It would be useful for the Authority to set up a mechanism on the lines of the Security and Exchange Board of India (SEBI)’s Informal Guidance Scheme, which enables regulated entities to approach the Authority for non-binding advice on the position of law. Given that this is the first omnibus data protection law in India, and there is very little jurisprudence on the subject from India, it would be extremely useful for regulated entities to get guidance from the regulator.</li>
<li>Power to name and shame: When a DPA makes public the names of organisations that have seriously contravened data protection legislation, this is a practice known as “naming and shaming.” The UK ICO and other DPAs recognise the power of publicity, as evidenced by their willingness to co-operate with the media. The ICO does not simply post monetary penalty notices (MPNs or fines) on its websites for journalists to find, but frequently issues press releases, briefs journalists and uses social media. The ICO’s publicity statement on communicating enforcement activities states that the “ICO aims to get media coverage for enforcement activities.”</li>
<li>Undertakings: The UK ICO has also leveraged the threats of fines into an alternative enforcement mechanism seeking contractual undertakings from data controllers to take certain remedial steps. Undertakings have significant advantages for the regulator. Since an undertaking is a more “co-operative”solution, it is less likely that a data controller will change it. An undertaking is simpler and easier to put in place. Furthermore, the Authority can put an undertaking in place quickly as opposed to legal proceedings which are longer.</li></ol>
<p> </p>
<h3>9. No clear roadmap for the implementation of the Bill</h3>
<p>The 2018 Bill had specified a roadmap for the different provisions of the Bill to come into effect from the date of the Act being notified [14]. It specifically stated the time period within which the Authority had to be established and the subsequent rules and regulations notified.The present Bill does not specify any such blueprint; it does not provide any details on either when the Bill will be notified or the time period within within which the Authority shall be established and specific rules and regulations notified. Considering that 25 provisions have been deferred to rules that have to be framed by the Central Government and a further 19 provisions have been deferred to the regulations to be notified by the Authority the absence and/or delayed notification of such rules and regulations will impact the effective functioning of the Bill.The absence of any sunrise or sunset provision may disincentivise political or industrial will to support or enforce the provisions of the Bill. An example of such a lack of political will was the establishment of the Cyber Appellate Tribunal. The tribunal was established in 2006 to redress cyber fraud. However, it was virtually a defunct body from 2011 onwards when the last chairperson retired. It was eventually merged with the Telecom Dispute Settlement and Appellate Tribunal in 2017.We recommend that Bill clearly lays out a time period for the implementation of the different provisions of the Bill, especially a time frame for the establishment of the Authority. This is important to give full and effective effect to the right of privacy of the <br />individual. It is also important to ensure that individuals have an effective mechanism to enforce the right and seek recourse in case of any breach of obligations by the data fiduciaries.For offences, we suggest a system of mail boxing where provisions and punishments are enforced in a staggered manner, for a period till the fiduciaries are aligned with the provisions of the Act. The Authority must ensure that data principals and fiduciaries have sufficient awareness of the provisions of this Bill before bringing the provisions for punishment are brought into force. This will allow the data fiduciaries to align their practices with the provisions of this new legislation and the Authority will also have time to define and determine certain provisions that the Bill has left the Authority to define. Additionally enforcing penalties for offences initially must be in a staggered process, combined with provisions such as warnings, in order to allow first time and mistaken offenders from paying a high price. This will relieve the fear of smaller companies and startups who might fear processing data for the fear of paying penalties for offences.</p>
<p> </p>
<h3>10. Lack of interoperability</h3>
<p>In its current form, a number of the provisions in the Bill will make it difficult for India’s framework to be interoperable with other frameworks globally and in the region. For example, differences between the draft Bill and the GDPR can be found in the grounds for processing, data localization frameworks, the framework for cross border transfers, definitions of sensitive personal data, inclusion of the undefined category of ‘critical data’, and the roles of the authority and the central government.</p>
<p> </p>
<h3>11. Legal Uncertainty</h3>
<p>In its current structure, there are a number of provisions in the Bill that, when implemented, run the risk of creating an environment of legal uncertainty. These include: lack of definition of critical data, lack of clarity in the interpretation of the terms ‘harm’ and ‘significant harm’, ability of the government to define further categories of sensitive personal data, inclusion of requirements for ‘social media intermediaries’, inclusion of ‘non-personal data’, framing of the requirements for data transfers, bar on processing of certain forms of biometric data as defined by the Central Government, the functioning between a consent manager and another data fiduciary, the inclusion of an AI sandbox and the definition of state. To ensure the greatest amount of protection of individual privacy rights and the protection of personal data while also enabling innovation, it is important that any data protection framework is structured and drafted in a way to provide as much legal certainty as possible.</p>
<p> </p>
<h3>Endnotes</h3>
<p>1. (2017) 10 SCC 641 (“Puttaswamy I”).</p>
<p>2. Clause 42(1) of the 2018 Bill states that “Processing of personal data in the interests of the security of the State shall not be permitted unless it is authorised pursuant to a law, and is in accordance with the procedure established by such law, made by Parliament and is necessary for, and proportionate to such interests being achieved.”</p>
<p>3. (2019) 1 SCC 1 (“Puttaswamy II”)</p>
<p>4. Puttaswamy I, supra, para 180.</p>
<p>5. (1978) 1 SCC 248.</p>
<p>6. Ibid para 48.</p>
<p>7. Puttaswamy I supra para 180.</p>
<p>8. State of W.B. v. Anwar Ali Sarkar, 1952 SCR 284; Satwant Singh Sawhney v A.P.O AIR 1967 SC1836.</p>
<p>9. (2016)7 SCC 353.</p>
<p>10. Dvara Research “Initial Comments of Dvara Research dated 16 January 2020 on the Personal Data Protection Bill, 2019 introduced in Lok Sabha on 11 December 2019”, January 2020, https://www.dvara.com/blog/2020/01/17/our-initial-comments-on-the-personal-data-protection-bill-2019/ (“Dvara Research”).</p>
<p>11. “A Data Sandbox for Your Company”, Terrific Data, last accessed on January 31, 2019, http://terrificdata.com/2016/12/02/3221/.</p>
<p>12. Clause 3(20) — “harm” includes (i) bodily or mental injury; (ii) loss, distortion or theft of identity; (ii) financial loss or loss of property; (iv) loss of reputation or humiliation; (v) loss of employment; (vi) any discriminatory treatment; (vii) any subjection to blackmail or extortion; (viii) any denial or withdrawal of service,benefit or good resulting from an evaluative decision about the data principal; (ix) any restriction placed or suffered directly or indirectly on speech, movement or any other action arising out of a fear of being observed or surveilled; or (x) any observation or surveillance that is not reasonably expected by the data principal.</p>
<p>13. Alex Hern “Anonymised data can never be totally anonymous, says study”, July 23, 2019 https://www.theguardian.com/technology/2019/jul/23/anonymised-data-never-be-anonymous-enough-study-finds.</p>
<p>14. Clause 97 of the 2018 Bill states“(1) For the purposes of this Chapter, the term ‘notified date’ refers to the date notified by the Central Government under sub-section (3) of section 1. (2)The notified date shall be any date within twelve months from the date of enactment of this Act. (3)The following provisions shall come into force on the notified date-(a) Chapter X; (b) Section 107; and (c) Section 108. (4)The Central Government shall, no later than three months from the notified date establish the Authority. (5)The Authority shall, no later than twelve months from the notified date notify the grounds of processing of personal data in respect of the activities listed in sub-section (2) of section 17. (6)The Authority shall no, later than twelve months from the date notified date issue codes of practice on the following matters-(a) notice under section 8; (b) data quality under section 9; (c) storage limitation under section 10; (d) processing of personal data under Chapter III; (e) processing of sensitive personal data under Chapter IV; (f ) security safeguards under section 31; (g) research purposes under section 45; (h) exercise of data principal rights under Chapter VI; (i) methods of de-identification and anonymisation; (j) transparency and accountability measures under Chapter VII. (7)Section 40 shall come into force on such date as is notified by the Central Government for the purpose of that section.(8)The remaining provision of the Act shall come into force eighteen months from the notified date.”</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019'>http://editors.cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019</a>
</p>
No publisherAmber Sinha, Elonnai Hickok, Pallavi Bedi, Shweta Mohandas, Tanaya RajwadeInternet GovernanceData ProtectionPrivacy2020-02-21T10:13:35ZBlog EntryCall for Researchers: Welfare, Gender, and Surveillance
http://editors.cis-india.org/jobs/researchers-welfare-gender-surveillance-call
<b>We are inviting applications for two researchers. Each researcher is expected to write a narrative essay that interrogates the modes of surveillance that people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations are put under as they seek sexual and reproductive health (SRH) services in India. The researchers are expected to undertake field research in the location they are based in, and reflect on lived experiences gathered through field research as well as their own experiences of doing field research. Please read the sections below for more details about the work involved, the timeline for the same, and the application process for this call.</b>
<p> </p>
<h4>Call for Researchers: <a href="https://github.com/cis-india/website/raw/master/docs/CIS_Researchers_WelfareGenderSurveillance_Call_20200110.pdf" target="_blank">Download</a> (PDF)</h4>
<hr />
<h3><strong>Description of the Work</strong></h3>
<p>Each researcher is expected to author a narrative essay that presents and reflects on lived experiences of people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations as they seek sexual and reproductive health (SRH) services in India. We expect the essay to contribute to a larger body of knowledge around the increasing focus on data-driven initiatives for public health provision in the country and elsewhere. Accordingly, the researcher may respond to any one or more than one of the following questions, within the context of the geographical focus as specified by the researcher:</p>
<ul>
<li>What are the modes of surveillance, especially in terms of generation and exploitation of digital data, experienced by people of marginalised gender identities and sexual orientations in India, as they avail of sexual and reproductive healthcare?</li>
<li>How are the lived experiences of underserved populations, such as people of marginalised gender identities and sexual orientations, shaped by gendered surveillance while accessing sexual and reproductive services?</li>
<li>What are the modes of governance and gender ideologies that have mediated the increasing datafication of such provision?</li></ul>
<p>We expect the researchers to draw on a) the Indian Supreme Court’s framing of privacy in India, as a fundamental right, and its implications; and b) apply and/or build on feminist conceptualisations of privacy. Further, we expect the researchers to respond to the uncertain landscape of legal rights accessible to people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations, especially in the current context shaped by The Transgender Persons (Protection of Rights) Act, 2019.</p>
<p>The researchers will undertake field research in locations of their choice, conduct interviews and discussions with people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations seeking such services, and conduct formal and informal interviews with officials and personnel associated with public and private sector agencies involved in the provision of SRH services.</p>
<h3><strong>Eligibility and Application Process</strong></h3>
<h4>We specifically encourage people of LGBTHIAQ+ and gender non-conforming identities and sexual orientations to submit their applications for this call for researchers.</h4>
<p>We are seeking applications from individuals who:</p>
<ul>
<li>Are based in the place where field study is to be undertaken, for the duration of the study;</li>
<li>Are fluent in the main regional language(s) spoken in the city where the study will be conducted, and in English (especially written);</li>
<li>Preferably have a postgraduate degree (current students should also apply) in social or technical sciences, journalism, or legal studies (undergraduate degree-holders with research or work experience should also apply); and</li>
<li>Have previous research and writing experiences on issues at the intersection of sexual and reproductive health, gender justice and women’s rights, and health informatics or digital public health.</li></ul>
<p>Please send the following documents (in text or PDF formats) to <strong>raw@cis-india.org by Friday, January 24</strong> to apply for the researcher positions:</p>
<ul>
<li>Brief CV with relevant academic and professional information;</li>
<li>Two samples of academic/professional (published/unpublished) writing by the applicant; and</li>
<li>A brief research proposal (around 500 words) that should specify the scope (geographical and conceptual), research questions, and motivation of the essay to be authored by the applicant.</li></ul>
<p>All applicants will be informed of the selection decisions by Friday, January 31.</p>
<h3><strong>Timeline of the Work</strong></h3>
<p><strong>February 3-7</strong> CIS research team will have a call with each researcher to plan out the work to be undertaken by them</p>
<p><strong>February - March</strong> Researchers are to undertake field research, as proposed by the researchers and discussed with the CIS research team</p>
<p><strong>March 27</strong> Researchers are to submit a full draft essay (around 3,000 words)</p>
<p><strong>March 30 - April 3</strong> CIS research team will have call with each researcher to discuss the shared draft essays and make plans towards their finalisation</p>
<p><strong>May 15</strong> Researchers are to submit the final essay (around 5,000 words, without footnotes and references)</p>
<p>As part of this project, CIS will organise two discussion events in Bengaluru and New Delhi during April-June (tentatively). Event dates are to be decided in conversation with the researchers, and they will be invited to present their works in the same.</p>
<h3><strong>Remuneration</strong></h3>
<p>Each researcher will be paid a remuneration of Rs. 1,00,000 (inclusive of taxes) over two equal installments: first on signing of the agreement in February 2020, and second on submission of the final essay in May 2020.</p>
<p>We will also reimburse local travel expenses of each researcher upto Rs. 10,000, and translations and transcriptions expense (if any) incurred by each researcher upto Rs. 10,000. These reimbursements will be made on the basis of expense invoices shared by the researcher.</p>
<h3><strong>Description of the Project</strong></h3>
<p>Previous research conducted by CIS on the subject of sexual and reproductive health (SRH) services in India observes that there is a complex web of surveillance, or ‘dataveillance’, around each patient as they avail of SRH services from the state. In this current project, we are aiming to map the ecosystem of surveillance around SRH services as their provision becomes increasingly ‘data-driven’, and explore its implications for patients and beneficiaries.</p>
<p>Through this project, we are interested in documenting the roles played by both the public and the private sector actors in this ecosystem of health surveillance. We understand the role of private sector actors as central to state provision of sexual and reproductive health services, especially through the institutionalisation of data-driven health insurance models, as well as through extensive privatisation of public health services. By studying semi-private, private, and public medical establishments including hospitals, primary/community health centres and clinics, we aim to develop a comparative analysis of surveillance ecosystems across the three establishment types.</p>
<p>This project is led by Ambika Tandon, Aayush Rathi, and Sumandro Chattapadhyay at the Centre for Internet and Society, and is supported by a grant from Privacy International.</p>
<h3><strong>Indicative Reading List</strong></h3>
<p><em>We are sharing below a short and indicative list of readings that may be useful for potential applicants</em>.</p>
<p>Aayush Rathi, <a href="https://www.epw.in/engage/article/indias-digital-health-paradigm-foolproof" target="_blank">Is India's Digital Health System Foolproof?</a> (2019)</p>
<p>Aayush Rathi and Ambika Tandon, <a href="https://www.epw.in/engage/article/data-infrastructures-inequities-why-does-reproductive-health-surveillance-india-need-urgent-attention" target="_blank">Data Infrastructures and Inequities: Why Does Reproductive Health Surveillance in India Need Our Urgent Attention?</a> (2019)</p>
<p>Ambika Tandon, <a href="https://cis-india.org/internet-governance/blog/ambika-tandon-december-23-2018-feminist-methodology-in-technology-research" target="_blank">Feminist Methodology in Technology Research: A Literature Review</a> (2018)</p>
<p>Ambika Tandon, <a href="https://cis-india.org/raw/big-data-reproductive-health-india-mcts" target="_blank">Big Data and Reproductive Health in India: A Case Study of the Mother and Child Tracking System</a> (2019)</p>
<p>Anja Kovacs, <a href="https://genderingsurveillance.internetdemocracy.in/theory/" target="_blank">Reading Surveillance through a Gendered Lens: Some Theory</a> (2017)</p>
<p>Lindsay Weinberg, <a href="https://www.westminsterpapers.org/articles/10.16997/wpcc.258/" target="_blank">Rethinking Privacy: A Feminist Approach to Privacy Rights after Snowden</a> (2017)</p>
<p>Nicole Shephard, <a href="https://www.apc.org/en/pubs/big-data-and-sexual-surveillance" target="_blank">Big Data and Sexual Surveillance</a> (2016)</p>
<p>Sadaf Khan, <a href="https://deepdives.in/data-bleeding-everywhere-a-story-of-period-trackers-8766dc6a1e00" target="_blank">Data Bleeding Everywhere: A Story of Period Trackers</a> (2019)</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/jobs/researchers-welfare-gender-surveillance-call'>http://editors.cis-india.org/jobs/researchers-welfare-gender-surveillance-call</a>
</p>
No publisherambikaWelfare GovernancePrivacyGenderGender, Welfare, and PrivacyResearchers at Work2020-02-13T15:05:37ZBlog EntryIETF106
http://editors.cis-india.org/internet-governance/news/ietf106
<b>Gurshabad Grover participated at IETF106, which was held in Singapore 16-22 November, 2019.</b>
<p class="moz-quote-pre">In the meeting of the Human Rights Protocol Considerations (hrpc) research group, I presented an update to draft-irtf-hrpc-guidelines-03 (Guidelines for Human Rights Protocol and Architecture Considerations), which is an Internet Draft adopted by the hrpc rg that he is co-editing with Niels ten Oever. <a class="external-link" href="https://datatracker.ietf.org/doc/draft-irtf-hrpc-guidelines/">More info here</a>.</p>
<p class="moz-quote-pre" style="text-align: justify; ">Among other working/research group meetings, I participated theTransport Layer Security (tls) and the Privacy Enhancements and Assessments research group (pearg) sessions. I also participated inseveral side meetings, including the Public Interest Technology Group(pitg) meeting.</p>
<p class="moz-quote-pre" style="text-align: justify; ">Agenda for the IETF and the different WGs/RG can be found on the <a class="external-link" href="https://datatracker.ietf.org/meeting/106/agenda">IETF website</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/ietf106'>http://editors.cis-india.org/internet-governance/news/ietf106</a>
</p>
No publisherAdminInternet GovernancePrivacy2019-12-15T06:14:02ZNews ItemPower over privacy: New Personal Data Protection Bill fails to really protect the citizen’s right to privacy
http://editors.cis-india.org/internet-governance/news/the-times-of-india-december-12-2019-power-over-privacy
<b>Nikhil Pahwa throws light on the new personal data protection bill.</b>
<p style="text-align: justify; ">The article by Nikhil Pahwa was <a class="external-link" href="https://timesofindia.indiatimes.com/blogs/toi-edit-page/power-over-privacy-new-personal-data-protection-bill-fails-to-really-protect-the-citizens-right-to-privacy/">published in the Times of India</a> on December 12, 2019. CIS report was mentioned.</p>
<hr />
<p style="text-align: justify; ">Earlier this year, in April, <a href="https://blog.trendmicro.com/trendlabs-security-intelligence/55m-registered-voters-risk-philippine-commission-elections-hacked/" rel="noopener noreferrer" target="_blank">a data breach</a> in the Election Commission of Philippines led to the leakage of personal information of over 55 million eligible voters on a searchable website: including names, addresses and date of birth. This was not the first data breach from the Election Commission. After the first, which took place in March 2016, where 340 GB of voter data was <a href="http://www.rappler.com/newsbreak/in-depth/127870-comelec-leak-identity-theft-scams-experts" rel="noopener noreferrer" target="_blank">published online by a group of hackers called LulzSec Pilipinas</a>, the National Privacy Commission of Philippines found that the Election Commission had violated the Data Privacy Act of 2012, and <a href="https://www.privacy.gov.ph/2017/01/privacy-commission-finds-bautista-criminally-liable-for-comeleak-data-breach/" rel="noopener noreferrer" target="_blank">recommended criminal prosecution of its chairman</a>, finding him liable when the agency failed to dispense its duty as a “personal information controller”.</p>
<p style="text-align: justify; ">It’s 2019, and that recommendation has still not been acted upon, because the National Privacy Commission of Philippines only has recommendatory powers for criminal prosecution. Meanwhile, data breaches continue at the Election Commission of Philippines.</p>
<p style="text-align: justify; ">Between 2017 and 2018, Aadhaar related personally identifiable data of several Indian citizens, including names, addresses, bank account numbers, in some cases pregnancy information and even religion and caste information of individuals, was published online by Indian government departments. The Centre for Internet and Society, in a report, estimated that <a href="https://www.medianama.com/2017/05/223-aadhaar-numbers-data-leak/" rel="noopener noreferrer" target="_blank">personally identifiable data for 130-135 million Indian citizens had been leaked</a>, thus putting them at risk. 210 government websites had made Aadhaar related data public, <a href="https://www.thehindu.com/news/national/210-govt-websites-made-aadhaar-details-public-uidai/article20555266.ece" rel="noopener noreferrer" target="_blank">UIDAI confirmed in response to an RTI in 2017</a>.</p>
<p style="text-align: justify; ">No one was held liable. There was no data protection law, no data protection authority, no criminal prosecution was recommended. Around that time, the Indian government was instead arguing in the Supreme Court that privacy isn’t a fundamental right under the Indian Constitution.</p>
<p style="text-align: justify; ">What we can learn from these two instances is that for the enforcement of a citizen’s right to privacy, and ensuring that no one takes the protection of data lightly, there needs to be a strong privacy law that holds even the government responsible, and above all, a strong data protection authority that is independent and has powers to penalise even government officials. On some of these counts, the Personal Data Protection Bill, 2019, disappoints.</p>
<p style="text-align: justify; ">First, members of the Data Protection Authority will no longer be appointed by independent entities from diverse backgrounds: where they were previously going to be appointed by a committee comprising the Chief Justice of India or a Supreme Court judge, the Cabinet secretary, and an independent expert, the power to appoint members to DPA now rests solely with government officials, including the appointment of adjudicating officers. In addition, the central government, in the interest of “national security, sovereignty, international relations and public order, can issue directions to DPA, which DPA will be bound by. Powers of DPA have also been reduced: while in the previous version of the bill, DPA had the sole power to categorise data as sensitive personal data, in the current version, the power rests with the central government, albeit in consultation with DPA. The central government will also notify any social media company as a significant data fiduciary, and not DPA. Only the central government can determine what critical personal data is, and not DPA.</p>
<p style="text-align: justify; ">This dependence on the government for appointments, functions and definitions, will invariably impact the independence of DPA, and even though the 2019 version of the bill gives it the authority to fine the state a maximum of Rs 5-15 crore, depending on the offence, i’d be surprised if this ever happens.</p>
<p style="text-align: justify; ">The bill does create significant exceptions for the state to acquire and process data, and an opportunity to create a base for surveillance reform in the country has been lost. The previous version of the bill had brought some sense of safety against mass surveillance, when it included the condition that processing of data by the government must be “necessary and proportionate”, drawing from Supreme Court’s historic right to privacy judgment. This is particularly important given that the bill also gives power to the government to exempt any agency from the provisions of the bill for processing of personal data, which includes acquiring data from any public or private entity.</p>
<p style="text-align: justify; ">Effectively, this means that government agencies may be exempt from any scrutiny by DPA, and can even collect data from third parties (for example, fin-tech companies, health-tech startups) without the user even knowing. Forget recommending criminal prosecution for mass surveillance, India’s DPA won’t even be able to fine a government agency for such a violation of the fundamental right to privacy. The government also has vast exceptions for data processing: “for the performance of any function of the state authorised by law”.</p>
<p style="text-align: justify; ">This aside, one of the more curious clauses in the bill is around non-personal data. The government, a few months ago, constituted a committee led by Infosys co-founder Kris Gopalakrishnan to look into the governance of non-personal data. Non-personal data, as the term suggests, is any data that is not related to an individual. In the bill, the government has given itself the right to acquire this data, which is essentially a company’s intellectual property, to “promote framing of policies for digital economy”. Why non-personal data finds a mention in a Personal Data Protection Bill is beyond comprehension, and this move will not inspire much confidence in businesses operating in India, when the state claims eminent domain over intellectual property.</p>
<p style="text-align: justify; ">It’s unfortunate minister Ravi Shankar Prasad is sending the bill to a select committee, given the fact that such significant changes to the bill should have led to another public consultation.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/the-times-of-india-december-12-2019-power-over-privacy'>http://editors.cis-india.org/internet-governance/news/the-times-of-india-december-12-2019-power-over-privacy</a>
</p>
No publisherNikhil PahwaInternet GovernancePrivacy2019-12-15T05:57:31ZNews ItemOutrage As Privileged IITians Use Tech To Spy On Sweepers
http://editors.cis-india.org/internet-governance/news/huffignton-post-december-13-2019-rachna-khaira-outrage-as-privileged-iit-ians-use-tech-to-spy-on-sweepers
<b>Some members of the housekeeping staff at IIT Ropar were put under round the clock surveillance during working hours for many days in February this year without their consent. IIT Ropar Director Prof S K Das has ordered a probe into the incident.</b>
<p style="text-align: justify; ">The article by Rachna Khaira was <a class="external-link" href="https://www.huffingtonpost.in/entry/outrage-as-privileged-iitians-use-tech-to-spy-on-sweepers_in_5df1bbc8e4b06a50a2e9e659">published in Huffington Post</a> on December 31, 2019. Aayush Rathi was quoted.</p>
<hr />
<p style="text-align: justify; ">The Indian Institute of Technology (IIT), Ropar is conducting a probe into the reported tagging and round the clock electronic surveillance of some housekeeping staff members as part of an experiment run by the Technology Business Incubation Foundation (TBIF) located at the IIT campus in February this year.</p>
<p style="text-align: justify; "><em>HuffPost India </em>has learnt that the TBIF, a tech incubator run within IIT Ropar, signed off on the “Sweepy” project in which housekeeping staff were given wristbands and brooms secretly embedded with tracking chips, without seeking the consent of the janitorial staff, or informing IIT Ropar management.</p>
<p style="text-align: justify; ">While the housekeeping staff were told the wristbands would record their pulse and heart beat, and that they should wear it while cleaning the campus, the tracking chips were used to track to assess if they were sweeping out hard-to-reach corners of the institute.</p>
<p style="text-align: justify; ">Prof. Sarit Kumar Das, Director IIT Ropar told HuffPost India that a three member committee comprising of Prof. Bijoy H Barua, Prof. Javed Agrewala and Prof. Deepak Kashyap has been set up to look into the matter.</p>
<p style="text-align: justify; ">“We at the IIT Ropar respect privacy and condemn any such violation made by any of our student or staff member,” said Prof. Das. “Before conducting any experiment on human beings, an approval has to be sought from the human ethics team constituted in our institution and they present a case to me after seeking a written consent from the people who would undergo the experiment. Only, after getting my approval, such an experiment can be conducted at the campus.”</p>
<h3 style="text-align: justify; ">Sweeping surveillance</h3>
<p style="text-align: justify; ">J K Sharma, the Chief Operating Officer of TBIF, told <em>HuffPost India</em> that his tech incubator deliberately misled the housekeeping staff about the true purpose of the wristband as they felt the housekeeping staff wouldn’t agree to wear such a device.</p>
<p style="text-align: justify; ">While elaborating more on the ‘Sweepy’ project, Sharma said that the project was based on an idea that came to the hostellers who were upset over the housekeeping staff for not cleaning their rooms.</p>
<p style="text-align: justify; ">“The sweepers were not working properly and despite reporting the matter several times to the authorities, they were not taking any cognisance. Perturbed, the students developed this programme in which the location of the sweeper can be recorded and monitored in a control room by a gadget tied to the sweeper’s wrist,” said Sharma.</p>
<p style="text-align: justify; ">He further added that a beacon records the activity of the sensor pasted to the broom or mop held by the sweeper and can monitor the area and the time in which it was used. The report was produced digitally on the screen.</p>
<p style="text-align: justify; ">Was a consent sought from the sweepers before tagging them?</p>
<p style="text-align: justify; ">“The testing was done in a secret manner as the housekeeping staff may not have given their consent for the trial. We tried it on three sweepers and while two of them were found working dedicatedly, one was found to have missed cleaning from few areas assigned to him,” said Sharma.</p>
<p style="text-align: justify; ">The findings were shared with the housekeeping supervisor who later directed his staff to do their duty more diligently.</p>
<p style="text-align: justify; ">The team working on the project however told <em>HuffPost India</em> that they secured the privacy of the housekeeping staff by removing the microphone from the gadgets tied to their wrists.</p>
<p style="text-align: justify; ">This technology does not have video feature and only monitors location of a moving object and is quite cheap as compared to the radio-frequency identification (RFID) technology that uses electromagnetic fields to automatically identify and track tags attached to objects.</p>
<blockquote class="pull-quote content-list-component" style="text-align: justify; ">The testing was done in a secret manner as the housekeeping staff may not have given their consent for the trial. We tried it on three sweepers and while two of them were found working dedicatedly, one was found to have missed cleaning from few areas assigned to himJ K Sharma, Chief Executive Officer, Technology Business Incubation Foundation, IIT Ropar</blockquote>
<p style="text-align: justify; ">Calling this an increasingly commonplace trend of covert spying on domestic workers without their knowledge, Ayush Rathi, Programme Officer, Centre for Internet and Society, said that the housekeeping staff was made to wear the gadget under a false pretense is telling.</p>
<p style="text-align: justify; ">“This is a classic example of how the access to privacy is stratified along the axes of class, caste and gender. And ties in closely with a key purpose of surveillance — that of exerting control over people’s bodies to conform to the surveiller’s ideas of right and wrong,” said Rathi.</p>
<p style="text-align: justify; ">He further added that in many ways, this story captures the zeitgeist of the 21st century. The is the essence of so much of what qualifies as innovation today is that they seek to find technological solutions to problems that are structural in nature.</p>
<p style="text-align: justify; ">“So, in this instance it is very evident that the objective sought to be achieved was not to merely ‘fix’ the problem of the housekeeping staff performing its duties well, but to solely hold them guilty for failing to do so,” said Rathi.</p>
<p style="text-align: justify; ">An alternate, albeit more tedious, approach would have been to speak with the workers and iron out the struggles they were facing at the workplace that were preventing them from performing their job well. Any solution could only have been prepared thereafter — he added.</p>
<p style="text-align: justify; ">As per Prof. Das, a major problem with the engineering students is that unlike medical students, 90 percent of their experiments are based on machines and not human beings.</p>
<p style="text-align: justify; ">“There is too much deficiency of the understanding of human psychology amongst engineering students. To curb this, we at the IIT have started a mandatory course on human ethics which is being taught by some of the renowned human psychology experts. Still sometimes, the violations gets reported,” said Prof. Das.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/huffignton-post-december-13-2019-rachna-khaira-outrage-as-privileged-iit-ians-use-tech-to-spy-on-sweepers'>http://editors.cis-india.org/internet-governance/news/huffignton-post-december-13-2019-rachna-khaira-outrage-as-privileged-iit-ians-use-tech-to-spy-on-sweepers</a>
</p>
No publisherRachna KhairaInternet GovernancePrivacy2019-12-15T05:33:21ZNews ItemWhatsApp spy attack and after
http://editors.cis-india.org/internet-governance/news/deccan-herald-november-6-2019-theres-sudeep-whatsapp-spy-attack-and-after
<b>Bengaluru experts analyse the Pegasus snooping scandal, and provide advice on what you can do about the gaping holes in your mobile phone security.</b>
<p>The article by Theres Sudeep was published in <a class="external-link" href="https://www.deccanherald.com/metrolife/metrolife-your-bond-with-bengaluru/whatsapp-spy-attack-and-after-773955.html">Deccan Herald</a> on November 6, 2019. Aayush Rathi was quoted.</p>
<hr />
<p>Last week ended with a sensational piece of news: WhatsApp said spyware Pegasus was being used to hack into the phones of activists and journalists in India.</p>
<p style="text-align: justify; ">The software is the brainchild of the NSO Group, an Israeli company. WhatsApp has detected 1,400 instances of Pegasus being used in the latest wave of attacks between April 29 and May 10. WhatsApp has identified 100-plus cases targeting human rights defenders and journalists. About two dozen of these attacks were in India.</p>
<p style="text-align: justify; ">Among those whose security was reportedly compromised is Congress leader Priyanka Gandhi.The first question is who ordered this snooping. NSO claims they sell their technology only to government agencies for lawful investigation into crime and terrorism. Speculation is rife that there is government involvement in the snooping.</p>
<p style="text-align: justify; ">Vinay Srinivas, lawyer with Alternative Law Forum, Bengaluru, says,“The targets of the attack seem to be those who had critical things to say about the current government.”Referring to a tweet by journalist Arvind Gunasekar, Srinivas says there is clear proof that the government knew of the breach and its severity.The tweet includes a screenshot of a report from the CERT-IN (Indian Computer Emergency Response Team) website dated May 17.</p>
<p style="text-align: justify; ">It shows severity rating as “High”.WhatsApp says the vulnerability has now been patched and urged users to update the app. But a level of paranoia around smartphones and privacy has been created. Apar Gupta, executive director of the Internet Freedom Foundation, based in Delhi works towards internet freedom and privacy, says Pegasus,specially, is too expensive (it can cost up to eight million dollars a year to licence) to be used on ordinary citizens.</p>
<p style="text-align: justify; ">But not all spyware is expensive. “Multiple kinds are now commercially available and easy to procure. These can be used by an estranged lover or even a professional rival to find information about you,” he says. Jija Hari Singh, retired DGP and Karnataka’s first woman IPS officer, says Pegasus is one of the smaller players, and spyware akin to it has been around for three decades. “Monsters bigger than Pegasus are still snooping on us,” she says.</p>
<h3 style="text-align: justify; ">NOTHING TO HIDE?</h3>
<p style="text-align: justify; ">Many people fall back on the narrative of ‘I have nothing to hide, so I’m not worried’.Aayush Rathi, Programme Officer at the Centre for Internet and Society, says that this is a flawed premise: “It is like saying free speech is not important for you because you have nothing useful to say.”Gupta breaks down this rationale: “If a person has ‘nothing to hide’ then they should just unlock their phone and hand it over to any person who asks for it. But the minute such a demand is made they would feel uncomfortable.”This discomfort, he says, doesn’t come because they are doing something illegal but because they fear social judgement.“There is a level of intimacy in their conversations that they’d rather not share with anyone else,” he says.Many people believe only illegal activity leads to surveillance, but that is not the case.“Even the most inconsequential actions are being logged on digital devices, and much of this information can be monetised,” he says.The most tangible risks are financial fraud and identity theft, and spyware is also commonly used for corporate espionage.</p>
<h3 style="text-align: justify; ">UPDATE SECURITY</h3>
<p style="text-align: justify; ">So what must one do if one’s phone is spied on? In the case of Pegasus, Rathi says, “You would have received a communication from WhatsApp if you were targeted. Irrespective, you should update the application immediately as the latest update fixes the vulnerability.”Srinivas says legally the recourse available is the fundamental right to privacy. “Since the government doesn’t have any regulation in place to deal with this, the National Human Rights Commission will have to take it up,” he says.</p>
<p style="text-align: justify; ">Gupta advises precautions against preventable hacks. He advises a reading of online guides on surveillance self-defence, especially those by Electronic Frontier Foundation.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/deccan-herald-november-6-2019-theres-sudeep-whatsapp-spy-attack-and-after'>http://editors.cis-india.org/internet-governance/news/deccan-herald-november-6-2019-theres-sudeep-whatsapp-spy-attack-and-after</a>
</p>
No publisherTheres SudeepInternet GovernancePrivacy2019-12-15T05:06:27ZNews ItemMaking Voices Heard: Privacy, Inclusivity, and Accessibility of Voice Interfaces in India
http://editors.cis-india.org/raw/making-voices-heard-project-announcement
<b>We believe that voice interfaces have the potential to democratise the use of internet by addressing barriers such as accessibility concerns, lack of abilities of reading and writing on digital text interfaces, and lack of options for people to interact with digital devices in their own languages. Through the Making Voice Heard Project supported by Mozilla Corporation, we will examine the current landscape of voice interfaces in India.</b>
<p> </p>
<img src="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_01.jpg" alt="null" width="30%" /> <img src="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_02.jpg" alt="null" width="30%" /> <img src="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_03.jpg" alt="null" width="30%" />
<p> </p>
<h4>Download the project announcement cards (shown above): <a href="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_01.jpg" target="_blank">Card 01</a>, <a href="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_02.jpg" target="_blank">Card 02</a>, and <a href="https://raw.githubusercontent.com/cis-india/website/master/img/CIS_Mozilla_MakingVoicesHeard_ProjectAnnouncement_03.jpg" target="_blank">Card 03</a></h4>
<hr />
<h3>Making Voices Heard: Project Announcement</h3>
<p>Although voice enabled interfaces are being deployed there is a need to understand how they are beneficial, and what have been important knowledge gaps and challenges in their development, adoption, use, and regulation. Through the Making Voice Heard Project <a href="https://blog.mozilla.org/blog/2019/07/05/mozillas-latest-research-grants-prioritizing-research-for-the-internet/" target="_blank">supported by Mozilla Corporation</a>, we will be examining the current landscape of voice interfaces in India, and seek to address the following questions:</p>
<ul>
<li>What is the broad (sectoral and functional) typology of available voice interfaces in Indian languages? How widely are these voice interfaces (in Indian languages) used, and what barriers prevent their further adoption and use?<br /><br /></li>
<li>What are concerns related to privacy and data protection that emerge with the growth of voice interfaces? What kind of protocols for data processing may need to be built into the design of these interfaces?<br /><br /></li>
<li>How accessible are these interfaces for persons with disabilities (PWDs)? What kinds of accessibility features, especially for Indian languages, may need to be developed to ensure effective use of voice technologies by PWDs?<br /><br /></li>
<li>Where do challenges in these three areas intersect? For instance, is compromising on users’ privacy, including weak or missing data protection regulations, required to create comprehensive speech datasets that may help develop better accessibility features, and address linguistic barriers?</li></ul>
<p>In order to approach these questions we have begun mapping the various developers and users of voice interfaces in India. In the next stage of the process we will be looking at these interfaces through the lens of privacy, language, accessibility, and design. In order to add to the mapping and questions, we will be conducting interviews and workshops with users, developers, designers and researchers of voice interfaces in India, including the <a href="https://voice.mozilla.org/en" target="_blank">Common Voice</a> team at Mozilla.</p>
<p>We hereby invite researchers, developers and designers of voice interfaces to speak to us and help inform the study. You may contact Shweta Mohandas at shweta@cis-india.org.</p>
<p><em>- Shweta Mohandas, Saumyaa Naidu, Puthiya Purayil Sneha, and Sumandro Chattapadhyay (project team)</em></p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/making-voices-heard-project-announcement'>http://editors.cis-india.org/raw/making-voices-heard-project-announcement</a>
</p>
No publishershwetaVoice User InterfaceLanguagePrivacyAccessibilityResearchVoice Assisted InterfaceFeaturedResearchers at WorkMaking Voices Heard2019-12-18T12:10:05ZBlog EntryCyber law experts asks why CERT-In removed advisory warning about WhatsApp vulnerability
http://editors.cis-india.org/internet-governance/news/et-tech-megha-mandavia-november-4-2019-cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability
<b>On the missing web page note, CERT-In had provided a detailed explanation of the vulnerability, which could be exploited by an attacker by making a decoy voice call to a target.</b>
<p style="text-align: justify; ">The article by Megha Mandavia was <a class="external-link" href="https://tech.economictimes.indiatimes.com/news/internet/cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability/71881880">published in ET Tech.com</a> on November 4, 2019. Pranesh Prakash was quoted.</p>
<hr />
<p style="text-align: justify; ">Cyber law experts have asked the <a href="https://tech.economictimes.indiatimes.com/tag/government">government</a> to explain why the Indian computer emergency response team (<a href="https://tech.economictimes.indiatimes.com/tag/cert-in">CERT-In</a>) removed from its website two days ago an advisory it had put out in May warning users of a vulnerability that could be used to exploit <a href="https://tech.economictimes.indiatimes.com/tag/whatsapp">WhatsApp</a> on their smartphones.<br /><br />“This is merely further evidence that the explanation is to be provided by GoI (Government of India) instead of blame shifting and politicizing the issue,” said Mishi Choudhary, the legal director of the New York-based Software Freedom Law Center. “India is a surveillance state with no judicial oversight.”<br /><br />On the missing web page note, CERT-In had provided a detailed explanation of the vulnerability, which could be exploited by an attacker by making a decoy voice call to a target.<br /><br />It had warned WhatsApp users that the vulnerability could allow an attacker to access information on the system, such as logs, messages and photos, and could further compromise it. CERT-In rated the severity “high” and asked users to upgrade to the latest version of the app.<br /><br />It also listed links to hackernews and cyber security firm Check Point Software that pointed to the alleged involvement of Israeli cyber software firm NSO Group in the hacking of WhatsApp messenger.<br /><br />CERT-In Director-General Sanjay Bahl did not respond to ET’s mails or calls seeking clarity on why the advisory was pulled from its website.<br /><br />The Times of India reported first the development.<br /><br />The government had blamed WhatsApp for not informing it about the attack and asked the Facebook-owned company to respond by November 4.<br /><br />In response, WhatsApp sources pointed out that it had informed CERT-in in May about the vulnerability and updated in September that 121 Indian nationals were targeted using the exploit, ET reported on Sunday.<br /><br />“We should not read too much into it. It could just be bad website management. The vulnerability was public knowledge. It was reported by the Common Vulnerabilities and Exposures (CVE) organization in May,” said Pranesh Prakash, fellow at the Centre of <a href="https://tech.economictimes.indiatimes.com/news/internet">Internet</a> and Society, a non-profit organisation.<br /><br />The government has also questioned the timing of the disclosure, as it comes amid a request by it to the Supreme Court seeking three months to frame rules to curb misuse of social media in the country.<br /><br />The government has categorically told WhatsApp that it wants the platform to bring in a mechanism that would enable tracing of the origin of messages, a demand that the instant messaging platform has resisted citing privacy concerns.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/et-tech-megha-mandavia-november-4-2019-cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability'>http://editors.cis-india.org/internet-governance/news/et-tech-megha-mandavia-november-4-2019-cyber-law-experts-asks-why-cert-in-removed-advisory-warning-about-whatsapp-vulnerability</a>
</p>
No publisherMegha MandaviaInternet GovernancePrivacy2019-11-15T00:48:00ZNews Item