You are here: Home / Internet Governance / Blog / Clause 12 Of The Data Protection Bill And Digital Healthcare: A Case Study

Clause 12 Of The Data Protection Bill And Digital Healthcare: A Case Study

Posted by Amber Sinha at Mar 01, 2022 03:07 PM |
In light of the state’s emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?

The blog post was published in Medianama on February 21, 2022. This is the second in a two-part series by Amber Sinha.


In the previous post, I looked at provisions on non-consensual data processing for state functions under the most recent version of recommendations by the Joint Parliamentary Committee on India’s Data Protection Bill (DPB). The true impact of these provisions can only be appreciated in light of ongoing policy developments and real-life implications.

To appreciate the significance of the dilutions in Clause 12, let us consider the Indian state’s range of schemes promoting digital healthcare. In July 2018, NITI Aayog, a central government policy think tank in India released a strategy and approach paper (Strategy Paper) on the formulation of the National Health Stack which envisions the creation of a federated application programming interface (API)-enabled health information ecosystem. While the Ministry of Health and Family Welfare has focused on the creation of Electronic Health Records (EHR) Standards for India during the last few years and also identified a contractor for the creation of a centralised health information platform (IHIP), this Strategy Paper advocates a completely different approach, which is described as a Personal Health Records (PHR) framework. In 2021, the National Digital Health Mission (NDHM) was launched under which a citizen shall have the option to obtain a digital health ID. A digital health ID is a unique ID and will carry all health records of a person.

A Stack Model for Big Data Ecosystem in Healthcare

A stack model as envisaged in the Strategy Paper, consists of several layers of open APIs connected to each other, often tied together by a unique health identifier. The open nature of APIs has the advantage that it allows public and private actors to build solutions on top of it, which are interoperable with all parts of the stack. It is however worth considering both the ‘openness’ and the role that the state plays in it.

Even though the APIs are themselves open, they are a part of a pre-decided technological paradigm, built by private actors and blessed by the state. Even though innovators can build on it, the options available to them are limited by the information architecture created by the stack model. When such a technological paradigm is created for healthcare reform and health data, the stack model poses additional challenges. By tying the stack model to the unique identity, without appropriate processes in place for access control, siloed information, and encrypted communication, the stack model poses tremendous privacy and security concerns. The broad language under Clause 12 of the DPB needs to be looked at in this context.

Clause 12 allows non-consensual processing of personal data where it is necessary “for the performance of any function of the state authorised by law” in order to provide a service or benefit from the State. In the previous post, I had highlighted the import of the use of only ‘necessity’ to the exclusion of ‘proportionality’. Now, we need to consider its significance in light of the emerging digital healthcare apparatus being created by the state.

The National Health Stack and National Digital Health Mission together envision an intricate system of data collection and exchange which in a regulatory vacuum would ensure unfettered access to sensitive healthcare data for both the state and private actors registered with the platforms. The Stack framework relies on repositories where data may be accessed from multiple nodes within the system. Importantly, the Strategy Paper also envisions health data fiduciaries to facilitate consent-driven interaction between entities that generate the health data and entities that want to consume the health records for delivering services to the individual. The cast of characters involve the National Health Authority, health care providers and insurers who access the National Health Electronic Registries, unified data from different programmes such as National Health Resource Repository (NHRR), NIN database, NIC and the Registry of Hospitals in Network of Insurance (ROHINI), private actors such as Swasth, iSpirt who assist the Mission as volunteers. The currency that government and private actors are interested in is data.

The promised benefits of healthcare data in an anonymised and aggregate form range from Disease Surveillance to Pharmacovigilance as well as Health Schemes Management Systems and Nutrition Management, benefits which have only been more acutely emphasised during the pandemic. However, the pandemic has also normalised the sharing of sensitive healthcare data with a variety of actors, without much thinking on much-needed data minimisation practises.

The potential misuses of healthcare data include greater state surveillance and control, predatory and discriminatory practices by private actors which rely on Clause 12 to do away with even the pretense of informed consent so long as the processing of data is deemed necessary by the state and its private sector partners to provide any service or benefit.

Subclause (e) in Clause 12, which was added in the last version of the Bill drafted by MeitY and has been retained by the JPC, allows processing wherever it is necessary for ‘any measures’ to provide medical treatment or health services during an epidemic, outbreak or threat to public health. Yet again, the overly-broad language used here is designed to ensure that any annoyances of informed consent can be easily brushed aside wherever the state intends to take any measures under any scheme related to public health.

Effectively, how does the framework under Clause 12 alter the consent and purpose limitation model? Data protection laws introduce an element of control by tying purpose limitation to consent. Individuals provide consent to specified purposes, and data processors are required to respect that choice. Where there is no consent, the purposes of data processing are sought to be limited by the necessity principle in Clause 12. The state (or authorised parties) must be able to demonstrate necessity to the exercise of state function, and data must only be processed for those purposes which flow out of this necessity. However, unlike the consent model, this provides an opportunity to keep reinventing purposes for different state functions.

In the absence of a data protection law, data collected by one agency is shared indiscriminately with other agencies and used for multiple purposes beyond the purpose for which it was collected. The consent and purpose limitation model would have addressed this issue. But, by having a low threshold for non-consensual processing under Clause 12, this form of data processing is effectively being legitimised.