You are here: Home / Internet Governance / Guest Report: Bridging the Concerns with Recommending Aarogya Setu

Guest Report: Bridging the Concerns with Recommending Aarogya Setu

Posted by Siddharth Sonkar at Jun 19, 2020 02:35 PM |
Keywords: Aarogya Setu, Constitutionality, Digital Contact Tracing, Location Data, Personal Data Protection Bill, 2019, Exemptions, Personal Data, Sensitive Personal Data, Mosaic Theory, Surveillance, Privacy, Governing Law, Necessity, Intensity of Review, disparate Impact, Proportionality

 

This report was edited and reviewed by Arindrajit Basu, Mira Swaminathan, and Aman Nair.Read the full report here

EXECUTIVE SUMMARY

 Aarogya Setu collects real-time location data of users every fifteen minutes to facilitate digital contact tracing during the Pandemic. It inter alia color-codes users indicating the extent of risk they pose based on their health status and predicts hotspots which are more susceptible to COVID-19. Its forecasts have reportedly facilitated the identification of 650 clusters of COVID-19 hotspots and predicting 300 emerging hotspots which may have been otherwise missed. In a welcome move, the source code of the application was recently made public. The initially-introduced mandate to use the application was reportedly diluted and a Protocol supplementing the privacy policy with additional safeguards was released. Despite these steps in the right direction, some key concerns continue to require alleviation through engagement. This Report seeks to constructively engage with these concerns towards making privacy safeguards governing its operability more consistent with international best practices.

 First, the Report maps situations in which Aarogya Setu in fact remains mandatory (in Table 1) In these situations, there exists no restriction against private parties (e.g. employers, airlines, etc.) from indirectly making its use mandatory. Consequently, there is no real choice in determining the use of the application. Even where there exists a choice to opt-out (e.g. in contexts where there is only an advisory but no indirect mandate), the choice is not meaningful due to the inability to examine the potential consequences of using the apn remains mandatory for practical purposes since there still exists an obligation to undertake due diligence towards making sure that every employee uses the application. In other words, this part of the report explains why it remains indirectly mandatory to use the application. This indirect mandate impedes the exercise of meaningful consent. This could be addressed through a notification directing that no one should be indirectly compelled to use the application. This part also acknowledges that even where a choice to opt-out (e.g. in contexts where there is only an advisory but no indirect mandate), the choice is not meaningful due to the inability to examine the potential consequences of using the application. 

 Second, the report explains why the mandate to use the Application raises concerns in the first place: i.e. in the absence of transparency beyond the publication of the source code. The open-source code may not necessarily result in meaningful algorithmic transparency (since the processing in the models at the Government of India server continues to remain a black box) in respect of predictions made to determine appropriate health responses. Based on the source code per se, people are unable to verify the wherever there exists operability of the Application more meaningfully. Algorithmic transparency enables people to make an informed decision in using the Application by choice. The ability to make an informed decision is critical to the right to privacy. The right to privacy does not just mean drawing boundaries or creating limitations against any external interference. The right also includes the public’s right to know how an algorithm affects their lives. Given the centrality of transparency in the ability of the user to exercise their privacy better, beyond releasing the source code of Aarogya Setu, publicizing information about how predictions are made is important. This part acknowledges the limitations of transparency in that it can only facilitate identification of privacy harms and not really solve them by itself. Yet, it goes ahead and re-emphasises the inter-relationship of transparency and privacy, highlighting how it became a basis recently in striking down a government-used algorithm, which indicates incentive to increase transparency.

 Third, the report reviews whether based on the already-available information from the combined reading of the privacy policy and the protocol, the operability of the application seems consistent with best international practices in protecting user privacy. This part begins with an analysis of the privacy policy and the protocol, which indicate privacy concerns in relation to inter alia location data, followed by an explanation as to why there exists a reasonable expectation of privacy over location data (to establish a privacy intrusion). This is followed by structurally applying the proportionality test to identify necessary modifications to the current framework:

  1. The 'legality' prong may be satisfied by a combined reading of the NDMA and the specificity in the delegated legislation, as has been done in the past particularly in the context of location tracking. However, it is suggested (in the recommendations section) that a statutory legislation comprehensively governing the operability of the Application is introduced to ensure predictability and permanency in the framework governing the operability of the Application as done internationally. Moreover, determining appropriate health responses to the Pandemic is indeed a legitimate interest that is sought to be achieved through the application 

  2. Given the limitations of traditional methods of contact-tracing, digital contact tracing could perhaps be a suitable method of ascertaining appropriate health responses to the Pandemic subject to a comprehensive review of evidence on a regular basis to evaluate verifiably its effectiveness. Since the use of the application seems likely in the long run, its efficacy needs to be backed by concrete evidence which corroborates its accuracy and effectiveness such as statistical data on false positives and negatives that result from the application

  3. A careful reading of the combined reading of the Aarogya Setu privacy policy and the Protocol with Fair Information Protection Principles (‘FIPP’) indicates some inconsistencies with international best practices. The extent of inconsistency with best practices may not be considered the least restrictive and therefore necessary form in which digital contact tracing can be conducted in India 

  4. Since the inconsistencies seem relatively more restrictive than necessary to facilitate digital contact tracing in India, a balancing of privacy and public health could result in the conclusion that the application is not ‘proportionate’ to the potential privacy harms that can result from using the application. While conducting the balancing exercise, privacy and public health should be viewed as complementary, not competing interests. This conception would encourage courts to consider privacy concerns with sufficient extent of intensity 

 Based on this analysis, the report concludes that digital contact tracing provided the following conditions (detailed in the ‘Recommendations’ section) are conjunctively satisfied: 

  1. Digital contact tracing should supplement (e.g. be in addition to) and not supplant (i.e. replace) traditional methods of contact tracing entirely, particularly for vulnerable groups (e.g. interviews where vulnerable groups, particularly marginalized women do not have access to mobile phones); 

  2. A statutory law should be introduced which strictly and comprehensively governs the scope of the application, 

  3. the suitability of the application (with meaningful algorithmic transparency) should be corroborated by reliable and relevant statistical evidence (e.g. with the help of closer scrutiny of the basis of predictive outcomes) and

  4. The privacy compromises using the application should be intrusive to the minimum extent possible. This could be done by further adding robust safeguards through stronger restrictions on sharing the collected data

(Final year undergraduate student of the National University of Juridical Sciences (NUJS), Kolkata with a sustained interest in law, technology and policy (graduating with the class of 2020).