A Critique of Consent in Information Privacy
Notice and Consent as cornerstone of privacy law
The privacy notice, which is the primary subject of this article, conveys all pertinent information, including risks and benefits to the participant, and in the possession of such knowledge, they can make an informed choice about whether to participate or not.
Most modern laws and data privacy principles seek to focus on individual control. In this context, the definition by the late Alan Westin, former Professor of Public Law & Government Emeritus, Columbia University, which characterises privacy as "the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to other," [1] is most apt. The idea of privacy as control is what finds articulation in data protection policies across jurisdictions beginning from the Fair Information Practice Principles (FIPP) from the United States. [2] Paul Schwarz, the Jefferson E. Peyser Professor at UC Berkeley School of Law and a Director of the Berkeley Center for Law and Technology, called the FIPP the building blocks of modern information privacy law. [3] These principles trace their history to a report called 'Records, Computers and Rights of Citizens'[4] prepared by an Advisory Committee appointed by the US Department of Health, Education and Welfare in 1973 in response to the increasing automation in data systems containing information about individuals. The Committee's mandate was to "explore the impact of computers on record keeping about individuals and, in addition, to inquire into, and make recommendations regarding, the use of the Social Security number."[5] The most important legacy of this report was the articulation of five principles which would not only play a significant role in the privacy laws in US but also inform data protection law in most privacy regimes internationally[6] like the OECD Privacy Guidelines, the EU Data Protection Principles, the FTC Privacy Principles, APEC Framework or the nine National Privacy Principles articulated by the Justice A P Shah Committee Report which are reflected in the Privacy Bill, 2014 in India. Fred Cate, the C. Ben Dutton Professor of Law at the Indiana University Maurer School of Law, effectively summarises the import of all of these privacy regimes as follows:
"All of these data protection instruments reflect the same approach: tell individuals what data you wish to collect or use, give them a choice, grant them access, secure those data with appropriate technologies and procedures, and be subject to third-party enforcement if you fail to comply with these requirements or individuals' expressed preferences"[7]
This makes the individual empowered and allows them to weigh their own interests in exercising their consent. The allure of this paradigm is that in one elegant stroke, it seeks to "ensure that consent is informed and free and thereby also to implement an acceptable tradeoff between privacy and competing concerns."[8] This system was originally intended to be only one of the multiple ways in data processing would be governed, along with other substantive principles such as data quality, however, it soon became the dominant and often the only mechanism.[9] In recent years however, the emergence of Big Data and the nascent development of the Internet of Things has led many commentators to begin questioning the workability of consent as a principle of privacy. [10] In this article we will look closely at the some of issues with the concept of informed consent, and how these notions have become more acute in recent years. Following an analysis of these issues, we will conclude by arguing that today consent, as the cornerstone of privacy law, may in fact be thought of as counter-productive and that a rethinking of a principle based approach to privacy may be necessary.
Problems with Consent
To a certain extent, there are some cognitive problems that have always existed with the issue of informed consent such as long and difficult to understand privacy notices,[11] although, in recent past with these problems have become much more aggravated. Fred Cate points out that FIPPs at their inception were broad principles which included both substantive and procedural aspects. However, as they were translated into national laws, the emphasis remained on the procedural aspect of notice and consent. From the idea of individual or societal welfare as the goals of privacy, the focus had shifted to individual control.[12] With data collection occurring with every use of online services, and complex data sets being created, it is humanly impossible to exercise rational decision-making about the choice to allow someone to use our personal data. The thrust of Big Data technologies is that the value of data resides not in its primary purposes but in its numerous secondary purposes where data is re-used many times over. [13] In that sense, the very idea of Big Data conflicts with the data minimization principle.[14] The idea is to retain as much data as possible for secondary uses. Since, these secondary uses are, by their nature, unanticipated, its runs counter to the the very idea of the purpose limitation principle. [15] The notice and consent requirement has simply led to a proliferation of long and complex privacy notices which are seldom read and even more rarely understood. We will articulate some issues with privacy notices which have always existed, and have only become more exacerbated in the context of Big Data and the Internet of Things.
1. Failure to read/access privacy notices
The notice and consent principle relies on the ability of the individual to make an informed choice after reading the privacy notice. The purpose of a privacy notice is to act as a public announcement of the internal practices on collection, processing, retention and sharing of information and make the user aware of the same.[16] However, in order to do so the individual must first be able to access the privacy notices in an intelligible format and read them. Privacy notices come in various forms, ranging from documents posted as privacy policies on a website, to click through notices in a mobile app, to signs posted in public spaces informing about the presence of CCTV cameras. [17]
In order for the principle of notice and consent to work, the privacy notices need to be made available in a language understood by the user. As per estimates, about 840 million people (11% of the world population) can speak or understand English. However, most privacy notices online are not available in the local language in different regions.[18] Further, with the ubiquity of smartphones and advent of Internet of Things, constrained interfaces on mobile screens and wearables make the privacy notices extremely difficult to read. It must be remembered that privacy notices often run into several pages, and smaller screens effectively ensure that most users do not read through them. Further, connected wearable devices often have "little or no interfaces that readily permit choices." [19] As more and more devices are connected, this problem will only get more pronounced. Imagine in a world where refrigerators act as the intermediary disclosing information to your doctor or supermarket, at what point does the data subject step in and exercise consent.[20]
Another aspect that needs to be understood is that unlike earlier when data collectors were far and few in between, the user could theoretically make a rational choice taking into account the purpose of data collection. However, in the world of Big Data, consent often needs to be provided while the user is trying to access services. In that context click through privacy notices such as those required to access online application, are treated simply as an impediment that must be crossed in order to get access to services. The fact that the consent need to be given in real time almost always results in disregarding what the privacy notices say.[21]
Finally, some scholars have argued that while individual control over data may be appealing in theory, it merely gives an illusion of enhanced privacy but not the reality of meaningful choice.[22] Research demonstrates that the presence of the term 'privacy policy' leads people to the false assumption that if a company has a privacy policy in place, it automatically means presence of substantive and responsible limits on how data is handled.[23] Joseph Turow, the Robert Lewis Shayon Professor of Communication at the Annenberg School for Communication, and his team for example has demonstrated how "[w]hen consumers see the term 'privacy policy,' they believe that their personal information will be protected in specific ways; in particular, they assume that a website that advertises a privacy policy will not share their personal information."[24] In reality, however, privacy policies are more likely to serve as liability disclaimers for companies than any kind of guarantee of privacy for consumers. Most people tend to ignore privacy policies.[25] Cass Sunstein states that our cognitive capacity to make choices and take decisions is limited. When faced with an overwhelming number of choices to make, most of us do not read privacy notices and resort to default options.[26] The requirement to make choices, sometimes several times in a day, imposes significant burden on the consumers as well the business seeking such consent. [27]
2. Failure to understand privacy notices
FTC chairperson Edith Ramirez stated: "In my mind, the question is not whether consumers should be given a say over unexpected uses of their data; rather, the question is how to provide simplified notice and choice."[28] Privacy notices often come in the form of long legal documents much to the detriment of the readers' ability to understand them. These policies are "long, complicated, full of jargon and change frequently."[29] Kent walker list five problems that privacy notices typically suffer from - a) overkill - long and repetitive text in small print, b) irrelevance - describing situations of little concern to most consumers, c) opacity - broad terms the reflect the truth that is impossible to track and control all the information collected and stored, d) non-comparability - simplification required to achieve comparability will lead to compromising accuracy, and e) inflexibility - failure to keep pace with new business models.[30] Erik Sherman did a review of twenty three corporate privacy notices and mapped them against three indices which give approximate level of education necessary to understand text on a first read. His results show that most of policies can only be understood on the first read by people of a grade level of 15 or above. [31] FTC Chairperson Timothy Muris summed up the problem with long privacy notices when he said, "Acres of trees died to produce a blizzard of barely comprehensible privacy notices." [32]
Margaret Jane Radin, the former Henry King Ransom Professor of Law Emerita at the University of Michigan, provides a good definition of free consent. It "involves a knowing understanding of what one is doing in a context in which it is actually
possible for or to do otherwise, and an affirmative action in doing something, rather
than a merely passive acquiescence in accepting something."[33] There have been various proposals advocating a more succinct and simpler standard for privacy notices,[34] or multi-layered notices[35] or representing the information in the form of a table. [36] However, studies show only an insignificant improvement in the understanding by consumers when privacy policies are represented in graphic formats like tables and labels. [37] It has also been pointed out that it is impossible to convey complex data policies in simple and clear language.[38]
3. Failure to anticipate/comprehend the consequences of consent
Today's infinitely complex and labyrinthine data ecosystem is beyond the comprehension of most ordinary users. Despite a growing willingness to share information online, most have no understanding of what happens to their data once they have uploaded it - Where it goes? Whom it is held by? Under what conditions? For what purpose? Or how might it be used, aggregated, hacked, or leaked in the future? For the most part, the above operations are "invisible, managed at distant centers, from behind the scenes, by unmanned powers."[39]
The perceived opportunities and benefits of Big Data have led to an acceptance of the indiscriminate collection of as much data as possible as well as the retention of that data for unspecified future analysis. For many advocates, such practices are absolutely essential if Big Data is to deliver on its promises.. Experts have argued that key privacy principles particularly those of collection limitation, data minimization and purpose limitation should not be applied to Big Data processing.[40] As mentioned above, in the case of Big Data, the value of the data collected comes often not from its primary purpose but from its secondary uses. Deriving value from datasets involves amalgamating diverse datasets and executing speculative and exploratory kinds of analysis in order to discover hidden insights and correlations that might have previously gone unnoticed.[41] As such organizations are today routinely reprocessing data collected from individuals for purposes not directly related to the services they provide to the customer. These secondary uses of data are becoming increasingly valuable sources of revenue for companies as the value of data in and of itself continues to rise. [42]
Purpose Limitation
The principle of purpose limitation has served as a key component of data protection for decades. Purposes given for the processing of users' data should be given at the time of collection and consent and should be "specified, explicit and legitimate". In practice however, reasons given typically include phrases such as, 'for marketing purposes' or 'to improve the user experience' that are vague and open to interpretation. [43]
Some commentators whilst conceding the fact that purpose limitation in the era of Big Data may not be possible have instead attempted to emphasise the notion of 'compatible use' requirements. In the view of Working Party on the protection of individuals with regard to the processing of person data, for example, use of data for a purpose other than that originally stated at the point of collection should be subject to a case-by-case review of whether not further processing for different purpose is justifiable - i.e., compatible with the original purpose. Such a review may take into account for example, the context in which the data was originally collected, the nature or sensitivity of the data involved, and the existence of relevant safeguards to insure fair processing of the data and prevent undue harm to the data subject.[44]
On the other hand, Big Data advocates have argued that an assessment of legitimate interest rather than compatibility with the initial purpose is far better suited to Big Data processing.[45] They argue that today the notion of purpose limitation has become outdated. Whereas previously data was collected largely as a by-product of the purpose for which it was being collected. If for example, we opted to use a service the information we provided was for the most part necessary to enable the provision of that service. Today however, the utility of data is no longer restricted to the primary purpose for which it is collected but can be used to provide all kinds of secondary services and resources, reduce waste, increase efficiency and improve decision-making.[46] These kinds of positive externalities, Big Data advocates insist, are only made possible by the reprocessing of data.
Unfortunately for the notion of consent the nature of these secondary purposes are rarely evident at the time of collection. Instead the true value of the data can often only be revealed when it is amalgamated with other diverse datasets and subjected to various forms of analysis to help reveal hidden and non-obvious correlations and insights.[47] The uncertain and speculative value of data therefore means that it is impossible to provide "specific, explicit, and legitimate" details about how a given data set will be used or how it might be aggregated in future. Without this crucial information data subjects have no basis upon which they can make an informed decision about whether or not to provide consent. Robert Sloan and Richard Warner argue that it is impossible for a privacy notice to contain enough information to enable free consent. They argue that current data collection practices are highly complex and that these practices involve collection of information at one stage for one purpose and then retain, analyze, and distribute it for a variety of other purposes in unpredictable ways. [48] Helen Nissenbaum points to the ever changing nature of data flow and the cognitive challenges it poses. "Even if, for a given moment, a
snapshot of the information flows could be grasped, the realm is in constant flux, with new firms entering the picture, new analytics, and new back end contracts forged: in other words, we are dealing with a recursive capacity that is indefinitely extensible." [49]
Scale and Aggregation
Today the quantity of data being generated is expanding at an exponential rate. From smartphones and televisions, trains and airplanes, sensor-equipped buildings and even the infrastructures of our cities, data now streams constantly from almost every sector and function of daily life, 'creating countless new digital puddles, lakes, tributaries and oceans of information'.[50] In 2011 it was estimated that the quantity of data produced globally would surpass 1.8 zettabytes , by 2013 that had grown to 4 zettabytes , and with the nascent development of the Internet of Things gathering pace, these trends are set to continue. [51] Big Data by its very nature requires the collection and processing of very large and very diverse data sets. Unlike other forms scientific research and analysis which utilize various sampling techniques to identify and target the types of data most useful to the research questions, Big Data instead seeks to gather as much data as possible, in order to achieve full resolution of the phenomenon being studied, a task made much easier in recent years as a result of the proliferation of internet enabled devices and the growth of the Internet of Things. This goal of attaining comprehensive coverage exists in tension however with the key privacy principles of collection limitation and data minimization which seek to limit both the quantity and variety of data collected about an individual to the absolute minimum. [52]
The dilution of the purpose limitation principle entails that even those who understand privacy notices and are capable of making rational choices about it, cannot conceptualize how their data will be aggregated and possibly used or re-used. Seemingly innocuous bits of data revealed at different stages could be combined to reveal sensitive information about the individual. Daniel Solove, the John Marshall Harlan Research Professor of Law at the George Washington University Law School, in his book, "The Digital Person", calls it the aggregation effect. He argues that the ingenuity of the data mining techniques and the insights and predictions that could be made by it render any cost-benefit analysis that an individual could make ineffectual. [53]
4. Failure to opt-out
The traditional choice against the collection of personal data that users have had access to, at least in theory, is the option to 'opt-out' of certain services. This draws from the free market theory that individuals exercise their free will when they use services and always have the option of opting out, thus, arguing against regulation but relying on the collective wisdom of the market to weed out harms. The notion that the provision of data should be a matter of personal choice on the part of the individual and that the individual can, if they chose decide to 'opt-out' of data collection, for example by ceasing use of a particular service, is an important component of privacy and data protection frameworks. The proliferation of internet-enabled devices, their integration into the built environment and the real-time nature of data collection and analysis however are beginning to undermine this concept. For many critics of Big Data, the ubiquity of data collection points as well as the compulsory provision of data as a prerequisite for the access and use of many key online services, is making opting-out of data collection not only impractical but in some cases impossible. [54]
Whilst sceptics may object that individuals are still free to stop using services that require data. As online connectivity becomes increasingly important to participation in modern life, the choice to withdraw completely is becoming less of a genuine choice. [55] Information flows not only from the individuals it is about but also from what other people say about them. Financial transactions made online or via debit/credit cards can be analysed to derive further information about the individual. If opting-out makes you look anti-social, criminal, or unethical, the claims that we are exercising free will seems murky and leads one to wonder whether we are dealing with coercive technologies.
Another issue with the consent and opt-out paradigm is the binary nature of the choice. This binary nature of consent makes a mockery of the notion that consent can function as an effective tool of personal data management. What it effectively means is that one can either agree with the long privacy notices, or choose to abandon the desired service. "This binary choice is not what the privacy architects envisioned four decades ago when they imagined empowered individuals making informed decisions about the processing of their personal data. In practice, it certainly is not the optimal mechanism to ensure that either information privacy or the free flow of information is being protected." [56]
Conclusion: 'Notice and Consent' is counter-productive
There continues to be an unwillingness amongst many privacy advocates to concede that the concept of consent is fundamentally broken, as Simon Davies, a privacy advocate based in London, comments 'to do so could be seen as giving ground to the data vultures', and risks further weakening an already dangerously fragile privacy framework.[57] Nevertheless, as we begin to transition into an era of ubiquitous data collection, evidence is becoming stronger that consent is not simply ineffective, but may in some instances might be counter-productive to the goals of privacy and data protection.
As already noted, the notion that privacy agreements produce anything like truly informed consent has long since been discredited; given this fact, one may ask for whose benefit such agreements are created? One may justifiably argue that far from being for the benefit and protection of users, privacy agreement may in fact be fundamentally to the benefit of data brokers, who having gained the consent of users can act with near impunity in their use of the data collected. Thus, an overly narrow focus on the necessity of consent at the point of collection, risks diverting our attention from the arguably more important issue of how our data is stored, analysed and distributed by data brokers following its collection. [58]
Furthermore, given the often complicated and cumbersome processes involved in gathering consent from users, some have raised concerns that the mechanisms put in place to garner consent could themselves morph into surveillance mechanisms. Davies, for example cites the case of the EU Cookie Directive, which required websites to gain consent for the collection of cookies. Davies observes how, 'a proper audit and compliance element in the system could require the processing of even more data than the original unregulated web traffic. Even if it was possible for consumers to use some kind of gateway intermediary to manage the consent requests, the resulting data collection would be overwhelming''. Thus in many instances there exists a fundamental tension between the requirement placed on companies to gather consent and the equally important principle of data minimization. [59]
Given the above issues with notice and informed consent in the context of information privacy, and the fact that it is counterproductive to the larger goals of privacy law, it is important to revisit the principle or rights based approach to data protection, and consider a paradigm shift where one moves to a risk based approach that takes into account the actual threats of sharing data rather than relying on what has proved to be an ineffectual system of individual control. We will be dealing with some of these issues in a follow up to this article.
[1] Alan Westin, Privacy and Freedom, Atheneum, New York, 2015.
[2] FTC Fair Information Practice Principles (FIPP) available at https://www.it.cornell.edu/policies/infoprivacy/principles.cfm.
[3] Paul M. Schwartz, "Privacy and Democracy in Cyberspace," 52 Vanderbilt Law Review 1607, 1614 (1999).
[4] US Secretary's Advisory Committee on Automated Personal Data Systems, Records, Computers and the Rights of Citizens, available at http://www.justice.gov/opcl/docs/rec-com-rights.pdf
[6] Marc Rotenberg, "Fair Information Practices and the Architecture of Privacy: What Larry Doesn't Get," available at https://journals.law.stanford.edu/sites/default/files/stanford-technology-law-review/online/rotenberg-fair-info-practices.pdf
[7] Fred Cate, The Failure of Information Practice Principles, available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1156972
[8] Robert Sloan and Richard Warner, Beyong Notice and Choice: Privacy, Norms and Consent, 2014, available at https://www.suffolk.edu/documents/jhtl_publications/SloanWarner.pdf
[9] Fred Cate, Viktor Schoenberger, Notice and Consent in a world of Big Data, available at http://idpl.oxfordjournals.org/content/3/2/67.abstract
[10] Daniel Solove, Privacy self-management and consent dilemma, 2013 available at http://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2093&context=faculty_publications
[11] Ben Campbell, Informed consent in developing countries: Myth or Reality, available at https://www.dartmouth.edu/~ethics/docs/Campbell_informedconsent.pdf ;
[12] Supra Note 7.
[13] Viktor Mayer Schoenberger and Kenneth Cukier, Big Data: A Revolution that will transform how we live, work and think" John Murray, London, 2013 at 153.
[14] The Data Minimization principle requires organizations to limit the collection of personal data to the minimum extent necessary to obtain their legitimate purpose and to delete data no longer required.
[15] Omer Tene and Jules Polonetsky, "Big Data for All: Privacy and User Control in the Age of Analytics," SSRN Scholarly Paper, available at http://papers.ssrn.com/abstract=2149364
[16] Florian Schaub, R. Balebako et al, "A Design Space for effective privacy notices" available at https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf
[17] Daniel Solove, The Digital Person: Technology and Privacy in the Information Age, NYU Press, 2006.
[19] Opening Remarks of FTC Chairperson Edith Ramirez Privacy and the IoT: Navigating Policy Issues International Consumer Electronics Show Las Vegas, Nevada January 6, 2015 available at https://www.ftc.gov/system/files/documents/public_statements/617191/150106cesspeech.pdf
[20] http://www.privacysurgeon.org/blog/incision/why-the-idea-of-consent-for-data-processing-is-becoming-meaningless-and-dangerous/
[21] Supra Note 10.
[22] Supra Note 7.
[23] Chris Jay Hoofnagle & Jennifer King, Research Report: What Californians Understand
About Privacy Online, available at http://ssrn.com/abstract=1262130
[24] Joseph Turrow, Michael Hennesy, Nora Draper, The Tradeoff Fallacy, available at https://www.asc.upenn.edu/sites/default/files/TradeoffFallacy_1.pdf
[25] Saul Hansell, "Compressed Data: The Big Yahoo Privacy Storm That Wasn't," New York Times, May 13, 2002 available at http://www.nytimes.com/2002/05/13/business/compressed-data-the-big-yahoo-privacy-storm-that-wasn-t.html?_r=0
[26] Cass Sunstein, Choosing not to choose: Understanding the Value of Choice, Oxford University Press, 2015.
[27] For example, Acxiom, processes more than 50 trillion data transactions a year. http://www.nytimes.com/2012/06/17/technology/acxiom-the-quiet-giant-of-consumer-database-marketing.html?pagewanted=all&_r=0
[28] Opening Remarks of FTC Chairperson Edith Ramirez Privacy and the IoT: Navigating Policy Issues International Consumer Electronics Show Las Vegas, Nevada January 6, 2015 available at https://www.ftc.gov/system/files/documents/public_statements/617191/150106cesspeech.pdf
[29] L. F. Cranor. Necessary but not sufficient: Standardized mechanisms for privacy notice and choice. Journal on Telecommunications and High Technology Law, 10:273, 2012, available at http://jthtl.org/content/articles/V10I2/JTHTLv10i2_Cranor.PDF
[30] Kent Walker, The Costs of Privacy, 2001 available at https://www.questia.com/library/journal/1G1-84436409/the-costs-of-privacy
[31] Erik Sherman, "Privacy Policies are great - for Phds", CBS News, available at http://www.cbsnews.com/news/privacy-policies-are-great-for-phds/
[32] Timothy J. Muris, Protecting Consumers' Privacy: 2002 and Beyond, available at http://www.ftc.gov/speeches/muris/privisp1002.htm
[33] Margaret Jane Radin, Humans, Computers, and Binding Commitment, 1999 available at http://www.repository.law.indiana.edu/ilj/vol75/iss4/1/
[34] Annie I. Anton et al., Financial Privacy Policies and the Need for Standardization, 2004 available at https://ssl.lu.usi.ch/entityws/Allegati/pdf_pub1430.pdf; Florian Schaub, R. Balebako et al, "A Design Space for effective privacy notices" available at https://www.usenix.org/system/files/conference/soups2015/soups15-paper-schaub.pdf
[35] The Center for Information Policy Leadership, Hunton & Williams LLP, "Ten Steps To Develop A Multi-Layered Privacy Notice" available at https://www.informationpolicycentre.com/files/Uploads/Documents/Centre/Ten_Steps_whitepaper.pdf
[36] Allen Levy and Manoj Hastak, Consumer Comprehension of Financial Privacy Notices, Interagency Notice Project, available at https://www.sec.gov/comments/s7-09-07/s70907-21-levy.pdf
[37] Patrick Gage Kelly et al., Standardizing Privacy Notices: An Online Study of the Nutrition Label Approach available at https://www.ftc.gov/sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00037/544506-00037.pdf
[38] Howard Latin, "Good" Warnings, Bad Products, and Cognitive Limitations, 41 UCLA Law Review available at https://litigation-essentials.lexisnexis.com/webcd/app?action=DocumentDisplay&crawlid=1&srctype=smi&srcid=3B15&doctype=cite&docid=41+UCLA+L.+Rev.+1193&key=1c15e064a97759f3f03fb51db62a79a5
[39] Jonathan Obar, Big Data and the Phantom Public: Walter Lippmann and the fallacy of data privacy self management, Big Data and Society, 2015, available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2239188
[40] Viktor Mayer Schoenberger and Kenneth Cukier, Big Data: A Revolution that will transform how we live, work and think" John Murray, London, 2013.
[41] Supra Note 15.
[42] Supra Note 40.
[43] Article 29 Working Party, (2013) Opinion 03/2013 on Purpose Limitation, Article 29, available at: http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf
[44] Ibid.
[45] It remains unclear however whose interest would be accounted, existing EU legislation would allow commercial/data broker/third party interests to trump those of the user, effectively allowing re-processing of personal data irrespective of whether that processing would be in the interest of the user.
[46] Supra Note 40.
[47] Supra Note 10.
[48] Robert Sloan and Richard Warner, Beyong Notice and Choice: Privacy, Norms and Consent, 2014, available at https://www.suffolk.edu/documents/jhtl_publications/SloanWarner.pdf
[49] Helen Nissenbaum, A Contextual Approach to Privacy Online, available at http://www.amacad.org/publications/daedalus/11_fall_nissenbaum.pdf
[50] D Bollier, The Promise and Peril of Big Data. The Aspen Institute, 2010, available at: http://www.aspeninstitute.org/sites/default/files/content/docs/pubs/The_Promise_and_Peril_of_Big_Data.pdf
[51] Meeker, M. & Yu, L. Internet Trends, Kleiner Perkins Caulfield Byers, (2013), http://www.slideshare.net/kleinerperkins/kpcb-internet-trends-2013 .
[52] Supra Note 40.
[53] Supra Note 17.
[54] Janet Vertasi, My Experiment Opting Out of Big Data Made Me Look Like a Criminal, 2014, available at http://time.com/83200/privacy-internet-big-data-opt-out/
[55] Ibid.
[57] Simon Davies, Why the idea of consent for data processing is becoming meaningless and dangerous, available at http://www.privacysurgeon.org/blog/incision/why-the-idea-of-consent-for-data-processing-is-becoming-meaningless-and-dangerous/
[58] Supra Note 10.
[59] Simon Davies, Why the idea of consent for data processing is becoming meaningless and dangerous, available at http://www.privacysurgeon.org/blog/incision/why-the-idea-of-consent-for-data-processing-is-becoming-meaningless-and-dangerous/