The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 11 to 25.
Global Technology Summit 2017
http://editors.cis-india.org/internet-governance/news/global-technology-summit-2017
<b>The 2017 Global Technology Summit will take place on December 7 and 8, 2017 at the Hotel Leela Palace, Bangalore. Sunil Abraham is a speaker at the event.</b>
<p style="text-align: justify; ">Link to the original published by Carnegie <a class="external-link" href="http://carnegieindia.org/2017/12/08/global-technology-summit-2017-event-5656?mkt_tok=eyJpIjoiTjJKbFlXWTBaakV3TVRVMSIsInQiOiJ1YkRmVHZHd2h2bVFOTzNEQm94YzRBYUtrWjFwNnhXMkJFSWNiSDE0QldRd3RsT3d1cXhyd2xrNGs4MjdUc2NTN3kyMm9wd28zWGgrcWFDVVBMXC90czhYQ0dSTzlPajRseGdzXC80WW4wWE9zMVR1N1pYY0pmdHBqZTRjSGphQWVRIn0%3D">here</a></p>
<hr style="text-align: justify; " />
<p style="text-align: justify; ">The inaugural edition of the <a href="http://carnegieindia.org/2016/12/07/global-technology-summit-2016-event-5407">Global Technology Summit</a> convened leading scholars, experts, and officials from more than ten countries for wide-ranging discussions on policy frameworks for technological innovation.</p>
<p style="text-align: justify; ">Building on its success, leading innovators, researchers, and entrepreneurs in cutting-edge technologies from around the world will engage with regulators, policy experts, and civil society actors this December in Bangalore.</p>
<p style="text-align: justify; ">The Summit will focus on new directions in technology policy, such as tech-diplomacy, data protection, and building an innovation ecosystem, as well as fields like digital finance, e-mobility, robotics, and smart cities, where massive technological transformation is likely in the coming years.</p>
<p><a class="external-link" href="http://cis-india.org/internet-governance/files/global-technology-summit-2017-agenda"><b>Agenda here</b></a></p>
<h3>Panel Description</h3>
<p style="text-align: justify; ">Navigating Big Data Challenges: Access to data, and capabilities to analyze the same, redefine the business moat for corporations and governance opportunities for governments. Data dictates product and policy success. It also raises complex challenges. With ever increasing hacks and vulnerabilities, data security continues to confound us. Data-driven businesses and governments also question core assumptions of privacy and individual reputation. Machine learning and deep learning, facilitated by data crunching algorithms, can either be coded to discriminate or learn from human data sets and imbibe the very same prejudices. This panel will deliberate upon these varied challenges, and explore possible policy frameworks to address them.</p>
<p style="text-align: justify; ">The panelists are:</p>
<ul>
<li>Ann Cavoukian</li>
<li>Rahul Matthan</li>
<li>Vishnu Shankar</li>
<li>Rob Sherman</li>
<li>Sunil Abraham</li>
</ul>
<p style="text-align: justify; ">Chaired by B.N. Srikrishna, former judge, Supreme Court of India</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/global-technology-summit-2017'>http://editors.cis-india.org/internet-governance/news/global-technology-summit-2017</a>
</p>
No publisherAdminInternet GovernanceBig Data2017-12-05T13:47:57ZNews Item#NAMAprivacy: Data standards for IoT and home automation systems
http://editors.cis-india.org/internet-governance/news/medianama-october-18-2017-namaprivacy-data-standards-for-iot
<b>On 5th October, MediaNama held a #NAMAprivacy conference in Bangalore focused on Privacy in the context of Artificial Intelligence, Internet of Things (IoT) and the issue of consent, supported by Google, Amazon, Mozilla, ISOC, E2E Networks and Info Edge, with community partners HasGeek and Takshashila Institution. Part 1 of the notes from the discussion on IoT:</b>
<p style="text-align: justify; ">Link to the original published by Medianama on October 18 <a class="external-link" href="https://www.medianama.com/2017/10/223-namaprivacy-data-standards-for-iot/">here</a></p>
<hr />
<p style="text-align: justify; ">The second session of the #NAMAprivacy in Bangalore dealt with the data privacy in the Internet of Things (IoT) framework. All three panelists for the session – <b>Kiran Jonnalagadda from HasGeek, Vinayak Hegde, a big data consultant working with ZoomCar and Rohini Lakshane a policy researcher from CIS</b> – said that they were scared about the spread of IoT at the moment. This led to a discussion on the standards which will apply to IoT, still nascent at this stage, and how it could include privacy as well.</p>
<div class="pBFsgLLI" style="text-align: justify; ">
<div align="center">
<div id="div-gpt-ad-1506358046991-0"></div>
</div>
</div>
<p style="text-align: justify; "><img class="size-full wp-image-176794 aligncenter" height="501" src="https://i2.wp.com/www.medianama.com/wp-content/uploads/IOT-panel-Namaprivacy-e1508321963437.jpg?resize=750%2C501&ssl=1" width="750" /></p>
<div class="gCmHYOrN" style="text-align: justify; "></div>
<p style="text-align: justify; ">Hedge, a volunteer with the Internet Engineering Task Force (IETF) which was instrumental in developing internet protocols and standards such as DNS, TCP/IP and HTTP, said that IETF took a political stand recently when it came to privacy. “One of the discussions in the IETF was whether security is really important? For a long time, the pendulum swung the other way and said that it’s important and that it’s not big enough a trade-off until the bomb dropped with the Snowden revelations. <b>The IETF has always avoided taking any political stance. But for the first time, they did take a political position and they published a request for comments which said: “Pervasive monitoring is an attack on the Internet” and that has become a guiding standard for developing the standards,</b>” he explained.</p>
<p style="text-align: justify; ">He added that this led the development of new standards which took privacy into consideration by default.</p>
<blockquote style="text-align: justify; ">
<p>“The repercussions has been pervasive across all the layers of the stack whether it is DNS and the development of DNS Sec. The next version of HTTP, does not actually mandate encryption but if you look at all the implementation on the browser side, all of them without exception have incorporated encryption,” he added.</p>
</blockquote>
<p style="text-align: justify; "><img class="size-full wp-image-176747 aligncenter" height="500" src="https://i2.wp.com/www.medianama.com/wp-content/uploads/NAMA-Data-Protection-Bangalore-93-e1508322824147.jpg?resize=750%2C500&ssl=1" width="750" /></p>
<p style="text-align: justify; ">Rohini added that discussion around the upcoming 5G standard, where large-scale IoT will be deployed, also included increased emphasis on privacy. “It is essentially a lot of devices connected to the Internet and talking to each other and the user. The standards for security and privacy for 5G are being built and some of them are in the process of discussion. Different standard-setting bodies have been working on them and there is a race of sorts for setting them up by stakeholders, technology companies, etc to get their tech into the standard,” she said.</p>
<p style="text-align: justify; ">“<b>The good thing about those is that they will have time to get security and privacy. Here, I would like to mention <a href="https://ict-rerum.eu/">RERUM</a> which is formed from a mix of letters which stands for Reliable, Resilient, and Secure IoT for smart cities being piloted in the EU. </b>It essentially believes that security should include reliability and privacy by design. This pilot project was thought to allow IoT applications to consider security and privacy mechanisms early in the design, so that they could balance reliability. Because once a standard is out or a mechanism is out, and you implement something as large as a smart city, it is very difficult to retrofit these considerations,” she explained.</p>
<p style="text-align: justify; "><img class="size-full wp-image-176796 aligncenter" height="499" src="https://i2.wp.com/www.medianama.com/wp-content/uploads/Rohini-Lakshane-CIS-Namaprivacy-e1508322694320.jpg?resize=750%2C499&ssl=1" width="750" /></p>
<h2 style="text-align: justify; ">Privacy issues in home automation and IoT</h2>
<p style="text-align: justify; ">Rohini pointed out a report which illustrates the staggering amount of data collection which will be generated by home automation. “I was looking for figures, and I found an FTC report published in 2015 where one IoT company revealed in a workshop that it <b>provides home automation to less than 10,000 households but all of them put together account for 150 million data points per day.</b> So that’s one data point for every six seconds per household. So this is IoT for home automation and there is IoT for health and fitness, medical devices, IoT for personal safety, public transport, environment, connected cars, etc.”</p>
<p style="text-align: justify; ">In this sort of situation, the data collected could be used for harms that users did not account for.</p>
<blockquote style="text-align: justify; ">
<p>“I received some data a couple of years back and the data was from a water flowmeter. It was fitted to a villa in Hoskote and the idea was simple where you could measure the water consumption in the villa and track the consumption. So when I received the data, I figured out by just looking at the water consumption, you can see how many people are in the house, when they get up at night, when they go out, when they are out of station. All of this data can be misused. Data is collected specifically for water consumption and find if there are any leakages in the house. But it could be used for other purposes,” <b>Arvind P from Devopedia</b> said.</p>
</blockquote>
<p style="text-align: justify; "><img class="size-full wp-image-176800 aligncenter" height="499" src="https://i1.wp.com/www.medianama.com/wp-content/uploads/Arvind-Devopedia-Namaprivcay-e1508323377344.jpg?resize=750%2C499&ssl=1" width="750" /></p>
<p style="text-align: justify; "><b>Pranesh Prakash, policy director at Centre for Internet and Society (CIS)</b>, also provided an example of a Twitter handle called “should I be robbed now” where it correlates a user’s vacation pictures says that they could be robbed. “What we need to remember is that a lot of correlation analysis is not just about the analysis but it is also about the use and misuse of it. A lot of that use and misuse is non-transparent. Not a single company tells you how they use your data, but do take rights on taking your data,” he added.</p>
<p style="text-align: justify; "><img class="size-full wp-image-176801 aligncenter" height="501" src="https://i1.wp.com/www.medianama.com/wp-content/uploads/Pranesh-Prakash-Namaprivacy-e1508324108535.jpg?resize=750%2C501&ssl=1" width="750" /></p>
<p style="text-align: justify; ">Vinayak Hedge also added that the governments are using similar methods of data tracking to catch bitcoin miners in China and Venezuela from smart meters.</p>
<p style="text-align: justify; ">“In China, there are all these bitcoin miners. I was reading this story in Venezuela, where bitcoin mining is outlawed. <b>The way they’re catching these bitcoin miners is by looking at their electricity consumption. Bitcoin mining uses a huge amount of power and computing capacity.</b> And people have come out with ingenious ways of getting around it. They will draw power from their neighbours or maybe from an industrial setting. This could be a good example for a privacy-infringing activity.”</p>
<h2 style="text-align: justify; "><b>Pseudonymization</b></h2>
<p style="text-align: justify; "><b>Srinivas P, head of security at Infosys</b>, pointed out that a possible solution to provide privacy in home automation systems could be the concept of pseudonymity. <b>Pseudonymization</b> is a procedure by which the most identifying fields within a data record are replaced by one or more artificial identifiers or pseudonyms.</p>
<p style="text-align: justify; ">“There are a number of home automation systems which are similar to NEST, which is extensively used in Silicon Valley homes, that connect to various systems. For example, when you are approaching home, it will know when to switch on your heating system or AC based on the weather. And it also has information on who stays in the house and what room and what time they sleep. And in a the car, it gives a full real-time profile about the situation at home. It can be a threat if it is hacked. This is a very common threat that is being talked about and how to introduce pseudo-anonymity. When we use these identifiers, and when the connectivity happens, how do we do so that the name and user are not there? Pseudonymity can be introduced so that it becomes difficult for the hacker to decipher who this guy is,” Srinivas added.</p>
<h2 style="text-align: justify; "><b>Ambient data collection</b></h2>
<p style="text-align: justify; ">With IoT, it has never been able to capture ambient data. <b>Ambient data</b> <b>is information that lies in areas not generally accessible to the user.</b> An example for this is how users get traffic data from Internet companies. Kiran Jonnalagadda explained how this works:</p>
<blockquote style="text-align: justify; ">
<p>“When you look at traffic data on a street map, where is that data coming from? <b>It’s not coming from the fact that there is an app on the phone constantly transmitting data from the phone. It’s coming from the fact that cell phone towers record who is coming to them and you know if the cell phone tower is facing the road, and it has so many connections on it, you know that traffic is at a certain level in that area</b>. Now as a user of the map, you are talking to a company which produces this map and it is not a telecom company. Someone who is using a phone is only dealing with a telecom company and how does this data transfer happen and how much user data is being passed on to the last mile user who is actually holding the phone.”</p>
</blockquote>
<p style="text-align: justify; "><img class="size-full wp-image-176802 aligncenter" height="501" src="https://i0.wp.com/www.medianama.com/wp-content/uploads/Kiran-Namaprivacy-e1508324684657.jpg?resize=750%2C501&ssl=1" width="750" /></p>
<p style="text-align: justify; ">Jonnalagadda stressed on the need for people to ask who is aggregating this ambient data.</p>
<p style="text-align: justify; ">“Now obviously, when you look at the map, you don’t get to see, who is around you. And that would be a clear privacy violation and you only get to see the fact that traffic is at a certain level of density around the street around you. But at what point is the aggregation of data happening from an individually identifiable phone to just a red line or a green line indicating the traffic in an area. We also need to ask who is doing this aggregation. Is it happening on the telecom level? Is it happening on the map person level and what kind of algorithms are required that a particular phone on a cell phone network represents a moving vehicle or a pedestrian? Can a cell phone company do that or does a map company do that? If you start digging and see at what point is your data being anonymized and who is responsible for anonmyzing it and you think that this is the entity that is supposed to be doing it, we start realizing that it is a lot more complicated and a lot more pervasive than we thought it would be,” he said.</p>
<p style="text-align: justify; "><b>#NAMAprivacy Bangalore:</b></p>
<ul style="text-align: justify; ">
<li>Will artificial Intelligence and Machine Learning kill privacy? [<a href="https://www.medianama.com/2017/10/223-namaprivacy-artificial-intelligence-privacy/">read</a>]</li>
<li>Regulating Artificial Intelligence algorithms [<a href="https://www.medianama.com/2017/10/223-namaprivacy-regulating-artificial-intelligence-algorithms/">read</a>]</li>
<li>Data standards for IoT and home automation systems [<a href="https://www.medianama.com/2017/10/223-namaprivacy-data-standards-for-iot/">read</a>]</li>
<li>The economics and business models of IoT and other issues [<a href="https://www.medianama.com/2017/10/223-namaprivacy-economics-and-business-models-of-iot/">read</a>]</li>
</ul>
<p style="text-align: justify; "><b>#NAMAprivacy Delhi:</b></p>
<ul style="text-align: justify; ">
<li>Blockchains and the role of differential privacy [<a href="https://www.medianama.com/2017/09/223-namaprivacy-blockchains-role-differential-privacy/">read</a>]</li>
<li>Setting up purpose limitation for data collected by companies [<a href="https://www.medianama.com/2017/09/223-namaprivacy-setting-purpose-limitation-data-collected-companies/">read</a>]</li>
<li>The role of app ecosystems and nature of permissions in data collection [<a href="https://www.medianama.com/2017/09/223-namaprivacy-role-app-ecosystems-nature-permissions-data-collection/">read</a>]</li>
<li>Rights-based approach vs rules-based approach to data collection [<a href="https://www.medianama.com/2017/09/223-namaprivacy-rights-based-approach-vs-rules-based-approach-data-collection/">read</a>]</li>
<li>Data colonisation and regulating cross border data flows [<a href="https://www.medianama.com/2017/09/223-namaprivacy-data-colonisation-and-regulating-cross-border-data-flows/">read</a>]</li>
<li>Challenges with consent; the Right to Privacy judgment [<a href="https://www.medianama.com/2017/09/223-consent-challenges-privacy-india-namaprivacy/">read</a>]</li>
<li>Consent and the need for a data protection regulator [<a href="https://www.medianama.com/2017/09/223-privacy-india-consent-data-protection-regulator-namaprivacy/">read</a>]</li>
<li>Making consent work in India [<a href="https://www.medianama.com/2017/09/223-privacy-india-consent-namaprivacy/">read</a>]</li>
</ul>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/medianama-october-18-2017-namaprivacy-data-standards-for-iot'>http://editors.cis-india.org/internet-governance/news/medianama-october-18-2017-namaprivacy-data-standards-for-iot</a>
</p>
No publisherAdminInternet GovernanceBig Data2017-11-08T02:15:52ZNews ItemBig Data for governance
http://editors.cis-india.org/internet-governance/news/telangana-today-november-8-2017-alekhya-hanumanthu-big-data-for-governance
<b>Recent times have witnessed an explosion of data as users started leaving a huge data footprint everywhere they go. Interestingly, this period has seen a phenomenal increase in computing power couple by a drop in costs of storage.</b>
<p style="text-align: justify; ">The article by Alekhya Hanumanthu was published in <a class="external-link" href="https://telanganatoday.com/big-data-governance">Telangana Today</a> on November 4, 2017.</p>
<hr style="text-align: justify; " />
<p style="text-align: justify; ">India is now sitting on the data so generated and subjecting it to data analytics for uses in various sectors like insurance, education, healthcare, governance, so on and so forth.</p>
<p style="text-align: justify; ">According to Centre for Internet and Society (CIS), in 2015, the Government of Narendra Modi launched Digital India Programme to ensure availability of government services to citizens electronically by improving online infrastructure and Internet connectivity.</p>
<p style="text-align: justify; ">Amongst other things, e-Governance and e-Kranti intend to reform governance through technology and enable electronic delivery of services. Needless to say, it will involve large scale digitisation, electronic collection of data from residents and processing. The Big data so created will help policy making evolve into a data backed, action oriented initiative with accountability asserted where it is due.</p>
<h3 style="text-align: justify; ">Let’s take a look at some Big Data based initiatives underway according to analyticsindiamag:</h3>
<p style="text-align: justify; "><b>Project insight:</b> Undertaken up by Indian tax agencies, Project Insight is an advanced analytical tool that is a comprehensive platform that encourages compliance of tax while at the same time it prevents non-compliance. Significantly, it will be used to detect fraud, support investigations and provide insights for policy making. For instance, it will detect the social media activity of a person to glean their spending and check if it is commensurate with the tax they have paid during that year. Needless to say, this will also unearth sources of black money.</p>
<p style="text-align: justify; "><b>Economic Development Board in Andhra:</b> CORE-CM Office Realtime Executive Dashboard is an integrated dashboard established to monitor category-wise key performance indicators of various departments/schemes in real time. Users can check key performance indicators of various departments, schemes, initiatives, programmes, etc. With a panoply of services information ranging from Women and Child Welfare to Street lights monitoring, it has become an exemplary role model of governance.</p>
<p style="text-align: justify; "><b>Geo-tagging of assets under Mahatma Gandhi National Rural Employment Guarantee Act (MGNREGA):</b> Under the guidance of Narendra Modi, online monitoring of assets to check leakages Ministry of Rural development was started. To achieve this, they were tied up with ISRO and National Informatics Centre to geo tag MGNREGA assets. According to India Today, the assets created range from plantations, rural infrastructure, water harvesting structures, flood control measures such as check dams etc. To do this, a junior engineer takes a photo of an asset and uploads it on the Bhuvan web portal run by ISRO’s National Remote Sensing Centre via a mobile app. Once a photo is uploaded, time and location gets encrypted automatically. Thus, the Government hopes to hold an ironclad control of the resources thus disseminated.</p>
<p style="text-align: justify; "><b>CAG’s centre for Data Management and Analytics:</b> According to Comptroller and Auditor General of India, The CAG’s Centre for Data Management and Analytics (CDMA) is going to play a catalytic role to synthesise and integrate relevant data into auditing process. According to an announcement on National Informatics Centre (NIC), it aims to build up capacity in the Indian Audit and Accounts Department in Big Data Analytics to explore the data rich environment at the Union and State levels. What’s more, this initiative of CAG of India, puts it amongst the pioneers in institutionalising data analytics in government audit in the international community.</p>
<p style="text-align: justify; "><b>Task Force to spruce up Employment Data:</b> The data provided by Labour Bureau is limited and not timely enough for policymakers to assess the need for job creation. To address this gap, the Government has set up a committee tasked to fill the employment data gap and ensure the timely availability of reliable information regarding job creation. Thus the top line of Government has a direct view of where the employment gaps are so that it can facilitate creation of appropriate jobs.</p>
<h3 style="text-align: justify; ">What’s the big picture?</h3>
<p style="text-align: justify; ">Policy making and governance by Indian government have traditionally been rife with red tape, bureaucracy and corruption. Lack of accountability on part of Government workforce not only impacted the quantity and quality of work delivered but also invited corrupt practices and leakages. So, Big data is a welcome change in direction.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/telangana-today-november-8-2017-alekhya-hanumanthu-big-data-for-governance'>http://editors.cis-india.org/internet-governance/news/telangana-today-november-8-2017-alekhya-hanumanthu-big-data-for-governance</a>
</p>
No publisherAdminInternet GovernanceBig Data2017-11-08T01:42:18ZNews ItemMediaNama - #NAMAprivacy: The Future of User Data (Delhi, Sep 6)
http://editors.cis-india.org/internet-governance/news/medianama-namaprivacy-the-future-of-user-data-delhi-sep-6
<b>MediaNama is hosting a full day conference on "the future of user data in India", on the 6th of September 2017, which is particularly significant given the recent Supreme Court ruling on the fundamental right to privacy, and two government consultations: one at the TRAI, and another at MEITY. This discussion is supported by Facebook, Google, and Microsoft. Sumandro Chattapadhyay, Research Director, will participate as a speaker in the session titled "regulating storage, sharing and transfer of data."</b>
<p> </p>
<h4>Details</h4>
<p>Time: September 6th 2017, 9 am to 4:30 pm</p>
<p>Venue: Gulmohar Hall, India Habitat Centre, Lodhi Road (please enter from Gate #3)</p>
<p>Agenda: <a href="https://www.medianama.com/2017/08/223-agenda-namaprivacy-future-of-user-data/">https://www.medianama.com/2017/08/223-agenda-namaprivacy-future-of-user-data/</a></p>
<h4>Announced Speakers</h4>
<ul><li>Chinmayi Arun, Centre for Communication Governance at NLU Delhi</li>
<li>Malavika Raghavan, IFMR Finance Foundation</li>
<li>Renuka Sane, NIPFP</li>
<li>Smitha Krishna Prasad, Centre for Communication Governance at NLU Delhi</li>
<li>Ananth Padmanabhan, Carnegie India</li>
<li>Avinash Ramachandra, Amazon</li>
<li>Hitesh Oberoi, Naukri</li>
<li>Jochai Ben-Avie, Mozilla</li>
<li>Mrinal Sinha, Mobikwik</li>
<li>Murari Sreedharan, Bankbazaar</li>
<li>Sumandro Chattapadhyay, Centre for Internet and Society</li></ul>
<h4>Facilitators</h4>
<ul><li>Saikat Datta, Asia Times Online</li>
<li>Shashidar KJ, MediaNama</li>
<li>Nikhil Pahwa, MediaNama</li></ul>
<h4>Attendees</h4>
<p>We have confirmed 140+ attendees from: Adobe, Amber Health, Amazon, APCO Worldwide, Bank Bazaar, Bloomberg-Quint, Blume Ventures, Broadband India Forum, Business Standard, BuzzFeed News, CCOAI, CEIP, Change Alliance, Chase India, CIS, CNN News18, DEF, Deloitte, DNA, DSCI, E2E Networks, British High Commission, Eurus Network Services, FICCI, Firefly Networks, Flipkart, Forrester Research, Fortumo, DoT, MEITY, IAMAI, IBM, ICRIER, IFMR Finance Foundation, IIMC, Indian Law Institute, Indic Project, Info Edge, ISPAI, IT for Change, ITU-APT, Jamia Millia Islamia, Jindal Global Law School, Mimir Technologies, Mozilla, Newslaundry, NIPFP, Nishith Desai Associates, NIXI, NLU-Delhi, ORF, Paytm, PLR Chambers, PRS Legislative Research, Publicis Groupe, Quartz India, Reliance Jio, Reuters, Saikrishna & Associates, Scroll.in, SFLC.in, Spectranet, The Economics Times, The Indian Express, The Times of India, The Wire, Times Internet, Twitter, and more.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/medianama-namaprivacy-the-future-of-user-data-delhi-sep-6'>http://editors.cis-india.org/internet-governance/news/medianama-namaprivacy-the-future-of-user-data-delhi-sep-6</a>
</p>
No publishersumandroBig DataDigital EconomyPrivacyInternet GovernanceData GovernanceData ProtectionDigital Rights2017-09-05T10:22:12ZBlog EntryCISxScholars Delhi - Harsh Gupta - FAT ML for Lawyers and Lawmakers (June 29, 5:30 pm)
http://editors.cis-india.org/raw/cisxscholars-harsh-gupta-machine-learning-for-lawyers-and-lawmakers-20170629
<b>We are proud to announce that Harsh Gupta will discuss "FAT ML (Fairness, Accountability, and Transparency in Machine Learning) for Lawyers and Lawmakers" at the CIS office in Delhi on Thursday, June 29, at 5:30 pm. This will be a two and half hour session: beginning with a 45 minute talk, followed by 15 minute break, another talk for 45 minutes, and then a discussion session. Please RSVP if you are joining us: <raw@cis-india.org>. </b>
<p> </p>
<p><em>CISxScholars are informal events organised by CIS for presentation, discussion, and exchange of academic research and policy analysis.</em></p>
<hr />
<h3><strong>FAT ML (Fairness, Accountability, and Transparency in Machine Learning) for Lawyers and Lawmakers</strong></h3>
<p>From tagging people in photos to determining risk of loan defaults, use of data based tools is affecting more and areas of our lives. In some areas there have been very successful applications of such tools, in others areas they has been found to not only reflect the existing bias and discrimination found in today's society but also exaggerate it.</p>
<h3><strong>Harsh Gupta</strong></h3>
<p>Harsh Gupta is a recent graduate from IIT Kharagpur with B.Sc and M.Sc in Mathematics and Computing and will be joining JP Morgan and Chase as a data scientist. He completed his master's thesis in "Discrimination Aware Machine Learning". He was also an intern at The Center for Internet and Society during summer of 2016.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/cisxscholars-harsh-gupta-machine-learning-for-lawyers-and-lawmakers-20170629'>http://editors.cis-india.org/raw/cisxscholars-harsh-gupta-machine-learning-for-lawyers-and-lawmakers-20170629</a>
</p>
No publishersumandroFAT MLCISxScholarsBig DataMachine LearningResearchers at WorkEventArtificial Intelligence2017-06-27T09:16:48ZEventPrivacy in the Age of Big Data
http://editors.cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data
<b>Personal data is freely accessible, shared and even sold, and those to whom this information belongs have little control over its flow.</b>
<p style="text-align: justify; ">The article was published in the <a class="external-link" href="http://www.asianage.com/india/all-india/100417/privacy-in-the-age-of-big-data.html">Asian Age</a> on April 10, 2017.</p>
<hr style="text-align: justify; " />
<p style="text-align: justify; ">In 2011 it was estimated that the quantity of data produced globally surpassed 1.8 zettabyte. By 2013, it had increased to 4 zettabytes. This is a result of digital services which involve constant data trails left behind by human activity. This expansion in the volume, velocity, and variety of data available, together with the development of innovative forms of statistical analytics on the data collected, is generally referred to as “Big Data”. Despite significant (though largely unrealised) promises about Big Data, which range from improved decision-making, increased efficiency and productivity to greater personalisation of services, concerns remain about the impact of such datafication of all human activity on an individual’s privacy. Privacy has evolved into a sweeping concept, including within its scope matters pertaining to control over one’s body, physical space in one’s home, protection from surveillance, and from search and seizure, protection of one’s reputation as well as one’s thoughts. This generalised and vague conception of privacy not only comes with great judicial discretion, it also thwarts a fair understanding of the subject. Robert Post called privacy a concept so complex and “entangled in competing and contradictory dimensions, so engorged with various and distinct meanings”, that he sometimes “despairs whether it can be usefully addressed at all”.</p>
<p style="text-align: justify; ">This also leaves the idea of privacy vulnerable to considerable suspicion and ridicule. However, while there is a lack of clarity over the exact contours of what constitutes privacy, there is general agreement over its fundamental importance to our ability to lead whole lives. In order to understand the impact of datafied societies on privacy, it is important to first delve into the manner in which we exercise our privacy. The ideas of privacy and data management that are prevalent can be traced to the Fair Information Practice Principles (FIPP). These principles are the forerunners of most privacy regimes internationally, such as the OECD Privacy Guidelines, APEC Framework, or the nine National Privacy Principles articulated by the Justice A.P. Shah Committee Report. All of these frameworks have rights to notice, consent and correction, and how the data may be used, as their fundamental principles. It makes the data subject to the decision-making agent about where and when her/his personal data may be used, by whom, and in what way. The individual needs to be notified and his consent obtained before his personal data is used. If the scope of usage extends beyond what he has agreed to, his consent will be required for the increased scope.</p>
<p style="text-align: justify; ">In theory, this system sounds fair. Privacy is a value tied to the personal liberty and dignity of an individual. It is only appropriate that the individual should be the one holding the reins and taking the large decisions about the use of his personal data. This makes the individual empowered and allows him to weigh his own interests in exercising his consent. The allure of this paradigm is that in one elegant stroke, it seeks to ensure that consent is informed and free and also to implement an acceptable trade-off between privacy and competing concerns. This approach worked well when the number of data collectors were less and the uses of data was narrower and more defined. Today’s infinitely complex and labyrinthine data ecosystem is beyond the comprehension of most ordinary users. Despite a growing willingness to share information online, most people have no understanding of what happens to their data.</p>
<p style="text-align: justify; ">The quantity of data being generated is expanding at an exponential rate. From smartphones and televisions, trains and airplanes, sensor-equipped buildings and even the infrastructures of our cities, data now streams constantly from almost every sector and function of daily life, “creating countless new digital puddles, lakes, tributaries and oceans of information”. The inadequacy of the regulatory approaches and the absence of a comprehensive data protection regulation is exacerbated by the emergence of data-driven business models in the private sector and the adoption of data-driven governance approach by the government. The Aadhaar project, with over a billion registrants, is intended to act as a platform for a number of digital services, all of which produce enormous troves of data. The original press release by the Central Government reporting the approval by the Cabinet of Ministers of the Digital India programme, speaks of “cradle to grave” digital identity as one of its vision areas.</p>
<p style="text-align: justify; ">While the very idea of the government wanting to track its citizens’ lives from cradle to grave is creepy enough in itself, let us examine for a minute what this form of datafied surveillance will entail. A host of schemes under Digital India shall collect and store information through the life cycle of an individual. The result, as we can see, is building databases on individuals, which when combined, will provide a 360 degree view into the lives of individuals. Alongside the emergence of India Stack, a set of APIs built on top of the Aadhaar, conceptualised by iSPIRT, a consortium of select IT companies from India, to be deployed and managed by several agencies, including the National Payments Corporation of India, promises to provide a platform over which different private players can build their applications.</p>
<p style="text-align: justify; ">The sum of these interconnected parts will lead to a complete loss of anonymity, greater surveillance and impact free speech and individual choice. The move towards a cashless economy — with sharp nudges from the government — could lead to lack of financial agencies in case of technological failures as has been the case in experiments with digital payments in Africa. Lack of regulation in emerging data driven sectors such as Fintech can enable predatory practices where right to remotely deny financial services can be granted to private sector companies. An architecture such as IndiaStack enables datafication of financial transactions in a way that enables linked and structured data that allows continued use of the transaction data collected. It is important to recognise that at the stage of giving consent, there are too many unknowns for us to make informed decisions about the future uses of our personal data. Despite blanket approvals allowing any kind of use granted contractually through terms of use and privacy policies, there should be legal obligations overriding this consent for certain kinds of uses that may require renewed consent.</p>
<p style="text-align: justify; "><b>Biometrics-based identification in UK: </b>In 2005, researchers from London School of Economics and Political Science came out with a detailed report on the UK Identity Cards Bill (‘UK Bill’) — the proposed legislation for a national identification system based on biometrics. The project also envisaged a centralised database (like India) that would store personal information along with the entire transaction history of every individual. The report pointed strongly against the centralising storage of information and suggested other alternatives such as a system based on smartcards (where biometrics are stored on the card itself) or offline biometric-reader terminals.</p>
<p style="text-align: justify; ">As per the report, the alternatives would also have been cheaper as neither required real-time online connectivity. In India, online authentication is a far greater challenge. According to Network Readiness Index, 2016, India ranks 91, whereas UK is placed eight. Poor Internet connectivity can raise a lot of problems in the future including paralysis of transactions. The UK identification project was subsequently discarded as a result of the privacy and cost considerations raised in this report.</p>
<h3 style="text-align: justify; ">Aadhaar: Privacy concerns</h3>
<ol style="text-align: justify; ">
<li>Once the data is collected through National Information Utilities, it will be privatised and controlled by private utilities.</li>
<li>Once an individual’s data is entered in the system, it cannot be deleted. That individual will have no control over it.</li>
<li>Aadhaar Data (Demographic details along with photographs) are shared/transferred with the private entities including telecom companies as per the Aadhaar (Targeted delivery of Financial and other subsidies, benefits and services) Act, 2016 with the consent of Aadhaar number holder to fulfil their e-KYC requirements. The data is shared in encrypted form through secured channel.</li>
<li>Aadhaar Enabled Payment System (AEPS) on which 119 banks are live.</li>
<li>More than 33.87 crore transactions have taken place through AEPS, which was only 46 lakhs in May 2014.</li>
<li>As on 30-9-2016, 78 government schemes were linked to Aadhaar.</li>
<li>The Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, 2016, provides that no core-biometric information (fingerprints, iris scan) shall be shared with anyone for any reason whatsoever (Sec 29) and that the biometric information shall not be used for any purpose other than generation of Aadhaar and authentication.</li>
<li>Access to the data repository of UIDAI, called the Central Identities Data Repository(CIDR), is provided to third parties or private companies.</li>
</ol>
<p style="text-align: justify; "><b>Central Monitoring System</b> (CMS) is already live in Delhi, New Delhi and Mumbai. Union minister Ravi Shankar Prasad revealed this in one of his replies in the Lok Sabha last year. CMS has been set up to automate the process of Lawful Interception & Monitoring of telecommunications.</p>
<p style="text-align: justify; "><b>Central Monitoring System</b> (CMS) is already live in Delhi, New Delhi and Mumbai. Union minister Ravi Shankar Prasad revealed this in one of his replies in the Lok Sabha last year. CMS has been set up to automate the process of Lawful Interception & Monitoring of telecommunications.</p>
<p style="text-align: justify; "><b>Lawful Intercept </b>and Monitoring (LIM) systems are used by the Indian Government to intercept records of voice, SMSes, GPRS data, details of a subscriber’s application and recharge history and call detail record (CDR) and monitor Internet traffic, emails, web-browsing, Skype and any other Internet activity of Indian users.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data'>http://editors.cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data</a>
</p>
No publisheramberInternet GovernanceAadhaarBig DataPrivacy2017-04-11T14:43:59ZBlog EntryExploring Big Data for Development: An Electricity Sector Case Study from India
http://editors.cis-india.org/raw/exploring-big-data-for-development-an-electricity-sector-case-study-from-india
<b>This working paper by Ritam Sengupta, Dr. Richard Heeks, Sumandro Chattapadhyay, and Dr. Christopher Foster draws from the field study undertaken by Ritam Sengupta, and is published by the Global Development Institute, University of Manchester. The field study was commissioned by the CIS, with support from the University of Manchester and the University of Sheffield.</b>
<p> </p>
<h4>Download the working paper: <a href="http://hummedia.manchester.ac.uk/institutes/gdi/publications/workingpapers/di/di_wp66.pdf" target="_blank">PDF</a></h4>
<hr />
<h3><strong>Abstract</strong></h3>
<p>This paper presents exploratory research into “data-intensive development” that seeks to inductively identify issues and conceptual frameworks of relevance to big data in developing countries. It presents a case study of big data innovations in “Stelcorp”; a state electricity corporation in India. In an attempt to address losses in electricity distribution, Stelcorp has introduced new digital meters throughout the distribution network to capture big data, and organisation-wide information systems that store and process and disseminate big data.</p>
<p>Emergent issues are identified across three domains: implementation, value and outcome. Implementation of big data has worked relatively well but technical and human challenges remain. The advent of big data has enabled some – albeit constrained – value addition in all areas of organisational operation: customer billing, fault and loss detection, performance measurement, and planning. Yet US$ tens of millions of investment in big data has brought no aggregate improvement in distribution losses or revenue collection. This can be explained by the wider outcome, with big data faltering in the face of external politics; in this case the electoral politics of electrification. Alongside this reproduction of power, the paper also reflects on the way in which big data has enabled shifts in the locus of power: from public to private sector; from labour to management; and from lower to higher levels of management.</p>
<p>A number of conceptual frameworks emerge as having analytical power in studying big data and global development. The information value chain model helps track both implementation and value-creation of big data projects. The design-reality gap model can be used to analyse the nature and extent of barriers facing big data projects in developing countries. And models of power – resource dependency, epistemic models, and wider frameworks – are all shown as helping understand the politics of big data.</p>
<hr />
<em>Cross-posted from <a href="http://www.gdi.manchester.ac.uk/research/publications/other-working-papers/di/di-wp66/">University of Manchester</a>.</em>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/exploring-big-data-for-development-an-electricity-sector-case-study-from-india'>http://editors.cis-india.org/raw/exploring-big-data-for-development-an-electricity-sector-case-study-from-india</a>
</p>
No publishersumandroBig DataData SystemsResearchers at WorkResearchFeaturedPublicationsBig Data for Development2019-03-16T04:33:15ZBlog EntryThe Fintech Disruption - Innovation, Regulation, and Transformation
http://editors.cis-india.org/internet-governance/news/the-fintech-disruption-innovation-regulation-and-transformation
<b>Sumandro Chattapadhyay attended an event organized by Carnegie India on March 28, 2017. The aim of the initiative was that inclusive and sustainable regulations require constant interaction between policy makers and industry. </b>
<p style="text-align: justify; ">Select senior level policymakers, leaders from the banking industry and dynamic start-up founders and innovators gathered for the meet-up. The intention is to follow up on the discussions and debates from the round-table and come out with a detailed report on Fintech Regulations based on the research and conversations with start-ups and other valuable stakeholders.</p>
<p><a class="external-link" href="http://cis-india.org/internet-governance/files/fintech-conference-agenda">See the conference agenda</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/the-fintech-disruption-innovation-regulation-and-transformation'>http://editors.cis-india.org/internet-governance/news/the-fintech-disruption-innovation-regulation-and-transformation</a>
</p>
No publisherpraskrishnaInternet GovernanceBig Data2017-03-29T02:10:49ZNews ItemBenefits, Harms, Rights and Regulation: A Survey of Literature on Big Data
http://editors.cis-india.org/internet-governance/blog/benefits-harms-rights-and-regulation-survey-of-literature-on-big-data
<b>This survey draws upon a range of literature including news articles, academic articles, and presentations and seeks to disaggregate the potential benefits and harms of big data, organising them into several broad categories that reflect the existing scholarly literature. The survey also recognises the non-technical big data regulatory options which are in place as well as those which have been proposed by various governments, civil society groups and academics.</b>
<p>The survey was edited by Sunil Abraham, Elonnai Hickok and Leilah Elmokadem</p>
<hr />
<h3>Introduction</h3>
<p style="text-align: justify; ">In 2011, it was estimated that the quantity of data produced globally surpassed 1.8 zettabyte.By 2013 it had increased to 4 zettabytes. With the nascent development of the so-called ‘Internet of Things’ gathering pace, these trends are likely to continue. This expansion in the volume, velocity, and variety of data available, together with the development of innovative forms of statistical analytics, is generally referred to as “Big Data”; though there is no single agreed upon definition of the term. Although still in its initial stages, big data promises to provide new insights and solutions across a wide range of sectors, many of which would have been unimaginable even a decade ago.</p>
<p style="text-align: justify; ">Despite enormous optimism about the scope and variety of big data’s potential applications, many remain concerned about its widespread adoption, with some scholars suggesting it could generate as many harms as benefits. Most notably are the concerns about the inevitable threats to privacy associated with the generation, collection and use of large quantities of data. Concerns have also been raised regarding, for example, the lack of transparency around the design of algorithms used to process the data, over-reliance on big data analytics as opposed to traditional forms of analysis and the creation of new digital divides. The existing literature on big data is vast. However, many of the benefits and harms identified by researchers tend to focus on sector specific applications of Big Data analytics, such as predictive policing, or targeted marketing. Whilst these examples can be useful in demonstrating the diversity of big data’s possible applications, they do not offer a holistic perspective of the broader impacts of Big Data.</p>
<p style="text-align: justify; "><b><a class="external-link" href="http://cis-india.org/internet-governance/files/benefits-harms-rights-and-regulation-a-survey-of-literature-on-big-data">Click to read the full survey here</a><br /></b></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/benefits-harms-rights-and-regulation-survey-of-literature-on-big-data'>http://editors.cis-india.org/internet-governance/blog/benefits-harms-rights-and-regulation-survey-of-literature-on-big-data</a>
</p>
No publisherAmber Sinha, Vanya Rakesh, Vidushi Marda and Geethanjali JujjavarapuInternet GovernanceBig Data2017-03-23T02:17:56ZBlog EntryBig Data in Governance in India: Case Studies
http://editors.cis-india.org/internet-governance/blog/big-data-in-governance-in-india-case-studies
<b>This research seeks to understand the most effective way of researching Big Data in the Global South. Towards this goal, the research planned for the development of a Global South big data Research Network that identifies the potential opportunities and harms of big data in the Global South and possible policy solutions and interventions. </b>
<p style="text-align: justify; "><i>This work has been made possible by a grant from the John D. and Catherine T. MacArthur Foundation. The conclusions, opinions, or points of view expressed in the report are those of the authors and do not necessarily represent the views of the John D. and Catherine T. MacArthur Foundation</i>.</p>
<hr style="text-align: justify; " />
<h2 style="text-align: justify; ">Introduction</h2>
<p style="text-align: justify; ">The research was for a duration of 12 months and in form of an exploratory study which sought to understand the potential opportunity and harm of big data as well as to identify best practices and relevant policy recommendations. Each case study has been chosen based on the use of big data in the area and the opportunity that is present for policy recommendation and reform. Each case study will seek to answer a similar set of questions to allow for analysis across case studies.</p>
<h2 style="text-align: justify; ">What is Big Data</h2>
<p style="text-align: justify; ">Big data has been ascribed a number of definitions and characteristics. Any study of big data must begin with first conceptualizing defining what big data is. Over the past few years, this term has been become a buzzword, used to refer to any number of characteristics of a dataset ranging from size to rate of accumulation to the technology in use.<a href="#fn1" name="fr1">[1]</a></p>
<p style="text-align: justify; ">Many commentators have critiqued the term big data as a misnomer and misleading in its emphasis on size. We have done a survey of various definitions and understandings of big data and we document the significant ones below.</p>
<h3 style="text-align: justify; ">Computational Challenges</h3>
<p style="text-align: justify; ">The condition of data sets being large and taxing the capacities of main memory, local disk, and remote disk have been seen as problems that big data solves. While this understanding of big data focusses only on one of its features—size, other characteristics posing a computational challenge to existing technologies have also been examined. The (US) National Institute of Science and Technology has defined big data as data which “exceed(s) the capacity or capability of current or conventional methods and systems.” <a href="#fn2" name="fr2">[2]</a></p>
<p>These challenges are not merely a function of its size. Thomas Davenport provides a cohesive definition of big data in this context. According to him, big data is “data that is too big to fit on a single server, too unstructured to fit into a row-and-column database, or too continuously flowing to fit into a static data warehouse.” <a href="#fn3" name="fr3">[3]</a></p>
<h3 style="text-align: justify; ">Data Characteristics</h3>
<p style="text-align: justify; ">The most popular definition of big data was put forth in a report by Meta (now Gartner) in 2001, which looks at it in terms of the three 3V’s—volume<a href="#fn4" name="fr4">[4]</a>, velocity and variety. It is high-volume, high-velocity and/or high-variety information assets that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation.<a href="#fn5" name="fr5">[5] </a></p>
<p style="text-align: justify; ">Aside from volume, velocity and variety, other defining characteristics of big data articulated by different commentators are— exhaustiveness,<a href="#fn6" name="fr6">[6]</a> granularity (fine grained and uniquely indexical),<a href="#fn7" name="fr7">[7] </a>scalability,<a href="#fn8" name="fr8">[8] </a>veracity,<a href="#fn9" name="fr9">[9] </a>value<a href="#fn10" name="fr10">[10] </a>and variability.<a href="#fn11" name="fr11">[11] </a>It is highly unlikely that any data-sets satisfy all of the above characteristics. Therefore, it is important to determine what permutation and combination of these gamut of attributes lead us to classifying something as big data.</p>
<h3 style="text-align: justify; ">Qualitative Attributes</h3>
<p>Prof. Rob Kitchin has argued that big data is qualitatively different from traditional, small data. Small data has used sampling techniques for collection of data and has been limited in scope, temporality and size, and are “inflexible in their administration and generation.”<a href="#fn12" name="fr12">[12] </a></p>
<p style="text-align: justify; ">In this respect there are two qualitative attributes of big data which distinguish them from traditional data. First, the ability of big data technologies to accommodate unstructured and diverse datasets which hitherto were of no use to data processors is a defining feature. This allows the inclusion of many new forms of data from new and data heavy sources such as social media and digital footprints. The second attribute is the relationality of big data.<a href="#fn13" name="fr13">[13] </a></p>
<p style="text-align: justify; ">This relies on the presence of common fields across datasets which allow for conjoining of different databases. This attribute is usually a feature of not the size but the complexity of data enabling high degree of permutations and interactions within and across data sets.</p>
<h3 style="text-align: justify; ">Patterns and Inferences</h3>
<p style="text-align: justify; ">Instead of focussing on the ontological attributes or computational challenges of big data, Kenneth Cukier and Viktor Mayer Schöenberger define big data in terms of what it can achieve.<a href="#fn14" name="fr14">[14] </a></p>
<p style="text-align: justify; ">They defined big data as the ability to harness information in novel ways to produce useful insights or goods and services of significant value. Building on this definition, Rohan Samarajiva has categorised big data into non-behavioral big data and behavioral big data. The latter leads to insights about human behavior.<a href="#fn15" name="fr15">[15] </a></p>
<p style="text-align: justify; ">Samarajiva believes that transaction-generated data (commercial as well as non-commercial) in a networked infrastructure is what constitutes behavioral big data. Scope of Research The initial scope arrived at for this case-study on role of big data in governance in India focussed on the UID Project, the Digital India Programme and the Smart Cities Mission. Digital India is a programme launched by the Government of India to ensure that Government services are made available to citizens electronically by improving online infrastructure and by increasing Internet connectivity or by making the country digitally empowered in the field of technology.<a href="#fn16" name="fr16">[16] </a></p>
<p>The Programme has nine components, two of which focus on e-governance schemes. <b><a class="external-link" href="http://cis-india.org/internet-governance/files/big-data-compilation.pdf">Read More</a> </b>[PDF, 1948 Kb]</p>
<hr />
<p>[<a href="#fr1" name="fn1">1</a>]. Thomas Davenport, Big Data at Work: Dispelling the Myths, Uncovering the opportunities, Harvard Business Review Press, Boston, 2014.</p>
<p style="text-align: justify; ">[<a href="#fr2" name="fn2">2</a>]. MIT Technology Review, The Big Data Conundrum: How to Define It?, available at https://www. technologyreview.com/s/519851/the-big-data-conundrum-how-to-define-it/</p>
<p style="text-align: justify; ">[<a href="#fr3" name="fn3">3</a>]. Supra note 1.</p>
<p style="text-align: justify; ">[<a href="#fr4" name="fn4">4</a>]. What constitutes as high volume remains an unresolved matter. Intel defined Big Data volumes are emerging in organizations generating a median of 300 terabytes of data a week.</p>
<p style="text-align: justify; ">[<a href="#fr5" name="fn5">5</a>]. http://www.gartner.com/it-glossary/big-data/</p>
<p style="text-align: justify; ">[<a href="#fr6" name="fn6">6</a>]. Viktor Mayer Schöenberger and Kenneth Cukier, Big Data: A Revolution that will transform how we live, work and think” John Murray, London, 2013.</p>
<p style="text-align: justify; ">[<a href="#fr7" name="fn7">7</a>]. Rob Kitchin, The Data Revolution: Big Data, Open Data, Data Infrastructures and their consequences, Sage, London, 2014.</p>
<p style="text-align: justify; ">[<a href="#fr8" name="fn8">8</a>]. Nathan Marz and James Warren, Big Data: Principles and best practices of scalable realtime data systems, Manning Publication, New York, 2015.</p>
<p style="text-align: justify; ">[<a href="#fr9" name="fn9">9</a>]. Bernard Marr, Big Data: the 5 Vs everyone should know, available at https://www.linkedin. com/pulse/20140306073407-64875646-big-data-the-5-vs-everyone-must-know.</p>
<p style="text-align: justify; ">[<a href="#fr10" name="fn10">10</a>]. Id.</p>
<p style="text-align: justify; ">[<a href="#fr11" name="fn11">11</a>]. Eileen McNulty, Understanding Big Data: the 7 Vs, available at http://dataconomy.com/sevenvs-big-data/.</p>
<p style="text-align: justify; ">[<a href="#fr12" name="fn12">12</a>]. Supra Note 7.</p>
<p style="text-align: justify; ">[<a href="#fr13" name="fn13">13</a>]. Danah Boyd and Kate Crawford, Critical questions for big data. Information, Communication and Society 15(5): 662–679, available at https://www.researchgate.net/publication/281748849_Critical_questions_for_big_data_Provocations_for_a_cultural_technological_and_scholarly_ phenomenon</p>
<p style="text-align: justify; ">[<a href="#fr14" name="fn14">14</a>]. Supra Note 6.</p>
<p style="text-align: justify; ">[<a href="#fr15" name="fn15">15</a>]. Rohan Samarajiva, What is Big Data, available at http://lirneasia.net/2015/11/what-is-bigdata/.</p>
<p style="text-align: justify; ">[<a href="#fr16" name="fn16">16</a>]. http://www.digitalindia.gov.in/content/about-programme</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/big-data-in-governance-in-india-case-studies'>http://editors.cis-india.org/internet-governance/blog/big-data-in-governance-in-india-case-studies</a>
</p>
No publisherAmber Sinha, Vanya Rakesh and Vidushi Marda and Edited by Elonnai Hickok, Sumandro Chattapadhyay and Sunil AbrahamInternet GovernanceBig Data2017-02-26T16:24:11ZBlog EntryVidhi Doshi - Fingerprint Payments Prompt Privacy Fears in India (The Guardian)
http://editors.cis-india.org/internet-governance/news/vidhi-doshi-fingerprint-payments-prompt-privacy-fears-in-india-the-guardian
<b>This article by Vidhi Doshi on the use of Aadhaar-based payments by private companies in India was published by The Guardian on February 09, 2017. Sumandro Chattapadhyay is quoted in the article.</b>
<p>Originally published by <a href="https://www.theguardian.com/sustainable-business/2017/feb/09/fingerprint-payments-privacy-fears-india-banknotes">The Guardian</a>.</p>
<hr />
<p style="text-align: justify;">For two years, Indian officials have been trawling the country, from city slums to unelectrified villages, zapping eyeballs, scanning fingerprints and taking photographs.</p>
<p style="text-align: justify;">Last month, Indian shoppers started to see the results. With the launch of a government-backed fingerprint payment system, tied to India’s growing biometric data bank, registered citizens can – in theory at least – now pay for things with the touch of a finger.</p>
<p style="text-align: justify;">India’s extraordinary biometric database, named Aadhaar after a Hindi word for ‘foundation’, is the biggest of its kind in the world. It was initially sold to the public as a welfare delivery mechanism that would ensure the country’s 1.25bn citizens were each receiving the right quantity of subsidised rice or cooking fuel, while weeding out fraudsters.</p>
<p>But now this pool of more than a billion people’s biometric data is being used by banks, credit checking firms and other private companies to identify customers, raising questions about privacy and security.</p>
<p style="text-align: justify;">As one of his flagship policies, prime minister Narendra Modi pledged to create a “digital India” in which the country’s cash-centric economy would switch to credit and debit cards, squeezing the parallel economy of untaxed cash transactions and giving more citizens access to digital financial services.</p>
<p style="text-align: justify;">In a surprise television announcement last November, Modi announced the demonetisation of 500 and 1,000 rupee notes (around £6 and £12), wiping out 85% of the country’s circulating currency overnight.</p>
<p style="text-align: justify;">Two days later, when the banks reopened, long queues snaked around almost every branch, with millions lining up to open bank accounts for the first time. Many used their 12-digit Aadhaar number, linked to their biometric profile, to sign up. Within three weeks, 3m bank accounts had been opened using fingerprint verification, according to estimates.</p>
<p style="text-align: justify;">The moment marked a radical change for India’s banking system, under which applicants were traditionally required to file photocopies of passports or voter IDs. Banks could take weeks, sometimes months, to verify them. Now applicants’ encrypted biometric data can be sent to the Unique Identification Authority of India (UIDAI), a government agency, to be matched against their Aadhaar data, re-encrypted and sent back to the bank.</p>
<p style="text-align: justify;">Despite technical teething problems, the system is designed to allow very fast authorisation. “All this happens in a matter or two or three seconds,” explains Ajay Bhushan Pandey, UIDAI’s director general.</p>
<p style="text-align: justify;">For Pandey, the benefits are clear: paper documents are easy to forge and hard to verify, especially in India where until recently thousands of people still used handwritten passports. Not so biometric data.</p>
<h4>Privacy fears</h4>
<p style="text-align: justify;">Pandey emphasises that private banks and companies aren’t able to access the entire Aadhaar database, only to use the government interface, which allows them to verify identities.</p>
<p style="text-align: justify;">Nonetheless, many Indians are worried about the privacy implications. Sumandro Chattapadhyay, a director at the Centre for Internet and Society thinktank, is one of them.</p>
<p style="text-align: justify;">For starters, says Chattapadhyay, the law governing use of the biometric database, fast-tracked through parliament last year, is flimsy when it comes to the private sector. Since India lacks a general privacy or data protection law, this leaves corporate use of Aadhaar services effectively unregulated, he says.</p>
<p style="text-align: justify;">This is particularly worrying, says Chattapadhyay, because of the data-sharing possibilities opened up by Aadhaar. It makes it easier for companies not only to share information on individuals’ consumption and mobility habits, but also to link this data up with public records like the electoral register, he says. “Both lead to significant threats to privacy of individuals.”</p>
<p style="text-align: justify;">Chattapadhyay’s fear is that private companies could eventually gain access to government-held personal data, such as income or medical records, while the government could use company data like phone records to target specific individuals in political campaigns.</p>
<p style="text-align: justify;">Already companies are linking Aadhaar numbers with collected metadata. Credit-checking startup CreditVidya, for example, identifies clients using their biometric ID in combination with their internet browsing history and other data, to assign credit scores for users who have no record of loan repayments. Banks then store this processed metadata, for example whether or not someone’s Facebook name is consistent with the name on their bank account.</p>
<p style="text-align: justify;">Its founder Abhishek Agarwal admits there are risks for users: “[I]f someone managed to hack the bank’s security system, as well as the Aadhaar database, they could potentially be able to link your Facebook or LinkedIn data with your biometric information.” But he says this would be hard to do.</p>
<p style="text-align: justify;">Pandey insists the companies are carefully vetted before they can use Aadhaar authentication. But, like Agarwal, he acknowledges the system can never be 100% secure: ““I wouldn’t say it is impossible to break the system, but it is very, very difficult.”</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/vidhi-doshi-fingerprint-payments-prompt-privacy-fears-in-india-the-guardian'>http://editors.cis-india.org/internet-governance/news/vidhi-doshi-fingerprint-payments-prompt-privacy-fears-in-india-the-guardian</a>
</p>
No publisherVidhi DoshiDemonetisationDigital PaymentBig DataPrivacyInternet GovernanceAadhaarBiometrics2017-02-13T09:21:42ZBlog EntrySeminar on Understanding Financial Technology, Cashless India, and Forced Digitalisation (Delhi, January 24)
http://editors.cis-india.org/internet-governance/news/seminar-on-understanding-financial-technology-cashless-india-and-forced-digitalisation-delhi-jan-24-2017
<b>The Centre for Financial Accountability is organising a seminar on "Understanding Financial Technology, Cashless India, and Forced Digitalisation" on Tuesday, January 24, at YWCA, Ashoka Road, New Delhi. Sumandro Chattapadhyay will participate in the seminar and speak on the emerging architecture of FinTech in India, as being developed and deployed by UIDAI and NPCI.</b>
<p> </p>
<p><em>Cross-posted from <a href="https://letstalkfinancialaccountability.wordpress.com/2017/01/20/understanding-financial-technology-cashless-india-forced-digitalisation/">Centre for Financial Accountability</a>.</em></p>
<hr />
<h2>Programme Schedule</h2>
<h4>09.30 - Registration</h4>
<h4>10:00 - Introduction to the Seminar & Setting the Context</h4>
<p>Madhuresh Kumar, National Alliance of People’s Movements</p>
<h4>10:15–11:30 - Session 1 - Understanding the Political Context of FinTech</h4>
<p>B P Mathur, Former Dy CAG</p>
<p>Prabir Purkayastha, Free Software Movement of India and Knowledge Commons</p>
<p>C P Chandrasekhar, Centre for Economic Studies and Planning, JNU</p>
<h4>11:30-11:45 – Tea / Coffee break</h4>
<h4>11:45-13:15 - Session 2 - How will FinTech Impact the Poor, and Labour and Banking Sector?</h4>
<p>Ashim Roy, New Trade Union of India</p>
<p>Nikhil Dey, Mazdoor Kisan Shakti Sangathan</p>
<p>Ravinder Gupta, General Secretary, State Bank of India Officers Association</p>
<h4>13:15-14:00 – Lunch</h4>
<h4>14:00-15:30 - Session 3 - Understanding the Economic Context of FinTech</h4>
<p>Indira Rajaraman, Former Director, RBI</p>
<p>Tony Joseph, Sr. Journalist</p>
<h4>15:30-17:00 - Session 4 - Understanding the Architecture of FinTech: Linkages to Aadhaar, IndiaStack etc</h4>
<p>Sumandro Chattapadhyay, the Centre for Internet and Society</p>
<p>Gopal Krishna, ToxicsWatch</p>
<h4>17:00 – Tea</h4>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/seminar-on-understanding-financial-technology-cashless-india-and-forced-digitalisation-delhi-jan-24-2017'>http://editors.cis-india.org/internet-governance/news/seminar-on-understanding-financial-technology-cashless-india-and-forced-digitalisation-delhi-jan-24-2017</a>
</p>
No publishersumandroUnified Payments InterfaceFinancial TechnologyDigital IDBig DataDigital EconomyUIDInternet GovernanceDigital IndiaAadhaarFinancial InclusionBiometricsDigital Payment2017-01-23T13:17:19ZBlog EntryComments on the Report of the Committee on Digital Payments (December 2016)
http://editors.cis-india.org/internet-governance/blog/comments-on-the-report-of-the-committee-on-digital-payments-dec-2016
<b>The Committee on Digital Payments constituted by the Ministry of Finance and chaired by Ratan P. Watal, Principal Advisor, NITI Aayog, submitted its report on the "Medium Term Recommendations to Strengthen Digital Payments Ecosystem" on December 09, 2016. The report was made public on December 27, and comments were sought from the general public. Here are the comments submitted by the Centre for Internet and Society.</b>
<p> </p>
<h3><strong>1. Preliminary</strong></h3>
<p><strong>1.1.</strong> This submission presents comments by the Centre for Internet and Society (“CIS”) <strong>[1]</strong> in response to the report of the Committee on Digital Payments, chaired by Mr. Ratan P. Watal, Principal Advisor, NITI Aayog, and constituted by the Ministry of Finance, Government of India (“the report”) <strong>[2]</strong>.</p>
<h3><strong>2. The Centre for Internet and Society</strong></h3>
<p><strong>2.1.</strong> The Centre for Internet and Society, CIS, is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with diverse abilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, and open access), internet governance, telecommunication reform, digital privacy, and cyber-security.</p>
<p><strong>2.2.</strong> CIS is not an expert organisation in the domain of banking in general and payments in particular. Our expertise is in matters of internet and communication governance, data privacy and security, and technology regulation. We deeply appreciate and are most inspired by the Ministry of Finance’s decision to invite entities from both the sectors of finance and information technology. This submission is consistent with CIS’ commitment to safeguarding general public interest, and the interests and rights of various stakeholders involved, especially the citizens and the users. CIS is thankful to the Ministry of Finance for this opportunity to provide a general response on the report.</p>
<h3><strong>3. Comments</strong></h3>
<p><strong>3.1.</strong> CIS observes that the decision by the Government of India to withdraw the legal tender character of the old high denomination banknotes (that is, Rs. 500 Rs. 1,000 notes), declared on November 08, 2016 <strong>[3]</strong>, have generated <strong>unprecedented data about the user base and transaction patterns of digital payments systems in India, when pushed to its extreme use due to the circumstances</strong>. The majority of this data is available with the National Payments Corporation of India and the Reserve Bank of India. CIS requests the authorities concerned to consider <strong>opening up this data for analysis and discussion by public at large and experts in particular, before any specific policy and regulatory decisions are taken</strong> towards advancing digital payments proliferation in India. This is a crucial opportunity for the Ministry of Finance to embrace (open) data-driven regulation and policy-making.</p>
<p><strong>3.2.</strong> While the report makes a reference to the European General Data Protection Directive, it does not make a reference to any substantive provisions in the Directive which may be relevant to digital payments. Aside from the recommendation that privacy protections around the purpose limitation principle be relaxed to ensure that payment service providers be allowed to process data to improve fraud monitoring and anti-money laundering services, the report is silent on significant privacy and data protection concerns posed by digital payments services. <strong>CIS strongly warns that the existing data protection and security regulations under Information Technology (Reasonable security practices and procedures and sensitive personal data or information), Rules are woefully inadequate in their scope and application to effectively deal with potential privacy concerns posed by digital payments applications and services.</strong> Some key privacy issues that must be addressed either under a comprehensive data protection legislation or a sector specific financial regulation are listed below. The process of obtaining consent must be specific, informed and unambiguous and through a clear affirmative action by the data subject based upon a genuine choice provided along with an option to opt out at any stage. The data subjects should have clear and easily enforceable right to access and correct their data. Further, data subjects should have the right to restrict the usage of their data in circumstances such as inaccuracy of data, unlawful purpose and data no longer required in order to fulfill the original purpose.</p>
<p><strong>3.3.</strong> The initial recommendation of the report is to “[m]ake regulation of payments independent from the function of central banking” (page 22). This involves a fundamental transformation of the payment and settlement system in India and its regulation. <strong>We submit that a decision regarding transformation of such scale and implications is taken after a more comprehensive policy discussion, especially involving a wider range of stakeholders</strong>. The report itself notes that “[d]igital payments also have the potential of becoming a gateway to other financial services such as credit facilities for small businesses and low-income households” (page 32). Thus, a clear functional, and hence regulatory, separation between the (digital) payments industry and the lending/borrowing industry may be either effective or desirable. Global experience tells us that digital transactions data, along with other alternative data, are fast becoming the basis of provision of financial and other services, by both banking and non-banking (payments) companies. We appeal to the Ministry of Finance to adopt a comprehensive and concerted approach to regulating, enabling competition, and upholding consumers’ rights in the banking sector at large.</p>
<p><strong>3.4.</strong> The report recognises “banking as an activity is separate from payments, which is more of a technology business” (page 154). Contemporary banking and payment businesses are both are primarily technology businesses where information technology particularly is deployed intimately to extract, process, and drive asset management decisions using financial transaction data. Further, with payment businesses (such as, pre-paid instruments) offering return on deposited money via other means (such as, cashbacks), and potentially competing and/or collaborating with established banks to use financial transaction data to drive lending decisions, including but not limited to micro-loans, it appears unproductive to create a separation between banking as an activity and payments as an activity merely in terms of the respective technology intensity of these sectors. <strong>CIS firmly recommends that regulation of these financial services and activities be undertaken in a technology-agnostic manner, and similar regulatory regimes be deployed on those entities offering similar services irrespective of their technology intensity or choice</strong>.</p>
<p><strong>3.5.</strong> The report highlights two major shortcomings of the current regulatory regime for payments. Firstly “the law does not impose any obligation on the regulator to promote competition and innovation in the payments market” (page 153). It appears to us that the regulator’s role should not be to promote market expansion and innovation but to ensure and oversee competition. <strong>We believe that the current regulator should focus on regulating the existing market, and the work of the expansion of the digital payments market in particular and the digital financial services market in general be carried out by another government agency, as it creates conflict of interest for the regulator otherwise.</strong> Secondly, the report mentions that Payment and Settlement Systems Act does not “focus the regulatory attention on the need for consumer protection in digital payments” and then it notes that a “provision was inserted to protect funds collected from customers” in 2015 (page 153). <strong>This indicates that the regulator already has the responsibility to ensure consumer protection in digital payments. The purview and modalities of how this function of course needs discussion and changes with the growth in digital payments</strong>.</p>
<p><strong>3.6.</strong> The report identifies the high cost of cash as a key reason for the government’s policy push towards digital payments. Further, it mentions that a “sample survey conducted in 2014 across urban and rural neighbourhoods in Delhi and Meerut, shows that despite being keenly aware of the costs associated with transacting in cash, most consumers see three main benefits of cash, viz. freedom of negotiations, faster settlements, and ensuring exact payments” (page 30). It further notes that “[d]igital payments have significant dependencies upon power and telecommunications infrastructure. Therefore, the roll out of robust and user friendly digital payments solutions to unelectrified areas/areas without telecommunications network coverage, remains a challenge.” <strong>CIS much appreciates the discussion of the barriers to universal adoption and rollout of digital payments in the report, and appeals to the Ministry of Finance to undertake a more comprehensive study of the key investments required by the Government of India to ensure that digital payments become ubiquitously viable as well as satisfy the demands of a vast range of consumers that India has</strong>. The estimates about investment required to create a robust digital payment infrastructure, cited in the report, provide a great basis for undertaking studies such as these.</p>
<p><strong>3.7.</strong> CIS is very encouraged to see the report highlighting that “[w]ith the rising number of users of digital payment services, it is absolutely necessary to develop consumer confidence on digital payments. Therefore, it is essential to have legislative safeguards to protect such consumers in-built into the primary law.” <strong>We second this recommendation and would like to add further that financial transaction data is governed under a common data protection and privacy regime, without making any differences between data collected by banking and non-banking entities</strong>.</p>
<p><strong>3.8.</strong> We are, however, very discouraged to see the overtly incorrect use of the word “Open Access” in this report in the context of a payment system disallowing service when the client wants to transact money with a specific entity <strong>[4]</strong>. This is not an uncommon anti-competitive measure adopted by various platform players and services providers so as to disallow users from using competing products (such as, not allowing competing apps in the app store controlled by one software company). <strong>The term “Open Access” is not only the appropriate word to describe the negation of such anti-competitive behaviour, its usage in this context undermines its accepted meaning and creates confusion regarding the recommendation being proposed by the report.</strong> The closest analogy to the recommendation of the report would perhaps be with the principle of “network neutrality” that stands for the network provider not discriminating between data packets being processed by them, either in terms of price or speed.</p>
<p><strong>3.9.</strong> A major recommendation by the report involves creation of “a fund from savings generated from cash-less transactions … by the Central Government,” which will use “the trinity of JAM (Jan Dhan, Adhaar, Mobile) [to] link financial inclusion with social protection, contributing to improved Social and Financial Security and Inclusion of vulnerable groups/ communities” (page 160-161). <strong>This amounts to making Aadhaar a mandatory ID for financial inclusion of citizens, especially the marginal and vulnerable ones, and is in direct contradiction to the government’s statements regarding the optional nature of the Aadhaar ID, as well as the orders by the Supreme Court on this topic</strong>.</p>
<p><strong>3.10.</strong> The report recommends that “Aadhaar should be made the primary identification for KYC with the option of using other IDs for people who have not yet obtained Aadhaar” (page 163) and further that “Aadhaar eKYC and eSign should be a replacement for paper based, costly, and shared central KYC registries” (page 162). <strong>Not only these measures would imply making Aadhaar a mandatory ID for undertaking any legal activity in the country, they assume that the UIDAI has verified and audited the personal documents submitted by Aadhaar number holders during enrollment.</strong> A mandate for <em>replacement</em> of the paper-based central KYC agencies will only remove a much needed redundancy in the the identity verification infrastructure of the government.</p>
<p><strong>3.11.</strong> The report suggests that “[t]ransactions which are permitted in cash without KYC should also be permitted on prepaid wallets without KYC” (page 164-165). This seems to negate the reality that physical verification of a person remains one of the most authoritative identity verification process for a natural person, apart from DNA testing perhaps. <strong>Thus, establishing full equivalency of procedure between a presence-less transaction and one involving a physically present person making the payment will only amount to removal of relatively greater security precautions for the former, and will lead to possibilities of fraud</strong>.</p>
<p><strong>3.12.</strong> In continuation with the previous point, the report recommends promotion of “Aadhaar based KYC where PAN has not been obtained” and making of “quoting Aadhaar compulsory in income tax return for natural persons” (page 163). Both these measures imply a replacement of the PAN by Aadhaar in the long term, and a sharp reduction in growth of new PAN holders in the short term. <strong>We appeal for this recommendation to be reconsidered as integration of all functionally separate national critical information infrastructures (such as PAN and Aadhaar) into a single unified and centralised system (such as Aadhaar) engenders massive national and personal security threats</strong>.</p>
<p><strong>3.13.</strong> The report suggest the establishment of “a ranking and reward framework” to recognise and encourage for the best performing state/district/agency in the proliferation of digital payments. <strong>It appears to us that creation of such a framework will only lead to making of an environment of competition among these entities concerned, which apart from its benefits may also have its costs. For example, the incentivisation of quick rollout of digital payment avenues by state government and various government agencies may lead to implementation without sufficient planning, coordination with stakeholders, and precautions regarding data security and privacy</strong>. The provision of central support for digital payments should be carried out in an environment of cooperation and not competition.</p>
<p><strong>3.14.</strong> CIS welcomes the recommendation by the report to generate greater awareness about cost of cash, including by ensuring that “large merchants including government agencies should account and disclose the cost of cash collection and cash payments incurred by them periodically” (page 164). It, however, is not clear to whom such periodic disclosures should be made. <strong>We would like to add here that the awareness building must simultaneously focus on making public how different entities shoulder these costs. Further, for reasons of comparison and evidence-driven policy making, it is necessary that data for equivalent variables are also made open for digital payments - the total and disaggregate cost, and what proportion of these costs are shouldered by which entities</strong>.</p>
<p><strong>3.15.</strong> The report acknowledges that “[t]oday, most merchants do not accept digital payments” and it goes on to recommend “that the Government should seize the initiative and require all government agencies and merchants where contracts are awarded by the government to provide at-least one suitable digital payment option to its consumers and vendors” (page 165). This requirement for offering digital payment option will only introduce an additional economic barrier for merchants bidding for government contracts. <strong>We appeal to the Ministry of Finance to reconsider this approach of raising the costs of non-digital payments to incentivise proliferation of digital payments, and instead lower the existing economic and other barriers to digital payments that keep the merchants away</strong>. The adoption of digital payments must not lead to increasing costs for merchants and end-users, but must decrease the same instead.</p>
<p><strong>3.16.</strong> As the report was submitted on December 09, 2016, and was made public only on December 27, 2016, <strong>it would have been much appreciated if at least a month-long window was provided to study and comment on the report, instead of fifteen days</strong>. This is especially crucial as the recently implemented demonetisation and the subsequent banking and fiscal policy decisions taken by the government have rapidly transformed the state and dynamics of the payments system landscape in India in general, and digital payments in particular.</p>
<h3><strong>Endnotes</strong></h3>
<p><strong>[1]</strong> See: <a href="http://cis-india.org/">http://cis-india.org/</a>.</p>
<p><strong>[2]</strong> See: <a href="http://finmin.nic.in/reports/Note-watal-report.pdf">http://finmin.nic.in/reports/Note-watal-report.pdf</a> and <a href="http://finmin.nic.in/reports/watal_report271216.pdf">http://finmin.nic.in/reports/watal_report271216.pdf</a>.</p>
<p><strong>[3]</strong> See: <a href="http://finmin.nic.in/cancellation_high_denomination_notes.pdf">http://finmin.nic.in/cancellation_high_denomination_notes.pdf</a>.</p>
<p><strong>[4]</strong> Open Access refers to “free and unrestricted online availability” of scientific and non-scientific literature. See: <a href="http://www.budapestopenaccessinitiative.org/read">http://www.budapestopenaccessinitiative.org/read</a>.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/comments-on-the-report-of-the-committee-on-digital-payments-dec-2016'>http://editors.cis-india.org/internet-governance/blog/comments-on-the-report-of-the-committee-on-digital-payments-dec-2016</a>
</p>
No publisherSumandro Chattapadhyay and Amber SinhaUIDDigital IDBig DataDigital EconomyDigital AccessPrivacyDigital SecurityData RevolutionDigital PaymentInternet GovernanceDigital IndiaData ProtectionDemonetisationHomepageFeaturedAadhaar2017-01-12T12:32:22ZBlog EntryNew Media, personalisation and the role of algorithms
http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms
<b>In his much acclaimed book, The Filter Bubble, Eli Pariser explains how personalisation of services on the web works and laments that they are creating individual bubbles for each user, which run counter to the idea of the Internet as an inherently open place. While Pariser’s book looks at the practices of various large companies providing online services, he briefly touches upon the role of new media such as search engines and social media portals in new curation. Building upon Pariser’s unexplored argument, this article looks at the impact of algorithmic decision-making and Big Data in the context of news reporting and curation.</b>
<em><br /></em>
<blockquote>
<div>
<div><em>Everything which bars freedom and fullness of communication sets up barriers that divide human beings into sets and cliques, into antagonistic sects and factions, and thereby undermines the democratic way of life. </em>—John Dewey</div>
</div>
</blockquote>
<p> Eli Pariser, in his book, The Filter Bubble,[1] refers to the scholarship by Walter Lippmann and John Dewey as integral to the evolution of the understanding of the democratic and ethical duties of the Fourth Estate. Lippmann was disillusioned by the role of newspapers in propaganda for the First World War. He responded with three books in quick succession — Liberty and the News,[2] Public Opinion[3] and The Phantom Public.[4] Lippmann brought attention the fact that the process of news-reporting was conducted through privately determined and unexamined standards. The failure of the Fourth Estate to perform its democratic functions, was, in the opinion of Lippmann, one of the prime factors responsible for the public not being an informed and rational entity. John Dewey, while rejecting Lippmann’s arguments that matters of public policy can only be determined by inside experts with training and education, did acknowledge the his critique of the media.</p>
<p>Pariser points to the creation of a wall between editorial decisionmaking and advertiser interests, as the eventual result of the Lippmann and Dewey debate. While accepting that this division between the financial and reporting sides of media houses has not been always observed, Pariser emphasises that the fact that the standard exists is important.[5] Unlike traditional media, the new media which relies on algorithmic decision-making for personalisation is not subject to the same standards which try to mitigate the influence of commercial interests on editorial decisions while performing many of the same functions as the traditional media.[6] </p>
<h3>How personalisation algorithms work</h3>
<p dir="ltr">Kevin Slavin, at his famous talk in the TEDGLobal Conference, characterised algorithms as “maths that computers use to decide stuff” and that it was infiltrating every aspect of our lives.[7] According to Slavin’s view, algorithms can be seen as control technologies and shape our world constantly through media and information systems, dynamically modifying content and function through these programmed routines. Search engines and social media platforms perpetually rank user-generated content through algorithms.[8]</p>
<p>Personalisation technologies have various advantages. It translates into more relevant content, which for service providers means more clicks and revenue and for consumer, less time spent on finding the content.[9] However, it also leads to privacy compromise, lack of control and reduced individual capability.[10] Search engines like Google use the famous PageRank algorithm, which combined with geographical location and previous searches yields most relevant search results.[11] PageRank algorithm uses various real time variables dependent on both voluntary and involuntary user inputs. These variables include number of clicks, number of occurrences of the key terms and number of references by other credible pages etc. This data in turn determines the order of pages in search results and influences the way we perceive, understand and analyse information.[12] Maps showing real time traffic information retrieve data from laser and infrared sensors alongside the road and from information from devices of users. Once this real time data is combined with historical trends, these maps recommend rout to every user, hence influencing the traffic patterns.[13]</p>
<p>Even though this phenomenon of personalization may appears to be new, it has been prevalent in the society for ages.[14] The history of mass media culture clearly shows personalization has always been a method to increase market, market reach and customer satisfaction.[15] Newspapers have sections dedicated to special topics, radio and TV have channels dedicated to different interest groups, age groups and consumers.[16] These personalised sections in a newspaper and personalised channels on radio and television don’t just provide greater satisfaction to the readers or listeners or consumers, they also provide targeted advertisement space for the advertisers and content developers. However, digital footprints and mass collection of data have made this phenomenon much more granular and detailed. Geographical location of an individual can tell a lot about their community, their culture and other important traits local to a community.[17] This data further assists in personalisation. Current developments in technology not only help in better collection of data about personal preferences but also help in better personalisation.</p>
<p>Pariser mentions three ways in which the personalization technologies of this day are different from those of the past. First, for the very first time, individuals are alone in the filter bubble. While in traditional forms of personalisation, there were various individuals who shared the same frame of reference, now there is a separate sets of filters governing the dissemination of content to each individual.[18] Second, the personalisation technologies are entirely invisible now, and there is little that consumers can do to control or modify them.[19] Third, often the decision to be subject to these personalisation technologies is not an informed choice. A good example of this would be an individual’s geographical location.[20]</p>
<h3>The neutrality of New Media?</h3>
<p dir="ltr">More and more, we have noticed personalisation technologies having an impact on how we consume news on the Internet. Google News, Facebook’s News Feed which tries to put together a dynamic feed for both personal and global stories, and Twitter’s trending hashtag feature, have brought forward these services are key drivers of an emerging news ecosystem. Initially, this new media was hailed as a natural consequence of the Internet which would enable greater public participation, allow journalists to find more stories and engage with the readers directly. An illustration of the same could be seen in the way Internet based news media and social networking websites behaved in the aftermath of Israel’s attacks on a United Nations run school in Gaza strip. While much of the international Internet media covered the story, Israel’s home media did not cover the story. The only exception to this was the liberal Israeli news website Ha’aretz.[21] Network graph details of Twitter, for a few days immediately after the incident clearly show the social media manifestation of the event in the personalised cyberspace. It is clearly visible that when most of the word was re-tweeting news of this heinous act of Israel, Israeli’s hardly re-tweeted this news. In fact they were busty re-tweeting the news of rocket attacks on Israel.[22]</p>
<p>The use of social media in newsmaking was hailed by many scholars as symptomatic of the decentralisation characteristic of the Internet. It has been seen as movement towards greater grassroots participation by negating the ‘gatekeeping’ role traditionally played by editors. Thomas Poell and José van Dijck punch holes in theory of social media and other online technologies as mere facilitators of user participation and translators of user preferences through Big Data analytics.[23] They quote T. Gillespie’s work which talks of the narrative of these online services as platforms which are “open, neutral, egalitarian and progressive support for activity.”[24]</p>
<p>Pedro Domingos calls the overwhelming number of choices as the defining problem of the information age, and machine learning and data analytics as the largest part of this solution.[25] The primary function of algorithmic decision making in the context of consumption of content is to narrow down the choices. Domingos is more optimistic about the impact of these technologies, and he says “last step of the decision is usually still for humans to make, but learners intelligently reduce the choices to something a human can manage.”[26] On the other hand, Pariser is more circumspect about the coercive result of machine learning algorithms. Whichever way we lean, we have to accept that a large part of personalisation algorithms is to select and prioritize content by categorising it on the basis of relevance and popularity. </p>
<p>Poell and van Dijck call this a new knowledge logic which in effect replaces human judgement (as, earlier exercised by editors) to some kind of proxy decisionmaking based on data. Their main thesis is that there is little evidence to suggest that the latter is more democratic than former and creates new problems of its own. They go on to compare the practices of various services including Facebook’s new graph and Twitter’s trending topic, and conclude that they prioritise breaking news stories over other kinds of content.[27] For instance, the algorithm for the trending topics depends not on the volume but the velocity of the tweets with the hashtag or term. It could be argued that given this predilection, the algorithms will rarely prefer complex content. If we go by Lippmann and Dewey’s idea that the role of the Fourth Estate is to inform public debate and accountability of those in positions of power, this aspect of Big Data algorithms does not correspond with this role.</p>
<h3>Quantified Audience</h3>
<p dir="ltr">Another aspect of use of Big Data and algorithms in New Media that requires attention is that the networked infrastructure enables a quantified audience. C W Anderson who has studied newsroom practices in the US looked at role played by audience quantification and rationalization in shifting newswork practices. He concluded that more and more, journalists are less autonomous in their news decisions and increasingly reliant on audience metrics as a supplement to news judgment.[28] Poell and van Dijck review the the practices by some leading publications such a New York Times, L.A. Times and Huffington Post, and degree to which audience metrics dictates editorial decisions. While New York Times seems to prioritise content on their social media portals based on expectation of spike in user traffic, L.A. Times goes one step further by developing content specifically aimed towards promoting greater social participation. Neither of these practices though compare to the reliance on SEO and SMO strategies of web-born news providers like Huffington Post. They have traffic editors who trawl the Internet for trending topics and popular search terms, the feedback from them dictates the content creation.[29]</p>
<h3>Conclusion</h3>
<p dir="ltr">The above factors demonstrate that the idea of New Media leading to the Fourth Estate performing its democratic functions does not take into account the actual practices. This idea is based on the erroneous assumption that technology, in general and algorithms, in particular are neutral. While the emergence of New Media might have reduced the gatekeeping role played by the editors, its strong prioritisation of content that will be popular reduce the validity of arguments that it leads to more informed public discussion. As Pariser said, the traditional media scores over the New Media inasmuch as there is an existence of a standard of division between editorial decisionmaking and advertiser interest. While this standard is flouted by media houses all the time, it exists as a metric to aspire to and measure service providers against. The New Media performs many of the same functions and maybe it is time to evolve some principles and ethical standards that take into account the need for it to perform these democratic functions.</p>
<h3>Endnotes </h3>
<p class="normal"><sup><sup>[1]</sup></sup> Eli Pariser, The Filter Bubble: What the Internet is
hiding from you (The Penguin Press, New York, 2011) </p>
<p dir="ltr"><span class="MsoFootnoteReference"><span class="MsoFootnoteReference">[2]</span></span> Walter Lippmann, Liberty and News (Harcourt, Brace
and Howe, New York 1920) available at<a href="https://archive.org/details/libertyandnews01lippgoog">https://archive.org/details/libertyandnews01lippgoog</a></p>
<p class="normal"><sup><sup>[3]</sup></sup> Walter Lippmann, Public Opinion (Harcourt, Brace and
Howe, New York 1920) available at <a href="http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html">http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html</a></p>
<p class="normal"><sup><sup>[4]</sup></sup> Walter Lippmann, The Phantom Public (Transaction
Publishers, New York, 1925)</p>
<p class="normal"><sup><sup>[5]</sup></sup> <em>Supra</em> Note
1 at 35.</p>
<p class="normal"><sup><sup>[6]</sup></sup> <em>Supra</em> Note
1 at 36.</p>
<p class="normal"><sup><sup>[7]</sup></sup> <a href="https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en">https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en</a></p>
<p class="normal"><sup><sup>[8]</sup></sup> Fenwick McKelvey, “Algorithmic Media Need Democratic
Methods: Why Publics Matter”, available at <a href="http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf">http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf</a>.</p>
<p class="normal"><sup><sup>[9]</sup></sup> <a href="http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1">http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1</a></p>
<p class="normal"><sup><sup>[10]</sup></sup> Helen Ashman, Tim Brailsford, Alexandra Cristea, Quan
Z Sheng, Craig Stewart, Elaine Torns and Vincent Wade, “The ethical and social
implications of personalization technologies for e-learning” available at <a href="http://www.sciencedirect.com/science/article/pii/S0378720614000524">http://www.sciencedirect.com/science/article/pii/S0378720614000524</a>.</p>
<p class="normal"><sup><sup>[11]</sup></sup> Sergey Brin and Lawrence Page, “The Anatomy of a
Large-Scale Hypertextual Web Search Engine” available at <a href="http://infolab.stanford.edu/pub/papers/google.pdf">http://infolab.stanford.edu/pub/papers/google.pdf</a>.</p>
<p class="normal"><sup><sup>[12]</sup></sup> Ian Rogers, “The Google Pagerank Algorithm and How It
Works” available at <a href="http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm">http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm</a>.</p>
<p class="normal"><sup><sup>[13]</sup></sup> Trygve Olson and Terry Nelson, “The Internet’s Impact
on Political Parties and Campaigns”, available at <a href="http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942">http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942</a>.</p>
<p class="normal"><sup><sup>[14]</sup></sup> Ian Witten, “Bias, privacy and and personalisation on
the web”, available at <a href="http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf">http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf</a>.</p>
<p class="normal"><sup><sup>[15]</sup></sup> <em>Supra</em> Note
1 at 10.</p>
<p class="normal"><sup><sup>[16]</sup></sup> <a href="https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/">https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/</a></p>
<p class="normal"><sup><sup>[17]</sup></sup> Charles Heatwole, “Culture: A Geographical Perspective”
available at <a href="http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html">http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html</a>.</p>
<p class="normal"><sup><sup>[18]</sup></sup> <em>Supra</em> Note
1 at 10.</p>
<p class="normal"><sup><sup>[19]</sup></sup> <em>Id</em>.</p>
<p class="normal"><sup><sup>[20]</sup></sup> <em>Supra</em> Note
1 at 11.</p>
<p class="normal"><sup><sup>[21]</sup></sup> Paul Mason, “Why Israel is losing the social media
war over Gaza?” available at <a href="http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182">http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182</a>.</p>
<p class="normal"><sup><sup>[22]</sup></sup> Gilad Lotan, Israel, Gaza, War & Data: Social
Networks and the Art of Personalizing Propaganda available at <a href="http://www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html">www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html</a></p>
<p class="normal"><sup><sup>[23]</sup></sup> Thomas Poell and José van Dijck, “Social Media and
Journalistic Independence” in Media Independence: Working with Freedom or
Working for Free?, edited by James Bennett & Niki Strange. (Routledge,
London, 2015)</p>
<p class="normal"><sup><sup>[24]</sup></sup> T Gillespie, “The politics of ‘platforms,” in New
Media & Society (Volume 12, Issue 3).</p>
<p class="normal"><sup><sup>[25]</sup></sup> Pedro Domingos, The Master Algorithm: How the quest
for the ultimate learning machine will re-make the world (Basic Books, New
York, 2015) at 38.</p>
<p class="normal"><sup><sup>[26]</sup></sup> <em>Ibid</em> at 40.</p>
<p class="normal"><sup><sup>[27]</sup></sup> <em>Supra</em> Note
23.</p>
<p class="normal"><sup><sup>[28]</sup></sup> C W Anderson, Between creative and quantified
audiences: Web metrics and changing patterns of newswork in local US newsrooms,
available at <a href="https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms">https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms</a></p>
<p dir="ltr">
<sup><sup>[29]</sup></sup> <em>Supra </em>Note 23.</p>
<p dir="ltr"><span id="docs-internal-guid-24b4db2a-a606-d425-16ff-1d76b980367d"><br /></span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms'>http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms</a>
</p>
No publisheramberHuman RightsBig DataInternet GovernanceMachine LearningAlgorithmsNew Media2017-01-16T07:20:52ZBlog EntryWorkshop Report - UIDAI and Welfare Services: Exclusion and Countermeasures
http://editors.cis-india.org/internet-governance/blog/workshop-report-uidai-and-welfare-services-august-27-2016
<b>This report presents summarised notes from a workshop organised by the Centre for Internet and Society (CIS) on Saturday, August 27, 2016, to discuss, raise awareness of, and devise countermeasures to exclusion due to implementation of UID-based verification for and distribution of welfare services.</b>
<p> </p>
<h2>Introduction</h2>
<p>The Centre for Internet and Society organised a workshop on "UIDAI and Welfare Services: Exclusion and Countermeasures" at the Institution of Agricultural on Technologists on August 27 in Bangalore to discuss, raise awareness of, and devise countermeasures to exclusion due to implementation of UID-based verification for and distribution of welfare services <strong>[1]</strong>. This was a follow-up to the workshop held in Delhi on “Understanding Aadhaar and its New Challenges” at the Centre for Studies in Science Policy, JNU on May 26th and 27th 2016 <strong>[2]</strong>. In this report we summarise the key concerns raised and the case studies presented by the participants at the workshop held on August 27, 2016.</p>
<h2>Implementation of the UID Project</h2>
<p><strong>Question of Consent:</strong> The Aadhaar Act <strong>[3]</strong> states that the consent of the individual must be taken at the time of enrollment and authentication and it must be informed to him/her the purpose for which the data would be used. However, the Act does not provide for an opt-out mechanism and an individual is compelled to give consent to continue with the enrollment process or to complete an authentication.</p>
<p><strong>Lack of Adherence to Court Orders:</strong> Despite of several orders by Supreme Court stating that use of Aadhaar cannot be made mandatory for the purpose of availing benefits and services, multiple state governments and departments have made it mandatory for a wide range of purposes like booking railway tickets <strong>[4]</strong>, linking below the poverty line ration cards with Aadhaar <strong>[5]</strong>, school examinations <strong>[6]</strong>, food security, pension and scholarship <strong>[7]</strong>, to name a few.</p>
<p><strong>Misleading Advertisements:</strong> A concern was raised that individuals are being mislead in the necessity and purpose for enrollment into the project. For example, people have been asked to enrol by telling them that they might get excluded from the system and cannot get services like passports, banks, NREGA, salaries for government employees, denial of vaccinations, etc. Furthermore, the Supreme Court has ordered Aadhaar not be mandatory, yet people are being told that documentation or record keeping cannot be done without UID number.</p>
<p><strong>Hybrid Governance:</strong> The participants pointed out that with the Aadhaar (Targeted delivery of financial and other subsidies, benefits and services) Act, 2016 (hereinafter referred to as Aadhaar Act, 2016 ) being partially enforced, multiple examples of exclusion as reported in the news are demonstrating how the Aadhaar project is creating a case of hybrid governance i.e private corporations playing a significant role in Governance. This can be seen in case of Aadhaar where we see many entities from private sector being involved in its implementation, as well as many software and hardware companies.</p>
<p><strong>Lack of Transparency around Sharing of Biometric Data:</strong> The fact how and why the Government is relying on biometrics for welfare schemes is unclear and not known. Also, there is no information on how biometric data that is collected through the project is being used and its ability as an authenticating device. Along with that, there is very little information on companies that have been enlisted to hold and manage data and perform authentication.</p>
<p><strong>Possibility of Surveillance:</strong> Multiple petitions and ongoing cases have raised concerns regarding the possibility of surveillance, tracking, profiling, convergence of data, and the opaque involvement of private companies involved in the project.</p>
<p><strong>Denial of Information:</strong> In an RTI filed by one of the participant requesting to share the key contract for the project, it was refused on the grounds under section 8(1) (d) of the RTI Act, 2005. However, it was claimed that the provision would not be applicable since the contract was already awarded and any information disclosed to the Parliament should be disclosed to the citizens. The Central Information Commission issued a letter stating that the contractual obligation is over and a copy of the said agreement can be duly shared. However, it was discovered by the said participant that certain pages of the same were missing , which contained confidential information. When this issue went before appeal before the Information Commissioner, the IC gave an order to the IC in Delhi to comply with the previous order. However, it was communicated that limited financial information may be given, but not missing pages. Also, it was revealed that the UIDAI was supposed to share biometric data with NPR (by way of a MoU), but it has refused to give information since the intention was to discontinue NPR and wanted only UIDAI to collect data.</p>
<h2>Concerns Arising from the Report of the Comptroller and Auditor General of India (CAG) on Implementation of PAHAL (DBTL) Scheme</h2>
<p>A presentation on the CAG compliance audit report of PAHAL on LPG <strong>[8]</strong> revealed how the society was made to believe that UID will help deal with the issue of duplication and collection as well as use of biometric data will help. The report also revealed that multiple LPG connections have the same Aadhaar number or same bank account number in the consumer database maintained by the OMCs, the bank account number of consumers were also not accurately recorded, scrutiny of the database revealed improper capture of Aadhaar numbers, and there was incorrect seeding of IFSC codes in consumer database. The participants felt that this was an example of how schemes that are being introduced for social welfare do not necessarily benefit the society, and on the contrary, has led to exclusion by design. For example, in the year 2011, by was of the The Liquefied Petroleum Gas (Regulation of Supply and Distribution) Amendment Order, 2011 <strong>[9]</strong>, the Ministry of Petroleum and Natural Gas made the Unique Identification Number (UID) under the Aadhaar project a must for availing LPG refills. This received a lot of public pushback, which led to non-implementation of the order. In October 2012, despite the UIDAI stating that the number was voluntary, a number of services began requiring the provision of an Aadhaar number for accessing benefits. In September 2013, when the first order on Aadhaar was passed by court <strong>[10]</strong>, oil marketing companies and UIDAI approached the Supreme Court to change the same and allow them to make it mandatory, which was refused by the Court. Later in the year 2014, use of Aadhaar for subsidies was made mandatory. The participants further criticised the CAG report for revealing the manner in which linking Aadhaar with welfare schemes has allowed duplication and led to ghost beneficiaries where there is no information about who these people are who are receiving the benefits of the subsidies. For example, in Rajasthan, people are being denied their pension as they are being declared dead due to absence of information from the Aadhaar database.</p>
<p>It was said that the statistics of duplication mentioned in the report show how UIDAI (as it claims to ensure de-duplication of beneficiaries) is not required for this purpose and can be done without Aadhaar as well. Also, due to incorrect seeding of Aadhaar number many are being denied subsidy where there is no information regarding the number of people who have been denied the subsidy because of this. Considering these important facts from the audit report, the discussants concluded how the statistics reflect inflated claims by UIDAI and how the problems which are said to be addressed by using Aadhaar can be dealt without it. In this context, it is important to understand how the data in the aadhaar database maybe wrong and in case of e-governance the citizens suffer. Also, the fact that loss of subsidy-not in cash, but in use of LPG cylinder - only for cooking, is ignored. In addition to that, there is no data or way to check if the cylinder is being used for commercial purposes or not as RTI from oil companies says that no ghost identities have been detected.</p>
<h2>UID-linked Welfare Delivery in Rajasthan</h2>
<p>One speaker presented findings on people's experiences with UID-linked welfare services in Rajasthan, collected through a 100 days trip organised to speak to people across the state on problems related to welfare governance. This visit revealed that people who need the benefits and access to subsidies most are often excluded from actual services. It was highlighted that the paperless system is proving to be highly dangerous. Some of the cases discussed included that of a disabled labourer, who was asked to get an aadhaar card, but during enrollment asked the person standing next to him to put all his 5 fingers for biometric data collection. Due to this incorrect data, he is devoid of all subsidies since the authentication fails every time he goes to avail it. He stopped receiving his entitlements. Though problems were anticipated, the misery of the people revealed the extent of the problems arising from the project. In another case, an elderly woman living alone, since she could not go for Aadhaar authentication, had not been receiving the ration she is entitled to receive for the past 8 months. When the ration shop was approached to represent her case, the dealers said that they cannot provide her ration since they would require her thumb print for authentication. Later, they found out that on persuading the dealer to provide her with ration since Aadhaar is not mandatory, they found out that in their records they had actually mentioned that she was being given the ration, which was not the case. So the lack of awareness and the fact that people are entitled to receive the benefits irrespective of Aadhaar is something that is being misused by dealers. This shows how this system has become a barrier for the people, where they are also unaware about the grievance redressal mechanism.</p>
<h2>Aadhaar and e-KYC</h2>
<p>In this session, the use of Aadhaar for e-KYC verification was discussed The UID strategy document describes how the idea is to link UIDAI with money enabled Direct Benefit Transfer (DBT) to the beneficiaries without any reason or justification for the same. It was highlighted by one of the participants how the Reserve Bank of India (RBI) believed that making Aadhaar compulsory for e-KYC and several other banking services was a violation of the Money Laundering Act as well as its own rules and standards, however, later relaxed the rules to link Aadhaar with bank accounts and accepted its for e-KyC with great reluctance as the Department of Revenue thought otherwise. It was mentioned how allowing opening of bank accounts remotely using Aadhaar, without physically being present, was touted as a dangerous idea. However, the restrictions placed by RBI were suddenly done away with and opening bank accounts remotely was enabled via e-KYC.</p>
<p>A speaker emphasised that with emerging FinTech services in India being tied with Aadhaar via India Stack, the following concerns are becoming critical:</p>
<ol><li>With RBI enabling creation of bank accounts remotely, it becomes difficult to to track who did e-KYC and which bank did it and hold the same accountable.<br /><br /></li>
<li>The Aadhaar Act 2016 states that UIDAI will not track the queries made and will only keep a record of Yes/No for authentication. For example, the e-KYC to open a bank account can now be done with the help of an Aadhaar number and biometric authentication. However, this request does not get recorded and at the time of authentication, an individual is simply told whether the request has been matched or not by way of a Yes/No <strong>[11]</strong>. Though UIDAI will maintain the authentication record, this may act as an obstacle since in case the information from the aadhaar database does not match, the person would not be able to open a bank account and would only receive a yes/no as a response to the request.<br /><br /></li>
<li>Further, there is a concern that the Aadhaar Enabled Payment System being implemented by the National Payment Corporation of India (NCPI) would allow effectively hiding of source and destination of money flow, leading to money laundering and cases of bribery. This possible as NCPI maintains a mapper where each bank account is linked (only the latest one). However, Aadhaar number can be linked with multiple bank accounts of an individual. So when a transaction is made, the mapper records the transaction only from that 1 account. But if another transaction takes place with another bank account, that record is not maintained by the mapper at NCPI since it records only transactions of the latest account seeded in that. This makes money laundering easy as the money moves from aadhaar number to aadhaar number now rather than bank account to bank account.</li></ol>
<h2>Endnotes</h2>
<p><strong>[1]</strong> See: <a href="http://cis-india.org/internet-governance/events/uidai-and-welfare-services-exclusion-and-countermeasures-aug-27">http://cis-india.org/internet-governance/events/uidai-and-welfare-services-exclusion-and-countermeasures-aug-27</a>.</p>
<p><strong>[2]</strong> See: <a href="http://cis-india.org/internet-governance/blog/report-on-understanding-aadhaar-and-its-new-challenges">http://cis-india.org/internet-governance/blog/report-on-understanding-aadhaar-and-its-new-challenges</a>.</p>
<p><strong>[3]</strong> See: <a href="https://uidai.gov.in/beta/images/the_aadhaar_act_2016.pdf">https://uidai.gov.in/beta/images/the_aadhaar_act_2016.pdf</a>.</p>
<p><strong>[4]</strong> See: <a href="http://scroll.in/latest/816343/aadhaar-numbers-may-soon-be-compulsory-to-book-railway-tickets">http://scroll.in/latest/816343/aadhaar-numbers-may-soon-be-compulsory-to-book-railway-tickets</a>.</p>
<p><strong>[5]</strong> See: <a href="http://www.thehindu.com/news/national/karnataka/linking-bpl-ration-card-with-aadhaar-made-mandatory/article9094935.ece">http://www.thehindu.com/news/national/karnataka/linking-bpl-ration-card-with-aadhaar-made-mandatory/article9094935.ece</a>.</p>
<p><strong>[6]</strong> See: <a href="http://timesofindia.indiatimes.com/india/After-scam-Bihar-to-link-exams-to-Aadhaar/articleshow/54000108.cms">http://timesofindia.indiatimes.com/india/After-scam-Bihar-to-link-exams-to-Aadhaar/articleshow/54000108.cms</a>.</p>
<p><strong>[7]</strong> See: <a href="http://www.dailypioneer.com/state-editions/cs-calls-for-early-steps-to-link-aadhaar-to-ac.html">http://www.dailypioneer.com/state-editions/cs-calls-for-early-steps-to-link-aadhaar-to-ac.html</a>.</p>
<p><strong>[8]</strong> See: <a href="http://www.cag.gov.in/sites/default/files/audit_report_files/Union_Commercial_Compliance_Full_Report_25_2016_English.pdf">http://www.cag.gov.in/sites/default/files/audit_report_files/Union_Commercial_Compliance_Full_Report_25_2016_English.pdf</a>.</p>
<p><strong>[9]</strong> See: <a href="http://petroleum.nic.in/docs/lpg/LPG%20Control%20Order%20GSR%20718%20dated%2026.09.2011.pdf">http://petroleum.nic.in/docs/lpg/LPG%20Control%20Order%20GSR%20718%20dated%2026.09.2011.pdf</a>.</p>
<p><strong>[10]</strong> See: <a href="http://judis.nic.in/temp/494201232392013p.txt">http://judis.nic.in/temp/494201232392013p.txt</a>.</p>
<p><strong>[11]</strong> Section 8(4) of the Aadhaar Act, 2016 states that "The Authority shall respond to an authentication query with a positive, negative or any other appropriate response sharing such identity information excluding any core biometric information."</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/workshop-report-uidai-and-welfare-services-august-27-2016'>http://editors.cis-india.org/internet-governance/blog/workshop-report-uidai-and-welfare-services-august-27-2016</a>
</p>
No publishervanyaDigital PaymentData SystemsResearchers at WorkUIDInternet GovernanceSurveillanceBig DataAadhaarWelfare GovernanceBig Data for DevelopmentDigital ID2019-03-16T04:34:11ZBlog Entry