You are here: Home / Internet Governance / News & Media / #NAMAprivacy: The economics and business models of IoT and other issues

#NAMAprivacy: The economics and business models of IoT and other issues

by Admin — last modified Nov 08, 2017 02:09 AM
On 5th October, MediaNama held a #NAMAprivacy conference in Bangalore focused on Privacy in the context of Artificial Intelligence, Internet of Things (IoT) and the issue of consent, supported by Google, Amazon, Mozilla, ISOC, E2E Networks and Info Edge, with community partners HasGeek and Takshashila Institution.

Link to the original published by Medianama on October 18, 2017 here


Part 1 of the notes from the discussion on IoT are here. Part 2:

The session on IoT shifted gears and the participants spoke more about the economics and business IoT. Participants expressed concern that data could be linked to very private aspects of their lives and build business models around them. For example, data from fitness trackers can be linked to a user’s insurance premiums. Or sensors on a car that monitors a user’s driving behavior and link motor insurance.

“I work for Zoomcar, and these are devices which our lives depend and are collecting and reporting data. And that data can be used against you. So it is very hard to know what is fair and what is unfair. Someone mentioned insurance, I feel it is useful to collect a lot of data and decide on insurance based on your driving behaviour and we have had markers for that. But is it fair to the user? The same kind of questions crops up elsewhere like in the US when it comes to healthcare,” Vinayak Hegde said.

An audience member pointed out to that in such a scenario, privacy can help businesses rather than inhibit them and cited a research study in UC Berkely.  “If I use a health tracking device, some of those devices can be valuable for health insurance companies and using that data, they might increase the premiums. But I don’t know actually who might sell my data to someone,” he explained.

Because I don’t know which tracking devices sell my data, I would like not to own the devices itself. So that itself harms the entire health tracker industry itself. He (the researcher) defines privacy as contextual integrity. So a health tracking device is supposed to help me track my health and not supposed to be used by insurance people to determine my premium. If the regulation mandates the contextual integrity of that, it helps that particular industry to avoid those feedback loops,” he explained.

Are fitness trackers in the hardware or services business?

Kiran Jonnalagadda of HasGeek added to the point on fitness trackers. He said that all of IoT is not in the business of hardware and that they are in the services business.  “I had an unusual experience for the past one week, I was out in an area with no Internet connection. But I have two fitness trackers. I bought them mostly because I’m curious about how these companies operate and what they’re doing. And the differences between them are the way they think about things. Now both of these are capable of counting steps without an Internet connection…. But they cannot do anything to show the step count on my phone which it connects to until the data is sent to the Internet and brought back. So my phone would keep telling me that I am not moving and tell me the move but the watch is saying that I am doing 20,000 steps a day and that I am trekking a lot,” he explained.

“For whatever reason, these companies have decided to operate in this manner where validation of data happens on the cloud and not on the device. You only get the most rudimentary data from your device and your phone is just a conduit and not a processing centre at all,” Jonnalagadda said.

He explained both the devices were in the device sales business and has not asked money from them for enabling this sort of Internet-based processing of data. “It calls into question, what is the model here. One could bring the conspiracy theory that they’re selling my data and therefore they don’t worry about collecting data from me. The second is to say: be a little bit more charitable and they recognize that if they piss me off, I won’t buy their device again. And then just assume that a device has a lifetime of just 2-3 years and if you keep a person happy for 2-3 years, they will buy the device from you again. What’s interesting is ultimately not about devices and that it is about services. And this is what I want to say about IoT that it is not about hardware at all it is entirely about services. Without services, the entire business model of IoT breaks down. You do not get software updates, you get vulnerabilities, you get broken design, things have stopped working and no one supports you.”

The economics of processing data locally on a device

Thejesh GN, co-founder of DataMeet, questioned the need for data to be processed on the Internet and asked whether the data will be better protected and have better privacy if it were processed locally. “Considering the fact that we have such powerful phones which are affordable, and can do a lot of things without the Internet. I mean the biggest concept we had in IoT was that we didn’t have CPU or memory and processing power. Given that and the availability of EDGE devices, how long will we have economic cases where privacy can be sold as part of IoT. The processing happens 99% of the times locally without Internet and requires the Internet only when there is messaging. This could be true for your fitness trackers that can be connected to your phone. Your phone has all the capabilities to do all the analysis and doesn’t need to go to the server,” he said.

Pranesh Prakash of the CIS countered him and said that the economics for processing data works out cheaper for the companies.

“One on local processing, this I think is a perennial problem and it really is a question of economics versus principles. Free software is losing out the battle against using other people’s computers for computing—cloud computing—because of economics. So, you no longer own the software that you purchase and even the hardware, very often, with IOT might not actually be yours. It might come with a license, it might come with data that is tied to the company that is actually providing you the device. So the economics of this are for me clear: it’s much cheaper to do it on other computers than to do it locally,” Prakash explained.

He made a case for asserting for individual user’s rights to privacy in this kind of scenario. “It is a question of principles. Should we allow for that or should we assert for consumer protection laws and assert other manners of laws to say that ‘no, people who are purchasing devices’ ought to have greater control of the devices and the data that they produce,”  he added.

Group privacy

The audience also suggested that privacy laws should not just look at protecting the rights of individuals but should look at protecting the rights of groups as well. They raised concerns that even in a group and if the data has been anonymized, it still can be weaponized and cause harms.

“For example, if there are 10-15 of us in this room and given our detailed medical histories, I can find a correlation between some of that. And then I can use that data in some other form when I run a test to see if I am vulnerable to something or use it as a way to discriminate further down the line. As a group, privacy matters a lot because when we talk about devices, we are talking about individuals. Maybe you can target via ethnicity or by age or by class and that can also be weaponized,” an audience member suggested.

Vinayak Hegde gave an example of how weather data captured by IoT can cause harms to a society at large. “If I’m using the weather sensor data and because of global warming, some places like Florida and south of India are going to be extremely hot, I can use surge pricing for a person’s electricity. Again I am not getting targeted as an individual, but as a group, I am being targeted. And sensors are closing that loop really fast.” he explained.

Srinivas Phead of security at Infosys, gave another example where gyroscopes in a phone could target a family. “Some companies in the US use gyroscope in a phone to surreptitiously monitor TV viewing habits. The mobile phone gets activated and over a period of time, they can tweak the advertisements. It is an interesting example, because in TV, when you watch at home, you cannot pinpoint TV a user, because it is shared by a family. This is because the guy who is watching the maximum amount of TV, their data gets circulated and the ads will be tailored to them. The person who does not watch that much amount of TV gets baffled to see advertisements that are not relevant to them. So when you want to process data, you want to assume that, this TV belongs to a user. The TV belongs to a group. And what if the viewing habits are so different, that once your privacy is violated, you don’t want your other family member to know what you are watching,” he added.

Perception of permissions for sensors

Rohini Lakshane of CIS raised an important point during the discussion. The users have different perceptions about the sensors that are embedded in smartphones. She pointed out that users are generally unaware that accelerometers are sensors and capture data and most apps do not ask permissions for the same. An accelerometer is a device used to measure acceleration forces. It is usually used in devices to measure movement and vibrations in devices such as fitness trackers.

“A researcher surveyed a control group and asked them if their GPS data was taken and their camera was made accessible, whether they would be comfortable with it? They were hugely uncomfortable. The question came to the accelerometer on the phone and the respondents said that ‘we are not all that afraid’. The accelerometer only counts the acceleration. So in that app which counts how many steps we have taken in a day, it uses the accelerometer and there is no permission required for it. The accelerometer is still on the phone and is still generating the data and you don’t see it because you don’t have an interface directly with it,” she commented.

*

#NAMAprivacy Bangalore:

  • Will artificial Intelligence and Machine Learning kill privacy? [read]
  • Regulating Artificial Intelligence algorithms [read]
  • Data standards for IoT and home automation systems [read]
  • The economics and business models of IoT and other issues [read]

#NAMAprivacy Delhi:

  • Blockchains and the role of differential privacy [read]
  • Setting up purpose limitation for data collected by companies [read]
  • The role of app ecosystems and nature of permissions in data collection [read]
  • Rights-based approach vs rules-based approach to data collection [read]
  • Data colonisation and regulating cross border data flows [read]
  • Challenges with consent; the Right to Privacy judgment [read]
  • Consent and the need for a data protection regulator [read]
  • Making consent work in India [read]
Filed under: ,