You are here: Home / Internet Governance / News & Media / The dark side of future tech: Where are we headed on privacy, security, truth?

The dark side of future tech: Where are we headed on privacy, security, truth?

by Admin — last modified Dec 30, 2018 09:24 AM
#2018 Year-End Special: We now live in a time when devices listen, chips track your choices, and governments can watch from behind a barcode. How do we navigate this world?

The article by Dipanjan Sinha was published in the Hindustan Times on December 29, 2018. Pranesh Prakash was quoted.


“One of the definitions of sanity is the ability to tell real from unreal. Soon we’ll need a new definition,” Alvin Toffler, author of the 1970 bestseller Future Shock, once said.

Privacy. Security. Freedom. Democracy. History. News — the lines between the real and unreal are blurring in each of these fields.

Fake news is helping decide elections; history being rewritten as it happens; rumour has become identical in look, feel and distribution to the actual news.

Devices that listen, governments that watch you from behind a barcode, chips that track where you go, what you eat, how you feel — these used to be the stuff of dystopian novels.

In April, the world learnt of the Chinese government’s social credit system, a programme currently in the works that would employ private technology platforms and local councils to use personal data to assign a social score to every registered citizen.

Behave as the state wants you to, and you could get cheaper loans, easier access to education; it’s unclear what the consequences could be for those who do the opposite, but discredits are likely for bad behaviours that range from smoking in non-smoking zones to buying ‘too many’ video games, and being critical of the government.

We’ve seen this before — totalitarian governments where the individual is under constant surveillance by a state that pretends this is for the greater good. But the last time we came across it, it was fiction — George Orwell’s 1984, set in a superstate where thought police took their orders from a totalitarian leader with a friendly name, Big Brother.

 

CATCH-22

“Just because you’re paranoid doesn’t mean they aren’t out to get you,” Joseph Heller said, in Catch-22, a novel so layered that you’re never sure which bits are true. Who gets access to the data your phone collects? What is the government watching for, after they’ve assigned citizens unique IDs?

It feels good to be able to criticise China, still something of an anomaly in a global community that is largely democratic and free-market, but the UK had a National Identity Cards Act from 2006 to 2010; India has the Aadhar project; Brazil has had the National Civil Identification document since 2017; Germany, a national identity card since 2010, and Colombia has had one since 2013.

They’re collecting biometric data, assigning numbers to citizens and building national registers — with not much word on what’s in them, who has access, or how secure they are.

“To ask what the risk is with accumulating such big data is like asking what the risk is with computers. They are both embedded in our lives,” says Pranesh Prakash, a fellow at the thinktank Centre for Internet and Society.

Security is just the base layer in the pyramid if risks. There is also the risk of discrimination — whether in terms of benefits, employment, or something like marriage, Prakash says. There is the risk of bad data leading to worse discrimination; there is the risk of public profiling.

“The question here is about transparency,” Prakash says. “The questions of what the data contains, who it is accessed by or sold do, how much of it there is, and what the purpose is of collecting it — need to be clearly answered.”

OPERATION THEATRE

New questions are being asked in the field of medicine as well. Where do you draw the line on designer babies? Should parents get to edit the genes of their child-to-be? How much ought we to tinker — do you stop at mutations, or go on to decide hair colour and intellect?

As it becomes cheaper and easier to sequence DNA, the questions over the next steps — of interpreting and analysing the data — will become more complex, says K VijayRaghavan, principal scientific adviser to the government of India, and former director of the National Centre for Biological Sciences. “From here on, with the data deluge, deciding what and how to do it will become fiendishly complex. Especially as commercial interests become involved.”

We have rules and laws for the use of DNA information in research, but corresponding laws that regulate how one can use personal whole genome information in the public space are still being framed. “The data-privacy discussion will soon get to the genomic-data space,” VijayRaghavan says. “Data sharing is needed for patients to benefit. Yet data privacy is needed to prevent exploitative use. It’s a conundrum, and there are no easy answers.”