The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 1 to 3.
You auto-complete me: romancing the bot
http://editors.cis-india.org/raw/maya-indira-ganesh-you-auto-complete-me-romancing-the-bot
<b>This is an excerpt from an essay by Maya Indira Ganesh, written for and published as part of the Bodies of Evidence collection of Deep Dives. The Bodies of Evidence collection, edited by Bishakha Datta and Richa Kaul Padte, is a collaboration between Point of View and the Centre for Internet and Society, undertaken as part of the Big Data for Development Network supported by International Development Research Centre, Canada. </b>
<p> </p>
<h4>Please read the full essay on Deep Dives: <a href="https://deepdives.in/you-auto-complete-me-romancing-the-bot-f2f16613fec8" target="_blank">You auto-complete me: romancing the bot</a></h4>
<h4>Maya Indira Ganesh: <a href="https://bodyofwork.in/" target="_blank">Website</a> and <a href="https://twitter.com/mayameme" target="_blank">Twitter</a></h4>
<hr />
<p>I feel like Kismet the Robot.</p>
<p>Kismet is a flappy-eared animatronic head with oversized eyeballs and bushy eyebrows. Connected to cameras and sensors, it exhibits the six primary human emotions identified by psychologist Paul Ekman: happiness, sadness, disgust, surprise, anger, and fear.</p>
<p>Scholar Katherine Hayles says that Kismet was built as an ‘ecological whole’ to respond to both humans and the environment. ‘The community,’ she writes, ‘understood as the robot plus its human interlocutors, is greater than the sum of its parts, because the robot’s design and programming have been created to optimise interactions with humans.’</p>
<p>In other words, Kismet may have ‘social intelligence’.</p>
<p>Kismet’s creator Cynthia Breazal explains this through a telling example. If someone comes too close to it, Kismet retracts its head as if to suggest that its personal space is being violated, or that it is shy. In reality, it is trying to adjust its camera so that it can properly see whatever is in front of it. But it is the human interacting with Kismet who interprets this retraction as the robot requiring its own space by moving back. Breazal says, ‘Human interpretation and response make the robot’s actions more meaningful than they otherwise would be.’</p>
<p>In other words, humans interpret Kismet’s social intelligence as ‘emotional intelligence’...</p>
<p>Kismet was built at the start of a new field called affective computing, which is now branded as ‘emotion AI’. Affective computing is about analysing human facial expressions, gait and stance into a map of emotional states. Here is what Affectiva, one of the companies developing this technology, says about how it works:</p>
<p>‘Humans use a lot of non-verbal cues, such as facial expressions, gesture, body language and tone of voice, to communicate their emotions. Our vision is to develop Emotion AI that can detect emotion just the way humans do. Our technology first identifies a human face in real time or in an image or video. Computer vision algorithms then identify key landmarks on the face…[and] deep learning algorithms analyse pixels in those regions to classify facial expressions. Combinations of these facial expressions are then mapped to emotions.’</p>
<p>But there is also a more sinister aspect to this digitised love-fest. Our faces, voices, and selfies are being used to collect data to train future bots to be more realistic. There is an entire industry of Emotion AI that harvests human emotional data to build technologies that we are supposed to enjoy because they appear more human. But it often comes down to a question of social control, because the same emotional data is used to track, monitor and regulate our own emotions and behaviours...</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/maya-indira-ganesh-you-auto-complete-me-romancing-the-bot'>http://editors.cis-india.org/raw/maya-indira-ganesh-you-auto-complete-me-romancing-the-bot</a>
</p>
No publishersumandroBodies of EvidenceResearchers at WorkResearchPublicationsBD4DBotsBig Data for Development2019-12-06T05:00:19ZBlog EntryData bleeding everywhere: a story of period trackers
http://editors.cis-india.org/raw/sadaf-khan-data-bleeding-everywhere-a-story-of-period-trackers
<b>This is an excerpt from an essay by Sadaf Khan, written for and published as part of the Bodies of Evidence collection of Deep Dives. The Bodies of Evidence collection, edited by Bishakha Datta and Richa Kaul Padte, is a collaboration between Point of View and the Centre for Internet and Society, undertaken as part of the Big Data for Development Network supported by International Development Research Centre, Canada.</b>
<p> </p>
<h4>Please read the full essay on Deep Dives: <a href="https://deepdives.in/data-bleeding-everywhere-a-story-of-period-trackers-8766dc6a1e00" target="_blank">Data bleeding everywhere: a story of period trackers</a></h4>
<h4>Sadaf Khan: <a href="http://mediamatters.pk/the-team/" target="_blank">Media Matters for Democracy</a> and <a href="https://twitter.com/nuqsh" target="_blank">Twitter</a></h4>
<hr />
<p>...By now there are a number of questions buzzing around my head, most of them unasked. Are users comfortable with so much of their data being collected? Are there really algorithms that string together all this data into medically-relevant trends? How reliable can these trends be when usage is erratic? Are period tracking apps pioneering, fundamental elements of a future where medical aid is digital and reliable data is inevitably linked to the provision of medical services? And if so, are privacy and health soon to become conflicting rights?</p>
<p>I also want to find out how users understand data collection and privacy before giving apps consent to utilize their data and information as they will. Hareem says she gives apps informed consent. ‘If my data becomes a part of the statistics aiding medical research, why not? There is no harm in it. I am getting a good service, and if my data helps create a better understanding as a part of a larger statistical pool, they are welcome to use it.’</p>
<p>But is she really sure that this information will be used only as anonymised data for medical research? ‘Look at the kind of information that is being collected,’ she answers. ‘Dates, mood, consistency of mucus, basal temperature. What kind of use does one have for this data?’</p>
<p>Naila, in turn, says: ‘Honestly, I have never really thought about what happens to the data the application collects. Obviously I enter detailed information about my cycle and my moods and my sex life. But a), my account is under a fake name and b), even if it wasn’t, who would have any use for stuff like when my period starts and ends and what my mood or digestive system is like at any given moment?’</p>
<p>In fact, this sentiment is shared among all the women interviewed for this piece — what use would anyone have for this data?</p>
<p>As users, we often imagine our own data as anonymised within a huge dataset. But as users, we don’t have enough information about how our data is being used — or will be used in future. The open and at times vague language of a platform’s terms and conditions allows menstrual apps to use data in ways that I may not know of. Some apps continue to hold customer data even after an account is deleted. Even though I may technically ‘agree’ to the terms and conditions, is this fully informed consent?</p>
<p>One of the big concerns around this kind of medical information being collected is the potential for collaborations with big pharmaceuticals and other health service providers. With apps sitting on a goldmine of users’ fertility and health information, health service providers might mine their data for potential consumers and reach out directly to them. While this is like any targeted marketing campaign, the fact that the advertiser is likely to be offering medical services to women suffering from infertility and are at their most vulnerable, raises totally different ethical concerns.</p>
<p>And these apps and their businesses might grow in directions that users haven’t taken into consideration. Take Ovia’s health feature for companies to buy premium services for their employees. While the gesture is packaged as a goodwill one, it also means that an employer has access to extremely private and intimate medical information about their women employees. And while the data set is anonymised, it is still possible to figure out the identity of users based on specific information. For example, how many women in any company are pregnant at any given time?...</p>
<p>Pregnant a year after my miscarriage, I initially downloaded multiple apps in a bid to find a good fit. I don’t know which one of these was in communication with Facebook. But almost immediately, my Facebook timeline started becoming littered with ads for baby stuff — clothes, shoes bibs, prams, cribs, ointments for stretch marks, maternity wear, the works.</p>
<p>It makes me think of those old school clockwork-style videos. You drop a ball and off it goes: making dominos fall, knocking over pots and pans, setting in motion absurd, synchronized mechanisms. Similarly, I drop my data and watch it hurtle into my life, on to other platforms, off to vendors. Maybe to stalkers? To employers? Who knows.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/sadaf-khan-data-bleeding-everywhere-a-story-of-period-trackers'>http://editors.cis-india.org/raw/sadaf-khan-data-bleeding-everywhere-a-story-of-period-trackers</a>
</p>
No publishersumandroBodies of EvidenceResearchers at WorkResearchFeaturedPublicationsBD4DBig Data for Development2019-12-06T05:03:09ZBlog EntryCan data ever know who we really are?
http://editors.cis-india.org/raw/zara-rahman-can-data-ever-know-who-we-really-are
<b>This is an excerpt from an essay by Zara Rahman, written for and published as part of the Bodies of Evidence collection of Deep Dives. The Bodies of Evidence collection, edited by Bishakha Datta and Richa Kaul Padte, is a collaboration between Point of View and the Centre for Internet and Society, undertaken as part of the Big Data for Development Network supported by International Development Research Centre, Canada.</b>
<p> </p>
<h4>Please read the full essay on Deep Dives: <a href="https://deepdives.in/can-data-ever-know-who-we-really-are-a0dbfb5a87a0" target="_blank">Can data ever know who we really are?</a></h4>
<h4>Zara Rahman: <a href="https://www.theengineroom.org/people/zara-rahman/" target="_blank">The Engine Room</a>, <a href="https://zararah.net/" target="_blank">Website</a>, and <a href="https://twitter.com/zararah" target="_blank">Twitter</a></h4>
<hr />
<blockquote>If I didn’t define myself for myself, I would be crunched into other people’s fantasies for me and eaten alive.<br /><em>– <a href="https://www.blackpast.org/african-american-history/1982-audre-lorde-learning-60s/" target="_blank">Audre Lorde</a></em></blockquote>
<p>The proliferation of digital data and the technologies that allow us to gather that data can be used in another way too — to allow us to define for ourselves who we are, and what we are.</p>
<p>Amidst a growing political climate of fear, mistrust and competition for resources, activists and advocates working in areas that are stigmatised within their societies often need data to ‘prove’ that what they are working on matters. One way of doing this is by gathering data through crowdsourcing. Crowdsourced data isn’t ‘representative’, as statisticians say, but gathering data through unofficial means can be a valuable asset for advocates. For example, <a href="http://readytoreport.in/" target="_blank">data collating the experiences of women</a> who have reported incidents of sexual violence to the police in India, can then be used to advocate for better police responses, and to inform women of their rights. Deservedly or not, quantifiable data takes precedence over personal histories and lived experience in getting the much-desired currency of attention.</p>
<p>And used right, quantifiable data — whether it’s crowdsourced or not — can also be a powerful tool for advocates. Now, we can use quantifiable data to prove beyond a question of a doubt that disabled people, queer people, people from lower castes, face intersecting discrimination, prejudice, and systemic injustices in their lives. It’s an unnecessary repetition in a way, because anybody from those communities could have told reams upon reams of stories about discrimination — all without any need for counting.</p>
<p>Regardless, to play within this increasingly digitised system, we need to repeat what we’ve been saying in a new, digitally-legible way. And to do that, we need to collect data from people who have often only ever been de-humanised as data subjects.</p>
<p>Artist and educator Mimi Onuoha writes about <a href="https://points.datasociety.net/the-point-of-collection-8ee44ad7c2fa#.y0xtfxi2p" target="_blank">the challenges that arise while collecting such data</a>, from acknowledging the humans behind that collection to understanding that missing data points might tell just as much of a story as the data that has been collected. She outlines how digital data means that we have to (intentionally or not) make certain choices about what we value. And the collection of this data means making human choices solid, and often (though not always) making these choices illegible to others.</p>
<p>We speak of black boxes when it comes to <a href="https://www.propublica.org/article/breaking-the-black-box-what-facebook-knows-about-you" target="_blank">the mystery choices that algorithms make</a>, but the same could be said of the many human decisions that are made in categorising data too, whether that be choosing to limit the gender drop-down field to just ‘male/female’ as with Fitbits, or a variety of apps incorrectly assuming that all people who menstruate <a href="https://medium.com/@maggied/i-tried-tracking-my-period-and-it-was-even-worse-than-i-could-have-imagined-bb46f869f45" target="_blank">also want to know about their ‘fertile window’</a>. In large systems with many humans and machines at work, we have no way of interrogating why a category was merged or not, of understanding why certain anomalies were ignored rather than incorporated, or of questioning why certain assumptions were made.</p>
<p>The only thing we can do is to acknowledge these limitations, and try to use those very systems to our advantage, building our own alternatives or workarounds, collecting our own data, and using the data that is out there to tell the stories that matter to us.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/zara-rahman-can-data-ever-know-who-we-really-are'>http://editors.cis-india.org/raw/zara-rahman-can-data-ever-know-who-we-really-are</a>
</p>
No publishersumandroBodies of EvidenceBig DataData SystemsResearchers at WorkResearchPublicationsBD4DBig Data for Development2019-12-06T05:02:53ZBlog Entry