The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 1 to 8.
A.I. Hype Cycles and Artistic Subversions
http://editors.cis-india.org/raw/ai-hype-cycles-and-artistic-subversions
<b>Gene Kogan will give a talk on "A.I. hype cycles and artistic subversions" on Friday, January 22, 2016 at the Centre for Internet and Society office, 6 pm - 8 pm.</b>
<p> </p>
<p><img src="http://www.genekogan.com/images/style-transfer/ml_egypt_crab_maps.jpg" alt="Gene Kogan - Style Transfer - Mona Lisa" width="800" /></p>
<h6>Mona Lisa restyled by Egyptian hieroglyphs, the Crab Nebula, and Google Maps. <a href="http://www.genekogan.com/works/style-transfer.html">Style Transfer</a>. Gene Kogan.</h6>
<p> </p>
<p style="text-align: justify;">Recent years have seen a resurgence of popular interest in machine learning and artificial intelligence, as emerging methods have set new scientific benchmarks and introduced classes of neural networks capable of imitating human behavior, among other impressive feats. More importantly, the study of these algorithms is rapidly crossing over into mainstream culture and industry as AI applications begin to inhabit more of our daily lives. Numerous initiatives have appeared, attempting to demystify and make these previously obscure research tracks more accessible to the public. Open source software like Torch, Theano, and TensorFlow have equipped amateurs with the same software which is achieving state-of-the-art results in industry and academia.</p>
<p style="text-align: justify;">This talk will examine the most recent wave of artistic projects applying these methods in various cultural contexts, producing troves of machine-hallucinated text, images, sounds, and videos, demonstrating a previously unseen capacity for imitating human style and sensibility. These experimental works attempt to show the capacity of these machines for producing aesthetically meaningful media, yet challenging and subverting them to illuminate their most obscure and counterintuitive properties.</p>
<p>A recent article by the speaker about this: <a href="http://bit.ly/1OhFcQr">From Pixels to Paragraphs: How artistic experiments with deep learning guard us from hype</a>.</p>
<p>Relevant projects by the speaker that will be presented include: <a href="http://bit.ly/1RyUH76">Style Transfer</a>, <a href="http://bit.ly/1QDNxOI">A Book from the Sky 天书</a>, <a href="http://bit.ly/1QDNClo">Learning to Generate Text and Audio</a>, and <a href="http://bit.ly/1QDNG4D">Deepdream Prototypes</a>.</p>
<h2>Gene Kogan</h2>
<p style="text-align: justify;">Gene Kogan is an artist and programmer who is interested in generative systems and applications of emerging technology in artistic and expressive contexts. He writes code for live music, performance, and visual art. He contributes to numerous open-source software projects and frequently gives workshops and demonstrations on topics related to code and art.</p>
<p style="text-align: justify;">He is a contributor to openFrameworks, Processing, and p5.js, an adjunct professor at Bennington College and NYU, a former resident at Eyebeam Art & Technology Center, and a former Fulbright scholar in Bangalore, India, 2012-2013.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/ai-hype-cycles-and-artistic-subversions'>http://editors.cis-india.org/raw/ai-hype-cycles-and-artistic-subversions</a>
</p>
No publishersharathGenerative ArtArtPracticeMachine LearningResearchers at WorkEventArtificial Intelligence2016-01-01T07:52:20ZEventTransformaking 2015 : International Summit on Critical and Transformative Making, Yogyakarta
http://editors.cis-india.org/a2k/blogs/transformaking-2015-international-summit-on-critical-and-transformative-making-yogyakarta
<b>Transformaking 2015 brought together makers, scientist, hackers, bricoleurs, researchers, artists, designers and other interdisciplinary practitioners from across the globe in a series of Residency and Research Program, Symposium, Exhibition, Fair, and Satellite Projects. It was held from August 10 to September 20, 2015. Transformaking 2015 was organized by HONF Foundation & CATEC (Culture Arts Technoloy Empowerment Community) in partnership with the Centre for Internet & Society (CIS), Common Room, Crosslab, and Nicelab. </b>
<p> </p>
<p style="text-align: justify;">More information on the event can be accessed on this <a class="external-link" href="http://transformaking.org/opencall">website</a>. I presented a talk <a class="external-link" href="http://transformaking.org/program/symposium">Open Spectrum and Open Science – Policy and Future Opportunities</a>. I was also a speaker in a panel <a class="external-link" href="http://transformaking.org/program/symposium">Encouraging Innovations through Communication and Open Source Culture</a> with fellow panelists Tom Rowlands (Future Everything), Gustav Hariman (Common Room, Bandung) and Colette Tron (Alphabetville) and moderated by Sachet Manandhar of Karkhana Labs, Nepal.</p>
<hr />
<p style="text-align: justify;">As with many other societies, Indonesia has a distinct maker culture that goes back centuries. The rise of collective movements in the network culture following the digital revolution — with associated terms such as DIY (do-it-yourself), DIWO (do-it-with-others), open source, maker and hacker spaces — only reinvigorates and replicates traditional production practices at the grass-roots level: verbal passing of knowledge both vertical (between generations) and horizontal (among community members), voluntary communal division of labour, inventiveness to overcome limited infrastructures, driven by the need to find solutions for a better life rather than personal profit. Our forefathers were the genuine makers.<br /><br />The burgeoning maker movement has been receiving growing recognition as it demonstrates great potential to address concerns and provide innovative solutions at a local, citizen level where established socio-political systems fail. As the makers and associated maker culture come into contact with large industries, they run the risk of being reduced into commodities. A critical attitude is essential to keep the maker movement genuine with lessons from our forefathers in mind and catalyze practices create solutions and sustainable implementations in a process of transformative making — or Transformaking.</p>
<p style="text-align: justify;">The summit aimed to:</p>
<ul>
<li>Create a forum for all stakeholders to discuss views, practices, questions, and issues in the realm of critical making movement</li>
<li>Exhibit projects that create tangible, transformative solutions at a citizen level</li>
<li>Produce usable tools and define dissemination strategies for catalyzing local transformations globally</li></ul>
<ul></ul>
<hr />
<p>The following is a note on the Conception of the Summit:</p>
<h3>Conception of the Summit - Why 'Transformaking'?</h3>
<p style="text-align: justify;">The act of Making is not new, and has been an ongoing process over centuries of mankind, ever since the invention of Neanderthal tools, the wheel, cultural artifacts and practices, to the modern day space shuttle and modes of communication. Today’s networked knowledge society is catalyzing and affecting the process of Making and knowledge production in interesting ways by mediating the co-located and instantaneous access, dissemination and sharing of information amongst people across vast distances.</p>
<p style="text-align: justify;"><img src="http://editors.cis-india.org/home-images/Transformaking.png/@@images/c5d0eac0-51db-4a42-a514-286e593c1c32.png" alt="null" class="image-inline" title="Transformaking" /></p>
<p style="text-align: justify;">The notion of free labour accompanying a rising participation in the gift economy of network culture, is loaded with words such as <em>DIY, Open Knowledge, Open Data, Free & Open Source</em>, that blurs the lines of distinction between production & consumption, labour & cultural expression, and has transcended both the puritan new left movement on one hand and the neo-liberal free market ideology on the other.</p>
<p style="text-align: justify;">There has evidently been a marked shift in the site of labour — from the factory to society, that autonomists have called ‘the social factory’ which challenges the very notion of capitalism from the inside. In Pierre Lévy’s own words — A shift from the Cartesian model of thought based on the singular idea of cogito (I think) to a collective or plural cogitamus (we think), seems to be the unifying goal represented by various models and spaces for thinking such as Makercultures, Think Tanks, Maker Movements, Maker Labs & Hacker/Maker Spaces.</p>
<p style="text-align: justify;">This change in the process of making and knowledge production is further underlined by contextualized maker activity geared towards fueling change, thereby challenging traditional modes of production and consumption, creative and cultural expression, structures of societal organization, ownership, access, intellectual property and copyright regimes, models of participative democracy, citizen science and civic governance in a process of Transformative Making or –what we call – ‘<strong>Transformaking</strong>’.</p>
<p><strong>Transformaking: The International Summit on Critical and Transformative Making 2015</strong> shall bring together makers, hackers, bricoleurs, educators, researchers, theorists, artists and designers to:</p>
<ul style="text-align: justify;">
<li>A Symposium to self reflect, debate and put forth views with regards to their respective practices and dissect various complexities and questions that surround the areas of Critical and Transformative Making.</li>
<li>An Exhibition on Critical Making featuring completed and contextualized projects and productions.</li>
<li>Produce a tangible outcome, of the first International Summit, that focuses on collating diverse views, practices and usable tools along with strategizing modes of academic publication and dissemination for furthering meaningful local transformations, globally.</li></ul>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/blogs/transformaking-2015-international-summit-on-critical-and-transformative-making-yogyakarta'>http://editors.cis-india.org/a2k/blogs/transformaking-2015-international-summit-on-critical-and-transformative-making-yogyakarta</a>
</p>
No publishersharathOpennessAccess to Knowledge2016-06-18T18:00:08ZBlog EntryNASA International Open Data Challenge 2015
http://editors.cis-india.org/openness/events/nasa-international-open-data-challenge-2015
<b>As part of the initial NASA Open Government 2.0 plan, the NASA International Open Data challenge brings together the FOSS community, citizen scientists, open data practitioners , open hardware enthusiasts and students for collaborative problem solving with the goal of producing relevant open-source solutions to address global needs applicable to both life on Earth and life in Space.</b>
<p style="text-align: justify; ">On April 11 and 12, 2015 2015, the event will be organized by the Centre for Internet and Society in collaboration with mentors from Team Indus, one of India's leading Space Technology Start-Ups. The event will start off with the following keynote and workshops at 9am on Saturday, April 11th, 2015:</p>
<div style="text-align: justify; "><b>Pre-Hackathon Workshop: 9 a.m., Saturday, April 11, 2015</b></div>
<div style="text-align: justify; ">IBM Blue Mix Team + OpenCube Labs</div>
<div style="text-align: justify; ">(Big Data Analytics using Cloud Services like Bluemix/Heroku, with node.js implementation and Android APIs)</div>
<div style="text-align: justify; "></div>
<div style="text-align: justify; ">
<div><b>Keynote: Amar Sharma, 12 p.m., April 11, 2015</b></div>
<div>Amar is credited as being the youngest and first Indian amateur astronomer to have an Asteroid named after him in 2014 at the age of 29. <b>(380607 Sharma)</b> He will talk about CCD Astro Imaging and his travails and journey as a self-made astronomer and comet hunter.</div>
<div></div>
<div>We will then break off into teams to commence the hackathon that will end on Sunday,April 12, 2015 at 6pm, after which teams will upload and present their solutions for Local and Global Nominations.</div>
<div></div>
<div>Registration is free and you are required to confirm participation at the below link:</div>
<div><a href="https://2015.spaceappschallenge.org/location/bangalore/">https://2015.spaceappschallenge.org/location/bangalore/</a></div>
</div>
<div style="text-align: justify; "></div>
<div style="text-align: justify; ">Participants are requested to bring their own laptop/computing devices.</div>
<hr />
<p> </p>
<div style="text-align: justify; ">Please see last year's event's focus on Open Science and Big data, and the various Open Data solutions developed at CIS, to get an idea of what the event is about:</div>
<div style="text-align: justify; "><a href="https://2014.spaceappschallenge.org/location/bangalore/">https://2014.spaceappschallenge.org/location/bangalore/</a> This year, we will have a workshop on Big Data Analytics conducted by IBM BlueMix Labs followed by Heroku implementation and Android Programming by friends of CIS from OpenCubeLabs, that will follow a very special Keynote speaker who is first amateur astronomer to have an asteroid named after him, to join the likes of Ramanujan and Vikram Sarabhai.</div>
<p>
For more details visit <a href='http://editors.cis-india.org/openness/events/nasa-international-open-data-challenge-2015'>http://editors.cis-india.org/openness/events/nasa-international-open-data-challenge-2015</a>
</p>
No publishersharathOpen DataEventBig DataOpenness2015-04-27T01:08:27ZEventFrom Open Citizen Radio Networks to the Race for .RADIO gTLD
http://editors.cis-india.org/telecom/open-citizen-radio-networks-to-race-for-.radio-gtld
<b>In light of the recent shutdown of INDYMEDIA ATHENS server and its associated antagonistic Internet radio streaming services, Radio98FM and Radio Entasi, Sharath Chandra Ram, takes a look at open radio networks run by citizen operators as well as the politics around internet radio and it’s growing potential as a medium for citizen activism.</b>
<p style="text-align: justify; ">On the afternoon of April 11, 2013, the president of the National University of Athens (NTUA) ,Simos Simopolous ordered the university’s Network Operations Centre (NOC) to pull the network plug off the IndyMedia Athens server that shared the university’s network infrastructure. With it went down the Internet radio stream of Radio98FM, an independent radio station broadcasting from within NUTA along with Radio Entasi. The takedown as it were later revealed, was an order by the Minister of Public Order, Nikos Dendias followed by the MP Adonis Georgiadis of the New Democracy Party tweeting in praise of the Minister’s decision. (Translate Tweet here : <a href="http://bit.ly/ZiBDuR">http://bit.ly/ZiBDuR</a> ) Indymedia Athens still continues to be accessible through the Tor network at <a href="http://gutneffntqonah7l.onion/">http://gutneffntqonah7l.onion/</a></p>
<p style="text-align: justify; ">The choice and use of broadcast networks in political and citizen uprisings have had a culturally specific side to it. The massive 2006 democracy movement in Nepal was fuelled entirely by pirate FM radio broadcasts, as most mountainous regions have no access to telephony, Internet or print news delivery services. Recently the world saw the power of social media , youtube and Twitter -- in Iran after the police killed student activist Neda and later in the landmark crisis of Tunisia.</p>
<p style="text-align: justify; ">By combining the power of seamless accessibility that the audio medium provides by allowing the user to multi-task, along with the viral broadcasting ability of the Internet, we indeed have an effective tool for networked citizen science. Are there popular models that the community can emulate, and what are the barriers to entry in a trans-medial paradigm such as Internet audio re-transmission?</p>
<h3 style="text-align: justify; ">An Overview of Open Radio Networks</h3>
<p style="text-align: justify; ">Let’s a take a quick peek into the wireless radio –VoIP service named <a href="http://www.echolink.org"><b>ECHOLINK</b></a> that I am fortunate to have had access to, over the last decade. Available to only ‘licensed’ and verified amateur radio operators, one maybe rest assured that strict legalities have unfortunately made such an open and transparent trans-medial global networked infrastructure impossible for commercial deployment or of use to the common citizen.</p>
<p style="text-align: justify; ">The ECHOLINK network was made possible thanks to the relentless efforts of amateur radio operators from around the world. It has enabled numerous wireless VHF local repeaters and links around the globe to be accessible over the Internet from practically any remote machine/device connected to the Internet, for both transmission as well as reception.</p>
<table class="listing">
<tbody>
<tr>
<th><img src="http://editors.cis-india.org/home-images/LinkingExample.png" alt="Linking Example" class="image-inline" title="Linking Example" /></th>
</tr>
<tr>
<td><img src="http://editors.cis-india.org/home-images/Echolink.png" alt="Echolink" class="image-inline" title="Echolink" /></td>
</tr>
</tbody>
</table>
<p style="text-align: justify; ">A memorable event was when I connected to a local Florida repeater from my bedroom’s PC and ended up conversing with an amateur radio operator who was driving around in his car through the Hurricane Katrina flooded streets with a VHF Handheld FM Transceiver, limited food supply and a gallon of reserve fuel canned in his backseat. Despite this, there was a sense of brethren and calm in his crackling radio voice that made it to the Florida repeater and then all the way to my home-station in Bangalore.</p>
<p style="text-align: justify; ">More recently, after the Fukushima Tsunami and Nuclear fallout, we observed that the Japanese government had jammed almost every radio repeater link, including the Emergency Amateur Radio service in Japan as they did not want any international contact to be made regarding the situation. Nevertheless after hours of trying, I intercepted a number of conference link nodes in Japan with people passing on information to each other about the deteriorating conditions in various prefectures. Below is a recorded excerpt from a conversation between two concerned citizens that I intercepted:</p>
<table class="listing">
<tbody>
<tr>
<th><iframe frameborder="no" height="166" scrolling="no" src="https://w.soundcloud.com/player/?url=http%3A%2F%2Fapi.soundcloud.com%2Ftracks%2F88260833" width="100%"></iframe><br /></th>
</tr>
</tbody>
</table>
<p style="text-align: justify; "><b>The Internet Radio Linking Project (IRLP)</b> was a precursor to the Echolink network, invented by Dave Cameron (Callsign: VE7LTD) who installed the first three Windows O/S based IRLP nodes in Vancouver, British Columbia in 1997, followed by a more reliable Linux Node (VE7RHS) in 1998, after which the IRLP network soon spread worldwide. Amateur Radio operators with a VHF handheld transceiver and a custom (also DIY) IRLP interface hardware could connect to any local node within their RF Range and by using particular DTMF codes could establish a connection to any other node in the world by referring to a global list of node numbers. (<a href="http://status.irlp.net/">http://status.irlp.net/</a>).</p>
<p style="text-align: justify; "><a href="http://www.echoirlp.net/">ECHOIRLP</a> enables Radio network node owners to support both IRLP as well as ECHOLINK networks on their repeater. A publically open trans-medial network such as this would certainly transform global information dissemination and accessibility, citizen journalism, community networks as well as disaster management.</p>
<h3 style="text-align: justify; ">Impedances and Emerging Trends in Commercial radio webcasting:</h3>
<p style="text-align: justify; ">Similar radio network paradigms, albeit highly commercial, already exist within the mobile phone infrastructure, and with location-based services and audio databases like Spotify and audio detection apps like SoundHound on the rise, one could expect a huge boom in Internet radio services with contextual advertising on personal devices in the coming years. As with the press wars during the early 1930s in America, when newspapers viewed radio broadcasting as a formidable competitor, various impedances have kept Internet streaming away from the space that local wireless broadcasters and telecommunications networks have enjoyed for so long.</p>
<p style="text-align: justify; ">Unlike in the US or EU Copyright law, in India, there is currently no copyright law that clearly regulates Internet webcasting and radio retransmission on the Internet. In the United States however, webcasting of copyrighted audio content as well as internet retransmission of over-the air FM and AM radio broadcasts are subject to a <b>per-performance royalty</b> and an <b>‘ephemeral’ license fee.</b> For the royalty calculations, transmission to each individual recipient is considered to be one ‘performance’. Estimating the market value of a ‘performance’ however is tricky, and the standing example that served as a reference, was the agreement reached between the RIAA (Recording Industry Association of America) that represents a majority of record labels that own copyrighted sound recordings and YAHOO! Inc , a then major webcaster and Internet re-transmitter. The RIAA-Yahoo! Agreement involved a lump sump payment of USD 1.25 million for the first 1.5 billion transmissions that amounted to about 0.08 cents/performance. The initial proposal by the CARP (Copyright Arbitration Royalty Panel) however, set this at 0.14 cents/performance for pure internet webcasts and 0.07 cents/performance for over-the air retransmissions, which later was rejected and equalized to 0.07cents/performance for both, after another recommendation by the Register of Copyrights was accepted by the Librarian of Congress.</p>
<p style="text-align: justify; ">In addition to this, an ephemeral license fee has to be paid by a webcaster and is currently set to be at about 8.8 per cent of the gross performance fee. ‘Ephemeral recordings’ in traditional broadcasting refer to the temporary copy made off a phono-record to facilitate transmission of the final studio mix. The twist in webcasting however is that temporary server copies necessary for Internet retransmission are subject to this ephemeral license fee.</p>
<p style="text-align: justify; ">Another limitation is bandwidth. Unlike wireless radio broadcasting that has a radial spread over line of sight depending on the wattage of transmission, the number of listeners that a server’s Internet radio streaming can tend to simultaneously, depends on the available bandwidth at the transmitting end. For instance, a 128kbps homebrew audio transmitted over a 1Mbps line using ShoutCast or Icecast, could probably support no more than 10 listeners although the advantage that listeners maybe geographically disparate cannot be overlooked.</p>
<p class="callout" style="text-align: justify; ">The possibility of having a central webspace that provides access to streams of re-transmission of say, every FM news channel across the world still remains unfeasible. The logical next step would be to install multiple repeater servers that can access radio Internet servers located in different parts of the world retransmitting both commercial FM broadcasts as well as independent radio broadcasts, and constructed similar to the Echolink infrastructure. Ofcourse this would only be possible with a community-funded initiative led by the global amateur radio community in tandem with commercial pubic service broadcasters who agree to sacrifice on re-transmission royalties in view of mass accessibility. This collaboration now seems very possible with the latest .RADIO gTLD community based application that was filed by the EBU in 2013.</p>
<h3 style="text-align: justify; ">The .RADIO TLD competition</h3>
<p style="text-align: justify; ">With ICANN launching the gTLD program, a notable contest has started for ownership of the <b>.radio</b> gTLD. The latest applicant is the Eurovision Broadcasting Union (EBU), the largest international association of broadcasters with supporters including the World Broadcasting Unions (WBU) and the Association Mondiale des Radiodiffuseurs Communautaires (AMARC). The EBU has filed for a ‘community based designation’ application, a move that has been actively supported by the International Amateur Radio Union (IARU), (<a href="http://goo.gl/H23YF">http://goo.gl/H23YF</a>), the founding fathers of the global amateur radio community. The European Broadcasting union, created in 1950, is a not-for-profit association and is one of the key sector members and technical advisors of the International Telecommunications Union. It’s primary function has been to advocate and negotiate the interests of European public broadcasters.</p>
<p style="text-align: justify; ">But three other standard applications for the <b>.RADIO</b> domain have been made to the ICANN<b> </b>as early as in 2012 by – BRS Media, AFILIAS and Tin Dale (LLC) all of whom have decried the latest application of EBU. BRS Media, as early as in 1998, entered into an ingenious agreement with the Federated States of Micronesia (country code .FM) and Armenia (country code .AM) and began offering the pricey .FM and .AM domains to Internet radio broadcasters and media services. AFILIAS Inc., who own the .MOBI and .INFO top level domains with it’s employees and investors in the ICANN Board have applied for 31 additional TLDs apart from .RADIO.</p>
<p style="text-align: justify; ">The ICANN reviews each applicant on the basis of descriptions of their mission and purpose of interest in the .RADIO TLD. While all the others allow ‘Open registrations’ of second level .RADIO domain-names by any organization, the EBU application entails a much more restrictive registration process where the initial round of registrations shall be limited to existing broadcasters, trademark owners, internet radio, amateur radio broadcasters and radio professionals.</p>
<p style="text-align: justify; ">The support of AMARC as well as the International Amateur Radio Union (IARU), (<a href="http://goo.gl/H23YF">http://goo.gl/H23YF</a>), has helped EBU to fulfill ICANN’s important pre-requisites for a community-based TLD application – that is “to substantiate its status as representative of the community it names in application by submission of written endorsements in support of the application.”</p>
<p style="text-align: justify; ">Does this mean that we shall finally see the dawn of widely accessible Internet radio and digital re-transmissions of over the air broadcasts, with the amateur radio networks working in tandem with commercial public service broadcasters? Will the EBU truly be a representative of the global broadcasting community and will it treat US counterparts no different from EU and rest of the world? And finally, what impact shall all this have on Internet governance, dissemination of public opinion and citizen interventions? These are but some of the burning questions that shall surface in the near future.</p>
<hr />
<h3>Key References</h3>
<ul>
<li><a href="http://www.echolink.org">http://www.echolink.org</a><b></b></li>
<li><a href="http://www.irlp.net/">http://www.irlp.net/</a></li>
<li>Summary of the Determination of the Librarian of Congress on Rates and Terms for Webcasting and Ephemeral Recordings (<a href="http://goo.gl/xPEj8">http://goo.gl/xPEj8</a>)</li>
<li><a href="http://newgtlds.icann.org/en/">http://newgtlds.icann.org/en/</a><b></b></li>
<li><a href="http://www.arrl.org/">http://www.arrl.org/</a></li>
</ul>
<p>
For more details visit <a href='http://editors.cis-india.org/telecom/open-citizen-radio-networks-to-race-for-.radio-gtld'>http://editors.cis-india.org/telecom/open-citizen-radio-networks-to-race-for-.radio-gtld</a>
</p>
No publishersharathTelecom2013-05-05T05:00:01ZBlog EntryWhat’s In a Name? — DNS Singularity of ICANN and The Gold Rush
http://editors.cis-india.org/internet-governance/blog/dns-singularity-of-icann-and-the-gold-rush
<b>March 2013 being the 28th birthday of the first ever registered Internet domain as well as the exigent launch of the Trademark Clearing House disguised as a milestone in rights protection by the Internet Corporation for Assigned Names and Numbers (ICANN) for it’s new gTLD program, Sharath Chandra Ram, dissects the transitory role of ICANN from being a technical outfit to the Boardroom Big Brother of Internet Governance.</b>
<hr />
<p><a class="external-link" href="http://trademark-clearinghouse.com/">Click to read</a> more about the <b>Trademark Clearing House</b>.</p>
<hr />
<p style="text-align: justify; ">As a non-profit organization, established in agreement with the US Department of Commerce in 1998, the current arrangement of ICANN has come under serious questions in recent years, with the United Nations wanting the ITU to oversee Internet Governance while Europe seeking more public participation in the decision making process that currently comprises a majority of private stakeholders as ICANN board members with vested interests. In this post we shall look at a few instances that give room for thought about the regulatory powers and methods adopted by ICANN as well as reparatory measures taken to reaffirm it’s image as an able governing body amidst disputes over trademarks and fair competition that might actually call for a wider and objective inclusion in future. An outline of functional and structural arrangements of ICANN maybe found at the <a class="external-link" href="http://goo.gl/FijE7">CIS Knowledge Repository page</a>.</p>
<h3 style="text-align: justify; ">The Business Model</h3>
<p style="text-align: justify; ">Earlier this month, (March 15, 2013) was the 28<sup>th</sup> birthday of <a href="http://www.symbolics.com">symbolics.com</a>, the first ever domain name registered in 1985 through the formal ICANN process. (<a class="external-link" href="http://www.nordu.net/ndnweb/home.html">nordu.net</a> being the first domain name created by the registry on January 1, 1985 for the first root server, nic.nordu.net) Symbolics, that spun-off the MIT AI Lab and specialized in building workstations running LISP finally sold the domain for an undisclosed amount to XY.com, an Internet investment firm that has been proudly boasting about their acquired relic for over three years now. The golden days of fancy one word domain name resale at exorbitant prices are over, as Google’s page ranking crawler now really looks at unique content and backlinks. Nevertheless, those with the same archaic view of a real estate agent still believe that a good domain name does have a high ROI and have managed to find naïve takers who will offer ridiculous amounts. One of many such examples is the plain looking <a href="http://www.business.com">www.business.com</a> that was bought initially for $1,50,000 and changed hands twice from $7.5 million to an absurd $345 million of R.H. Donnelley Inc., that soon filed for bankruptcy!</p>
<p style="text-align: justify; ">The top level domain market however, is consistently lucrative. A TLD registry on an average receives $5 - $7 per domain registered under it. So the .COM registry run by VeriSign which, as of 2013 has over a 100 million registered domains, receives a revenue of $500 to $700 million per year of which a fraction is paid to ICANN periodically on a per-registration or per-renewal basis. Competing registrars and registries across TLDs, their revenue generation practices as well as the application process for new TLDs gradually began to be regulated by ICANN in mysterious ways, as we will see in the following legal case studies.</p>
<h3 style="text-align: justify; ">VeriSign vs. ICANN</h3>
<p style="text-align: justify; ">VeriSign began to operate the .COM and .NET TLD after taking over Network Solutions Inc. and entering into a contractual agreement with ICANN in 2001. Let’s take a look at some methods used by VeriSign to garner internet traffic and registrant revenue, that were clamped down by the ICANN, which resulted in a lawsuit by plaintiff VeriSign claiming prevention of fair competition and revenue by impeding innovation.</p>
<p style="text-align: justify; "><i>Clamping of Site Finder & WLS</i>: In September 2003, VeriSign introduced a Wild Card DNS Service called Site Finder for all .com and .net domains. This meant that any user trying to access a non-existent domain name no longer received the 404 Error but were instead redirected to the VeriSign website with adverts and links to affiliate registrars. Often a result of a misspelled domain, in ICANN’s view, the redirection by VeriSign amounted to typo squatting internet users as within a month VeriSign’s traffic rose dramatically moving it to the top 20 most visited websites on the web. As seen below in this archived image of Alexa’s 2003 traffic statistic (Courtesy: <a class="external-link" href="http://cyber.law.harvard.edu/">cyber.law.harvard.edu</a>).</p>
<table class="listing">
<tbody>
<tr>
<th style="text-align: center; "><img src="http://editors.cis-india.org/home-images/copy_of_DailyTraffic.png" alt="Daily Traffic" class="image-inline" title="Daily Traffic" /></th>
</tr>
</tbody>
</table>
<p style="text-align: justify; ">Shortly, in October 2003, ICANN issued a suspension ultimatum pointing Site Finder in violation of the 2001 .Com agreement. This was not the first time ICANN clamped down on VeriSign’s ‘profiteering’ methods. In 2001, ICANN prevented VeriSign’s WLS (Wait Listing Service) that allowed a registrant (through selected participating affiliate registrars of VeriSign) to apply to register an already registered domain in the event that the registration is deleted – a nifty scheme considering the fact that about 25000 domains are deleted everyday!</p>
<h2 style="text-align: justify; ">Remarks and Submissions</h2>
<p style="text-align: justify; ">The long drawn case of VeriSign Vs. ICANN ended on a reconciliatory note, with ICANN bringing the Site Finder service to a halt at the cost of VeriSign walking away happier with a free 5 year extension on the .COM domain (2007 extended to 2012).</p>
<p style="text-align: justify; ">While the ingenious Site Finder service did pose a huge problem to spam filters, both the WLS and yet another service that VeriSign launched to allow registration of non-English language SLDs were also met with a cringe by ICANN.</p>
<p style="text-align: justify; "><b>However looking closer, one may realize that the act of ICANN permitting a DNS root redirect service such as Site Finder for all TLD operators (with an acceptable template that also carried information about the 404 error besides other marketing options) meant the first step towards paving the way towards a plausible scenario of multiple competing DNS roots across TLDs being able to interact with each other — a system often argued by network theorists to be the most efficient and competitive model that would reduce the disjoint between the demand and supply of TLDs in a decentralized infrastructure, and that definitely was not in the best interest of ICANN’s monopolistic plan. Hence, this could be seen as a move by ICANN to nip the Site Finder bud while still young</b>.</p>
<p style="text-align: justify; ">Finally, as brought to public notice in more than one instance (name.Space Vs. ICANN, IOD Vs. ICANN), the vested interests of ICANN board members has come under glaring light. <b>Can the ICANN leadership consisting of members from the very same domain name business industry be able to objectively deal with competing registry services and legal issues?</b> Conspicuous targets have been chairperson Steve Crocker who owns a consulting firm Shinkuro, whose subtle investor is infact AFILIAS INC which runs the .INFO and .MOBI TLDs, provides backend services to numerous TLDs (.ORG, .ASIA, .AERO (aviation)), has applied for a further 31 new TLDs and has it’s CTO Ram Mohan on the Board of Directors of ICANN. Also ICANN Vice Chariman, Bruce Tonkin is Senior Executive at Australia’s largest domain name provider Melbourne IT, and Peter Thrush former chairman of the ICANN Board of Directors is Executive Chairman of Top Level Domain Holdings,Inc which filed 92 gTLD applications in 2012.</p>
<h3 style="text-align: justify; ">Trademark Protection and Domain Names</h3>
<p style="text-align: justify; ">Image Online Design (IOD) is a company that since 1996 has been providing Internet registry services using the trademark .WEB (trademark #3,177,334 including computer accessories) registered with the US Patents and Trademarks Office (USPTO).</p>
<p style="text-align: justify; ">It’s registry services however, were not through the primary DNS root server maintained by ICANN, but through an alternate DNS root that required prospective users to manually make changes in their browser settings in order to resolve .WEB domains registered through IOD. Despite not running the primary DNS root server for. WEB, by the year 2000 IOD had acquired about 20,000 registered .WEB customers.</p>
<p style="text-align: justify; ">The beacon of ‘hope’ arrived upon IOD in mid-2000 as ICANN (on advise of supporting organization GNSO) opened a call for proposals for registrations of new TLDs, with a non-refundable deposit of $50,000 for an application to be considered. By then the importance of the .WEB TLD for e-commerce was well known amongst ICANN board members with Louis Touton lobbying for his preferred applicant AFILIAS INC to be given the .WEB TLD, with others raising concerns about IOD’s preregistration of .WEB domains. One of the founding fathers of the internet, Vinton Cerf, the then Chairman of ICANN took a benevolent stance-- <i>"I'm still interested in IOD," he repeated over Touton's objections. "They've worked with .WEB for some time. To assign that to someone else given that they're actually functioning makes me uneasy," he said, prompting board member Linda Wilson to chime in, "I agree with Vint."</i> (<a href="http://goo.gl/d1v6X">http://goo.gl/d1v6X</a> , <a href="http://goo.gl/eV9Jd">http://goo.gl/eV9Jd</a>).</p>
<p style="text-align: justify; ">Finally amidst all the contention, no one was offered the .WEB domain and ICANN announced that all applications not selected will remain pending and those who submitted will have the option of being re-considered when additional TLD selections are made in future. And the future being, 2012, when ICANN invited a new round of TLD applicants, this time with the non-refundable deposit of whopping $185,000 for a single application (1 TLD/application as opposed to the $50,000 in the year 2000 that allowed multiple TLD requests within the same application) to be considered. While 7 new applicants for the .WEB TLD registered their interest, IOD considered their application to be still pending and did not join the new pool that included AFILIAS INC. and GOOGLE.</p>
<p style="text-align: justify; ">The litigation of IOD Vs ICANN ended in Feb 2013, with IOD claiming weak causes of action under “Trademark Infringement” and “Breach of Contract” &“Fair Dealing” hinging on the fact that the initial $50,000 application was still pending and never was officially rejected by ICANN. Further, there was not enough room to make a valid trademark infringement, as there was no substantial room for consumer confusion in the .WEB case.</p>
<h2 style="text-align: justify; ">Remarks and Submissions</h2>
<p style="text-align: justify; ">The IOD Vs. ICANN case not only increased concerns globally, over the uncertainty associated with the ICANN application process for generic TLDs along with questions regarding the objectivity of its board members, but at the same time has alerted ICANN to take the necessary big sister steps to ensure that it’s well in the game.</p>
<p style="text-align: justify; ">The fact of the matter is that the USPTO does not provide trademark protection services for the Top level Domain industry citing the reason that TLDs trademarks do not provide a distinct service mark that can identify or differentiate the service of an applicant from others, and further cannot be used to ascertain the source of an applicant’s services. This view is flawed, as by looking at a TLD, say BBC.com, an informed person can easily say that VeriSign INC manages the service of directing a user to a correct location on the .COM registry. With introduction of new gTLDs, perhaps BBC would shift it’s content to BBC.news, where the source may be an abstracted Registrar and the nature of service being quite evident. And to those registered trademarks, especially those that shall result in substantial brand confusion to the customer if infringed, granting a TLD like .ibm or .bbc may well be granted to the owner of the trademark who may then outsource registry services to a service provider. This shall invert the current model by relegating the role of a TLD registry holder to that of a contracted service provider.</p>
<p style="text-align: justify; "><b>So the question is, should have the US Department of Commerce, who contracted ICANN in the first place, mediated with USPTO to place the business of a registrar on par with other trades and businesses, and modify it’s trademark infringement policies? And more importantly, will ICANN view this as introducing yet another key stakeholder to the gTLD assignment process?</b></p>
<p style="text-align: justify; "><b>The answer to the latter is already clear as ICANN being in the top of it’s game decided to take matters into its own hands and on March 26, 2013) launched</b> <a href="http://trademark-clearinghouse.com/"><b>http://trademark-clearinghouse.com/</b></a><b> with a new set of guidelines for accepted trademarks and a mechanism that allows trademark holders to submit their application to a central repository.</b></p>
<p style="text-align: justify; ">Accepted trademark holders shall be given priority to register gTLDs during the ‘sunrise’ period. Deloitte Enterprise Risk Services have been assigned the responsibility of evaluating submitted trademarks while IBM shall maintain the actual database of trademarks by the later half of 2013.</p>
<p style="text-align: justify; ">The tip of the iceberg is well in scope of view. ICANN46 is currently being hosted in Beijing, at the China Internet Network Information Centre (CINIC) from April 7 to 11, 2013 while hopefully parallel discussions will happen on all other global forums to hopefully re-consider a future of multiple competing DNS root servers towards healthy competition that is decentralized.</p>
<hr />
<p><b> Key References</b></p>
<ol>
<li><a href="http://www.icann.org/en/news/litigation">http://www.icann.org/en/news/litigation</a></li>
<li><a href="http://cyber.law.harvard.edu/tlds/">http://cyber.law.harvard.edu/tlds/</a></li>
<li style="text-align: justify; ">Lynn, S. [2001] “Discussion Draft: A Unique, Authoritative Root for the DNS” Internet Corporation for Assigned Names and Numbers, 28 May, 2001.</li>
<li style="text-align: justify; ">Internet Architecture Board [2000] “IAB Technical Comment on the Unique DNS Root.” RFC 2826, Internet Society, May 2000.</li>
</ol>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/dns-singularity-of-icann-and-the-gold-rush'>http://editors.cis-india.org/internet-governance/blog/dns-singularity-of-icann-and-the-gold-rush</a>
</p>
No publishersharathICANNInternet Governance2013-03-31T05:35:33ZBlog EntryWho Minds the Maxwell's Demon (Revisiting Communication Networks through the Lens of the Intermediary)
http://editors.cis-india.org/telecom/blog/who-minds-the-maxwells-demon
<b>A holistic reflection on information networks and it’s regulatory framework is possible only when the medium-specific boundary that has often separated the Internet and Telecom networks begins to dissolve, to objectively reveal points of contention in the communication network where the dynamics of network security and privacy are at large – namely, within the historic role of the intermediary at data/signal switching and routing nodes. </b>
<p style="text-align: justify; ">It is unfair to contextualize the history of the Internet without looking at how analog information networks like cable and wireless telegraph and later, the telephone, almost coincidentally necessitated the invention of automated networks for remote machine control and peer-to- peer communication over the Internet that promised to drastically reduce intermediary overheads. While the whole world was fraught in patent wars over wired private networks, the first nodes of the ‘open’ internet were built in a two-week global meeting of computer scientists who were flown down to simply prepare for ‘a public exhibition’ of the ARPANET in 1971.</p>
<p style="text-align: justify; ">While India only received it’s first telephone in New Delhi late into the 20<sup>th</sup> century, “Telegraph Laws” to most of the Indian working class always remained an ominously urgent telegram that brought the news of a dear one who had taken seriously ill. And so, on a lateral note, it is apt to bring to light the life of one Mr Almond Brown Strowger, wherein the idea of an automatic telephone exchange was given birth to by the <b>‘business of death’.</b></p>
<p><i> </i></p>
<h3>The Automatic Telephone Exchange</h3>
<p style="text-align: justify; ">Almond Strowger was an undertaker based in Missouri, in a town where there was yet another undertaker, who’s wife incidentally was an operator in the then manual telephone exchange. Strowger came to believe the reason he received fewer phone calls was that his business competitor’s wife ended up preferentially routing all callers seeking Strowger’s funeral services to her undertaker husband instead. Strowger conceived the initial idea in 1888 and patented ‘The Automatic Telephone Exchange’ in 1891. <a href="http://goo.gl/oieIJ">http://goo.gl/oieIJ</a></p>
<table class="listing">
<tbody>
<tr>
<th><img class="image-inline" src="../../internet-governance/blog/resolveuid/8ec6c81ad81940739eb4fcaa67ad1da2" /></th>
</tr>
</tbody>
</table>
<p style="text-align: justify; ">Popularly known as the ‘Strowger Switch’, the Step-by Step switch (SXS switch) consisted of two interfaces – One at the customer’s end that used telegraph keys (and later a rotary dial) to send a train of electric current pulses corresponding to the digits 0 -9 all the way to the exchange. The actual Strowger switch at the exchange, used an electromechanical device that could move vertically to select one of 10 contacts, and then rotated to select one of another 10 in each row – a total of 100 choices. Consequently was formed in 1892, the Strowger Automatic Telephone Exchange Company at Indiana with about 75 subscribers. Strowger later sold his patents for $10,000 in 1898 to the Automatic Electric Company, a competitor of Bell System’s Western Electric. His patents were eventually acquired by Bell systems for $2.5 million in 1916, showing just how much growth and investor interest the telephone industry had gained by then.</p>
<h3 style="text-align: justify; ">Switching Paradigms</h3>
<p style="text-align: justify; ">The architecture of global communication was headed towards different ideals and directions. Most media historians contrast these methodologies into ‘circuit switching’ and ‘packet switching’, or a connection-oriented fault intolerant system on one hand and another connection-less fault tolerant protocol respectively, both of which were being developed concurrently. In reality however, a major driving factor were the stakeholders backing the infrastructure of the rapidly growing communication industry, who were looking for growing returns on their investments. And hence these parallel ramifications may also be looked at through the lens of closed proprietary and medium specific networks versus an open, shared, medium in-specific paradigm of information theory.</p>
<p style="text-align: justify; ">Circuit switching relied on an assured dedicated connection between 2 nodes, and was especially patronized by the industry that saw telecommunication as the latest fad in urban luxury (a key factor in the distinction of suburban areas as the affluent moved into urban areas that were ‘connected’ by telephone). Owners and manufacturers of the hardware infrastructure became the most significant stakeholders. The revenue model was based on the amount of time the network was used and hence was popular in analog voice telephone networks.<b> </b>The entire bandwidth of the channel was made available for the duration of the session along with a fixed delay between communicating nodes. Therefore, even if there was no information being transmitted during a session, the channel would not be made available to anyone else waiting to use it unless released by the previous party. Early telephone exchanges relied on manual labour to facilitate switching until the automated exchange came about.</p>
<p style="text-align: justify; ">Packet switching on the other hand, leaned towards the paradigm of shared bandwidth and resources, and more importantly approached communication with complete disregard to the medium of transmission, be it wired or wireless. Furthermore, it also disregarded the content, modality and form of communication with an objectified data-centric approach. Information to be transmitted was divided into structured “packets” or “capsules”. These packets were all ‘thrown’ into the shared network pool consisting of numerous other such packets, each with its own destination, to be carefully buffered, stored and forwarded by intermediary routers in the network. Apart from occasional packet loss, the time taken to send a message is indeterminate and is dependent on the overall traffic load on the network at any given time.</p>
<h3 style="text-align: justify; ">INTERFACE MESSAGE PROCESSOR and the ICCC ‘Hackathon’</h3>
<p style="text-align: justify; ">Plans forged on into the early 1960s towards the development of an open architecture to enable network communication between computer systems, culminating in the invention of the ‘interface message processor’ that promised to herald the coming of an era of packet switching by enabling the ARPANET (Advanced Research Projects Agency Network), the first wide area packet switched network – and precursor to the world wide web as we know it today.</p>
<p style="text-align: justify; ">While the Information Processing Techniques Office (IPTO) had previously contracted Larry Roberts who in 1965 developed the first packet switched network between two computers , the TX-2 at MIT with a Q-32 in California, a growing need was felt to have a centralized terminal with access to multiple sites that would enable any computer to connect to any site. The first IMP was commissioned to be built by the engineering firm BBN (Bolt, Beranek and Newman, a professor student trio from MIT).</p>
<table class="listing grid">
<tbody>
<tr>
<th><img class="image-inline" src="../../internet-governance/blog/resolveuid/b1a67e16e3314a0e854294ab95758314" /></th>
</tr>
</tbody>
</table>
<p>(The very first Interface Message Processor by BBN: Courtesy: <a class="external-link" href="http://goo.gl/tvo8n">http://goo.gl/tvo8n</a>)</p>
<p style="text-align: justify; ">By 1971, the four original nodes that connected the ARPANET (viz, UCLA, Stanford Research Institute, University of Utah and University of California at Santa Barbara) had expanded to 15 nodes, but the lack of a common host protocol meant that a full-scale implementation and adoption of the ARPANET was far from complete. The time had come to allow the public to engage with the promising future that the Internet held. What entailed was the organization of first public International Conference on Computer Communication (1972) (<a href="http://goo.gl/PFhtL">http://goo.gl/PFhtL</a>) under the umbrella of the IEEE Computer Society at the Hilton Hotel, Washington D.C. In many ways the event was the original version of a modern day new media art ‘hackathon’ and involved about 50 computer scientists who were flown in from around the globe alongside the likes of Vint Cerf and Bob Metcalfe. The deadline of a public demonstration provided the much-needed impetus to drive the network to functional completion. Exhibits included a variety of networked applications like the famed dialogue between the ‘paranoid patient’ chatbot PARRY and doctor ELIZA, motion control of the LOGO ‘Turtle’ across the network and remote access of digital files that were printed on paper locally. A milestone in distributed packet switching had been achieved and the stage had been set to compete with the archaic paradigm of circuit switched networks, even as delegates from AT&T (incidentally one of the funders of the event) watched on with the hope that the demonstration would run into a fatal glitch.</p>
<h3 style="text-align: justify; ">Who Minds the Maxwell's Demon</h3>
<p style="text-align: justify; ">It may not be boldly evident from the vast corpus of policy research surrounding the regulation of communication networks (be it the issues of network security, privacy, anonymity, surveillance or billing systems) that key-points in the control system where dynamics play at large, are at the interfacing nodes and data/signal switches at either transceiver nodes as well as intermediary nodes. This is further underlined by the historical fact that the invention of the automatic telephone exchange was fuelled by the necessity to ensure a paradigm of unbiased circuit switching within the context of a networked business.</p>
<p style="text-align: justify; ">Just a glimpse at the number of patents that directly or indirectly refer to the Automatic Telephone Exchange patent shall bring to light myriad applications that range from “Linking of Personal Information Management Data”, “Universal Data Aggregation”, “Flexible Billing Architecture”, ”Multiple Data Store Authentication” , “Managing User to User Contact using Inferred Presence Detection” to various paradigms surrounding distributed systems for cache defeat detection, most of which are part of PUSH technology services that manage networked smartphone applications from instant messaging to email access. Other proposed systems for spectrum management and dynamic bandwidth allocation, such as policy alternatives to spectrum auction that entail frequency hopping at the transmitter level shall invariably depend on a centralized automated intermediary who shall in theory have transparent access to data flow. The role of routing intermediaries with specialized access, poses many interesting questions with regards to policy issues that surround network privacy and security.</p>
<p style="text-align: justify; ">This brings us back to the seemingly comical reference that this article makes to a mysterious entity named the ‘Maxwell’s Demon’. A thought experiment proposed by James Clerk Maxwell, involved a chamber of gas molecules at equilibrium that was divided into two halves along with a ‘door’ controlled by the “Maxwell’s Demon”. The demon had the ability to ‘open’ the door to allow faster than average molecules to enter one side of the chamber while slower molecules ended up on the other side of the chamber, causing the former side to heat up while the other side gradually cooled down, thereby establishing a temperature difference without doing any work, and thus violating the 2<sup>nd</sup> Law of Thermodynamics. The parallel drawn in this article between networked switching intermediaries and the Maxwell’s demon does not go beyond this simple functional similarity.</p>
<p style="text-align: justify; ">However for the ambitious reader, it maybe interesting to note that ever since the invention of digital computers, scientists have actively pursued the paradox of Maxwell’s demon to revisit physical fundamentals governing information theory and information processing, which has involved analyzing the thermodynamic costs of elementary information manipulation in digital circuits – A study that probably constantly engages Google as they pump water through steel tubes to cool their million servers.</p>
<p style="text-align: justify; ">We shall save all this for another day, but on yet another related note, everytime say an email sent to an invalid address bounces back to your inbox as a “Mailer Daemon”, let it be known that the “Daemon” in Operating System terminology that refers to an invisible background process that the user has no control over, infact directly owes it’s etymology to the paradox of ‘Maxwell’s Demon’.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/telecom/blog/who-minds-the-maxwells-demon'>http://editors.cis-india.org/telecom/blog/who-minds-the-maxwells-demon</a>
</p>
No publishersharathTelecom2013-03-05T07:37:37ZBlog EntryMining the Web Collective
http://editors.cis-india.org/internet-governance/blog/mining-the-web-collective
<b>In March 2012, Dr Bruno Latour and his team from the Sciences Po Media Lab organized a workshop that assembled a selected group of researchers from India to explore methods of Controversy Mapping. It was hosted by Dr J. Srinivasan, Director of the Divecha Centre for Climate Change at the Indian Institute of Science, Bangalore, India.</b>
<p style="text-align: justify; ">While the context of this workshop focussed on deciphering and mapping opinions related to academic controversies surrounding climate change, the very same techniques of deploying digital tools to crawl through associated content on the websphere, maybe used to map any other controversy that has been actively influencing public and political opinion.</p>
<p style="text-align: justify; "><i>As one of the participants in the workshop, in an attempt to make my interpretation as accessible as possible to a wider inter-disciplinary audience, below is my own assimilation and extrapolation of the musings and discussions that entailed. Further I have drawn out limitations and future directions towards more viable paradigms that augment the mapping and democratization of public opinion.</i></p>
<p style="text-align: justify; ">The session drew an outset around how new digital tools could aid researchers by enabling them to quickly see an individual entity’s data as well as it’s associated aggregates, and register all of this within a single view in real-time. Contrasting the traditional methods of data collection through individual surveys, new digital methods can almost instantaneously bridge the gap between the individual and the collective and help us answer the question that Latour poses in his most recent paper that revisits social theory around the Tardean concept of reciprocally connected ‘monads’ -- <i>''.... is there an alternative to the common sense version that distinguishes atoms, interactions and wholes as successive sequences (whatever the order and the timing)? An alternative that should not oblige the inquirer to change gears from the micro to the macro levels ..... but remains fully continuous ...''</i> [Latour et al , 2012].</p>
<h3 style="text-align: justify; ">Encompassing the Collective</h3>
<p style="text-align: justify; ">The geometric basis of the universe as expressed by Edgar Allan Poe, asserts that the ‘universe.. is a sphere of which the centre is everywhere and circumference nowhere’ (Eureka, p 20) This is essentially a post-Euclidean conception of space, in line with the view of early 20th century physicist Alexander Friedmann who posits that the ‘universe is not finite in space, but neither does space have any boundary’ and so the centre of the universe is relative to every single atom — hence every single observer.</p>
<p style="text-align: justify; ">In many ways, the process of data collection and visualization that was carried out at the workshop tried at best to mimic this geometric basis of space. By starting with a single entity (say, mammals) the empiricist begins with nothing more than a named 'label'. One then extends the specification of this entity, by populating a list with an increasing number of elements. This process of 'learning' about an entity is essentially an infinite process, as many abstract associations maybe permitted to enter the list. However, the observer stops this iterative process at a point when he feels that he has enough knowledge to describe the entity within the (seemingly finite) 'scope' of study. What we then have is a highly individualized point of view with respect to one entity that has a view of all it's associated attributes.</p>
<p style="text-align: justify; ">It is worth noting here that the attributes themselves can be looked at as individualized entities, and vice versa, from their own view point, depending on the way in which one navigates, thereby making the map invertible. For instance while 'egg-laying' maybe one of the attributes of a 'mammal', if we navigated to define 'egg-laying' to be our starting entity, it's view point can contain attributes like 'mammals' and 'birds'. This process is entirely different from the bottom up approach of constructing a general view by combining individual counterparts. In fact, there is no one general view here, as the picture is an exploded graph emanating from a single entity's view point, each to it's own 'umwelt'.[Kaveli et al, 2010].</p>
<h3 style="text-align: justify; ">(Re)formation of Opinion</h3>
<p style="text-align: justify; ">The formation of a fundamental percept in the human brain, for instance, during the cognitive activity of reading a text, is in itself a bottom-up serial process where individual words progressively make up semantic associations to form a meaningful structure (just as this sentence), along with contextual association with previously acquired knowledge. This capacity limit for information processing [Rene and Ivanoff, 2005] which is a prerequisite for our highly focussed mechanism of attention is the reason why we cannot capture the entire star map within a single glance at the night sky.</p>
<p style="text-align: justify; ">Somewhere down this iterative line of observing an entity, and not having access to all of its attributes in entirety, leads to over-specification and an entanglement with isolated systems, thereby falling into a local maxima as opposed to a global solution. This is the basis of opinion formation and by envisaging it as a 'closed' object it is transformed into a percept, open to interpretation and often conflicting with another, thereby resulting in a controversy.</p>
<p style="text-align: justify; ">One of the objectives of the controversy mapping workshop was to transform the 'immutable' percept surrounding a controversy into a visual map that all at once registers weblinked attributes surrounding it, to give us a possibly emergent and unbiased picture.</p>
<h3 style="text-align: justify; ">The Method to the Madness</h3>
<p style="text-align: justify; ">The process of framing of a ‘controversial topic’ and the collation of massive data and links on the internet that surround the topic could indeed be a cumbersome task. An informed approach is thus required in order to achieve a meaningful result.<br /><br />Firstly, one needs to consider reliable sources and means of knowledge production that provide enough fuel to kindle the analysis of the controversy. One needs to move on from casual matters of opinion or statements (such as “the cumulative effects of CFC result in ozone layer depletion”) to identifying a hypothesis or theory that is being actively contested by academicians and experts through research and publication. This serves to outline an important preliminary sketch of the controversy that exists within the community.</p>
<p style="text-align: justify; ">Secondly, it is essential to remember that specialized researchers do not exist in self-centered isolation but often operate in tandem with multiple stakeholders, investors, donors, sponsors and a diverse audience that they cater to through articles, books, research projects and published journals. For instance, several theorists who are into the business of developing a so-called ‘language of critique’ often ensure through working group meetings that a selected group of researchers are on the ‘same page’ while using common words to canvass a spearhead towards prospective calls from popular journals. At other times, one may perceive a very direct link between mainstream press and cutting-edge research. This group comprising allies and endorsers are an important constituent of the mapping process as they provide key points of entry into the controversy.</p>
<p style="text-align: justify; ">Further, as more and more data relating to a controversy is accrued, one must decipher not only how the position of the controversy is being dynamically shaped over time along with its stakeholders but also be able to extrapolate how and why its current position of uncertainty might evolve. This would involve identifying potential points of contention that could respark a debate over an issue that has reached near closure.</p>
<h3 style="text-align: justify; ">Mapping the Controversy around ‘Anthropocene’</h3>
<table class="listing">
<tbody>
<tr>
<th><img class="image-inline" src="../../accessibility/blog/resolveuid/8d81a93d91444d90a178646db01a002f/@@images/image/large" /></th>
</tr>
</tbody>
</table>
<p style="text-align: justify; ">The topic chosen by my group (which consisted of scholars Neesha Dutt, Muthatha Ramanathan and Prasanna Kolte) was ‘Anthropocene’, a geo-chronological term that was informally introduced by a Nobel laureate in the field of atmospheric chemistry, Paul Crutzen, at a dinner party. ‘Anthropocene’ apparently marks the post industrial period as a time window that represents the impact that human activities have had on earth’s ecological systems, thereby affecting climate change. The widespread acceptance and popularity of the the word has even seen a move to officially recognize ‘Anthropocene’ as geological unit of time, complemented by a number of dubious research projects that assume the ‘anthropocenic’ view of climate change. The tools used were Navicrawler to populate a massive list of webpages that featured the keyword and other landing websites that each of the webpages point to. The context of the websites based on their content were labelled manually and no native text parsing and analysis was used. An interconnected visual graph structure was then obtained using Gephi, a software that uses Force Layout -2 , a graph layout algorithm for network visualization. [M. Bastian et al, 2009].</p>
<h3 style="text-align: justify; ">Future Directions</h3>
<p style="text-align: justify; ">Including a layer of geographical representation to the formation and spread of an opinion is a key direction towards which opinion mining and controversy mapping is headed. A limiting factor while crawling articles over the web using currently available digital tools is the inaccurate representation of geographical source. An article posted in a popular science blog in India, may actually have its server hosted in California and this fact may often be abstracted to our crawler.</p>
<p style="text-align: justify; ">Furthermore, apart from the geographical source of a web article, an interesting direction would be to employ geo-located public opinion interfaces to collect a sample set of public opinion related to an issue, across diverse geographical locations in realtime. This would serve as valuable layer to overlay onto the controversy web map.</p>
<p style="text-align: justify; ">Another constraint of the digital methods referred to here within, is the medium specific approach that does not look beyond the sample space of the internet. Listening to and analyzing internet social media dynamics and combing large data sets to churn out a report is not much of a challenge. Cross media influences in public and political opinion have become increasingly clear with television broadcasts and newspaper reports directly contributing to discussions that happen on internet forums and websites. Take for instance Blue Fin Labs that started off within the Cognitive Machines group of MIT Media Lab. Initially known as the Human Speechome project which used deep machine learning algorithms to map out relationships between spoken word and context, Blue Fin Labs now applies the same technique to map internet comments and posts to corresponding audio-visual stimuli in television broadcasts that caused those comments to be made on the web.</p>
<hr />
<h2 style="text-align: justify; ">Video</h2>
<p style="text-align: justify; "><b>Data visualization of connecting the social graph to the TV content graph</b></p>
<p><iframe frameborder="0" height="315" src="http://www.youtube.com/embed/xEZ2W5-l1Zo" width="320"></iframe></p>
<h3 style="text-align: justify; ">References</h3>
<ol>
<li style="text-align: justify; ">Cappi, Alberto (1994). "Edgar Allan Poe's Physical Cosmology". The Quarterly Journal of the Royal Astronomical Society 35: 177–192</li>
<li style="text-align: justify; ">Castells, M. (2000). Materials for an exploratory theory of the network society. British Journal of Sociology Vol. No. 51 Issue No. 1 (January/March 2000).</li>
<li style="text-align: justify; ">Edgar Allen Poe (1848) ‘Eureka : A Prose Poem'.</li>
<li style="text-align: justify; ">Kull, Kaveli 2010. Umwelt. In: Cobley, Paul (ed.), The Routledge Companion to Semiotics. London: Routledge, 348–349.</li>
<li style="text-align: justify; ">Latour, B. et al 2012 “The Whole is Always Smaller Than It’s Parts A Digital Test of Gabriel Tarde’s Monads” British Journal of Sociology (forthcoming)<a href="http://www.bruno-latour.fr/sites/default/files/123-WHOLE-PART-FINAL.pdf">http://www.bruno-latour.fr/sites/default/files/123-WHOLE-PART-FINAL.pdf</a></li>
<li style="text-align: justify; ">M. Bastian, S. Heymann, and M. Jacomy, “Gephi: an open source software for exploring and manipulating networks,” in International AAAI Conference on Weblogs and Social Media. Association for the Advancement of Artificial Intelligence, 2009.</li>
<li style="text-align: justify; ">M. E. J. Newman, “Analysis of weighted networks,” 2004, arxiv:cond-mat/0407503.</li>
<li style="text-align: justify; ">Reynolds, C. W. (1987) Flocks, Herds, and Schools: A Distributed Behavioral Model, in Computer Graphics, 21(4) (SIGGRAPH '87 Conference Proceedings) pp. 25-34.</li>
<li style="text-align: justify; ">Rene Marois and Jason Ivanoff, Capacity limits of information processing in the brain, TRENDS in Cognitive Sciences Vol.9 No.6 June 2005</li>
<li style="text-align: justify; ">T. M. J. Fruchterman and E. M. Reingold, “Graph drawing by force-directed placement,” Softw: Pract. Exper., vol. 21 no. 11, pp. 1129–1164, Nov. 1991.</li>
</ol>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/mining-the-web-collective'>http://editors.cis-india.org/internet-governance/blog/mining-the-web-collective</a>
</p>
No publishersharathInternet Governance2013-01-06T23:48:20ZBlog EntryHuman Machine Interfaces: The History of an Uncertain Future
http://editors.cis-india.org/accessibility/blog/human-machine-interfaces-the-history-of-an-uncertain-future
<b>"Multimodal interfaces maybe re-engineered much more easily now and can transform the ways in which the physically,cognitively and sensorially disabled can access information and interact with the digital world", says Sharath Chandra Ram.</b>
<p style="text-align: justify; ">Fundamental inspirations in digital information practices sprouted from the hypothetical electro-mechanical device ‘Memex’ proposed by renowned scientist Vannevar Bush in 1945, who incidentally, as the graduate professor of Claude Shannon, also paved the way for digital circuit design theory. The Memex (Memory + Index) concept entailed a system where a user could add associative trails to notes, books, communication and audio-visual experiences involving both him and others. Memex in Bush’s view was to create trails of links in temporal sequences of subjective experiences of a person, accessible to him (and others) anytime — a sort of augmented and extended memory. So implausible was considered this ambitious proposal of his, that the word ‘<b>vannevar’</b> has entered the dictionary as a noun used to describe something that is unfeasibly fantastic and imaginative.</p>
<p class="callout" style="text-align: left; ">Wholly new forms of encyclopedias will appear, readymade with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.<a href="#fn1" name="fr1">[1]</a></p>
<p style="text-align: justify; ">The Memex idea had an immediate bearing on the conception of the World Wide Web and also influenced Ted Nelson’s coinage of ‘hyperlink’ that mapped a single word in a document to other associative content. Douglas Engelbart inspired by Bush’s essay, invented an interface that aided the very metaphor of pinpointed navigation through hyperlinks — the X-Y Indicator — that later came to be known to the world as the Computer Mouse. Not much has changed in the ways by which humans have interacted at the interface level. The WIMP paradigm (Windows, Icons ,Menus and Pointers) has been here to stay.</p>
<p style="text-align: justify; ">The X-Y indicator that previously mapped motions made on a two dimensional track pad onto the screen has simply been infused onto touch screens. While this may have eased the process of visual design automation, could our interaction be more natural, expressive, immersive and creative? Our experience in the real world is multi modal and we communicate with others using our body, hands, visual cues and sound. Is there a way by which our interaction in the virtual world could closely mimic our real world behavior?</p>
<p style="text-align: justify; ">The answer to the above questions came around the same time that the mouse was invented — Myron Kreuger's Videoplace. Unarguably the first and finest immersive virtual reality created way back in the 1970s, ‘Videoplace’ combined two cultural forces — the television (a purveyor of passive experience) and computer (symbol of forbidding technology) to create an expressive medium for communicating playfulness and active participation. Kreuger argued that "computer art which ignores responsiveness is using the computer only for visual design automation, rather than as a basis for a new medium." Kreuger used image processing and gestural interaction as early as in the 70s to interact with virtual objects in the digital world and has inspired a whole generation of computer vision artists including the likes of Golan Levin. If one recalls the seemingly futuristic gestural interface that Tom Cruise used in the film ‘Minority Report’ — be assured it’s already here! Jaron Lanier, a pioneer in virtual reality systems who headed the National Tele Immersion Initiative developed the entire working set of the film.</p>
<p style="text-align: justify; ">It seems Kreuger’s work had remained in a niche closet due to early commercialization and large scale adoption of the XY mouse and touch devices. ‘User centric design’ has become increasingly device dependent and really only caters to enticing users to information that the interface wants to disseminate rather than let the user engage with the interface intuitively.</p>
<p style="text-align: justify; ">Today, natural interfacing techniques are regaining much commercial interest. A landmark event was the massively viral <a class="external-link" href="http://www.youtube.com/watch?v=Jd3-eiid-Uw">YouTube video of Johnny Lee Cheung</a> hacking the Nintedo Wiimote’s infrared sensor to track the head movement of a user in real time and provide an illusion of 3-Dimensional Virtual Reality. Within a year, Microsoft hired Cheung to develop the Kinect Camera for gestural interaction with it’s X-box gaming console and also bought all assets of 3DV system’s 3D sensor ‘ZCam’ -- the most affordable option available to new media artists until then. Within a week of the Kinect’s release, it’s drivers were hacked and exposed by the opensource art community that responded to Adafruit’s USD 2000 Kinect hack challenge.</p>
<p style="text-align: justify; ">With similar gestural devices by ASUS and the much-awaited Leap sensor, we are on the brink of a paradigm shift in the ways of accessing information that shall redefine concepts in human computer interaction. Cognitive interface solutions by NeuroSky and Emotiv Systems have already paved the way to neuronal signal activated interactions and games. The OpenEEG project has propelled research into open hardware schematics for brain computer interaction.</p>
<p style="text-align: justify; ">The linear presentation of search engine results on a browser across millions of pages , I predict, will change this decade as the GUI will transform into a 4 Dimensional Space layered in time, with relevant search results being clustered onto a connected graph node structure and distanced based on their mutual relevance. This calls for a more natural interface that depends not on the traditional keyboard-mouse interaction but on the use of intelligent interfaces such as eye-tracking , gaze , gesture ,speech and thought waves to sift through large databases that shall present themselves in totality along with multimodal feedback to the user.</p>
<p align="JUSTIFY">While all this will transform the ways in which the specially-abled shall access digital information, such transparent interfaces shall also raise a number of policy questions related to privacy and who knows ,one day, even freedom of thought!!</p>
<p style="text-align: justify; ">On one hand, we would like to see the price of natural interfaces being made affordable to the commoner, on the other it will require us to unlearn traditional means of information interaction that we have been made quite comfortably accustomed to. Until then it is anyone’s guess what Microsoft’s recent acquisition of Skype along with the desktop version of the Kinect would turn bedroom and boardroom interactions into!</p>
<hr />
<p>[<a href="#fr1" name="fn1">1</a>].Ironically sourced from a present day Wikipedia article linking to Bush’s 1945 article in The Atlantic Monthly titled “As We May Think”. See <a href="http://goo.gl/4mZKx">http://goo.gl/4mZKx</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/accessibility/blog/human-machine-interfaces-the-history-of-an-uncertain-future'>http://editors.cis-india.org/accessibility/blog/human-machine-interfaces-the-history-of-an-uncertain-future</a>
</p>
No publishersharathAccessibility2013-01-04T11:30:47ZBlog Entry