Human Machine Interfaces: The History of an Uncertain Future
Fundamental inspirations in digital information practices sprouted from the hypothetical electro-mechanical device ‘Memex’ proposed by renowned scientist Vannevar Bush in 1945, who incidentally, as the graduate professor of Claude Shannon, also paved the way for digital circuit design theory. The Memex (Memory + Index) concept entailed a system where a user could add associative trails to notes, books, communication and audio-visual experiences involving both him and others. Memex in Bush’s view was to create trails of links in temporal sequences of subjective experiences of a person, accessible to him (and others) anytime — a sort of augmented and extended memory. So implausible was considered this ambitious proposal of his, that the word ‘vannevar’ has entered the dictionary as a noun used to describe something that is unfeasibly fantastic and imaginative.
Wholly new forms of encyclopedias will appear, readymade with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.[1]
The Memex idea had an immediate bearing on the conception of the World Wide Web and also influenced Ted Nelson’s coinage of ‘hyperlink’ that mapped a single word in a document to other associative content. Douglas Engelbart inspired by Bush’s essay, invented an interface that aided the very metaphor of pinpointed navigation through hyperlinks — the X-Y Indicator — that later came to be known to the world as the Computer Mouse. Not much has changed in the ways by which humans have interacted at the interface level. The WIMP paradigm (Windows, Icons ,Menus and Pointers) has been here to stay.
The X-Y indicator that previously mapped motions made on a two dimensional track pad onto the screen has simply been infused onto touch screens. While this may have eased the process of visual design automation, could our interaction be more natural, expressive, immersive and creative? Our experience in the real world is multi modal and we communicate with others using our body, hands, visual cues and sound. Is there a way by which our interaction in the virtual world could closely mimic our real world behavior?
The answer to the above questions came around the same time that the mouse was invented — Myron Kreuger's Videoplace. Unarguably the first and finest immersive virtual reality created way back in the 1970s, ‘Videoplace’ combined two cultural forces — the television (a purveyor of passive experience) and computer (symbol of forbidding technology) to create an expressive medium for communicating playfulness and active participation. Kreuger argued that "computer art which ignores responsiveness is using the computer only for visual design automation, rather than as a basis for a new medium." Kreuger used image processing and gestural interaction as early as in the 70s to interact with virtual objects in the digital world and has inspired a whole generation of computer vision artists including the likes of Golan Levin. If one recalls the seemingly futuristic gestural interface that Tom Cruise used in the film ‘Minority Report’ — be assured it’s already here! Jaron Lanier, a pioneer in virtual reality systems who headed the National Tele Immersion Initiative developed the entire working set of the film.
It seems Kreuger’s work had remained in a niche closet due to early commercialization and large scale adoption of the XY mouse and touch devices. ‘User centric design’ has become increasingly device dependent and really only caters to enticing users to information that the interface wants to disseminate rather than let the user engage with the interface intuitively.
Today, natural interfacing techniques are regaining much commercial interest. A landmark event was the massively viral YouTube video of Johnny Lee Cheung hacking the Nintedo Wiimote’s infrared sensor to track the head movement of a user in real time and provide an illusion of 3-Dimensional Virtual Reality. Within a year, Microsoft hired Cheung to develop the Kinect Camera for gestural interaction with it’s X-box gaming console and also bought all assets of 3DV system’s 3D sensor ‘ZCam’ -- the most affordable option available to new media artists until then. Within a week of the Kinect’s release, it’s drivers were hacked and exposed by the opensource art community that responded to Adafruit’s USD 2000 Kinect hack challenge.
With similar gestural devices by ASUS and the much-awaited Leap sensor, we are on the brink of a paradigm shift in the ways of accessing information that shall redefine concepts in human computer interaction. Cognitive interface solutions by NeuroSky and Emotiv Systems have already paved the way to neuronal signal activated interactions and games. The OpenEEG project has propelled research into open hardware schematics for brain computer interaction.
The linear presentation of search engine results on a browser across millions of pages , I predict, will change this decade as the GUI will transform into a 4 Dimensional Space layered in time, with relevant search results being clustered onto a connected graph node structure and distanced based on their mutual relevance. This calls for a more natural interface that depends not on the traditional keyboard-mouse interaction but on the use of intelligent interfaces such as eye-tracking , gaze , gesture ,speech and thought waves to sift through large databases that shall present themselves in totality along with multimodal feedback to the user.
While all this will transform the ways in which the specially-abled shall access digital information, such transparent interfaces shall also raise a number of policy questions related to privacy and who knows ,one day, even freedom of thought!!
On one hand, we would like to see the price of natural interfaces being made affordable to the commoner, on the other it will require us to unlearn traditional means of information interaction that we have been made quite comfortably accustomed to. Until then it is anyone’s guess what Microsoft’s recent acquisition of Skype along with the desktop version of the Kinect would turn bedroom and boardroom interactions into!
[1].Ironically sourced from a present day Wikipedia article linking to Bush’s 1945 article in The Atlantic Monthly titled “As We May Think”. See http://goo.gl/4mZKx