From iris-controlled film to a new Disney experiment, sound transmission via human touch
IRIS-CONTROLLED FILM
The human iris is practically structured to serve as an optic fingerprint, perfect for biometric systems to work their pattern-recognition magic. Airports in the UAE have been using iris-recognition tech to filter out unwelcome visitors, while Amsterdam airport Schiphol’s Privium programme offers iris recognition for a more streamlined travelling experience.
But the eye is much more than simplyan alternate means of identification – “gaze-tracking” technology allows a user to control an on-screen cursor with eye movements. Swedish firm Tobii currently leads the market with its pricey EyeMobile unit that runs its own Windows 8 app. Brazilian researcher Katia Vega has also developed an experimental eyeshadow that integrates eye movement with wearable conductive material.
The most nuanced cinematic exploration of this visionary new world will probably come from Mike Cahill, the writer and director behind 2011’s Another Earth. Perhaps it’ll be a delicate examination of surveillance and a play on reflections, in the neurotic vein of Duncan Jones’s Moon. Cahill is supposedly working on a thriller revolving around ocular technology – he’s been throwing out Twitter requests for information about iris conditions like sectoral heterochromia, and just last December, participated in a Creators Project hackathon at the aptly named Eyebeam research centre. The concept of manipulating devices with our eyes is the closest thing to telekinetic powers we’ll get. The touchscreen revolution has conditioned consumers to expect a fully interactive, tactile interface in their new gadgets, so it seems logical that in the near future, we won’t have to lift a finger.
Text by Alexis Ong

THE FUTURE OF STORYTELLING: GAMING?
Keep reading, even if you were the only one who didn’t stand in an hour’s queue to buy Grand Theft Auto V on opening day. We are living in genre-melding and mind-blowing times in terms of creative visual narratives. And the latest immersive-reality techniques in gaming may provide the greatest hints as to what we can expect from the future of storytelling. According to a study by research consultancy Latitude, as technology becomes more advanced and accessible across multiple platforms, audiences are looking for a blurring of barriers between content and reality in layered yet cohesive executions. “There’s a largely untapped opportunity to allow people to tie stories directly into their own lives – bringing narratives ‘out of the screen’, so to speak, often through meaningful connections with characters,” commented Latitude’s Neela Sakaria.
Fascinating study findings included: 1) Transmedia is not just media shifting: 82 per cent of those studied wanted complementary mobile apps for their TV-watching experience 2) The real world is a platform: 52 per cent consider the real world as another platform in which 3D technology and augmented reality are expected to link the digital and physical. 3) Control: 79 per cent expressed the desire to become part of a story, interacting with its main characters.
In other words, it’s only natural for us as viewers to expect increasingly higher standards of engagement, integration and synchronicity across all our daily viewing platforms. How? Maybe with something like an Igloo Vision immersive dome simulator, which has five HD projectors to project your visual content on a massive wraparound 360-degree screen, and MSE Weibull’s omni-directional treadmill, which tracks and simulates any movement you make in virtual reality. Just imagine playing the hero in Die Hard, or watching a reality TV show like Survivor while simultaneously participating in it. Or consider realising your dream as a quirky minor character in Twin Peaks (reinventing the story or experiencing multiple alternative endings to) while Extra Dimensional Technologies’ ambient lighting recreates the real-time lighting changes appropriate to Lynch’s eerily glamorous atmosphere. The possibilities are endless...
Text by Christine Jun

Günter Seyfried’s bioart-work takes the principle of data moshing (pushing digital video files into software or machines unintended for media playback) but instead maps moving-image data into a gif, pushes the binary data into DNA and then implants it into a bacteria’s genome. He then extracts the DNA and converts it back to a gif. The result? Data-moshed biomedia! Seyfried's project is a prototype expedition made possible by the gif format. Nature is the most creative analogue machine we have, one that up to now we've been constrained from documenting up close. In Seyfried's work, digital-media is data-moshed by the engines of evolution, producing biocinema never seen before.
Text by Stephen Fortune
Does the future of sound lie in our fingertips? Disney thinks so. The global conglomerate is making waves in the field of sound with a new technology dubbed “Ishin-Den-Shin” – a Japanese expression for communicating through unspoken mutual understanding. One person speaks into a microphone, which is then relayed as an electromagnetic signal via the human body. Although it’s unclear how Disney plans to roll this out, sound is about to get physical.
Text by Trey Taylor