Dr. Geek: The $6 Million Contact Lens: The Marrying of Cybernetics and Augmented Reality
You’re in the middle of Hyderabad, India, on the hottest day of the year. You’re there for a business / pleasure trip, researching Bollywood culture for your book on globalized fandom. You’re looking for the easiest way to get to Ramoji Film City, a Bollywood response to Disneyland. You would ask someone, but your previous attempts at handling the region’s languages Telugu and Urdu have not gone very well. Normally you would whip out the iPhone 4S and tap along to find the directions. But the battery ran out and the adapter you brought couldn’t adapt well to Indian weather. Luckily, you have your back-up system in place and working.
Ducking into a relatively quiet corner, you give a command to the Bluetooth receiver hooked over your ear: “Directions to nearest bus stop for bus to Ramoji Film City.” Within seconds, the world before your eyes changes. You are still seeing the hustle and bustle, the lights and colors, of Hyderabad, only now there is more. The streets and buildings are outlined by a green grid, with their names lighting up as labels. Far in the distance a green triangle pulses slowly, and an arrow points away from you to the marker. A distance gauge tells you there are 639 feet separating you and the marker. It’s like the world of your favorite first person shooter has just been overlaid unto the world you are standing in. You step forward, the distance gauge begins ticking down, and you feel more confident than you have all day.
Augmented reality is the layering of digital information onto how we perceive the physical world. Originally developed in the 1990s, augmented reality is often associated with virtual reality. Virtual reality is the attempt to bring a person into a virtual environment via computer controlled sensory inputs, chiefly sight, sound and touch. Augmented reality brings that virtual environment out into the physical world. In both instances, our perception of reality is being modified, or augmented, by some type of computer mediation.
Residents of the United States are probably most familiar with seeing such augmentation during the broadcasting of their favorite sports. For example, since 1998, a yellow line has been used by US television networks to indicate how far a team must go to achieve a “first down” in American football: such representations of information are computer mediated to help the broadcast audience understand their team’s predicament, thus upping the excitement level of watching the game. Being a Green Bay Packers’ fan since 1992, I can remember the time before these augmentations occurred, and how helpful they have been to the understanding of what is going on during any particular game.
Augmented reality most commonly comes in one of two forms. One form specifically encodes (typically through QR codes) pieces of paper, boards, or gaming pieces that, when seen through a computer controlled camera, will produce images and animations on the computer screen. GE’s ecoimagination Smart Grid requires you to print a code that can produce the illusion of a holographic wind farm when the code is seen through your computer’s webcam. Georgia Tech Augmented Environments Lab and the Savannah College of Art and Design produced a board game that, when seen through an app-equipped smartphone, involves killing animated zombies in a downtown metropolis.
A tie-in game with James Cameron’s 3D film Avatar involves using a series of encoded gaming pieces that, seen through a computer, become animated weapons and warriors.
In the opening hypothetical story to this piece, the augmentation involves adding digitally stored and retrieved information to the information we perceive through our senses, chiefly the sense of sight. A screen mediates our view of the world, and the information is transmitted onto the screen, thereby providing the illusion that the digital information has merged with the physical world. If the screen is linked with a camera, GPS or other sensor for tracking the screen’s position in relation to the physical world, then the information on the screen can change in relation to how it exists between the world and your eyes. That way, the information on the screen can be relevant to the world as we see it, when we see it, and where we see it. The information can be used for various purposes, from learning about a historical event, to finding directions, to battling TIE Fighters as they attack Manhattan.
Right now, a variety of apps have been designed to provide for this type of augmented reality. Given the requirement for the screen to be between the person and the world, it is no surprise that we find the majority of augmented reality being used in mobile computing devices, such as tablets and smartphones. But what if you could have the portability of these devices, only more so?
For several years now, but only recently receiving 15 minutes of media fame November, 2011, a research project from the University of Washington has been working on perfecting a particular type of smart clothing to bring us closer than we have ever been to having bionic eyes. Led by electrical engineering professor Babak Parviz, the research is attempting to create contact lenses that have integrated circuitry to essentially create cybernetic lenses. The circuits would contain radio receiver and transmitter arrays to link the lenses to databases of information, even if the database is just within the smartphone or other mobile computing device the person is carrying. The circuitry would also provide for a LED projector array to turn part of the lens into a screen on which the information is relayed.
Basically, they and other researchers around the world are doing research that could result in the type of computer displays the Terminator and Robocop were built with – the types of displays augmented reality apps have been putting onto our mobile computing devices over the past several years. We’ve seen such speculative technology more recently envisioned – and perhaps influenced by this research – in the contact lenses alternately worn by Josh Holloway and Jeremy Renner in Mission: Impossible Ghost Protocol. In the most recent installment of the franchise, our heroes wear contact lenses with integrated circuits that allow it to capture images, like a still camera, and wirelessly transmit those images to some networked device. Their lenses were only one of the many futuristic devices used in the movie, and one of three augmented reality techniques, which also included a holographic masking system and an embedded AR system in a car windshield, which is also getting closer to arriving at a dealership near you.
This technology becomes possible because of the hyper-trend of miniaturization (see Moore’s Law), the increased interest in embedded smart clothing and accessories, the overall trend towards mobile computing and cloud computing, and improved technology for screen resolution. However, one hurdle for the University of Washington team was the problem of power. While batteries have greatly improved within the past decade, permitting longer lasting mobile devices and better performing electric cars, the researchers were stumped on how to provide the battery life necessary to fuel the lenses.
The answer appears to be in the science of piezoelectrics. If you haven’t heard much of this science, you will probably hear more about it in the future, as piezoelectrics come up whenever the discussion turns to creating newer forms of mobile technology, from touchscreens to batteries. The fundamental idea is that piezoelectric materials are able to convert mechanical energy – such as the kinetic energy your body produces as it moves – into electronic energy. While we probably need to make sure we don’t end up as batteries for a future race of AI robots – a la the Matrix – the science of piezoelectrics opens up the playing field for new possibilities in fueling our love of mobile computing gadgets while potentially reducing the damage done to the planet because of that love.
The future is miniaturization and naturalization – to further blur the line between Man and Machine, and to bring our tools for engaging with the world into the naturalness of how we engage the world through our bodies. These “$6 Million Lenses” are an indication of the cybernetic future that awaits us as we seek easier ways of accessing the world of information that is physically and virtually all around us. So far, these lenses are prototypes, not really capable of providing very detailed information on their basic LED screens. But at least the rabbits they are being tested on do not have a problem wearing them. And in five years, perhaps we won’t, either.
Here’s Huffington Post with more information: http://www.huffingtonpost.com/2012/02/03/virtual-reality-contact-lenses_n_1252481.html
Pingback: Dr. Geek: Smart Clothes and Cybernetic Fashions