The Future of Augmented Reality is in our Hands with Haptics & Touch Screens
There are two questions that I’m often asked: ‘What’s in store for the future of AR?’ and ‘What would you like to see in the future of e-books and tablets?’
My answer to both is haptics and tactile feedback.
In August 2011 I had the pleasure of visiting the Magic Vision Lab at the University of South Australia and experiencing their AR haptics demo. Wearing a head-mounted display (HMD) and using a Phantom stylus, I could feel the scales of a virtual fish which appeared before me. I was able to touch virtual objects and receive tactile feedback, as though these were real, physical objects I was interacting with. This completely threw off my sensibilities of the real, having difficulty distinguishing between what was real and what was virtual. This experience signified an important shift for me in the medium of AR: in the past, the only tactile component of AR was that which physically existed in our environment.
Image: Haptics Demo from the Magic Vision Lab, University of South Australia.
I was fascinated by the sense of touch and tactile feedback paired with AR. However, I was left desiring a more direct interaction in this experience, without the HMD or stylus.
Enter Senseg’s touch technology for tablets, which premiered at the Consumer Electronics Show (CES) last week. If we can merge this with AR, I truly think it can be a game changer and help push the medium forward in new important ways that are currently absent. To date, AR has been primarily a vision-based medium; we haven’t really got to augmenting touch and we can’t ignore these other very ‘real’ senses for much longer.
In the short video above, Dave Rice, VP of Senseg, discusses the technology as adding tactile effects to touch screen displays including smart phones, tablet computers, touch pads and gaming devices. He discusses the possibilities for gaming applications (I personally think this would be incredible to apply to storytelling as well) and describes a treasure hunt game in which a treasure chest is hidden and can only be found by feeling around on the screen. Dave says, “There were no visual cues there and that’s pretty exciting because now we can move to the world of feel to complement what you’re seeing, or to work independently from it and really create a new world to explore.”
For me this perfectly describes the future of AR and its potential. I think about this last quote and how it applies to my recent AR Popup Book “Who’s Afraid of Bugs?”, the first AR book designed for iPad 2. For me the next step in this book is to be able to touch and feel the texture of the virtual spider that magically appears. Imagine petting the spider and feeling each tiny hair.
(Also with today’s announcement from Apple on iBooks textbooks for iPad, “a new kind of textbook that’s dynamic, current, engrossing, and truly interactive”, imagine how haptics and tactile feedback could change the future of education in e-books, as well as AR. Talk about ‘bringing the curriculum alive’.)
“A Brief Rant on the Future of Interaction Design”, is a very relevant and excellent article in which Bret Victor asks us to aim for a “dynamic medium that we can see, feel, and manipulate”. Bret’s article immediately resonated with me when I read it in November and I shared it with the AR community via Twitter as something important we needed to be aware of and really work towards.
Image: Bret Victor
Bret emphasized the use of our hands to feel and manipulate objects. He writes, “The sense of touch is essential to everything that humans have called ‘work’ for millions of years.”
“Now, take out your favorite Magical And Revolutionary Technology Device. Use it for a bit. What did you feel? Did it feel glassy? Did it have no connection whatsoever with the task you were performing?” Bret calls this technology, “Pictures Under Glass”, noting that it sacrifices “all the tactile richness of working with our hands”.
Bret links to research that’s been around for decades in haptics, tangible-user interface (TUI) and even Touchable Holography. He comments on how this research has always been marginalized, but that “maybe you can help”.
AND WE CAN. As Bret so wonderfully states, the most important thing about the Future is that it is a choice. As an AR industry and community, it is our choice as to how this medium evolves. “People choose which visions to pursue, people choose which research gets funded, people choose how they will spend their careers.”
Let’s do this. Kindly get in touch if you’re keen to collaborate!
Let’s also continue the conversation in the comments area below and on Twitter; find me, I’m @ARstories.
*Hat tip to Stephen Ancliffe for sharing the Senseg video with me.
UPDATE (March 6, 2012): The Next Web and The Guardian published articles on haptics/touch-feedback possibly being the secret feature of the iPad 3. Could AR’s tactile future be here very, very soon? Let’s hope so. My original post was published on January 19, 2012.
UPDATE (March 7, 2012): Sadly, haptics didn’t make it into Apple’s big “New iPad” announcement today. Let’s hope for touch-feedback capabilities in the next iPad. Now, that would truly be a magical device.