Her bio-alarm clock wakes her up gently; she feels renewed. Her glasses give her the stats on her sleep, syncing with the accelerometer in her smartwatch. She’s pleased with the 84% sleep quality rating. She takes a moment to scan her REM log, adding a voice note with the fantastical things she saw in her dreams.
She rises and begins the day with her yoga practice, to stretch mind and body. Her EEG readings are excellent, with a high spike in alpha brain wave activity. She opts for some music from her library based on her heart rate. She smiles when The xx come on over the speakers.
She heads into the shower, but not before removing her glasses (she snickers thinking only ‘white men wearing glass’ actually shower with them on).
Dressed, she walks into the kitchen to make a light breakfast. The Wall Street Journal, The New York Times, and Wired Magazine instantly light up before her, offering the day’s headlines and relevant articles to her week ahead. Her serendipity app brings in a few articles from other news sources she might not necessarily read. She tags ‘more like this’ to the one she likes.
She fields a phone call from her boss while drinking her coffee and is able to reference both his recent email and an article in the morning’s paper, which of course, impresses him.
She walks to work while her glasses provide her with shopping deals of the day. She receives a popup reminder of a friend’s upcoming birthday with a photo of the blouse her friend liked when they were last window shopping together.
She enters a crowded, all male conference room, comfortably taking a seat in the middle of the investors’ meeting. As she scans the room, her glasses use facial recognition to provide her with everyone’s name and LinkedIn profile. She greets every member by name, asking about their recent projects, ventures, and successes as provided to her by her glasses. They’re noticeably impressed. When she assumes lead on the group presentation, her glasses sync to her visual presentation. She had used her voice analyzer app last night to rehearse the presentation and tone. She walks the room with confidence and exits a dazzled crowd.
It’s been a long, but fruitful day. She heads out for a cocktail after work. The doors swing open into a crowded lounge and she walks in alone, but not unnoticed. She checks tomorrow’s calendar with her glasses while ordering a drink. A handsome stranger approaches her, asking for her name. Her glasses immediately alert her that he’s 35 and a successful broker, whose address matches his mother’s house. She smiles and walks away. Well, at least he didn’t pull the, ‘So, are you a Gemini?’ line. She laughs to herself. Her attention shifts to a guy across the bar, he’s dressed casually, but locks eyes and smiles. Her glasses tell her he’s 31, a painter, and a volunteer at the children’s hospital art wing. She sends a friend request to him with her glasses while sending a message, “Always wanted to learn how to paint….Helen”, smiles, and walks away.
She returns home and checks her email on her glasses. She has several emails congratulating her on a fantastic presentation and requesting meetings. She replies by checking her availability on her calendar, which appears next to the message on her glasses. The painter from the bar replies, “Anytime ;)” She takes off her glasses and turns out the light.
[Hat tip and very gracious bows to James C. Nelson and Dr. Caitlin Fisher]
Augmented Reality eyewear and Google’s Glass will take us to new heights, quite literally: the first sequence in the “How It Feels [through Glass]” video was shot via Glass in a hot air balloon.
It was 155 years ago that the first aerial photograph was taken on a balloon flight in 1858 over Paris, France by artist Nadar (born Gaspar Félix Tournachon)(1820-1910). A pioneer in the newly emerging medium of photography, Nadar also attempted underground photography using artificial light to produce pictures of the catacombs and sewers of Paris. Nadar’s technical experiments and innovation took us to places via his camera that were previously inaccessible to photography, inspiring new ways of seeing and capturing our world.
AR eyewear and Glass offer this same opportunity at a time when AR is emerging as a new medium, which will give way to novel conventions, stylistic modes, and genres. Referencing Dr. Seuss’s book in the title of this article, AR also promises to transport us to wondrous, magical places we’ve yet to see.
This article is a follow up to a post I wrote a year ago posing the questions: Will Google’s Project Glass change the way we see & capture the world in Augmented Reality (AR) and what kind of new visual space will emerge?
As both a practitioner and PhD researcher specializing in AR for nearly a decade, my interests are in how AR will come to change the way we see, experience, and interact with our world, with a focus on the emergence of a new media language of AR and storytelling.
I’ve previously identified Point-of-View (PoV) as one of “The 4 Ideas That Will Change AR”, noting the possibilities for new stylistic motifs to emerge based on this principle. I’d like to revisit the significance of PoV in AR at this time, particularly with the release of Google Glass Explorer Edition. PoV, more specifically, “Point-of-Eye”, is a characteristic of AR eyewear that is beginning to impact and influence contemporary visual culture in the age of AR.
Image: Google Glass
AR eyewear like “Glass” (2013) and Steve Mann’s “Digital Eye Glass” (EyeTap) (1981) are worn in front of the human eye, serving as a camera to both record the viewer’s environment and superimpose computer-generated imagery atop the present environment. With the position of the camera, such devices present a direct ‘Point-of-Eye’ (PoE), as Mann calls it, providing the ability to see through someone else’s eyes.
AR eyewear like Glass remediates the traditional camera, aligning our eye once again with the viewfinder, enabling hands-free PoE photography and videography. Eye am the camera.
Contemporary mass-market digital photography has us forever looking at a screen as we document an event, rather than seeing or engaging with the actual event. As comedian Louis C.K. so facetiously points out, we are continually holding up a screen to our faces, blocking our vision of the actual event with our digital devices. “Everyone’s watching a shitty movie of something that’s happening 10 feet away” he says, while the ‘resolution on the actual thing is unbelievable’.
Glass presents an opportunity where your experience in that moment is documented as is without having to stop and grab your camera. Glass captures what you are seeing as you see it through PoE, very close to how you are seeing it. Google Co-Founder Sergey Brin states, “I think this can bring on a new style of photography that allows you to be more intimate with the world you are capturing, and doesn’t take you away from it.”
Image: Recording video with Google Glass, “Record What You See. Hands Free.”
I agree with Brin; Glass will bring on new stylistic modes and conventions through PoE, which also appears to be influencing other mediums outside of AR.
Take for instance the viral Instagram series “Follow Me” by Murad Osmann featuring photographs of his hand being led by his girlfriend to some of the world’s most iconic landmarks.
Photographs by Murad Osmann, “Follow Me” series, 2013.
(Similar in style to the above video recording visual from Google Glass of a ballerina taking and leading the viewer’s hand.)
The article “How Will Google Glass Change Filmmaking?”, identifies two other examples in contemporary music videos: the viral first-person music video for Biting Elbows and the award-winning music video for Cinnamon Chasers’ song “Luv Deluxe”.
In “The Cinema as a Model for the Genealogy of Media” (2002), Andre Gaudreault and Phillipe Marion state, “The history of early cinema leads us, successively, from the appearance of a technological process, the apparatus, to the emergence of an initial culture, that of ‘animated pictures’, and finally to the constitution of an established media institution” (14). AR is currently in a transition period from a technological process to the emergence of an initial AR culture, one of ‘superimposed pictures’, with PoE as a characteristic of the AR apparatus that will impact stylistic modes, both inside and outside the medium, contributing to a larger Visual Culture.
Gaudreault and Marion identify key players in this process as: the inventors responsible for the medium’s appearance, camera operators for its emergence, and the first film directors for its constitution. ‘Camera operators’ around the world are beginning to contribute to AR’s emergence as a medium, and through this process, towards an articulation of a media language of AR. Mann, described as the father of wearable computing, has been a ‘camera operator’ since the 90′s. In 2013, Google Glass’s early adopter program selected 8000 ‘camera operators’ to explore these possibilities, with Kickstarter proposals since from directors for PoE film projects including both documentaries and dramas. What new stories will the AR apparatus enable? Like cinema before it, what novel genres, conventions, and tropes will emerge in this new medium towards its constitution?
Let’s continue the conversation on Twitter: I’m @ARstories.
I’m greatly honoured to be named among the NEXT 100 Top Influencers of the Digital Industry in 2013! A heartfelt thank you to NEXT Berlin, and the independent jury of experts. Very flattered to be included on this wonderful list of trailblazers.
I’m very pleased to write about my big news this month: the announcement of my role as Chief Innovation Officer at Infinity Augmented Reality (AR) in New York City. I can’t wait to share what we’ve been working on at Infinity, stay tuned to www.infinityar.com often for updates. There are some very exciting things just around the corner.
Nearing the completion of my doctoral degree specializing in Augmented Reality, I will continue to share my PhD research and musings here at Augmented Stories. Also be on the lookout for more writing and future forecasting from me at Infinity, with a format dedicated to the future of AR we are calling “What If…?”. You can subscribe to the Infinity newsletter here, and we’ll share the details with you when we launch this special section very soon.
Moving from hand-held AR devices to eyewear, we can now become hands-free to fully immerse ourselves in the AR experience; we are no longer holding up a looking glass, such as a tablet or a smartphone as a portal into another world, we are now truly seeing, we are there, it is our reality.
AR will enable us to experience events and places in real-time we may not otherwise have access to. Imagine participating in a live concert, sporting event, or visiting the destination of your choice all from the comfort of your own home through AR. The stereoscope of the Victorian era created the concept of “the armchair traveller”, of being able to ‘travel’ to distant lands from one’s home by looking at 3D photographs through a special viewing device. Film and television extended this notion of being transported to other worlds through the moving picture and live telecasts viewed on dedicated screens, and now with AR, we can bring the world to us anywhere and anytime, appearing in our space and reality as a highly personalized experience.
AR is no longer a technology only accessible to a select few, but a mass medium that is being experienced by consumers on a daily basis.
Thank you each for your incredible support over the past 8 years and all of the warm congratulatory wishes on my new position. I can’t wait to share our augmented future with you. It’s going to be pretty marvellous.
My best to you each,
Nick Bilton of the New York Times reported yesterday that Apple is “experimenting with wristwatch-like devices made of curved glass”.
(View more Apple Smart watch Concepts on Mashable)
Bilton posed several questions about the watch including, “If the company does release such a product, what would it look like? Would it include Siri, the voice assistant? Would it have a version of Apple’s map software, offering real-time directions to people walking down the street? Could it receive text messages? Could it monitor a user’s health or daily activity?”
The big question on my mind is: Will Apple’s rumoured smart watch be Augmented Reality (AR) enabled? (Google already has a patent on a smart watch incorporating a flip up display, see below.)
After all, a truly ‘smart’ smart watch would be responsive to context. Enter AR. Creating contextual experiences is one of the unique capabilities that will distinguish AR from other technologies and mediums. The best AR scenarios will be context driven and engage users in meaningful and compelling experiences that are specific to the individual’s unique circumstance or environment.
Enabling these experiences need not be limited to AR equipped devices such as tablets, smart phones, or eyewear. A watch is another viable contender.
Image: Google patent “Smart watch including flip up display”
In fact, Google patented a smart watch last year including a flip up display that appears to have AR functionalities. Could Apple be experimenting with something similar?
One of the examples in Google’s patent describes the figure above, which “includes an application where a user receives product information from the smart-watch”. The experience entails the user opening up the flip up portion and capturing an image of the desired product on the camera. The “inside display of the flip up portion may form an optical viewfinder for the camera. Therefore, the image may be seen on the inside display by the user.” The patent also states that “Product information may be retrieved in a variety of ways including, but not limited to, bar code scanning of the product or image analysis of the product.” Hello AR.
To return to Bilton’s NYT article, one of the questions he posed about Apple’s rumoured smart watch was, “Could it monitor a user’s health or daily activity?” This important question around context and tailoring a unique response and experience based on the individual wearer made me think of a short video featuring Intel’s Genevieve Bell’s (Director of Interaction & Experience Research) views on computing in the year 2020 in which she asks, “Will devices learn us?”
Bell describes a near future “where technology can start to anticipate our needs” by means of “personal objects” that know you and your behaviours, like catching the bus every Monday morning. She tells us it is safe to imagine that our devices will come to know us in a whole new way: “They’re going to be more intuitive about who we are. They are going to have a memory of us. And as such not be so much of an interaction but a relationship.” (Perhaps a “relationship” like that between KITT and Knight Rider, that, on a side note, my mother referred to me as Knight Rider while I was using Siri on my iPhone.) We’ll have scenarios where personal technology knows and really sees us, seeing A.I. to “I”, anticipating and presenting custom tailored experiences and information.
And let’s not forget the other non-eyewear AR device prototypes such as the EyeRing and Sixth Sense from MIT, as well as Google’s patent for ‘seeing with your hands’ (Read more about those projects here in my article, “The 4 Ideas That Will Change AR”).
Bilton’s article referenced Corning’s bendable “Willow Glass”; Corning is also the maker of “Gorilla Glass” used in the iPhone. I had the great pleasure of being invited to participate in Corning’s Advancing the Vision 2 at Stanford last Fall, an exchange of ideas and information on building the technologies of tomorrow.
One year ago today I wrote a guest post for The Creators Project naming Georges Melies (1861-1938) the patron saint of Augmented Reality (AR) in an article celebrating what would have been the magician and filmmaker’s 150th birthday. Today, I tip my hat to Melies again, honouring his creative genius and incredible technical contribution to film and special effects.
Melies was a master of production maintaining a sophisticated understanding of the medium of cinema and a fervour to innovate within this novel domain. He wrote in detail about the complexity and special care of composing and preparing scenes and sets, highly aware of how his work was analogous to theater and photography, yet all the while completely attune to the particular sensitivities and opportunities presented in this whole new medium. And this is one of the things that made Melies’s work incredible: with great skill, knowledge, and understanding of the medium, he playfully pushed beyond existing conventions to invent completely new techniques specific to cinema such as the substitution shot (also referred to as stop-trick).
Melies was able to evolve his stage tricks as a magician to multiple exposures and superimposition in cinema, radically different from the ‘actuality films’ of the time (including the Lumiere brother films such as “Exiting the Factory”, and “Arrival of a Train”, which are exactly as they sound); Melies introduced a fantastical and transformational aesthetic of visibility/invisibility to the moving image, which, in Melies’s words, allowed “the impossible to be rendered visually” .
And so with AR we also see this dialectic of appearance and disappearance, of making the invisible visible. AR, a superimposition of virtual content atop the physical world in real-time, is part of Melies’ legacy of special effects and also urgently demands creative and technical specialists of the medium for AR to move beyond mimicking other media and to truly come into its own. AR needs excellent experience designers, writers, directors, and so on, who, like Melies, can also maintain the wonderment of the medium to create magical and compelling experiences unlike anything we’ve ever seen before.
How do we do this? Some nudges here in a creative manifesto entitled, “An Explorer’s Guide to Augmented Reality’s Creative Future” and more on the “4 Ideas That Will Change AR” here.
Photo: KinEtre, Microsoft Research Cambridge
This year for Melies’s birthday I would like to gift him this playful 3D animated dancing chair from Sharam Izadi’s team at Microsoft Research Cambridge (I think Melies would have quite liked the horse too, be sure to watch the video). “KinEtre” allows users to scan physical objects with the Kinect and bring them to life by mapping their body movements to the newly created 3D object. As Melies did in cinema, KinEtre can make inanimate objects (fantastically) animate. And why would you want to make a chair ‘walk’ in a human way you ask? (Aside from making Borat’s joke into a reality?) Well, let’s think about the storytelling possibilities here. Anthropomorphism can play a strong role in future AR experiences to create magical and enchanted realities. Maybe KinEtre has legs (bad pun), maybe it doesn’t. The point is to constantly push ideas out there, to experiment, to prototype and to iterate, iterate, iterate (a process which is central to my personal creative practice in AR, and one where I learned the critical importance of while working at Bruce Mau Design in my pre-augmented life).
The Kinect has been a wonderfully magical device for AR. One of the strengths of the Kinect is in the NUI (Natural User Interface) and making the technology invisible, to ‘disappear’ — continuing on a Melies theme here — to make for an engaging experience rooted in the physical environment, yet transporting the viewer into wondrous other worlds. Georges, tonight I look up at the moon and smile at your genius; I can’t wait to see how your legacy will continue to impact the marvellous medium of AR as we continue on this fantastical journey. Happiest Birthday.
 “Cinematographic Views.” Georges Melies translated by Stuart Liebman. October, Vol. 29 (Summer 1984), p31.
It was a great honour to speak at the Ontario Augmented Reality Network (OARN) 2012 conference today alongside sci-fi guru & contributing editor of Wired Magazine Bruce Sterling and Gene Becker from Samsung. The day was chockfull of wonderful presentations and critical conversations on advancing AR as a new medium.
I’ve had requests to share the ‘creative manifesto’ slides portion of my talk, “An Explorer’s Guide to Augmented Reality’s Creative Future”, and here they are, unicorn and all. This guide is intended as a source of inspiration to light a creative fire under our AR rockets as we navigate this new terrain. The term “AR unicorn” was also coined during this talk as a metaphor for an AR future we dare to imagine and made real intro a wondrous reality.
A big thank you to Kevin Kee & Karen Flindall for organizing a terrifically inspiring event and another warm thank you to the esteemed panelists and wonderfully engaging audience today! It was so great to bring it all home to Toronto; thank you for having me.
*Update: October 7: Bruce Sterling shared this great photo on Flickr he took of me in cyborg mode exercising my Terminator vision at OARN