Hands-on With Microsoft HoloLens: Augmented Reality That Doesn’t Make You Sick

Microsoft MSFT -1.12% showed off some nifty demos of HoloLens, its augmented reality headset, last week during the company’s annual conference for developers in San Francisco. In a live, on-stage demo, Microsoft demonstrated how you could pin videos and calendars wherever you want in your house or play with a 3D-animated puppy.

The demos were impressive, but trying HoloLens for yourself is a whole different thing. On Thursday, I got a chance to do so, and I got an opportunity to develop a rudimentary app using the company’s tools.

Let me start with what seemed like the biggest shortcoming: HoloLens limits your field of vision to a smallish rectangle in which you can see the digital information projected onto the display. It felt constrained. I couldn’t see anything in my periphery; I had to be looking almost directly at the holographic objects to see them. This takes away all the immersive qualities you might be expecting from the videos Microsoft was showing off.

In the 90-minute session, a group of peppy, blue-shirted Microsoft “mentors” guided us through the building of 3D holographs using a custom software development kit built with Microsoft’s Unity game engine. We then added things like gesture and voice control as well as audio to the 3D models. Once the models were built in the development kit, we exported them to Microsoft Visual Studio and then loaded them onto the HoloLens through a USB port. With headsets on, were able to do things like gesture with our finger to make holographic spheres fall onto other holographic objects or onto real objects like a couch or coffee table. We could also set the virtual spheres in motion with voice commands.

Despite the limitations of the field of vision, HoloLens is an impressive and promising gadget. I’ve had the chance to try a number of other augmented reality headsets (no, Magic Leap still won’t let me near whatever they’re doing), and the worst problem is always the latency between your head’s movements and the visuals in the glasses. There’s always some delay, and it makes me feel nauseous.

That’s a problem Microsoft really seems to have solved. Objects stick in space where they’re supposed to. It looks like they exist in the environment around you. When I turned my head, the holograms moved accordingly with no delay. It was the most seamless experience I have had with this kind of technology. After a fair amount of use in the 90-minute session, I felt perfectly fine.

A lot of this is aided by the robust array of cameras and sensors Microsoft has embedded in the frames of the headset. Like the Microsoft Kinect device for the Xbox game consoles, HoloLens maps your environment in real time. By knowing exactly what’s in front of you and where you are in the environment, the software is able to position objects much more steadily than other augmented reality glasses I’ve tried.

Microsoft is continuing to keep a tight lid on the details of the hardware, but you can see at least five cameras and some other sensors collecting visual information in the frames of the headset. Other headset include accelerometers and gyroscopes to help sense where your head is positioned and how you’re moving around, and it’s likely Microsoft has included these kinds of sensors in the HoloLens as well.

During the demo, we were able to add wire meshing to the 3D scans of the environment. This allowed us to see how the glasses were picking up information and processing it in a real-time scan. People and furniture showed up as wire blobs that closely matched their real shape and size.

In other augmented reality glasses I’ve tried, there’s usually only a single camera in the frame. That camera picks up what you’re looking at in order to pin objects in space, but the experience usually isn’t very good. Objects don’t stay pinned where they’re supposed to very well. After 10 minutes of wearing them, I usually have to take a break.

HoloLens also handles sound impressively. In the Unity software development kit, developers can pin audio to specific objects. Then, as the HoloLens user walks through a space the sound changes based on his proximity to various objects much like it would in a real environment. (The audio is delivered through two small speakers in the headset.)

With the interest in augmented reality and virtual reality taking flight in the past year, Microsoft is up against a growing list of big players that includeGoogle GOOGL -1.33%, Facebook and others. I’m not convinced this is anything people would wear regularly around the house like Microsoft shows in its demos, but companies are still searching for a use case with this technology. The strongest ideas are usually around enterprise applications that could do everything from helping someone fix a complicated piece of machinery to designing a building. Microsoft has built unique piece of hardware that could play an important role in this emerging space.

We still don’t know how much the HoloLens will cost or even when it’ll be out. The hardware may still improve before the official launch, and Microsoft may address the issue with the limited field of view. At the conference, Microsoft put a good amount of attention on the device in hopes to get developers excited about it. It may be that Microsoft isn’t quite sure what the killer apps for HoloLens are going to be and is hoping that developers will figure it out.

Sony’s $840 augmented reality glasses are real, just not pretty

Sony has toyed around with ideas like a clip-on headset to compete with Google Glass, but its initial entry into augmented reality wearables is this pair of glasses. It’s not a consumer product yet, but the SmartEyeGlass SED-E1 Developer Edition previewed a few months ago is coming to 10 countries in March, for $840 (US), 670 (EUR), or 100,000 (yen). While we wait for Microsoft’s HoloLens and a revamped version of Glass, Sony is using “holographic waveguide technology” in 3mm AR lenses to put information directly in the wearer’s eyeline. A demo video (after the break) will give you an idea of the capabilities, but it looks a lot more like Glass than Hololens, with simple green monochrome text and diagrams displayed at up to 15fps. There’s also a 3MP camera tucked inside that can take still pictures or video, which developers can use its images in their apps.

As you’d hope, the glasses can connect to compatible Android smartphones over Bluetooth, and is controlled via that little puck you see in the picture above — that’s also where the battery, speaker, microphone, NFC and touch controls are. Now that it’s almost ready to roll we’ve got a full list of specs, including a battery life of 80 minutes with the display active while using the camera, and 150 minutes without. Sony says apps will let wearers access services like Facebook and Twitter, and it’s rolled out the first version of an SDK to get things moving for other apps. While early-adopting developers take a shot (you can pre-order now in the US, Japan, Germany and the UK) at the software next, we’ll wait for Sony to whip up something slightly more stylish for v2.

Project HoloLens: Our Exclusive Hands-On With Microsoft’s Holographic Goggles

It’s the end of October, when the days have already grown short in Redmond, Washington, and gray sheets of rain are just beginning to let up. In several months, Microsoft will unveil its most ambitious undertaking in years, a head-mounted holographic computer called Project HoloLens. But at this point, even most people at Microsoft have never heard of it. I walk through the large atrium of Microsoft’s Studio C to meet its chief inventor, Alex Kipman.

Alex Kipman.

The headset is still a prototype being developed under the codename Project Baraboo, or sometimes just “B.” Kipman, with shoulder-length hair and severely cropped bangs, is a nervous inventor, shifting from one red Converse All-Star to the other. Nervous, because he’s been working on this pair of holographic goggles for five years. No, even longer. Seven years, if you go back to the idea he first pitched to Microsoft, which became Kinect. When the motion-sensing Xbox accessory was released, just in time for the 2010 holidays, it became the fastest-selling consumer gaming device of all time.
Right from the start, he makes it clear that Baraboo will make Kinect seem minor league.

Kipman leads me into a briefing room with a drop-down screen, plush couches, and a corner bar stocked with wine and soda (we abstain). He sits beside me, then stands, paces a bit, then sits down again. His wind-up is long. He gives me an abbreviated history of computing, speaking in complete paragraphs, with bushy, expressive eyebrows and saucer eyes that expand as he talks. The next era of computing, he explains, won’t be about that original digital universe. “It’s about the analog universe,” he says. “And the analog universe has a fundamentally different rule set.”

Translation: you used to compute on a screen, entering commands on a keyboard. Cyberspace was somewhere else. Computers responded to programs that detailed explicit commands. In the very near future, you’ll compute in the physical world, using voice and gesture to summon data and layer it atop physical objects. Computer programs will be able to digest so much data that they’ll be able to handle far more complex and nuanced situations. Cyberspace will be all around you.

What will this look like? Well, holograms.

http://video.wired.com/watch/introducing-the-hololens

jpegjpeg-2 jpeg-6 jpeg-7 jpeg-8 Microsoft-HoloLens-Family-Room-RGB1_TONED Screen-Shot-2015-01-21-at-10.42.06-AM_TONED Screen-Shot-2015-01-21-at-10.55.30-AM_TONED Screen-Shot-2015-01-21-at-10.55.42-AM_TONED Screen-Shot-2015-01-21-at-10.55.58-AM_TONED Screen-Shot-2015-01-21-at-10.57.08-AM_TONED win10_holoLens_livingRoom_web

First Impressions

That’s when I get my first look at Baraboo. Kipman cues a concept video in which a young woman wearing the slate gray headset moves through a series of scenarios, from collaborating with coworkers on a conference call to soaring, Oculus-style, over the Golden Gate Bridge. I watch the video, while Kipman watches me watch the video, while Microsoft’s public relations executives watch Kipman watch me watch the video. And the video is cool, but I’ve seen too much sci-fi for any of it to feel believable yet. I want to get my hands on the actual device. So Kipman pulls a box onto the couch. Gingerly, he lifts out a headset. “First toy of the day to show you,” he says, passing it to me to hold. “This is the actual industrial design.”

Oh Baraboo! It’s bigger and more substantial than Google Glass, but far less boxy than the Oculus Rift. If I were a betting woman, I’d say it probably looks something like the goggles made by Magic Leap, the mysterious Google-backed augmented reality startup that has $592 million in funding. But Magic Leap is not yet ready to unveil its device. Microsoft, on the other hand, plans to get Project HoloLens into the hands of developers by the spring. (For more about Microsoft and CEO Satya Nadella’s plans for Project HoloLens, read WIRED’s February cover story.)

Kipman’s prototype is amazing. It amplifies the special powers that Kinect introduced, using a small fraction of the energy. The depth camera has a field of vision that spans 120 by 120 degrees—far more than the original Kinect—so it can sense what your hands are doing even when they are nearly outstretched. Sensors flood the device with terabytes of data every second, all managed with an onboard CPU, GPU and first-of-its-kind HPU (holographic processing unit). Yet, Kipman points out, the computer doesn’t grow hot on your head, because the warm air is vented out through the sides. On the right side, buttons allow you to adjust the volume and to control the contrast of the hologram.

Microsoft's Lorraine Bardeen demonstrates HoloLens at the Windows 10 event at the company's headquarters in Redmond, Washington on Wednesday, Jan. 21, 2015.

Tricking Your Brain

Project HoloLens’ key achievement—realistic holograms—works by tricking your brain into seeing light as matter. “Ultimately, you know, you perceive the world because of light,” Kipman explains. “If I could magically turn the debugger on, we’d see photons bouncing throughout this world. Eventually they hit the back of your eyes, and through that, you reason about what the world is. You essentially hallucinate the world, or you see what your mind wants you to see.”

To create Project HoloLens’ images, light particles bounce around millions of times in the so-called light engine of the device. Then the photons enter the goggles’ two lenses, where they ricochet between layers of blue, green and red glass before they reach the back of your eye. “When you get the light to be at the exact angle,” Kipman tells me, “that’s where all the magic comes in.”

Thirty minutes later, after we’ve looked at another prototype and some more concept videos and talked about the importance of developers (you always have to talk about the importance of developers when launching a new product these days), I get to sample that magic. Kipman walks me across a courtyard and through the side door of a building that houses a secret basement lab. Each of the rooms has been outfitted as a scenario to test Project HoloLens.

A Quick Trip to Mars

The first is deceptively simple. I enter a makeshift living room, where wires jut from a hole in the wall where there should be a lightswitch. Tools are strewn on the West Elm sideboard just below it. Kipman hands me a HoloLens prototype and tells me to install the switch. After I put on the headset, an electrician pops up on a screen that floats directly in front of me. With a quick hand gesture I’m able to anchor the screen just to the left of the wires. The electrician is able to see exactly what I’m seeing. He draws a holographic circle around the voltage tester on the sideboard and instructs me to use it to check whether the wires are live. Once we establish that they aren’t, he walks me through the process of installing the switch, coaching me by sketching holographic arrows and diagrams on the wall in front of me. Five minutes later, I flip a switch, and the living room light turns on.

Another scenario lands me on a virtual Mars-scape. Kipman developed it in close collaboration with NASA rocket scientist Jeff Norris, who spent much of the first half of 2014 flying back and forth between Seattle and his Southern California home to help develop the scenario. With a quick upward gesture, I toggle from computer screens that monitor the Curiosity rover’s progress across the planet’s surface to the virtual experience of being on the planet. The ground is a parched, dusty sandstone, and so realistic that as I take a step, my legs begin to quiver. They don’t trust what my eyes are showing them. Behind me, the rover towers seven feet tall, its metal arm reaching out from its body like a tentacle. The sun shines brightly over the rover, creating short black shadows on the ground beneath its legs.

jpeg-3-full

Norris joins me virtually, appearing as a three-dimensional human-shaped golden orb in the Mars-scape. (In reality, he’s in the room next door.) A dotted line extends from his eyes toward what he is looking at. “Check that out,” he says, and I squat down to see a rock shard up close. With an upward right-hand gesture, I bring up a series of controls. I choose the middle of three options, which drops a flag there, theoretically a signal to the rover to collect sediment.

After exploring Mars, I don’t want to remove the headset, which has provided a glimpse of a combination of computing tools that make the unimaginable feel real. NASA felt the same way. Norris will roll out Project HoloLens this summer so that agency scientists can use it to collaborate on a mission.

A Long Way Yet

Kipman’s voice eventually brings me back to Redmond. As I remove the goggles, he reminds me that it’s still early days for the project. This isn’t the kind of thing that will be, say, a holiday best seller. It’s a new interface, controlled by voice and gesture, and the controls have to work flawlessly before it will be commercially viable. I get that. I love voice controls, and I talk to Siri all the time. But half the time, she doesn’t give me a good answer and I have to pull up my keyboard to find what I’m looking for more quickly. Project HoloLens won’t have a keyboard. If the voice and gesture controls don’t work perfectly the first time, consumers will write it off. Quickly.

That said, there are no misfires during three other demos. I play a game in which a character jumps around a real room, collecting coins sprinkled atop a sofa and bouncing off springs placed on the floor. I sculpt a virtual toy (a fluorescent green snowman) that I can then produce with a 3-D printer. And I collaborate with a motorcycle designer Skyping in from Spain to paint a three-dimensional fender atop a physical prototype.

As I make my way through each, Kipman seems less nervous than when we began, but no less focused. It has been three hours since we met. In each scenario, he watches a screen that shows him what I am seeing, and he watches me trying to use his device for the first time. His eyebrows draw down in deep concentration as he checks to see if every calculation is perfect—noting the touch of my thumb and forefinger as I make an upward gesture, the words I reach for instinctively to instruct the computer. Seven years in, he is trying to see Project HoloLens as if for the first time. To see it through the eyes of a 30-something female New Yorker. But that is one thing his magical head-mounted holographic computer cannot do. At least not yet.

Microsoft’s Augmented Reality Ads Turn Bus Shelters into Video Game Arenas

For new video game franchise, Microsoft re-engineers out-of-home advertising

What’s on your bus stop shelter wall? A mangled schedule, or torn map? Perhaps you, like many commuters, only venture inside when the wind is howling, rain pouring, or you’re desperately looking for a seat.

Microsoft is ready to change the bus stop from inclement weather coverage into a digital experience. To promote its newest shooter video game Sunset Overdrive, Microsoft has outfitted three bus stops in San Francisco, London and Melbourne with augmented reality technology that immerses viewers in the game’s digital environment.

The advertisements look like regular digital screens from a distance, but at close-range, the augmented reality display manipulates figures move across the screen and look as if they are about to jump offscreen.

The ads, which are supported by Clear Channel Outdoor and media agency Empowering Media, will run for one month. Their augmented reality displays were produced by Grand Visual, a digital out-of-home agency that has previously incorporated augmented reality, motion graphics, and color recognition technology in outdoor advertising campaigns for Heineken, Pepsi, and Tropicana.

Sunset Overdrive, which was developed by Insomniac Games for Xbox One, is the first installment of a new fast-action shooting game franchise that pits players against mutant attackers. The outdoor innovation reflects Microsoft’s attempt to make a splash in a saturated video game market. Hoping to cut through the noise of video game advertising, as well as the noise of the hectic external environment, Microsoft’s augmented reality billboard is likely to not only establish a new franchise, but also herald in a new era of re-engineered outdoor advertising.

Do We Want An Augmented Reality Or A Transformed Reality?

The Reality Boost

“We are moving into an era where we will, on a commercial scale, be taking our visual information in real time and integrating this with a wealth of external information to transform our daily lives. This will give us some degree of control over how we see the world, in the fundamental sense.

For example, we might be offered information about people or objects as they pop into our field of view. Or it could introduce into our visual field view things that don’t exist at all in the real world to potentially filter out of our vision things that are in fact there, such as giant advertising billboards.” Science 2.0

Bombarded by dazzling ad billboards, such as this iconic one at Sydney’s Kings Cross …
Bombarded by dazzling ad billboards, such as this iconic one at Sydney’s Kings Cross …

The Unspoken Future

Extrapolating from the recent history of technology gives us a glimpse of what the future of AR is likely to look like in the hands of the big tech companies.

First, the idea of the “app” will extend into the visual domain, giving us apps that aid us in all the things we already do: building a house, studying at a distance, traveling in a new city and evenmaking love.

Second, the price for access to these new services and of having information at our fingertips is likely to involve surrendering ever more of our personal information. Critically, it will open up new markets for advertisers to promote their products and services in both tacit and explicit ways – an extension of the world of “advertising everywhere”.

 

… you might replace the ads with a beach scene or other images in your own transformed reality.
… you might replace the ads with a beach scene or other images in your own transformed reality.

Transformed Reality

The name “augmented reality” gives it away. The vision of AR that we are seeing in the media and in press releases for products such as Google Glass is a vision of our world as we know it, but perhaps made a little easier through this technology.

In contrast, this technology, that can change what we sense in real time, has the potential to fundamentally change how we live. Do we have the imagination to dream about how instead of merely augmenting reality we could be aiming to transform it?

Now is the time to start dreaming about how the advent of ubiquitous AR could not merely augment society, but transform it for the better.The Conversation