See how Google’s new ‘Project Tango’ smartphones sense the world

Computer vision application to ar technology.

projecttango9_1020_verge_super_wideprojecttango8_1020_verge_super_wideprojecttango6_1020_verge_super_wideprojecttango7_1020_verge_super_wide

Google’s surprise reveal of Project Tango, a smartphone equipped with a variety of cameras and vision sensors that provides a whole new perspective on the world around it, left us with quite a few questions about how this device actually works and what it’s for. Google says the Tango smartphone can capture a wealth of data never before available to app developers, including depth- and object-tracking and real-time 3D mapping. And it’s no bigger or more dependent on power than your typical smartphone. We sat down with Remi El-Ouazzane, CEO of Movidius, the company that developed some of the technology used in Tango, to get a better idea of what this device can do and what it means for applications of the future. We also got a chance to use the device Google will be delivering to developers next month.

Movidius has been working on computer vision technology for the past seven years — it developed the processing chips used in Project Tango, which Google paired with sensors and cameras to give the smartphone the same level of computer vision and tracking that formerly required much larger equipment. In fact, El-Ouzzane says the technology isn’t very different at all from what NASA’s Exploration Rover used to map the surface of Mars a decade ago, but instead of being in a 400-pound vehicle, it fits in the palm of your hand.

projecttango4_1020_verge_super_wide

The phone is equipped with a standard 4-megapixel camera paired with a special combination RGB and IR sensor and a lower-resolution image-tracking camera. Those image sensors give the smartphone a similar perspective on the world as you and I, complete with spatial awareness and a perception of depth. They feed data to Movidius’ custom Myriad 1 low-power computer-vision processor, which can then crunch the data and feed it to apps through a set of APIs.

IT’S LIKE HAVING THE MARS ROVER’S EYES IN THE PALM OF YOUR HAND

But what can you do with all of that data? That’s really up to app developers and is the reason Google is giving out 200 of these prototype devices to developers in the coming weeks. The devices that we saw were equipped with a few demonstration apps to show off some of the hardware’s capabilities. One of the apps was able to display a distance heat map on top of what the camera sees, layering blue colors on far away objects and red colors on things that are close up. Another took the data from the image sensors and paired with the device’s standard motion sensors and gyroscopes to map out paths of movement down to 1 percent accuracy and then plot that onto an interactive 3D map.

Perhaps the most impressive demo was an app that was able to capture a 3D model of a scene in real time and draw it on the display as you moved the device around the room. It’s pretty amazing to see a three-dimensional model of the table in front of you get drawn in real time in just a few seconds by a smartphone.

projecttango2_1020_verge_super_wideprojecttango5_1020_verge_super_wide