‘s oject Tango phone understs its environment like humans do

BY

Published 20 Feb 2014

NSFW AI Why trust Greenbot

We maintain a strict editorial policy dedicated to factual accuracy, relevance, and impartiality. Our content is written and edited by top industry professionals with first-hand experience. The content undergoes thorough review by experienced editors to guarantee and adherence to the highest standards of reporting and publishing.

Disclosure


Your phone knows a lot. It knows the time, its location on the globe, its orientation movement, maybe even the ambient light air pressure. But it doesn’t really know its surroundings. Not like we humans do.

That’s the problem ’s ATAis trying to solve. The Advanced Technology ojects group (official + tagline: “ like epic shit.”) just announced oject Tango, with the goal of giving mobile devices a “human-scale understing of space motion.”

So what is it, exactly? ll, right now it’s a custom 5-inch Android phone. It’s sort of blocky bulky, very much looks like the prototype it is. This is not a consumer device. But this phone does something special. It’s got a 4-megapixel camera, an additional motion-tracking camera, a third depth-sensing camera. Together with a pair of processors optimized for computer vision calculations, the phone tracks its absolute position orientation in 3D more than 250,000 times a second—all while making a virtual 3D map of whatever it’s pointing at. The YouTube video released by ’s ATAwill show you how it works.

If it looks an awful lot like someone crammed an Xbox Kinect sensor into a smartphone, there’s probably a good reason for that. ny Chung , the project lead for Tango, was a key developer on the Kinect team at Microsoft before jumping ship for .

Note that this is still a very experimental endeavor. Interested developers can apply for a kit at the oject Tango site, but there are only 200 kits in total, so don’t get your hopes up. The 5-inch Android phones come with the software stack necessary to do 3D positioning mapping, but it’s all at an early prototype stage under active development. As the site says, “These experimental devices are intended only for the adventurous are not a final shipping product.”

This is an exciting area of advancement for phones, frankly more useful than something like Glass. If your phone knew its absolute position orientation in real time with low latency could map its environment in 3D, the possibilities for developers are boundless. Augmented reality apps games could seamlessly meld a virtual world with the real one. A shopping app could literally walk you through the store to the exact position on the shelf where the item you want is located. You could navigate inside complex buildings the same way cars use turn-by-turn street navigation. You could look at a piece of furniture with your phone virtually, accurately, place it in your living room. one assistants could start to “see” the world the way you do. This could be one of the breakthrough technologies that gets us from Siri to something out of Her.

But there are challenges galore. Extra cameras custom processors aren’t cheap. The drain on battery life is likely to be terrible right now. The prototype phones are too thick bulky for the average user. In addition to the mountain of software problems the team is working to solve, they’re going to have to find a way to make all this stuff affordable, energy efficient, compact. So, don’t expect full 3D world-mapping to be a stard feature of Android llipop (if that’s the name of the next release), but maybe it’ll get here in time for Marzipan.