X
Innovation

Google unveils Project Tango, smartphones that view the world in 3D

Google has unveiled a smartphone equipped with revolutionary 3D scanning technology.
Written by Charlie Osborne, Contributing Writer
 
Screen Shot 2014-02-21 at 12.18.54.png

Google has taken the lid off Project Tango, research designed to make our mobile devices function in 3D.

The tech giant's Advanced Technology and Projects (ATAP) team has produced a smartphone prototype which utilizes a vision processor chip capable of bringing 3D motion tracking to mobile devices.

The goal of Project Tango is to "give mobile devices a human-scale understanding of space and motion." In other words, these devices will be aware of their environment and function in three-dimensional space. The hardware is able to make 250,000 3D measurements per second, therefore creating a real-time 3D map of its surroundings.

Project Tango's application in future technology could be endless. For example, a smartphone could give the visually impaired auditory cues related to their surroundings, GPS maps could be revolutionized, and future games could be created based on a scan of the physical realm.

Johnny Lee, leader of the Tango team, said:

"Over the past year, our team has been working with universities, research labs, and industrial partners spanning nine countries around the world to harvest research from the last decade of work in robotics and computer vision, concentrating that technology into a unique mobile phone.
Now, we're ready to put early prototypes into the hands of developers that can imagine the possibilities and help bring those ideas into reality."

Dev kits are being handed out this March to parties interested in creating applications for the device. Google suggests that indoor and outdoor mapping, as well as games which use physical space, could be hot topics to pursue.

The ATAP team is a spin-off research lab that Google decided to hang on to when it sold Motorola to Lenovo.


Via: Google

This post was originally published on Smartplanet.com

Editorial standards