Tango (platform) explained

Tango
Author:Google
Developer:Google
Discontinued:yes
Language:English
Genre:Computer vision

Tango (named Project Tango while in testing) was an augmented reality computing platform, developed and authored by the Advanced Technology and Projects (ATAP), a skunkworks division of Google. It used computer vision to enable mobile devices, such as smartphones and tablets, to detect their position relative to the world around them without using GPS or other external signals. This allowed application developers to create user experiences that include indoor navigation, 3D mapping, physical space measurement, environmental recognition, augmented reality, and windows into a virtual world.

The first product to emerge from ATAP, Tango was developed by a team led by computer scientist Johnny Lee, a core contributor to Microsoft's Kinect. In an interview in June 2015, Lee said, "We're developing the hardware and software technologies to help everything and everyone understand precisely where they are, anywhere."[1]

Google produced two devices to demonstrate the Tango technology: the Peanut phone and the Yellowstone 7-inch tablet. More than 3,000 of these devices had been sold as of June 2015,[2] chiefly to researchers and software developers interested in building applications for the platform. In the summer of 2015, Qualcomm and Intel both announced that they were developing Tango reference devices as models for device manufacturers who use their mobile chipsets.

At CES, in January 2016, Google announced a partnership with Lenovo to release a consumer smartphone during the summer of 2016 to feature Tango technology marketed at consumers, noting a less than $500 price-point and a small form factor below 6.5 inches. At the same time, both companies also announced an application incubator to get applications developed to be on the device on launch.

On 15 December 2017, Google announced that they would be ending support for Tango on March 1, 2018, in favor of ARCore.[3]

Overview

Tango was different from other contemporary 3D-sensing computer vision products, in that it was designed to run on a standalone mobile phone or tablet and was chiefly concerned with determining the device's position and orientation within the environment.

The software worked by integrating three types of functionality:

Together, these generate data about the device in "six degrees of freedom" (3 axes of orientation plus 3 axes of position) and detailed three-dimensional information about the environment.

Project Tango was also the first project to graduate from Google X in 2012 [4]

Applications on mobile devices use Tango's C and Java APIs to access this data in real time. In addition, an API was also provided for integrating Tango with the Unity game engine; this enabled the conversion or creation of games that allow the user to interact and navigate in the game space by moving and rotating a Tango device in real space. These APIs were documented on the Google developer website.[5]

Applications

Tango enabled apps to track a device's position and orientation within a detailed 3D environment, and to recognize known environments. This allowed the creations of applications such as in-store navigation, visual measurement and mapping utilities, presentation and design tools,[6] and a variety of immersive games. At Augmented World Expo 2015,[7] Johnny Lee demonstrated a construction game that builds a virtual structure in real space, an AR showroom app that allows users to view a full-size virtual automobile and customize its features, a hybrid Nerf gun with mounted Tango screen for dodging and shooting AR monsters superimposed on reality, and a multiplayer VR app that lets multiple players converse in a virtual space where their avatar movements match their real-life movements.

Tango apps are distributed through Play. Google has encouraged the development of more apps with hackathons, an app contest, and promotional discounts on the development tablet.

Devices

As a platform for software developers and a model for device manufacturers, Google created two Tango devices.

The Peanut phone

"Peanut" was the first production Tango device, released in the first quarter of 2014. It was a small Android phone with a Qualcomm MSM8974 quad-core processor and additional special hardware including a fisheye motion camera, "RGB-IR" camera for color image and infrared depth detection, and Movidius Vision processing units. A high-performance accelerometer and gyroscope were added after testing several competing models in the MARS lab at the University of Minnesota.

Several hundred Peanut devices were distributed to early-access partners including university researchers in computer vision and robotics, as well as application developers and technology startups. Google stopped supporting the Peanut device in September 2015, as by then the Tango software stack had evolved beyond the versions of Android that run on the device.

The Yellowstone tablet

"Yellowstone" was a 7-inch tablet with full Tango functionality, released in June 2014, and sold as the Project Tango Tablet Development Kit. It featured a 2.3 GHz quad-core Nvidia Tegra K1 processor, 128GB flash memory, 1920x1200-pixel touchscreen, 4MP color camera, fisheye-lens (motion-tracking) camera, an IR projector with RGB-IR camera for integrated depth sensing, and 4G LTE connectivity.[8] As of May 27, 2017, the Tango tablet is considered officially unsupported by Google.[9]

Testing by NASA

In May 2014, two Peanut phones were delivered to the International Space Station to be part of a NASA project to develop autonomous robots that navigate in a variety of environments, including outer space. The soccer-ball-sized, 18-sided polyhedral SPHERES robots were developed at the NASA Ames Research Center, adjacent to the Google campus in Mountain View, California. Andres Martinez, SPHERES manager at NASA, said "We are researching how effective [Tango's] vision-based navigation abilities are for performing localization and navigation of a mobile free flyer on ISS.

Intel RealSense smartphone

Announced at Intel's Developer Forum in August 2015,[10] and offered to public through a Developer Kit since January 2016.[11] It incorporated a RealSense ZR300 camera[12] which had optical features required for Tango, such as the fisheye camera.[13]

Lenovo Phab 2 Pro

Lenovo Phab 2 Pro was the first commercial smartphone with the Tango Technology, the device was announced at the beginning of 2016, launched in August, and available for purchase in the US in November. The Phab 2 Pro had a 6.4 inch screen, a Snapdragon 652 processor, and 64 GB of internal storage, with a rear facing 16 Megapixels camera and 8 MP front camera.

Asus Zenfone AR

Asus Zenfone AR, announced at CES 2017,[14] was the second commercial smartphone with the Tango Technology. It ran Tango AR & Daydream VR on Snapdragon 821, with 6GB or 8GB of RAM and 128 or 256GB of internal memory depending on the configuration.

See also

External links

Notes and References

  1. Web site: Future Phones Will Understand, See the World . 3 June 2015 . 4 November 2015.
  2. Web site: Slamdance: inside the weird virtual reality of Google's Project Tango . 29 May 2015 .
  3. News: Google's Project Tango is shutting down because ARCore is already here. The Verge. 2017-12-16.
  4. Web site: 10 Bold Google X Projects Aiming for Tech Breakthroughs.
  5. https://developers.google.com/project-tango/ Google developer website.
  6. An augmented reality tool to detect design discrepancies: a comparison test with traditional methods. International Conference on Augmented Reality, Virtual Reality and Computer Graphics . 2019. 10.1007/978-3-030-25999-0_9 .
  7. http://augmentedworldexpo.com/ Augmented World Expo 2015
  8. Web site: Tango Tablet Development Kit User Guide Tango Google Developers. Google Developers. en. 2016-09-11.
  9. News: The launch of the Lenovo Phab 2 Pro was a major milestone for Tango, marking .... 2017-07-15. en.
  10. Web site: Google and Intel bring RealSense to phones with Project Tango dev kit.
  11. Web site: Intel's RealSense phone with Project Tango up for pre-order.
  12. Web site: Intel's RealSense smartphone developer kit now available to pre-order for $399. 7 January 2016.
  13. https://click.intel.com/media/ZR300-Product-Datasheet-Public-002.pdf Intel Realsense ZR300 Product Datasheet
  14. https://www.youtube.com/watch?v=rpmsmF3TNRU ASUS announcement at CES 2017