Google smartphones become brains of hovering robots at ISS

7 Jul, 2014 17:09 / Updated 10 years ago

NASA will employ Google smartphones with advanced 3D sensing and vision technology to control Star Wars-inspired small, round hovering robots on the International Space Station.

The phones, part of Google’s Project Tango, will be used for NASA’s Synchronized Position Hold, Engage, Reorient, Experimental Satellites (SPHERES). The system may eventually assume chores from astronauts or potentially dangerous tasks outside the International Space Station (ISS).

The 5-inch handsets will accompany a cargo spacecraft scheduled for launch on July 11, according to Reuters.

Project Tango devices, first introduced by Google in February, use sensors to build visual maps of rooms using 3D scanning. Google believes the sensors, used in combination with advanced computer vision techniques, can revolutionize indoor navigation and gaming, among other opportunities.

NASA’s soccer-ball-sized SPHERES robots, guided by the Google handsets, will be used around the space station’s microgravity interior, moving an inch per second via small spurts of carbon dioxide.

NASA first sent its SPHERES to the ISS in 2006, but they were only able to achieve little more than precise movements. Engineers at NASA’s Ames Research Center in Mountain View, California – where Google’s headquarters are, as well - began looking for ways to boost its operational intelligence.

“We wanted to add communication, a camera, increase the processing capability, accelerometers and other sensors. As we were scratching our heads thinking about what to do, we realized the answer was in our hands,” SPHERES project manager Chris Provencher told Reuters. “Let’s just use smartphones.”

The SPHERES team upgraded regular smartphones with extra batteries and a shatter-proof display before sending the handsets to the ISS, where astronauts affixed them to the side of the SPHERES robots. The phones offered robots much more in sensing and visual capabilities, but still not enough to move fluidly around the station.

NASA eventually turned to Google’s experimental Project Tango for a boost to its SPHERES project.

Project Tango devices include motion-tracking cameras and an infrared depth sensor for mapping and precise movement. The sensors will be able to measure sharp angles in the space station while creating a 3D map that will allow the SPHERES to successfully transport from one module to another.

“This type of capability is exactly what we need for a robot that’s going to do tasks anywhere inside the space station,” Provencher said. “It has to have a very robust navigation system.”

NASA’s handsets have been designed so that the touchscreen and sensors face outward when affixed to the robots. In addition, the NASA-specific devices will have batteries sufficient for use in space, as well as plastic connectors that will replace the Velcro that held the phone and robot together.

Google has said it aims to make Project Tango a ubiquitous technology for retailers, especially, to create 3D maps of their shops, or to allow gamers to transform their homes into virtual arenas.

“Project Tango strives to give mobile devices a human-like understanding of space and motion through advanced sensor fusion and computer vision, enabling new and enhanced types of user experiences – including 3D scanning, indoor navigation and immersive gaming,” said Google's Advanced Technology and Projects leader Johnny Lee earlier this year.

Accompanying its announcement in February, Google said of Project Tango: “What if you could capture the dimensions of your home simply by walking around with your phone before you went furniture shopping? What if directions to a new location didn’t stop at the street address? What if you never again found yourself lost in a new building?”

Earlier this year, Google selected 200 developers to build apps for the Project Tango handsets that achieve “indoor navigation/mapping, single/multiplayer games that use physical space, and new algorithms for processing sensor data.”