Robonaut—perception in space

Robonaut—perception in space
Credit: Texas A&M University

In order to remain safe, robots are commonly used to reach what human hands cannot. Often a robot is used to uncover victims from rubble or bring them safely to shore. These helpful hands can even reach a world far beyond our own – outer space.

Dr. Dezhen Song, a professor in the Department of Computer Science and Engineering at Texas A&M University, is working on a collaborative project with NASA's Johnson Space Center to develop localization and mapping algorithms for an astronaut robot (Robonaut) to make better use of the crew's time, and to perform dangerous tasks in lieu of a human.

To utilize all tools and facilities developed for human astronauts, the team is working together to build a human-like robot with similar body configurations such as arms and hands. Due to the lack of GPS signals, the current Robonaut prototype cannot localize itself in the International Space Station (ISS).

Most tasks performed by Robonaut are limited to the vicinity of the robot. To enable further functionalities, such as transporting items in the ISS or performing panel maintenance, the robot needs to move around the station. This also means it must establish a mental map of the visited region and localize itself in the process. In the field of robotics, this is known as simultaneous localization and mapping (SLAM).

"SLAM is part of the robot perception capability," Song said. "Our study is to try to bring better and more accurate information to the robot to facilitate its decision process so that more smart robots can be developed for different applications. If successful, we can significantly increase the robots' ability in handling different environments, which will have significant impact on manufacturing, daily life, defense and many other areas that can benefit from the increasing capability from mobile robots."

A reliable, low-cost SLAM capability has been an obstacle for many robotic applications in the past. A camera is a low-cost sensor compared to laser range finders, but the drawback to using a camera is lighting and baseline limits in calculating stereo information.

Since cameras measure bearing instead of absolute size, they have difficulty measuring distance.

One idea to combat this is to use two or more cameras with known baselines to provide distance reference; this is known as stereo vision. However, the joint coverage region between fields of views of the two cameras is too limited to be directly useful. Therefore, during the process, Robonaut's head will be activated from side to side. This will allow it to scan the surroundings to enlarge the field of view. By using neck encoder readings, the team can track Robonaut's head scanning motion.

There is an inertial measurement unit (IMU) installed in Robonaut that delivers body movement information. An IMU also helps establish view correspondence when Robonaut is moving. The primary challenge with this research project lies in combining the multiple camera views and other sensors with different, uncertain characteristics to provide robust SLAM results.

Interest in these developments extend beyond the space and aeronautics industry and into one that is a bit more grounded.

"They are interested in using our motion-sensor based technology in detecting railway status for better and low-cost railway maintenance," Song said.

This project first came about in 2005 when the group developed SLAM algorithms for vehicles while developing an autonomous motorcycle for the Defense Advanced Research Projects Agency Grand Challenge.

Along with their partnership with NASA, the group is also collaborating with industry contacts and Texas A&M faculty on their research for Robonaut. They are working with Dr. Tim Davis, a professor in the computer science and engineering department, to improve visual SLAM optimization algorithms using sparse matrices, and Dr. Jun Zou, an associate professor in the electrical and computer engineering department, to develop a new line of ranging and communication sensors for underwater robots.

Explore further: Robot's in-hand eye maps surroundings, determines hand's location