The future of robotic helpers might seem like an idea straight from the Jetsons, but an interdisciplinary robotics team at UT Austin is working to make that a reality by competing in RoboCup, a competition designed to expand the boundaries of artificial intelligence (AI).

In late July, the UT Villa team traveled to Nagoya, Japan to compete against ten international teams in RoboCup 2017 for the first time, bringing home 3rd place in the @Home Domestic Standard Platform League.

RoboCup team
UT Austin RoboCup Team at the 2017 RoboCup competition in Nagoya, Japan. View more RoboCup photos.

UT originally entered two teams to receive the Toyota Support Robot (HSR); both Texas teams won an award, which qualified the teams to compete in RoboCup 2017. Luis Sentis, professor of aerospace engineering and engineering mechanics and Andrea Tomaz, associate professor of electrical and computer engineering, led the Cockrell School of Engineering team; the Texas Computer Science team was led by faculty members Peter Stone, Raymond Mooney and Scott Niekum.

The two teams ultimately decided to join forces and work together as one multidisciplinary Texas team at the RoboCup@Home League in Japan. In addition, 24 postdoctoral, Ph.D., M.S. and undergraduate students worked together on RoboCup.

RoboCup began in 1997 as a robotic soccer tournament but expanded to accommodate tasks as varied as disaster rescue, a children’s league and the @Home League, which aims to develop service and assistive robot technology with high relevance for future personal domestic applications. 

The @Home competition requires robots to complete household tasks such as unloading and sorting groceries, setting and cleaning a table, recognizing and interacting with people and maneuvering around obstacles in a realistic home environment setting.

“One of the main reasons we entered this competition is to create an advanced AI that could be used by students and faculty across departments,” Sentis said. “By building an AI as a joint effort, we benefit from the broad range of expertise at UT and the student-oriented multidisciplinary team effort.”

Teams in the @Home category each receive an identical robot kit – the HSR – a mobile manipulation service robot used to standardize the hardware and focus the competition on advances in AI/software/control. A set of benchmark tests is used to evaluate the robot’s abilities and the team is judged on the programming and functionality they develop to bring the robot to life.

Zilker the robot
UT’s robot, Zilker, moves on wheels and has a robotic arm used to grasp objects. It uses laser sensors to navigate without bumping into obstacles, skeletal tracking to detect human movements and multiple microphones and cameras to help it recognize individual people’s voices and faces.

The robots are small, standing only about three feet tall. UT’s robot, Zilker, moves on wheels and has a robotic arm used to grasp objects. It uses laser sensors to navigate without bumping into obstacles, skeletal tracking to detect human movements and multiple microphones and cameras to help it recognize individual people’s voices and faces.

During a demonstration in the Human Centered Robotics Laboratory before the competition,  Zilker announces, “I can see Nicolas,” when recognizing one of its programmers, visiting mechanical engineering graduate student Nicolas Brissonneau.

“That was the voice that came with the robot,” Brissonneau said, commenting on the robot’s distinctly pleasant tone. “We thought it was kind of cute and decided to keep it.”

Zilker can recognize up to 9,000 3D objects and can navigate through a pre-programmed map according to mechanical engineering Ph.D. student Minkyu Kim.

“We are using a learning based robot detection algorithm,” Kim said. “It’s basically based on the image. If we can detect the image, we use laser sensor or the depth camera to calculate the relative distance from the robot and perceive the location of the 3D object.”

The team visited the site in Japan two days before the competition to program the Zilker’s map, but the locations of obstacles such as furniture were still a mystery for the robot to navigate.

Another challenge the team encountered was programming the robot to recognize regional objects when performing a task like sorting groceries.

“The products we were manipulating are Japanese products,” Brissonneau said. “We had no idea what even exists or what to expect. It’s way more difficult for the robot to recognize.”

The team learned a great deal from this competition, which Kim says will help support future robotics research at UT Austin.

“Through this competition, the whole team was united to produce valuable results,” Kim said. “In preparing for competition, we implemented basic structure and functions for navigation, manipulation, perception skills of the world environment. Now we can use HSR for our research platform, which has a lot of basic functionality. In other words, we have established a stepping stone to conduct in-depth research in the future.”