Maruthi Akella, a professor in the Department of Aerospace Engineering and Engineering Mechanics, is working to create fast, agile, and autonomous unmanned aerial vehicles, or UAVs, by borrowing navigation skills from nature’s great flyers—birds.
“There is no GPS that’s guiding birds, and there’s no other infrastructure. They’re able to create their own navigation infrastructure using naturally endowed sensors,” Akella said.
Currently, UAVs need lots of outside help. Most function like high-tech model airplanes, requiring a human pilot to fly it via remote control, or a precisely plotted GPS route.
To make UAVs more autonomous, Akella and his research team are developing software that applies similar skills that birds use to fly through the skies: optical vision and learned experience—executed through machine intelligence.
His goal is to create autonomous UAVs that don’t require humans or GPS data to rapidly navigate around obstacles in environments and execute assigned tasks.
“What I mean by autonomy is that we dial down the human involvement as much as the application requires,” Akella said. “That’s not to say that the human is never in the picture, but we want to minimize the babysitting part of what we need to do when it comes to flying UAVs.”
Akella is also working to make the UAVs fly quickly, at about 20 meters per second while carrying about a pound of payload for 20 minutes of flight time.
Allowing the UAVs to see and rapidly respond to environments without human or GPS direction helps expand where they can function, and the efficiency of task execution, Akella said. The exteriors of most buildings block outdoor radio and GPS signals. And humans can be clumsy pilots, especially for fast vehicles operating within cluttered environments.
Akella envisions agile, autonomous UAVs in a variety of important roles: explorers on other planets, first-responders searching for signs of life inside of a collapsed building, or tools in war zones where GPS signals have been purposefully jammed.
To test the developments in vision-enabled autonomy, Akella and his team uses a wide suite of UAV models made up of both commercial and built-in-house quadcopters and helicopters ranging in size from a tea-saucer to a large dinner plate.
Attached to each UAV are three basic components: a battery, a computer board that integrates flight algorithms, and a digital camera.
A UAV “sees” by processing the image data continuously collected by the onboard camera through programs that analyze it and tells its motors and propellers how to respond. Machine intelligence and learning algorithms help the UAV quickly recognize and respond to obstacles that it has encountered before so that processing time and power can be spent analyzing new or unfamiliar features in the environment.
To increase processing and imaging power further, Akella is also working on developing UAVs that share data collection and processing duties as they fly together to execute a common goal, much like a flock of birds or a swarm of bees.
But while flocks of birds are usually limited to a single species, Akella is developing different models of UAVs to work together toward a common task. For example, small UAVs could speedily capture information on the environment and send the data to larger UAVs carrying out the majority of the processing. The processed data would then be sent back to the smaller UAV to aid in navigation.
“The more heterogeneous the team is, in a way, your team is better,” Akella said.
Akella’s vision-enabled autonomous UAVs are still in the development stage. They have been tested in short flights outdoors, with indoor flight-tests planned for later this year. During these test-flights he invites the public to come see the UAVs in action so they can become comfortable with the technology and its possibilities.
“The outdoor field tests we’ve done in the past we always had a public invitation and we expect continuing that when we go indoors,” Akella said. “We are always happy to share our excitement with the broader public.”