U-M Receives $1M Grant for Bipedal First Responder Robots Project

A new University of Michigan research project funded by a $1 million grant from the National Science Foundation will enable robots to navigate in real time without the need for a preexisting map of the terrain they traverse.
709
Digit, a bipedal robot, is tested in the Ronald D. And Regina C. McNeil Walking Robotics Laboratory in the Ford Robotics Building at the University of Michigan in Ann Arbor, MI.  Robots are tested in a variety of situations to allow for a variety of bipedal and quadrupedal robots to function in various environments.  Photo: Joseph Xu/University of Michigan Engineering, Communications and Marketing
The University of Michigan received a $1 million grant from the National Science Foundation for real-time robot navigation on bipedal systems (pictured) to be used alongside first responders and more. // Photograph by Joseph Xu

A new University of Michigan research project funded by a $1 million grant from the National Science Foundation will enable robots to navigate in real time without the need for a preexisting map of the terrain they traverse.

The goal of the project is to have robots work alongside first responders such as wildfire fighters.

The project seeks to take bipedal walking robots to a new level by equipping them to adapt on the fly to treacherous ground, dodge obstacles, or decide whether a given area is safe for walking. It also could enable robots to go into areas too dangerous for humans and lead to more intuitive prosthetics.

“I envision a robot that can walk autonomously through the forest here on North Campus (in Ann Arbor) and find an object we’ve hidden. That’s what’s needed for robots to be useful in search and rescue, and no robot right now can do it,” says Jessy Grizzle, principal investigator on the project and the Elmer G. Gilbert distinguished university professor of engineering at U-M.

Grizzle will partner with Maani Ghaffari Jadidi, an assistant professor of naval architecture and marine engineering. The two will embrace an approach called “full-stack robotics,” integrating a series of new and existing pieces of technology into a single, open-source perception and movement system that can be adapted to robots beyond those used in the project itself.

“What full-stack robotics means is that we’re attacking every layer of the problem at once and integrating them together,” says Grizzle. “Up to now, a lot of roboticists have been solving very specific individual problems. With this project, we aim to integrate what has already been done into a cohesive system, then identify its weak points and develop new technology where necessary to fill in the gaps.”

One area of focus will be mapping. The project will aim to find a way to allow robots to develop rich, multidimensional maps based on real-time sensory input. The robot is expected to accomplish this using math. For example, by calculating a standard deviation of ground height variation or how slippery a surface is.

It is expected to be built with sophisticated perceptual tools that will help robots gather data by analyzing their limbs actions. For example, a slip on an icy surface or a kick on a tree root would generate a new data point. The system will also help robots navigate loose ground and moving objects, such as rolling branches.

In addition to developing new technology, the project will also collaborate with the U-M School of Education on outreach to a Detroit high school, working to share the project’s material with them and develop interest in robotics.

“A shared understanding of the environment between humans and robots is essential, because the more a human team can see, the better they can interpret what the robot team is trying to accomplish,” says Ghaffari. “And that can help humans to make better decisions about what other resources need to be brought in or how the mission should proceed.”