Developing robots that can walk more naturally

The COMAN robot is designed for helping humans. Here carrying a table with scientist Hamed Razavi. © 2017 EPFL/ A.Herzog

The COMAN robot is designed for helping humans. Here carrying a table with scientist Hamed Razavi. © 2017 EPFL/ A.Herzog

Walking on two legs isn’t as easy as it seems. For robots and their designers, it is an even bigger challenge! Researchers at EPFL’s Biorobotics Laboratory are testing novel algorithms to improve humanoids’ ability to walk and interact with humans.


For humans, it comes perfectly naturally. But walking on two legs is actually a complicated task, requiring several muscles to perform delicate balancing acts. That’s why, in spite of years of major technological advancements in the field, humanoid robots are still far from being able to get around easily and reliably. Engineers at EPFL’s Biorobotics Laboratory are testing new walking algoritms on a plateform called COMAN, short for COmpliant HuMANoid. This 95-cm-tall humanoid is designed specifically for studying walking – which is why it has no head.

COMAN was developed under the EU AMARSi project and is being used by several research teams. The EPFL team is looking specifically at the “brains” of the machine. “We developed algorithms that can improve the robot’s balance while it’s walking,” says Hamed Razavi, a researcher scientist at the Biorobotics Lab.

In harmony with symmetries

One of COMAN’s distinguishing features is its joints, which are integrated with elastic elements that give it greater flexibility when performing different tasks. The EPFL team came up with a novel control algorithm for the robot, based on the existing symmetries in the structure and dynamics of the robot’ as well as the mathematical equations representing the robot dynamics. “You could say we’re working in harmony with these symmetries rather than against them. As a result, we obtain a more natural and robust walking gait,” says Razavi.

The control algorithm uses sophisticated computer programs to carefully analyze the date received from the robot – including its position, velocity, joint angles, etc. – and sends appropriate commands to the motors, telling them what to do in order to maintain the robot's balance. “For example, if someone pushes COMAN, for example, our algorithms will calculate exactly where its foot should land in order to counteract the perturbation,” says Razavi.

Climbing stairs and opening doors

The algorithms are geared towards three types of realworld applications. The first is carrying out rescue missions in disastrous scenarios. “In environments designed by humans - like a nuclear power plant where there are stairs to climb and doors to open – humanoid robots can get around more easily than robots with wheels,” says Razavi. The second is helping with tasks like carrying heavy boxes or moving objects (see box). And the third is creating exoskeletons for the disabled.

“Making the robots more stable is just the tip of the iceberg,” says Razavi. The next step is refining the algorithms so that the humanoids have a wider range of movement and can overcome obstacles and walk on irregular or sloped surfaces.

Humanoids helping humans

As part of this project, Jessica Lanini and Hamed Razavi studied how two people carrying an object together are able to walk, turn and speed up in a coordinated manner – without communicating with each other. Their findings, recently published in PLOS ONE, indicate that the two people automatically synchronize their steps, like a quadruped. Now the researchers plan to apply their results to humanoid robots.

“Whether for manufacturing or natural disasters, we need robots that can interact with humans and help us carry heavy objects,” says Lanini. “But such robots don’t exist. That’s because, in order to operate safely and effectively, the robots would need to be able to make decisions and respond to unexpected circumstances.”

The researchers decided to observe humans, who do things better and more naturally than robots. They analyzed the way humans move and found that some factors like speed, force and hand position play a pivotal role in understanding “commands” like speeding up or stopping. The next step is modeling these observations in order to program the robots. “What is exactly that makes us realize to slow down or turn? The applied force? A combination of force and speed? The boundary is not yet clear,” says Razavi.

Clara Marc