It’s one thing to make a robot that can run around, but what about one that can do it blind? MIT’s new Cheetah 3 quadrupedal robot answers that question by covering rough terrain in leaps, pounces and gallops, and even navigating stairs – all without sight. Instead, the dog-sized, 90-lb (41 kg) automaton relies on internal sensors and algorithms to produce what the engineering team call “blind locomotion.”
As humans, we often have a very skewed view of what’s easy and what’s difficult. We think that doing calculus or playing chess at the Grandmaster level is really hard, but walking is really easy. That’s one of the reason old science fiction stories are filled with robots with the intellect of a ZX-80, but are perfectly capable of going for a stroll.
In fact, walking is extremely difficult to do well, even if the robot in question has four legs instead of two. It only seems easy to us because we can do it without thinking. One reason for this is that walking is literally something you can do with your eyes closed. Though we rely heavily on vision for navigation and avoiding obstacles, our sense of touch combined with a mental image of our surroundings and an in-built capacity to tell where our limbs are, how much force is acting on them, and predict what they’re going to, takes up a lot of the slack.
As part of developing the latest iteration of its Cheetah robot series, the MIT team led by associate professor of mechanical engineering Sangbae Kim is working on improving robotic locomotion by relying less on visual systems that are often inaccurate and slow in favor of tactile information from internal sensors. The Cheetah 3, which also has an expanded range of motion that lets it stretch backwards and forwards, and twist from side to side, has been programmed with two new algorithms to help it sense its environment and predict the movement of its legs.
According to MIT, the contact detection algorithm tells the Cheetah 3 the status of each limb 20 times per second by using gyroscopes, accelerometers, and joint positions to determine how to best keep moving them. It does this by calculating the odds of each leg making ground contact, the probability of the force caused when the leg strikes the ground, and the likelihood that the leg will still be in mid swing. This way, the robot can decide when to switch from leg swinging to completing a step.
What this means in practice is that the Cheetah 3 can not only gallop along, but it can compensate if it steps on some unexpected object or finds that it’s on a descending set of stairs. The robot can then calculate how to tilt and how to reposition its legs and when to plant them to maintain its balance, as was demonstrated when the team set it to running on a laboratory treadmill and climbing a flight of stairs – with obstacles such as wooden blocks and rolls of tape thrown in for good measure.
The second algorithm allows the Cheetah improve its walking by being able to predict how much force it needs to budget to each leg after deciding to take a step. The model-predictive control algorithm works by calculating the options for the legs and body a half a second into the future if various levels of force are applied to each limb. This not only keeps the robot generally stable, but can also help out if someone decides to be mean to it – as so often happens in robotics labs these days.
“Say someone kicks the robot sideways,” says Kim. “When the foot is already on the ground, the algorithm decides, ‘How should I specify the forces on the foot? Because I have an undesirable velocity on the left, so I want to apply a force in the opposite direction to kill that velocity. If I apply 100 newtons in this opposite direction, what will happen a half second later?”
To prove the point, the team gave the Cheetah 3 a hard time. As it ran on the treadmill, they took it in turns kicking, shoving it, and even yanking its lead when it climbed the stairs. In each case, the algorithm allowed it to exert counter forces and compensate to maintain its balance.
Though the Cheetah 3 is currently blind, the team has also used cameras on the robot to help it map its surroundings. The reason the visual sensors are switched off is to concentrate on blind locomotion. Once this is perfected and the cameras are brought online, if the cameras provide false data or the robot steps on something it can’t see, the algorithms can compensate. Eventually, the hope is to produce a robot that can go into dangerous and inaccessible places in the place of humans.
“Cheetah 3 is designed to do versatile tasks such as power plant inspection, which involves various terrain conditions including stairs, curbs, and obstacles on the ground,” says Kim. “I think there are countless occasions where we [would] want to send robots to do simple tasks instead of humans. Dangerous, dirty, and difficult work can be done much more safely through remotely controlled robots.”
Source : Newatlas