Researchers at Virginia Tech are developing a way to use computer algorithms and sensors to help robot animals move more naturally like their real-life counterparts.
Kaveh Hamed and his colleagues at VT's College of Engineering have launched four different studies to find out how they can draw inspiration from human and animal behavior for software development. Their goal is to come up with programs that would help give mechanical appendages more natural movement.
Drawing Inspiration From Human And Animal Movement
Hamed first got the idea of translating human and animal movement into algorithms for robots from experiences in his personal life. He watched how his one-year-old son learned how to walk and how his pet dog tended to switch from a run to a trot whenever it approached him. He said all these motions made him think about math.
"It seems like a simple problem," Hamed said.
"We do these things every day — we walk, run, climb stairs, step over gaps. But translating that to math and robots is challenging."
One of the team's projects focuses on applying bipedal (two-legged) locomotion to powered prosthetic legs. The VT researchers are working on decentralized control algorithms that they could use to power a prosthetic leg model developed by colleagues at the University of Michigan.
Hamed and his team are also looking to leverage control algorithms, artificial intelligence, and sensors to improve the quadrupedal (four-legged) movement of robotic dogs.
The researchers took note at how most two or four-legged robots lack the proper movement to match their real-life inspirations. They believe that even state-of-the-art machines still cannot exactly mimic the agility of animals, such as dogs, cheetahs, and mountain lions. There seems to be a fundamental gap between the locomotion seen in robots and those seen in their biological counterparts.
Transforming Natural Movement Into Control Algorithms
For their work, Hamed and his colleagues are developing control algorithms that can help recreate the agility and stability of animal movement. These will be combined with sensors that work in the framework of basic animal biology.
The researchers discussed how legged humans and animals can still walk even if they close their eyes. In this example, locomotion is controlled primarily in the organism's spinal cord, where oscillatory neurons communicate with each other to determine the proper rhythmic motion. This function comes naturally for humans and animals, according to Hamed.
However, if these legged organisms need to navigate through more complex environments, such as a flight of stairs, they have to use their vision along with their brain to interpret what they see.
By combining robust control algorithms with sensors, the researchers were able to produce the same function in robotic dogs. The sensors allowed the machines to have more control over their balance and motion much like real animals.
The team also attached cameras and Light Detection and Ranging (LiDAR) technology to give the robotic dogs machine vision that would allow them to have a better grasp of potential obstacles around them.
With these features, Hamed and his team hope that the robotic dogs would be able to act accordingly to their environment. The sensors would read measurements of the robots' motion and surroundings, while the onboard computers would calculate the robust control actions necessary to help the machines navigate from point A to point B.
The robotic dogs used in the experiment have successfully mimicked several different gaits of real animals. So far, they have been able to amble, trot, and run even at sharp angles with better agility, balance, and speed than other robots.
The VT researchers are now working to incorporate AI into their control algorithms to help improve the machines' decision-making in real-world settings.