Google avanza en robótica

Los investigadores de Google están utilizando el aprendizaje por imitación para enseñar a los robots autónomos cómo caminar, girar y moverse de manera más ágil. Usando un conjunto de datos de captura de movimiento de varios sensores conectados a un perro, los investigadores le enseñaron a un robot cuadrúpedo llamado Laikago varios movimientos diferentes que son difíciles de lograr a través de controles robóticos codificados a mano tradicionales.


What they did: Using a data set of motion capture data recorded from various sensors attached to a dog, the researchers taught a quadruped robot named Laikago several different movements that are hard to achieve through traditional hand-coded robotic controls.

How they did it: First, they used the motion data from the real dog to construct simulations of each maneuver, including a dog trot, side-step, and … a dog version of classic ’80s dance move, the running man. (The last one was not, in fact, performed by the real dog itself. The researchers manually animated the simulated dog to dance to see if that would translate to the robot as well.) They then matched together key joints on the simulated dog and the robot to make the simulated robot move in the exact same way as the animal. Using reinforcement learning, it then learned to stabilize the movements and correct for differences in weight distribution and design. Finally, the researchers were able to port the final control algorithm into a physical robot in the lab—though some moves, like the running man, weren’t entirely successful.

Why it matters: Teaching robots the complex and agile movements necessary to navigate the real world has been a long-standing challenge in the field. Imitation learning of this kind instead allows such machines to easily borrow the agility of animals and even humans.

Future work: Jason Peng, the lead author on the paper, says there are still a number of challenges to overcome. The heaviness of the robot limits its ability to learn certain maneuvers, like big jumps or fast running. Additionally, capturing motion sensor data from animals isn’t always possible. It can be incredibly expensive and requires the animal’s cooperation. (A dog is friendly; a cheetah, not so much.) The team plans to try using animal videos instead, which would make their technique far more accessible and scalable.

Fuente: https://www.technologyreview.com