top of page
Extreme Parkour with Legged Robots
Author
Cheng et al.
2024
|
Publication type
Artículo de revista
Language
Inglés
Keywords
Summary
Humans can perform parkour by traversing
obstacles in a highly dynamic fashion requiring precise eyemuscle coordination and movement. Getting robots to do the
same task requires overcoming similar challenges. Classically,
this is done by independently engineering perception, actuation,
and control systems to very low tolerances. This restricts
them to tightly controlled settings such as a predetermined
obstacle course in labs. In contrast, humans are able to learn
parkour through practice without significantly changing their
underlying biology. In this paper, we take a similar approach
to developing robot parkour on a small low-cost robot with
imprecise actuation and a single front-facing depth camera
for perception which is low-frequency, jittery, and prone to
artifacts. We show how a single neural net policy operating
directly from a camera image, trained in simulation with largescale RL, can overcome imprecise sensing and actuation to
output highly precise control behavior end-to-end. We show
our robot can perform a high jump on obstacles 2x its height,
long jump across gaps 2x its length, do a handstand and run
across tilted ramps, and generalize to novel obstacle courses
with different physical properties
Url
bottom of page