5112

Humanoid navigation planning using future perceptive capability

Philipp Michel, Joel Chestnutt, Satoshi Kagami, Koichi Nishiwaki, James Kuffner, Takeo Kanade
The Robotics Institute, Carnegie Mellon University, Pittsburgh, PA
8th IEEE-RAS International Conference on Humanoid Robots, 2008. Humanoids 2008

@inproceedings{michel2008humanoid,

   title={Humanoid Navigation Planning using Future Perceptive Capability},

   author={Michel, P. and Chestnutt, J. and Kagami, S. and Nishiwaki, K. and Kuffner, J. and Kanade, T.},

   booktitle={Humanoid Robots, 2008. Humanoids 2008. 8th IEEE-RAS International Conference on},

   pages={507–514},

   year={2008},

   organization={IEEE}

}

Download Download (PDF)   View View   Source Source   

1436

views

We present an approach to navigation planning for humanoid robots that aims to ensure reliable execution by augmenting the planning process to reason about the robotpsilas ability to successfully perceive its environment during operation. By efficiently simulating the robotpsilas perception system during search, our planner generates a metric, the so-called perceptive capability, that quantifies the dasiasensabilitypsila of the environment in each state given the task to be accomplished. We have applied our method to the problem of planning robust autonomous walking sequences as performed by an HRP-2 humanoid. A fast GPU-accelerated 3D tracker is used for perception, with a footstep planner incorporating reasoning about the robotpsilas perceptive capability. When combined with a controller capable of adaptively adjusting the height of swing leg trajectories, HRP-2 is able to navigate around obstacles and climb stairs in dynamically changing environments. Reasoning about the future perceptive capability ensures that sensing remains operational throughout the walking sequence and yields higher task success rates than perception-unaware planning.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: