{"id":2743,"date":"2011-02-06T12:38:55","date_gmt":"2011-02-06T12:38:55","guid":{"rendered":"http:\/\/hgpu.org\/?p=2743"},"modified":"2011-02-06T12:38:55","modified_gmt":"2011-02-06T12:38:55","slug":"gpu-accelerated-real-time-3d-tracking-for-humanoid-locomotion-and-stair-climbing","status":"publish","type":"post","link":"https:\/\/hgpu.org\/?p=2743","title":{"rendered":"GPU-accelerated real-time 3D tracking for humanoid locomotion and stair climbing"},"content":{"rendered":"<p>For humanoid robots to fully realize their biped potential in a three-dimensional world and step over, around or onto obstacles such as stairs, appropriate and efficient approaches to execution, planning and perception are required. To this end, we have accelerated a robust model-based three-dimensional tracking system by programmable graphics hardware to operate online at frame-rate during locomotion of a humanoid robot. The tracker recovers the full 6 degree-of- freedom pose of viewable objects relative to the robot. Leveraging the computational resources of the GPU for perception has enabled us to increase our tracker&#8217;s robustness to the significant camera displacement and camera shake typically encountered during humanoid navigation. We have combined our approach with a footstep planner and a controller capable of adaptively adjusting the height of swing leg trajectories. The resulting integrated perception-planning-action system has allowed an HRP-2 humanoid robot to successfully and rapidly localize, approach and climb stairs, as well as to avoid obstacles during walking.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>For humanoid robots to fully realize their biped potential in a three-dimensional world and step over, around or onto obstacles such as stairs, appropriate and efficient approaches to execution, planning and perception are required. To this end, we have accelerated a robust model-based three-dimensional tracking system by programmable graphics hardware to operate online at frame-rate [&hellip;]<\/p>\n","protected":false},"author":351,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[11,73,3],"tags":[444,1782,1791,20,247,402],"class_list":["post-2743","post","type-post","status-publish","format-standard","hentry","category-computer-science","category-computer-vision","category-paper","tag-cg","tag-computer-science","tag-computer-vision","tag-nvidia","tag-nvidia-geforce-7800-gtx","tag-video-tracking"],"views":1900,"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/hgpu.org\/index.php?rest_route=\/wp\/v2\/posts\/2743","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/hgpu.org\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/hgpu.org\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/hgpu.org\/index.php?rest_route=\/wp\/v2\/users\/351"}],"replies":[{"embeddable":true,"href":"https:\/\/hgpu.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2743"}],"version-history":[{"count":0,"href":"https:\/\/hgpu.org\/index.php?rest_route=\/wp\/v2\/posts\/2743\/revisions"}],"wp:attachment":[{"href":"https:\/\/hgpu.org\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2743"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/hgpu.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2743"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/hgpu.org\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2743"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}