PublicationDetail

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

L. Schwarz, A. Mkhitaryan, D. Mateus, N. Navab
Estimating Human 3D Pose from Time-of-Flight Images Based on Geodesic Distances and Optical Flow
IEEE Conference on Automatic Face and Gesture Recognition (FG), Santa Barbara, USA, March 2011 (bib)

In this paper, we present a method for human full-body pose estimation from Time-of-Flight (ToF?) camera images. Our approach consists of robustly detecting anatomical landmarks in the 3D data and fitting a skeleton body model using constrained inverse kinematics. Instead of relying on appearance-based features for interest point detection that can vary strongly with illumination and pose changes, we build upon a graph-based representation of the ToF? depth data that allows us to measure geodesic distances between body parts. As these distances do not change with body movement, we are able to localize anatomical landmarks independent of pose. For differentiation of body parts that occlude each other, we employ motion information, obtained from the optical flow between subsequent ToF? intensity images. We provide a qualitative and quantitative evaluation of our pose tracking method on ToF? sequences containing movements of varying complexity.
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each authors copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.



Edit | Attach | Refresh | Diffs | More | Revision r1.13 - 30 Jan 2019 - 15:16 - LeslieCasas

Lehrstuhl für Computer Aided Medical Procedures & Augmented Reality    rss.gif