Depth-based hand tracking for RGBD Augmented C-arms
Thesis by:
Advisor: Nassir Navab
Supervision by: Séverine Habert
Due date:
Abstract
RGBD Augmented C-arms provide an overlay of X-ray image over a video image. As an additional information, depth information at every color pixel is also provided. RGBD Augmented C-arms are intended to be used inside the Operating Room for performing surgery, providing an overview of the surgeon actions in context with the X-ray image.
In most of the surgeries, the surgeon place its hands in the field of view of the device as it corresponds to the surgical field. Numerous works have lately been published concerning depth-based hand tracking using Machine Learning, from Random Forests [1] to Deep Learning [2].
Tracking the joints of the surgeon´s hands when placed in the field of view of the RGBD augmented C-arm is of numerous interests: gesture control user interface (the surgeon can control directly with gesture the blending value of X-ray), pose estimation, diminished reality (as we can simplify the hand to a centerline representation).
Video from [2]:
Tasks
The student will have to perform a search for literature, for existing code (several implementations are already available) and apply them on our dataset.
The student will then implement a simple gesture based user interface for changing the blending value of the X-ray overlay.
Literature
[1] Sun, Xiao, et al. "Cascaded hand pose regression." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2015.
[2] Oberweger, Markus, Paul Wohlhart, and Vincent Lepetit. "Hands deep in deep learning for hand pose estimation." arXiv preprint arXiv:1502.06807(2015).