DaGuiForHmdInteraction

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

Design, Development and Evaluation of a Multimodal User Interface for Medical In-Situ Visualization

Thesis by: Samuel Kerschbaumer
Advisor: Nassir Navab
Supervision by: Christoph Bichlmeier

Abstract

In-situ visualization of medical image data using a head mounted display allows the presentation of virtual objects within the Augmented Reality (AR) scene from the natural point of view. However, a major problem is to manipulate the visualization: parameters should be adjusted by the user to get the desired view of the region of interest. Also, for different stages in intraoperative navigational procedures, visualization of various instruments and navigational information has to be adjusted according to the needs of the operating surgeon. However, in operating rooms there is almost no room for classical interfaces like buttons, pedals, keyboards and mice. All tools close to the operation site have to be sterile and space around the operating table is reserved for surgical equipment. In my thesis, new concepts for interfaces to interact with the AR scene and to manipulate virtual objects are developed. The optimal user interface has to exploit the advantages of AR without hindering the user by too complex or space wasting tools, because they would drastically reduce the acceptance of AR in the operating room. As a result of this thesis, three input modalities based on hand detection, a foot pedal and voice recognition are implemented. A user study compares the three different interfaces, showing their strengths and weaknesses.

Students.ProjectForm
Title: Design, Development and Evaluation of a Multimodal User Interface for Medical In-Situ Visualization
Abstract: In-situ visualization of medical image data using a head mounted display allows the presentation of virtual objects within the Augmented Reality (AR) scene from the natural point of view. However, a major problem is to manipulate the visualization: parameters should be adjusted by the user to get the desired view of the region of interest. Also, for different stages in intraoperative navigational procedures, visualization of various instruments and navigational information has to be adjusted according to the needs of the operating surgeon. However, in operating rooms there is almost no room for classical interfaces like buttons, pedals, keyboards and mice. All tools close to the operation site have to be sterile and space around the operating table is reserved for surgical equipment. In my thesis, new concepts for interfaces to interact with the AR scene and to manipulate virtual objects are developed. The optimal user interface has to exploit the advantages of AR without hindering the user by too complex or space wasting tools, because they would drastically reduce the acceptance of AR in the operating room. As a result of this thesis, three input modalities based on hand detection, a foot pedal and voice recognition are implemented. A user study compares the three different interfaces, showing their strengths and weaknesses.
Student: Samuel Kerschbaumer
Director: Nassir Navab
Supervisor: Christoph Bichlmeier, Stuart Holdstock
Type: Diploma Thesis
Area: Surgical Workflow, Computer-Aided Surgery, Medical Augmented Reality
Status: finished
Start: 2008/11/15
Finish: 2009/05/15
Thesis (optional):  
Picture:  


Edit | Attach | Refresh | Diffs | More | Revision r1.9 - 07 Dec 2011 - 13:52 - EtibarTaghiyev