Thesis by: Latifa Omary
Advisor: Nassir Navab
Supervision by: Christoph Bichlmeier
Medical Partner: Ben Ockert (Klinikum Innenstadt)
For preoperative planning physicians being in charge with a particular patient meet in order to discuss the medical case and plan further steps for therapy. Some of the topics to be discussed during this meeting base on medical imaging data of the patient such as previously captured CT or MRI data that is displayed on monitors. However, the navigation through the stacks of slices of such volumetric data sets is performed by only one physician with standard input devices like a mouse or keyboard. Usually, several physicians sit or stand behind the computer to examine the patient’s anatomy within the region of interest in the medical imaging data and give commands, to the physician having control over computer input devices, for the data manipulating, such as browsing through the stack forward and backward or adjusting color parameters such as contrast or brightness of grey scale images. Current user interfaces have several limitations, such as the data visualization for only 2D images and the single user interaction. The long communication pipeline between various physician and the one controling the slice viewer makes analysis of imaging data difficult and time consuming. This thesis aims at the development of a user interface, based on tabletop system with a FTIR-technology, for the collaborative visualization and analysis of medical imaging data. Its goal is to improve the collaborative work of a medical team consisting of up to 6-8 persons. A large horizontal interactive and multi-touch capable display provides enough place for all participants of the meeting to be positioned around the table. During the collaborative discussion of the patient’s imaging data, each of them can simultaneously interact with the data projected on the table surface by simply touching the surface with his or her fingers. Moreover, the interactive and intuitive user interface is based on two sensors systems. Besides a multi-sensitive table surface allowing for interaction in 2D space based on gestures due to finger motion, the integration of an optical tracking system placed above the tabletop system allows for interaction in 3D space.
Students.ProjectForm | |
---|---|
Title: | A Tabletop Display as a Multi-Modal and Multi-User Interface for Collaborative Patient Data Analysis |
Abstract: | For preoperative planning physicians being in charge with a particular patient meet in order to discuss the medical case and plan further steps for therapy. Some of the topics to be discussed during this meeting base on medical imaging data of the patient such as previously captured CT or MRI data that is displayed on monitors. However, the navigation through the stacks of slices of such volumetric data sets is performed by only one physician with standard input devices like a mouse or keyboard. Usually, several physicians sit or stand behind the computer to examine the patient’s anatomy within the region of interest in the medical imaging data and give commands, to the physician having control over computer input devices, for the data manipulating, such as browsing through the stack forward and backward or adjusting color parameters such as contrast or brightness of grey scale images. Current user interfaces have several limitations, such as the data visualization for only 2D images and the single user interaction. The long communication pipeline between various physician and the one controling the slice viewer makes analysis of imaging data difficult and time consuming. This thesis aims at the development of a user interface, based on tabletop system with a FTIR-technology, for the collaborative visualization and analysis of medical imaging data. Its goal is to improve the collaborative work of a medical team consisting of up to 6-8 persons. A large horizontal interactive and multi-touch capable display provides enough place for all participants of the meeting to be positioned around the table. During the collaborative discussion of the patient’s imaging data, each of them can simultaneously interact with the data projected on the table surface by simply touching the surface with his or her fingers. Moreover, the interactive and intuitive user interface is based on two sensors systems. Besides a multi-sensitive table surface allowing for interaction in 2D space based on gestures due to finger motion, the integration of an optical tracking system placed above the tabletop system allows for interaction in 3D space. |
Student: | Latifa Omary |
Director: | Nassir Navab |
Supervisor: | Christoph Bichlmeier |
Type: | Diploma Thesis |
Area: | |
Status: | finished |
Start: | |
Finish: | 2008/11/15 |
Thesis (optional): | |
Picture: |