ChristophBichlmeier

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

Christoph Bichlmeier (Dr. rer. nat., Dipl.-Inf. Univ.)
email
Skype My status
christoph.JPG

Theses and Projects under my (Co-)Supervision


Publications

2012
T. Blum, V. Kleeberger, C. Bichlmeier, N. Navab
mirracle: An Augmented Reality Magic Mirror System for Anatomy Education
IEEE Virtual Reality 2012 (VR), Orange County, USA, Mar. 4 - 8, 2012. The original publication is available online at ieee.org. (bib)
2010
C. Bichlmeier
Immersive, Interactive and Contextual In-Situ Visualization for Medical Applications
Dissertation an der Fakultät für Informatik, Technische Universität München, 2010 ( online version available here) (bib)
C. Bichlmeier, E. Euler, T. Blum, N. Navab
Evaluation of the Virtual Mirror as a Navigational Aid for Augmented Reality Driven Minimally Invasive Procedures
The 9th IEEE and ACM International Symposium on Mixed and Augmented Reality, Seoul, Korea, Oct. 13 - 16, 2010. The original publication is available online at ieee.org. (bib)
P. Wucherer, C. Bichlmeier, M. Eder, L. Kovacs, N. Navab
Multimodal Medical Consultation for Improved Patient Education
Proceedings of Bildverarbeitung fuer die Medizin (BVM 2010), Aachen, Germany, March 2010 (bib)
M. Wieczorek, A. Aichert, O. Kutter, C. Bichlmeier, J. Landes, S.M. Heining, E. Euler, N. Navab
GPU-accelerated Rendering for Medical Augmented Reality in Minimally-Invasive Procedures
Proceedings of Bildverarbeitung fuer die Medizin (BVM 2010), Aachen, Germany, March 14-16 2010 (bib)
2009
C. Bichlmeier, S.M. Heining, L. Omary, P. Stefan, B. Ockert, E. Euler, N. Navab
MeTaTop: A Multi-Sensory and Multi-User Interface for Collaborative Analysis of Medical Imaging Data
Interactive Demo (ITS 2009), Banff, Canada, November 2009 (bib)
C. Bichlmeier, S. Holdstock, S.M. Heining, S. Weidert, E. Euler, O. Kutter, N. Navab
Contextual In-Situ Visualization for Port Placement in Keyhole Surgery: Evaluation of Three Target Applications by Two Surgeons and Eighteen Medical Trainees
The 8th IEEE and ACM International Symposium on Mixed and Augmented Reality, Orlando, US, Oct. 19 - 22, 2009. (bib)
C. Bichlmeier, M. Kipot, S. Holdstock, S.M. Heining, E. Euler, N. Navab
A Practical Approach for Intraoperative Contextual In-Situ Visualization
International Workshop on Augmented environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS 2009), London, UK, September 2009 (bib)
C. Bichlmeier, S.M. Heining, M. Feuerstein, N. Navab
The Virtual Mirror: A New Interaction Paradigm for Augmented Reality Environments
IEEE Trans. Med. Imag., vol. 28, no. 9, pp. 1498-1510, September 2009 (bib)
B. Ockert, C. Bichlmeier, S.M. Heining, O. Kutter, N. Navab, E. Euler
Development of an Augmented Reality (AR) training environment for orthopedic surgery procedures
Proceedings of The 9th Computer Assisted Orthopaedic Surgery (CAOS 2009), Boston, USA, June, 2009 (bib)
2008
C. Bichlmeier, B. Ockert, S.M. Heining, A. Ahmadi, N. Navab
Stepping into the Operating Theater: ARAV - Augmented Reality Aided Vertebroplasty
The 7th IEEE and ACM International Symposium on Mixed and Augmented Reality, Cambridge, UK, Sept. 15 - 18, 2008. (bib)
O. Kutter, A. Aichert, C. Bichlmeier, J. Traub, S.M. Heining, B. Ockert, E. Euler, N. Navab
Real-time Volume Rendering for High Quality Visualization in Augmented Reality
International Workshop on Augmented environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS 2008), USA, New York, September 2008 (bib)
C. Bichlmeier, B. Ockert, O. Kutter, M. Rustaee, S.M. Heining, N. Navab
The Visible Korean Human Phantom: Realistic Test & Development Environments for Medical Augmented Reality
International Workshop on Augmented environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS 2008), USA, New York, September 2008 (bib)
F. Wimmer, C. Bichlmeier, S.M. Heining, N. Navab
Creating a Vision Channel for Observing Deep-Seated Anatomy in Medical Augmented Reality
Proceedings of Bildverarbeitung fuer die Medizin (BVM 2008), Munich, Germany, April 2008 (bib)
S.M. Heining, C. Bichlmeier, E. Euler, N. Navab
Smart Device: Virtually Extended Surgical Drill
Proceedings of The 8th Computer Assisted Orthopaedic Surgery (CAOS 2008), Hong Kong, China, June, 2008 (bib)
2007
C. Bichlmeier, S.M. Heining, M. Rustaee, N. Navab
Laparoscopic Virtual Mirror for Understanding Vessel Structure: Evaluation Study by Twelve Surgeons
The Sixth IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, Nov. 13 - 16, 2007. (bib)
C. Bichlmeier, F. Wimmer, S.M. Heining, N. Navab
Contextual Anatomic Mimesis: Hybrid In-Situ Visualization Method for Improving Multi-Sensory Depth Perception in Medical Augmented Reality
The Sixth IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, Nov. 13 - 16, 2007. (bib)
C. Bichlmeier, M. Rustaee, S.M. Heining, N. Navab
Virtually Extended Surgical Drilling Device: Virtual Mirror for Navigated Spine Surgery
Proceedings of Medical Image Computing and Computer-Assisted Intervention (MICCAI 2007), Brisbane, Australia, October/November 2007. (bib)
N. Navab, J. Traub, T. Sielhorst, M. Feuerstein, C. Bichlmeier
Action- and Workflow-Driven Augmented Reality for Computer-Aided Medical Procedures
IEEE Computer Graphics and Applications, vol. 27, no. 5, pp. 10-14, Sept/Oct, 2007 (bib)
C. Bichlmeier, T. Sielhorst, S.M. Heining, N. Navab
Improving Depth Perception in Medical AR: A Virtual Vision Panel to the Inside of the Patient
Proceedings of Bildverarbeitung fuer die Medizin (BVM 2007), Munich, Germany, March 2007 (bib)
N. Navab, M. Feuerstein, C. Bichlmeier
Laparoscopic Virtual Mirror - New Interaction Paradigm for Monitor Based Augmented Reality
Virtual Reality, Charlotte, North Carolina, USA, March 10-14, 2007 (bib)
2006
C. Bichlmeier, N. Navab
Virtual Window for Improved Depth Perception in Medical AR
International Workshop on Augmented Reality environments for Medical Imaging and Computer-aided Surgery (AMI-ARCS 2006), Copenhagen, Denmark, October 2006 (bib)
C. Bichlmeier, T. Sielhorst, N. Navab
The Tangible Virtual Mirror: New Visualization Paradigm for Navigated Surgery
International Workshop on Augmented Reality environments for Medical Imaging and Computer-aided Surgery (AMI-ARCS 2006), Copenhagen, Denmark, October 2006 (bib)
T. Sielhorst, C. Bichlmeier, S.M. Heining, N. Navab
Depth perception a major issue in medical AR: Evaluation study by twenty surgeons
Proceedings of Medical Image Computing and Computer-Assisted Intervention (MICCAI 2006), Copenhagen, Denmark, October 2006, pp. 364-372
The original publication is available online at www.springerlink.com
(bib)
C. Bichlmeier
Advanced 3D Visualization for Intra Operative Augmented Reality
Diplomarbeit - Technische Universität München (bib)

Active Research Projects

NARVIS - navigated augmented reality visualization system

NARVIS - navigated augmented reality visualization system

Advanced visualization is getting increasingly important for the operation room of the future. The increasing number of available medical images must be presented to surgery team in new ways in order to support them rather than overloading with more information. In our project NARVIS we integrate an HMD-based (head mounted display) AR system into the operation room for 3D in situ visualization of computed tomography (CT) images. The final system aims at spinal surgery. The work is in close collaboration with our project partners “Klinikum für Unfallchirgie” at LMU, A.R.T. Weilheim, and Siemens Corporate Research in Princeton. The project is funded by Bayerische Forschungsstiftung.
ARAV Augmented Reality Aided Vertebroplasty

ARAV Augmented Reality Aided Vertebroplasty

In today’s ORs more and more operations are performed employing minimally invasive procedures. Surgical instruments are inserted through a tiny cut on the patient’s skin, the port to the inside of the patient. In some cases endoscope cameras record video images of the operation site that are presented on a monitor. As a consequence of this technique, the surgeon’s field of view is divided into several work spaces, the monitor, the patient and information of medical imaging data presented on a third station. The missing direct view on the workspace complicates intuitive control of surgical tools. In contrast to open surgery the physician has to collect information from several fields of view at the same time and fuse information mentally to create a complete model of his working space, the operation site. The minimally invasive intervention vertebroplasty was determined as a suitable medical application to bring an Head Mounted Display (HMD) into the OR for augmentation of surgical instruments and medical imaging data. In-situ visualization with an HMD presents all available imaging data and navigational information in one field of view. The objective of vertebroplasty is the insertion of cement into weak and brittle vertebrae through a trocar for stabilization. In this case the view on the inside of the patient is not provided by an endoscope camera. However, since the operation is performed under a CT scanner, imaging data is permanently updated to check position of the trocar and amount of inserted cement. Imaging data is presented on a monitor and has to be mapped mentally by the surgeon on the real operation site.
Improving Depth Perception and Perception of Layout for In-Situ Visualization in Medical Augmented Reality

Improving Depth Perception and Perception of Layout for In-Situ Visualization in Medical Augmented Reality

In-situ visualization in medical augmented reality (AR) using for instance a video see-through head mounted display (HMD) and an optical tracking system enables the stereoscopic view on visualized CT data registered with the real anatomy of a patient. Data can aligned with the required accuracy and the surgeons do not have to analyze data on an external monitor or images attached to the wall somewhere in the operating room. Thanks to a medical AR system like mentioned before, surgeons get a direct view onto and also ”into” the patient. Mental registration of medical imagery with the operation site is not necessary anymore. In addition surgical instruments can be augmented inside the human body. Bringing medical imagery and surgical instruments in the same field of action provides the most intuitive way to understand the patient’s anatomy within the region of interest and allows for the development of completely new generations of surgical navigation systems.
Unfortunately, this method of presenting medical data suffers from a serious lack. Virtual imagery, such as a volume rendered spinal column, can only be displayed superimposed on real objects. If virtual entities of the scene are expected behind real ones, like the virtual spinal column beneath the real skin surface, this problem implicates incorrect perception of the viewed objects respective their distance to the observer. The strong visual depth cue interposition is responsible for misleading depth perception. This project aims at the development and evaluation of methods to improve depth perception for in-situ visualization in medical AR. Its intention is to provide an extended view onto the human body that allows an intuitive localization of visualized bones and tissue.
Virtual Mirror: Interaction Paradigm for Augmented Reality Applications

Virtual Mirror: Interaction Paradigm for Augmented Reality Applications

Augmented Reality offers a higher degree of freedom for the programmer than classical visualization of volume data on a screen. The existing paradigms for interaction with 3D objects are not satisfactory for particular applications since the majority of them rotate and move the object of interest. The classic manipulation of virtual objects cannot be used while keeping real and virtual spaces in alignment within an AR environment. This project introduces a simple and efficient interaction paradigm allowing the users to interact with 3D objects and visualize them from arbitrary viewpoints without disturbing the in-situ visualization, or requiring the user to change the viewpoint. We present a virtual, tangible mirror as a new paradigm for interaction with 3D models. The concept borrows its visualization paradigm in some sense from methodology used by dentists to examine the oral cavity without constantly changing their own viewpoint or moving the patients head. The virtual mirror improves the understanding of complex structures, enables completely new concepts to support navigational aid for different tasks and provides the user with intuitive views on physically restricted areas.
Augmented Reality Supported Patient Education and Consultation

Augmented Reality Supported Patient Education and Consultation

The project Augmented Reality Supported Patient Education and Consultation (Augmented Reality unterstützte Operationsaufklärung) aims at developing Augmented Reality (AR) supported communication tools for patient education. The development of the targeted systems involves disciplines reaching from image registration, human computer interaction and in-situ visualization to instructional design and perceptual psychology. As a primary clinical application, we determined breast reconstruction in plastic surgery.
MeTaTop A Multi Sensory Table Top System for Medical Procedures

MeTaTop A Multi Sensory Table Top System for Medical Procedures

A tabletop system in medical environments can be used for interactive and collaborative analysis of patient data but also as a multimedia user interface within sterile space. For preoperative planning physicians in charge with a particular patient meet to discuss the medical case and plan further steps for therapy. For this reason, they could collaboratively view and browse through all kind of available medical imaging data with the tabletop system. Alternatively such a system could be a central interaction device for all kind of equippment within the OR requiring user input, however, can not be operated by the sterile surgeon. We believe that the projection of all kind of user and information interfaces on a sterile glass plane would facilitate the clinical workflow.
This project is strongly related to the Tangible Interaction Surface for Collaboration between Humans project.
3D user interfaces for medical interventions

3D user interfaces for medical interventions

This work group aims at practical user interfaces for 3D imaging data in surgery and medical interventions. The usual monitor based visualization and mouse based interaction with 3D data will not present acceptable solutions. Here we study the use of head mounted displays and advanced interaction techniques as alternative solutions. Different issues such as depth perception in augmented reality environment and optimal data representation for a smooth and efficient integration into the surgical workflow are the focus of our research activities. Furthermore appropriate ways of interaction within the surgical environment are investigated.
Laparoscope Augmentation for Minimally Invasive Liver Resection

Laparoscope Augmentation for Minimally Invasive Liver Resection

In recent years, an increasing number of liver tumor indications were treated by minimally invasive laparoscopic resection. Besides the restricted view, a major issue in laparoscopic liver resection is the precise localization of the vessels to be divided. To navigate the surgeon to these vessels, pre-operative imaging data can hardly be used due to intra-operative organ deformations caused by appliance of carbon dioxide pneumoperitoneum and respiratory motion.

Therefore, we propose to use an optically tracked mobile C-arm providing cone-beam computed tomography imaging capability intra-operatively. After patient positioning, port placement, and carbon dioxide insufflation, the liver vessels are contrasted and a 3D volume is reconstructed during patient exhalation. Without any further need for patient registration, the volume can be directly augmented on the live laparoscope video. This augmentation provides the surgeon with essential aid in the localization of veins, arteries, and bile ducts to be divided or sealed.

Current research focuses on the intra-operative use and tracking of mobile C-arms as well as laparoscopic ultrasound, augmented visualization on the laparoscope's view, and methods to synchronize respiratory motion.

Teaching

Conference Activities

http://campwww.informatik.tu-muenchen.de/AMIARCS09/doku.php AMI-ARCS 2009: 5th Workshop on Augmented environments for Medical Imaging and Computer-aided Surgery

Award

siemens_award_2006_icon.jpg Werner von Siemens Excellence Award 2006
for my diploma thesis Advanced 3D Visualization for Intra Operative Augmented Reality (.pdf)
supervised by Tobias Sielhorst & advised by Prof. Nassir Navab
mbpw 2009 Munich Business Plan Competition 2009 (Developer Stage) for the business plan on A Simulator System for Team-Oriented, Surgical Education and Training of Complete Intraoperativen Procedures together with Philipp Stefan, MD Sandro Heining and Prof. Nassir Navab

2006
C. Bichlmeier
Advanced 3D Visualization for Intra Operative Augmented Reality
Diplomarbeit - Technische Universität München (bib)

Going Abroad for Research Visits

simiosys

Beginning from Oct. 2008 I spent three month in Orlando, Florida at Christopher Stapletons lab Simiosys to study the application of instructional design for Medical Augmented Reality technology.

Address in Orlando:
3280 Progress Drive
Orlando Florida 32826
Phone: 407 965 1356
Fax: 407 658 5059

Stuff




UsersForm
Title: Dr.
Firstname: Christoph
Middlename: Paul
Lastname: Bichlmeier
Picture: bichlmeierchristophicon.png
Birthday:  
Nationality: Bavaria
Languages: English, German, Spanish, Bavarian, Portuguese
Groups: Medical Imaging, Computer-Aided Surgery, Medical Augmented Reality, 3D Information Presentation, 3D Interaction
Expertise: Computer-Aided Surgery, Medical Augmented Reality, 3D Information Presentation, 3D Interaction
Position: External Collaborator
Status: Alumni
Emailbefore: christoph.bichlmeier
Emailafter: gmx.de
Room: NARVIS lab
Telephone: +49 89 5160 4368
Alumniactivity:  
Defensedate: 1 December 2010
Thesistitle: Immersive, Interactive and Contextual In-Situ Visualization for Medical Applications
Alumnihomepage:  
Personalvideo01:  
Personalvideotext01:  
Personalvideopreview01:  
Personalvideo02:  
Personalvideotext02:  
Personalvideopreview02:  


Edit | Attach | Refresh | Diffs | More | Revision r1.101 - 25 Nov 2011 - 15:58 - ChristophBichlmeier

Lehrstuhl für Computer Aided Medical Procedures & Augmented Reality    rss.gif