IFL

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

IFL: Interdisziplinäres Forschungslabor

Klinikum rechts der Isar
Location:
Klinikum rechts der Isar
Ismaninger Str. 22
81675 München

Room: 01.01.3a-c (basement floor)
Phone: +49 (89) 4140-6457
Fax: +49 (89) 4140-6458
Twitter: @IFL_CAMP

General Scope:
IFL is a central research laboratory open for joint projects of the medical and non-medical groups of the Technische Universität München. Its goal is to encourage interdisciplinary work and discussion between Applied Sciences, Engineering and Medicine in close contact with physicians in order to generate new lines of applied research in Medical Technology and Computer- and Robot-Aided Medical Procedures. The laboratory serves as a space for collaborative projects that aim at the fast transfer of technological development into the clinical routine by an iterative approach, getting feedback form the end-user - the clinician - as soon as possible. Current projects include, besides CAMP, the departments for neurosurgery, general surgery, vascular surgery, nuclear medicine, urology, gynecology, radiology, plastic surgery, and trauma surgery. We have equipment relevant to a multi-disciplinary research environment, including an Intuitive Surgical daVinci system, light-weight robotic arms, ultrasound systems, tracking systems and RGB cameras. We also provide a conference room with tele-conferencing facilities and a coffee and couch corner to relax.

IFL further serves as a show-room for cutting-edge research, keeping both the medical personnel at the Klinikum rechts der Isar and the community informed of latest results, We thus obtain early feedback and encourage new collaborations.

Take a look inside.


Link to internal IFL page.

The IFL Laboratory

CAMP@IFL

The following links provide information about the organisation and location of IFL.

Ongoing CAMP Research Projects @ IFL

MedInnovate: From unmet clinical needs to solution concepts

MedInnovate: From unmet clinical needs to solution concepts

Learn how to successfully identify unmet clinical needs within the clinical routine and work towards possible and realistic solutions to solve those needs. Students will get to know tools helping them to be successful innovators in medical technology. This will include all steps from needs finding and selection to defining appropriate solution concepts, including the development of first prototypes. Get introduced to necessary steps for successful idea and concept creation and realize your project in an interdisciplinary teams comprising of physicists, informations scientists and business majors. During the project phase, you are supported by coaches from both industry and medicine, in order to allow for direct and continuous exchange.
Advanced Robotics for Multi-Modal Interventional Imaging (RoBildOR)

Advanced Robotics for Multi-Modal Interventional Imaging (RoBildOR)

This project aims at developing advanced methods for robotic image acquisitions, enabling more flexible, patient- and process-specific functional and anatomical imaging with the operating theater. Using robotic manipulations, co-registered and dynamic imaging can be provided to the surgeon, allowing for optimal implementation of preoperative planning. In particular, this projects is the first one developing concepts for intraoperative SPECT-CT, and introduces intraoperative robotic Ultrasound imaging based on CT trajectory planning, enabling registration with angiographic data. With distinguished partners from Bavarian industry, this project has a fundamental contribution in developing safe, reliable, flexible, and multi-modal imaging technologies for the operating room of the future.
Computational Sonography

Computational Sonography

3D ultrasound imaging has high potential for various clinical applications, but often suffers from high operator-dependency and the directionality of the acquired data. State-of-the-art systems mostly perform compounding of the image data prior to further processing and visualization, resulting in 3D volumes of scalar intensities. This work presents computational sonography as a novel concept to represent 3D ultrasound as tensor instead of scalar fields, mapping a full and arbitrary 3D acquisition to the reconstructed data. The proposed representation compactly preserves significantly more information about the anatomy-specific and direction-depend acquisition, facilitating both targeted data processing and improved visualization. We show the potential of this paradigm on ultrasound phantom data as well as on clinically acquired data for acquisitions of the femoral, brachial and antebrachial bone. Further investigation will consider additional compact directional-dependent representations on the one hand and on the other hand modify Computational Sonography from working on B-Mode images to RF-envelope statistics, motivated by the statistical process of image formation. We will show the advantages of the proposed improvements on simulated ultrasound data, phantom and clinically acquired ultrasound data.
EDEN2020: Enhanced Delivery Ecosystem for Neurosurgery

EDEN2020: Enhanced Delivery Ecosystem for Neurosurgery

EDEN2020 (Enhanced Delivery Ecosystem for Neurosurgery) aims to develop the gold standard for one-stop diagnosis and treatment of brain disease by delivering an integrated technology platform for minimally invasive neurosurgery. A team of first-class industrial partners (Renishaw plc. and XoGraph ltd.), leading clinical oncological neurosurgery team (Università di Milano, San Raffaele and Politecnico di Milano) lead by Prof. Lorenzo Bello and the involvement of leading experts in shape sensing (Universitair Medisch Centrum Groningen) under supervision of Prof. Dr. Sarthak Misra The project is coordinated by Dr. Rodriguez y Baena, Imperial College London. His team provides the core technology for the envisioned system, the bendable robotic needle. During the course of EDEN2020 this interdisciplinary team will work on the integration of 5 key concepts, namely (1) pre-operative MRI and diffusion-MRI imaging, (2) intra-operative ultrasounds, (3) robotic assisted catheter steering, (4) brain diffusion modelling, and (5) a robotics assisted neurosurgical robotic product (the Neuromate), into a pre-commercial prototype which meets the pressing demand for better and less invasive neurosurgery. Our chair will be focusing on the imaging components (i.e. (1) and (2)), targeting the realtime compensation of tissue movement and accurate localization of the flexible catheters at hand. We will further extend the findings of FP7 ACTIVE, in which we successfully combined pre-operative MRI with intra-operative US through deformable 3D-2D registration, making us most qualified for this role.
Freehand SPECT for Sentinel Lymph Node Localization

Freehand SPECT for Sentinel Lymph Node Localization

Nuclear medicine imaging modalities assist commonly in surgical guidance given their functional nature. However, when used in the operating room they present limitations. Pre-operative tomographic 3D imaging can only serve as a vague guidance intra-operatively, due to movement, deformation and changes in anatomy since the time of imaging, while standard intra-operative nuclear measurements are limited to 1D or (in some cases) 2D images with no depth information. To resolve this problem we propose the synchronized acquisition of position, orientation and readings of gamma probes intra-operatively to reconstruct a 3D activity volume. In contrast to conventional emission tomography, here, in a first proof-of-concept, the reconstruction succeeds without requiring symmetry in the positions and angles of acquisition, which allows greater flexibility and thus opens doors towards 3D intra-operative nuclear imaging.
In-PSMA Radioguided Surgery

In-PSMA Radioguided Surgery

With the advent of 68Ga-HBED-PSMA PET hybrid imaging techniques, even small and atypical localized metastatic lesions of prostate cancer can be visualized. However, these lesions might not be easy to localize intraoperatively. The aim of project is to evaluate intraoperative detection of metastatic lesions using a gamma probe and freehand SPECT after injection of radioactive-labelled PSMA-ligands in correlation with postoperative histological findings.
Inside-Out Tracking

Inside-Out Tracking

Current tracking solutions routinely used in a clinical, potentially surgically sterile, environment are limited to mechanical, electromagnetic or classic optical tracking. Main limitations of these technologies are respectively the size of the arm, the influence of ferromagnetic parts on the magnetic field and the line of sight between the cameras and tracking targets. These drawbacks limit the use of tracking in a clinical environment. The aim of this project is the development of so-called inside-out tracking, where one or more small cameras are fixed on clinical tools or robotic arms to provide tracking, both relative to other tools and static targets.
These developments are funded from the 1st of January 2016 to 31st of December 2017 by the ZIM project Inside-Out Tracking for Medical Applications (IOTMA).
Intra-operative Human Computer Interaction and Usability Evaluations

Intra-operative Human Computer Interaction and Usability Evaluations

Computerized medical systems play a vital role in the operating room, yet surgeons often face challenges when interacting with these systems during surgery. In this project we are aiming at analyzing and understanding the Operating Room specific aspects which affect the end user experience. Beside operating room specific usability evaluation approaches in this project we also try to improve the preliminary intra-operative user interaction methodologies.
Kooperationsprojekt SFB 824 (3. Förderperiode) & BFS

Kooperationsprojekt SFB 824 (3. Förderperiode) & BFS

The SFB824 (Sonderforschungsbereich 824: Central project for histopathology, immunohistochemistry and analytical microscopy) represents an interdisciplinary consortium which aims at the development of novel imaging technologies for the selection and monitoring of cancer therapy as an important support for personalized medicine. Z2, the central unit for comparative morphomolecular pathology and computational validation, provides integration, registration and quantification of data obtained from both macroscopic and (sub-)cellular in-vivo as well as ex-vivo imaging modalities with tissue-based morphomolecular readouts as the basis for the development and establishment of personalized medicine. In order to develop novel imaging technologies, co-annotation and validation of image data acquired by preclinical or diagnostic imaging platforms via tissue based quantitative morphomolecular methods is crucial. Light sheet microscopy will continue to close the gap between 3D data acquired by in-vivo imaging and 2D histological slices especially focusing on tumor vascularization. The Multimodal ImagiNg Data Flow StUdy Lab (MINDFUL) is a central system for data management in preclinical studies developed within SFB824. Continuing the close collaboration of pathology, computer sciences and basic as well as translational researchers from SFB824 will allow the Z2 to develop and subsequently provide a broad variety of registration and analysis tools for joint imaging and tissue based image standardization and quantification.

The goal of the BFS Project: ImmunoProfiling using Neuronal Networks (IPN2) is to develop a method based on neuronal networks and recent advances in Deep Learning to allow characterization of a patient's tumor as ″hot″ or ″cold″ tumor depending on the identified ImmunoProfile. Recent research has shown that many tumors are infiltrated by immuno-competent cells, as well as that the amount, type and location of the infiltrated lymph nodes in primary tumors provide valuable prognostic information. In contrast to a ″cold tumor″, a ″hot tumor″ is characterized by an active immune system which the tumor has identified as threat. This identification provides the basis for selecting the therapy best suitable for the individual patient.
Prostate Fusion Biopsy

Prostate Fusion Biopsy

Transrectal ultrasound (TRUS) guided biopsy remains the gold standard for diagnosis. However, it suffers from low sensitivity, leading to an elevated rate of false negative results. On the other hand, the recent advent of PET imaging using a novel dedicated radiotracer, Ga-labelled PSMA (Prostate Specific Membrane Antigen), combined with MR provides improved preinterventional identification of suspicious areas. Thus, MRI/TRUS fusion image-guided biopsy has evolved to be the method of choice to circumvent the limitations of TRUS-only biopsy. We propose a multimodal fusion image-guided biopsy framework that combines PET-MRI images with TRUS. Based on open-source software libraries, it is low cost, simple to use and has minimal overhead in clinical workflow. It is ideal as a research platform for the implementation and rapid bench to bedside translation of new image registration and visualization approaches.
Stain Separation and Structure-Preserving Color Normalization for Histological Images

Stain Separation and Structure-Preserving Color Normalization for Histological Images

Staining and scanning of tissue samples for microscopic examination is fraught with unwanted variations that affect their color appearance. Sources of these variations include differences in raw material and manufacturing techniques of stain vendors, staining protocols of labs, and color responses of digital scanners. Color normalization of stained biopsies and tissue microarrays will help pathologists and computational pathology software while comparing different tissue samples. However, techniques that are used for natural images, such as histogram matching fail to utilize unique properties of stained tissue samples and produce undesirable artifacts. Tissue samples are stained with only a few reagents (frequently only two -- hemotoxylin and eosin or H\&E) and most tissue regions bind to only one stain or the other, thus producing sparse density maps composed of only a few components. This underlying structure of sparse stain density is biomedically important. We used these properties of stained tissue to propose a technique for stain separation and color normalization. Based on sparse non-negative matrix factorization (sparseNMF), we estimate prototype color and density map of each stain in an unsupervised manner to perform stain separation. To color normalize a given source image, we combine its stain density maps with the stain color prototypes of a target image whose appearance was preferred by pathologists. In this way, the normalized image preserve the biological structure encoded in the stain density of the source image. Both the proposed sparseNMF stain separation and color-normalization techniques yield higher correlation with ground truth than the state of the art. They are also rated qualitatively higher than other techniques by a group of pathologists. We further propose a computationally faster extension of this technique for large whole-slide images that selects an appropriately small sample of patches to compute the color prototypes of each stain instead of using the entire image. The fast scheme achieves a 20-folds acceleration, which does not only greatly enhance the analysis efficiency, but also allow its clinical applications to become practically feasible.
SUPRA: Software Defined Ultrasound Processing for Real-Time Applications

SUPRA: Software Defined Ultrasound Processing for Real-Time Applications

SUPRA is an open-source pipeline for fully software defined ultrasound processing for real-time applications. Covering everything from beamforming to output of B-Mode images, SUPRA can help to improve the reproducibility of results and does allow for a full customization of the image acquisition workflow. Including all processing stages of a common ultrasound pipeline, it can be executed in 2D and 3D on consumer GPUs in real-time. Even on hardware as small as the CUDA enabled Jetson TX2, SUPRA allows for 2D imaging in real-time.

You can access the code on our github page: https://github.com/IFL-CAMP/supra
Additional information can be found in our work on SUPRA
Göbl, R. and Navab, N. and Hennersperger, C., SUPRA: Open Source Software Defined Ultrasound Processing for Real-Time Applications, eprint arXiv:1711.06127, Nov 2017, under review for IPCAI2018 The development of SUPRA was partly funded by the European Horizon 2020 Project EDEN2020.

Past CAMP Research Projects @ IFL

Please click here to find out about past CAMP projects at IFL.

Recent Publications

2017
B. Busam, T. Birdal, N. Navab
Camera Pose Filtering with Local Regression Geodesics on the Riemannian Manifold of Dual Quaternions
International Conference on Computer Vision Workshop (ICCVW) on Multiview Relationships in 3D Data, Venice, Italy, October 2017 [oral]. (bib)
J. Wuestemann, F. Pinto, M. Mesri, P. Matthies, J. Neba, M. Pech, M. J. Tapner, M. C. Kreissl, M. Lassmann, O. S. Grosser
Interventional Real-Time Quantification of 90Y-Microspheres Distribution in Selective Internal Radiotherapy
European Association of Nuclear Medicine (EANM) Congress, Vienna, Austria, October 2017 (bib)
P. Matthies, M. Mesri, F. Pinto, J. Wuestemann, O. S. Grosser
Patient specific scatter reduction in SIRT gamma camera images
International Conference on Monte Carlo Techniques for Medical Applications, Naples, Italy, October 2017 (bib)
R. Kojcev, A. Khakzar, B. Fuerst, O. Zettinig, C. Fakhry, R. DeJong, J. Richmon, R. Taylor, E. Sinibaldi, N. Navab
On the Reproducibility of Expert-Operated and Robotic Ultrasound Acquisitions
International Journal of Computer Assisted Radiology and Surgery / International Conference on Information Processing in Computer-Assisted Interventions (IPCAI), Barcelona, June 2017.
O. Zettinig, J. Rackerseder, B. Lentes, T. Maurer, K. Westenfelder, M. Eiber, B. Frisch, N. Navab
Preconditioned Intensity-Based Prostate Registration using Statistical Deformation Models
IEEE International Symposium on Biomedical Imaging (ISBI), Melbourne, April 2017. (bib)
M. Riva, C. Hennersperger, F. Milletari, A. Katouzian, F. Pessina, B. Gutierrez-Becker, A. Castellano, N. Navab, L. Bello
3D intra-operative ultrasound and MR image-guidance: pursuing an ultrasound-based management of brainshift to enhance neuronavigation
International Journal of Computer Assisted Radiology and Surgery, in press. (bib)
R. Göbl, S. Virga, J. Rackerseder, B. Frisch, N. Navab, C. Hennersperger
Acoustic window planning for ultrasound acquisition
International Journal of Computer Assisted Radiology and Surgery / 8th International Conference on Information Processing in Computer-Assisted Interventions (IPCAI), Barcelona, Spain, June 2017.
The original publication is available online at link.springer.com
(bib)
O. Zettinig, B. Frisch, S. Virga, M. Esposito, A. Rienmüller, B. Meyer, C. Hennersperger, Y.-M. Ryang , N. Navab
3D Ultrasound Registration-based Visual Servoing for Neurosurgical Navigation
International Journal of Computer Assisted Radiology and Surgery, Volume 12, Issue 9, pp 1607–1619, 2017.
The original publication is available online at link.springer.com
(bib)
2016
B. Frisch, O. Zettinig, B. Fuerst, S. Virga, C. Hennersperger, N. Navab
Collaborative Robotic Ultrasound: Towards Clinical Application
Radiological Society of North America Annual Meeting, Chicago, USA, 2016 (bib)
B. Busam, M. Esposito, B. Frisch, N. Navab
Quaternionic Upsampling: Hyperspherical Techniques for 6 D-o-F Pose Tracking
International Conference on 3DVision (3DV), Stanford University, California, USA, October 2016 (bib)
C. Hennersperger, B. Fuerst, S. Virga, O. Zettinig, B. Frisch, T. Neff, N. Navab
Towards MRI-Based Autonomous Robotic US Acquisitions: A First Feasibility Study
IEEE Transactions on Medical Imaging, vol. 36, iss. 2, 2017
The original publication (open access) is available online at ieeexplore.ieee.org
(bib)
J. Gardiazabal, P. Matthies, J. Vogel, B. Frisch, N. Navab, S. I. Ziegler, T. Lasser
Flexible Mini Gamma Camera Reconstructions of Extended Sources using Step and Shoot and List Mode
Medical Physics 43(12):6418-6428, 2016. (bib)
M. Esposito, B. Busam, C. Hennersperger, J. Rackerseder, N. Navab, B. Frisch
Multimodal US-Gamma Imaging using Collaborative Robotics for Cancer Staging Biopsies
International Journal of Computer Assisted Radiology and Surgery (bib)
R. Kojcev, B. Fuerst, O. Zettinig, J. Fotouhi, S.C. Lee, B. Frisch, R. Taylor, E. Sinibaldi, N. Navab
Dual-Robot Ultrasound-Guided Needle Placement: Closing the Planning-Imaging-Action Loop
International Journal of Computer Assisted Radiology and Surgery / International Conference on Information Processing in Computer-Assisted Interventions (IPCAI), Heidelberg, June 2016.
The original publication is available online at link.springer.com
(bib)
O. Zettinig, B. Fuerst, R. Kojcev, M. Esposito, M. Salehi, W. Wein, J. Rackerseder, B. Frisch, N. Navab
Toward Real-time 3D Ultrasound Registration-based Visual Servoing for Interventional Navigation
IEEE International Conference on Robotics and Automation (ICRA), Stockholm, May 2016.
The original publication is available online at ieeexplore.ieee.org
(bib)
C. Bluemel, P. Matthies, K. Herrmann, S. P. Povoski
3D Scintigraphic Imaging and Navigation in Radioguided Surgery: Freehand SPECT Technology and its Clinical Applications
Expert Review of Medical Devices, 2016 (bib)
2015
B. Busam, M. Esposito, S. Che'Rose, N. Navab, B. Frisch
A Stereo Vision Approach for Cooperative Robotic Movement Therapy
International Conference on Computer Vision Workshop (ICCVW), Santiago, Chile, December 2015 [oral]. (bib)
P. Matthies, B. Frisch, J. Vogel, T. Lasser, M. Friebe, N. Navab
Inside-Out Tracking for Flexible Hand-held Nuclear Tomographic Imaging
IEEE Nuclear Science Symposium and Medical Imaging Conference, San Diego, USA, November 2015 (bib)
J. Gardiazabal, B. Frisch, P. Matthies, J. Vogel, S. I. Ziegler, N. Navab, T. Lasser
List-Mode Reconstruction for Continuous Freehand SPECT Acquisitions
IEEE Nuclear Science Symposium and Medical Imaging Conference, San Diego, USA, November 2015 (bib)
B. Frisch, et al.
First Results with an Interventional Handheld PET
IEEE Nuclear Science Symposium and Medical Imaging Conference, San Diego, USA, November 2015 (bib)
M. Esposito, B. Busam, C. Hennersperger, J. Rackerseder, A. Lu, N. Navab, B. Frisch
Cooperative Robotic Gamma Imaging: Enhancing US-guided Needle Biopsy
Proceedings of the 18th International Conference on Medical Image Computing and Computer Assisted Interventions (MICCAI), Munich, Germany, October 2015 [oral]. (bib)
O. Zettinig, A. Shah, C. Hennersperger, M. Eiber, C. Kroll, H. Kübler, T. Maurer, F. Milletari, J. Rackerseder, C. Schulte-zu-Berge, E. Storz, B. Frisch, N. Navab
Multimodal Image-Guided Prostate Fusion Biopsy based on Automatic Deformable Registration
International Journal of Computer Assisted Radiology and Surgery / 6th International Conference on Information Processing in Computer-Assisted Interventions (IPCAI), Barcelona, June 2015.
The original publication is available online at link.springer.com
(bib)
A. Shah, O. Zettinig, E. Storz, T. Maurer, M. Eiber, N. Navab, B. Frisch
Challenges in Multimodal Image-guided Targeted Prostate Biopsy
Hamlyn Symposium on Medical Robotics, London, UK, June 2015 (bib)
B. Frisch, T. Maurer, A. Okur, T. Weineisen, , H. Kübler, N. Navab, HP Wester, M. Schwaiger, M. Eiber
Freehand SPECT for 111In-PSMA-I&T radioguided lymphadenectomy in prostate cancer patients
Society of Nucler Medicine and Medical Imaging Annual Meeting, Baltimore, USA, 2015 (bib)
B. Frisch, E. Storz, O. Zettinig, A. Shah, H. Kübler, N. Navab, HP Wester, M. Schwaiger, M. Eiber, T. Maurer
PET/MRI/TRUS image fusion guided prostate biopsy: development of a research platform and initial clinical results
Society of Nucler Medicine and Medical Imaging Annual Meeting, Baltimore, USA, 2015 (bib)
J. Gardiazabal, J. Vogel, P. Matthies, M. Wieczorek, B. Frisch, N. Navab, S. I. Ziegler, T. Lasser
Fully 3D thyroid imaging with mini gamma cameras
Proceedings of Fully3D, Newport, USA, June 2015 (bib)
T. Maurer, T. Weineisen, HP Wester, , A. Okur, G. Weirich, H. Kübler, M. Schwaiger, J. Gschwend, B. Frisch, M. Eiber
PSMA-radioguided surgery: Introducing molecular surgery in patients with recurrent prostate cancer
2015 Annual Meeting of the American Urological Association, New Orleans, USA, 2015 (bib)
E. Storz, A. Shah, O. Zettinig, M. Eiber, H.-J. Wester, H. Kübler, J. Gschwend, M. Schwaiger, B. Frisch, T. Maurer
PSMA-PET/MRI-guided fusion biopsy for the detection of prostate cancer
Annual Congress of European Association of Urology (EAU), Madrid, March 2015 (bib)
T. Maurer, T. Weineisen, HP Wester, , A. Okur, G. Weirich, H. Kübler, M. Schwaiger, J. Gschwend, B. Frisch, M. Eiber
Introduction of PSMA-radioguided surgery in patients with recurrent prostate cancer: taking salvage lymphadenectomy to the next level?
Annual Congress of European Association of Urology (EAU), Madrid, March 2015 (bib)
T. Lasser, J. Gardiazabal, M. Wieczorek, P. Matthies, J. Vogel, B. Frisch, N. Navab
Towards 3D thyroid imaging using robotic mini gamma cameras
Bildverarbeitung für die Medizin, Lübeck, Germany, March 2015 (bib)
A. Hartl, D. I. Shakir, T. Lasser, S. I. Ziegler, N. Navab
Detection models for freehand SPECT reconstruction
Physics in Medicine and Biology 60(3):1031-1046, 2015 (bib)


Edit | Attach | Refresh | Diffs | More | Revision r1.67 - 17 May 2019 - 17:01 - ThomasWendler

Lehrstuhl für Computer Aided Medical Procedures & Augmented Reality    rss.gif