MaTRUSMROrganMotion

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

Real time image based correction of organ motion for TRUS-MR fusion

Advisor: Nassir Navab
Supervision by: Peter Maday

Project Description

Needle biopsy of the prostate under Trans-rectal ultrasound guidance (TRUS) is an established technique for the diagnosis of prostate cancer. To improve the low specificity of random biopsy, information about the suspected location of tumor cells needs to be incorporated into the procedure. Prostate specific PET tracers, and the advent of PET-MR equipment opened up the possibility for simultaneous imaging of anatomical structure, and the mapping of areas most likely affected by cancer. The direct use of such equipment for biopsy guidance is however not practical.

In a recent work [1] a real time guidance system is developed, delivering a fused view of the live 3D US stream and the preoperative volumes. As a first step, large scale tissue deformations, due to differences in the patient positioning are corrected for, using deformable registration between the 3D US and the preoperative MR volume. During the tracking phase, changes in the imaging pose are monitored with the aid of a probe-mounted electro-magnetic tracker, to generate the live fused view. A shortcoming of this setup, is that only changes in the TRUS pose are accounted for. Slight deformation of the surrounding tissues is unavoidable, that introduces artifacts in the overlay.

The goal of this thesis is to develop an image based tracking solution with real-time capabilities, to account for the currently unmodeled organ deformations in the TRUS-MR fusion setup.

The simultaneous model based segmentation and tracking method of [2] could serve as a starting point for the investigation. The real-time slice-to-volume registration algorithm should be extended to work with the 3D US stream, by using a sparse set of cross section slices ([3]). The approach only accounts for rigid motion, thus additional steps are required to enable deformation modeling. The theoretical framework is closely related to works in computer vision [4], for which deformable extensions based on shape modeling is available [5,6]. Appropriate extensions should yield a suitable solution for the real time deformable organ tracking in TRUS-MR fusion.

Tasks

  1. Real-time rigid 3D TRUS to MR registration [2,3]
  2. Real-time deformable 3D TRUS to MR registration [4,5,6]
  3. Quantitative Validation

Requirements

Contact

If you are interested in the project or if you have any questions please contact Peter Maday

References



Edit | Attach | Refresh | Diffs | More | Revision r1.1 - 01 Sep 2015 - 08:23 - PeterMaday