ZhongliangJiang

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

Zhongliang Jiang

Contact

Zhongliang Jiang Zhongliang Jiang, MSc

Email:

Computer Aided Medical Procedures & Augmented Reality
Fakultät für Informatik
Technische Universität München
Boltzmannstr. 3
85748 Garching b. München
Room: MI 03.13.041

Klinikum Rechts der Isar
Interdisziplinäres Forschungslabor
Ismaninger Str. 22
81675 München
Room: Building 501 - 01.01.3a-c
Phone: +49 89 4140 6457

* Google ScholarScholar

News

  • 30 Jun (2021): 1 paper accepted IEEE Robotics and Automation Letters (presented at IROS2021)
  • 23 Jun (2021): 1 paper accepted for publication in IEEE Transactions on Industrial Electronics
  • 28 Feb (2021): 1 paper accepted to ICRA2021
  • 21 Oct (2020): 1 paper accepted for publication in IEEE Transactions on Industrial Electronics
  • 30 May (2020): 1 paper accepted for publication in IEEE Transactions on Medical Robotics and Bionics
  • 07 Jan (2020): 1 paper accepted to IEEE Robotics and Automation Letters (presented at ICRA2020)
  • 03 Oct (2018): Join CAMP


Research Interests

  • Medical Robotics: automatic robotic ultrasound acquisition and diagnosis system
  • Robotic Learning: reinforcement learning, inverse reinforcement learning, imitation learning to learn operation skills form doctors
  • Robotic Control: MPC, shared control, Human-Robot Interaction, fuzzy control to ensure safety and accomplish complex medical tasks
  • Imaging Processing: RGB-D camera image process, surface registration, ultrasound imaging segmentation


Bachelor, Master Thesis

If you are interested in the area of medical robotics, robot control, and robotic learning, always welcome to drop an email to me.

Available
Master ThesisAutomatic Robotic Ultrasound Scan--Stage II
(Zhongliang Jiang; Dr. Mingchuan Zhou, Prof. Nassir Navab)

Finished
Master ThesisAutomatic Robotic Ultrasound Scan
(Zhongliang Jiang; Dr. Mingchuan Zhou, Prof. Nassir Navab)

  • Learning-Based Method to Place the Ultrasound Probe in Normal Direction of Contact Surface
  • Reinforcement Learning for Fully Automatic Robotic Ultrasound Examination


Teaching


Publications

2021
Z. Jiang, H. Wang, Z. Li, M. Grimm, M. Zhou, U. Eck, S. V. Brecht, T. C. Lueth, T. Wendler, N. Navab
Motion-Aware Robotic 3D Ultrasound
2021 IEEE International Conference on Robotics and Automation (The video is available on youtube)
The first two authors contribute equally to this paper.
(bib)
2020
Z. Jiang, M. Grimm, M. Zhou, Y. Hu, J. Esteban, N. Navab
Automatic Force-Based Probe Positioning for Precise Robotic Ultrasound Acquisition
IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS (bib)
Z. Jiang, L. Lei, Y. Sun, X. Qi, Y. Hu, B. Li, N. Navab, J. Zhang
Model-Based Compensation of Moving Tissue for State Recognition in Robotic-Assisted Pedicle Drilling
IEEE Transactions on Medical Robotics and Bionics (bib)
Z. Jiang, M. Grimm, M. Zhou, J. Esteban, W. Simson, G. Zahnd, N. Navab
Automatic Normal Positioning of Robotic Ultrasound Probe based only on Confidence Map Optimization and Force Measurement
IEEE Robotics and Automation Letters (presented at ICRA2020) (The video is available on youtube) (bib)

Automatic Normal Positioning of Robotic Ultrasound Probe based only on Confidence Map Optimization and Force Measurement

The presented method aims at acquiring good US image quality by optimizing the orientation of the robotic ultrasound (US) probe, i.e. aligning the central axis of the US probe to the tissue’s surface normal at the point of contact in order to improve sound propagation within the tissue. This is done using estimated forces and the US image in this work. To the best of our knowledge, this is the first paper combining force estimates and real-time US images for estimating the optimal probe orientation.

Automatic Force-Based Probe Positioning for Precise Robotic Ultrasound Acquisition

In the present paper, we propose a method to automatically position a US probe orthogonally to the tissue surface, thereby improving sound propagation and enabling RUSS to reach predefined orientations relatively to the surface normal at the contact point. The method relies on the derivation of the underlying mechanical model. Two rotations around orthogonal axes are carried out, while the contact force is being recorded. Then, the force data are fed into the model to estimate the normal direction. Accordingly, the probe orientation can be computed without requiring visual features. The method is applicable to the convex and linear probes. It has been evaluated on a phantom with varying tilt angles and on multiple human tissues (forearm, upper arm, lower back, and leg).

Motion-Aware Robotic 3D Ultrasound

Robotic three-dimensional (3D) ultrasound (US) imaging has been employed to overcome the drawback of traditional US examinations, such as high inter-operator variability and lack of repeatability. However, occasionally object movement remains an issue: Unexpected motion decreases the quality of the 3D compounding. Further, attempted adjustment of objects, e.g. adjusting limbs to display the entire limb artery tree, is not allowed for current robotic US system. To address this challenge, we propose a vision-based robotic US system with the ability to monitor the object motion and automatically update the sweep trajectory to provide 3D compounded images of the target anatomy seamlessly.

Flag Counter


UsersForm
Title: M.Sc.
Circumference of your head (in cm):  
Firstname: Zhongliang
Middlename:  
Lastname: Jiang
Picture:  
Birthday:  
Nationality: China
Languages: English, Chinese
Groups: Computer-Aided Surgery, IFL
Expertise:  
Position: Scientific Staff
Status: Active
Emailbefore: zl.jiang
Emailafter: tum.de
Room: MI 03.13.041
Telephone:  
Alumniactivity:  
Defensedate:  
Thesistitle:  
Alumnihomepage:  
Personalvideo01:  
Personalvideotext01:  
Personalvideopreview01:  
Personalvideo02:  
Personalvideotext02:  
Personalvideopreview02:  


Edit | Attach | Refresh | Diffs | More | Revision r1.37 - 14 Jul 2021 - 16:29 - ZhongliangJiang

Lehrstuhl für Computer Aided Medical Procedures & Augmented Reality    rss.gif