Usually all solutions of these calibration methods are estimated using numerical solvers in the first place before perfoming some non-linear optimization for an accurate result. Depending on the linear system or applied solver (SVD, Eigenvalues, etc.) the results of these estimations are always influenced by the input data, typically gathered from the user/developer of the AR system. Unexperienced users can easily gather lots of data which is not suitable for the underlaying calibration process and usually achieve only disapointing results in the calibration process which also affects the AR applications. It usually takes lots of time/trials for new users/developers in the field of Augmented Reality to be aware of common mistakes in these AR calibration scenarios.
The goal of this thesis is to build a system that supports a common calibration procedures via an adequate online visualization framework. The online system should be build such that it perceives sensor information via websockets and that uses the sensor information to display clear instructions to the user to achieve on optimal calibration movements/results. This should support a better estimation in at least one of the above listed common calibration methods and avoid the frustration of unexperienced AR users/developers in complex sensor setups.
ProjectForm | |
---|---|
Title: | User supporting Visualisationmethods for Hand-Eye Calibration |
Abstract: | |
Student: | Michael Boxhammer |
Director: | Gudrun Klinker |
Supervisor: | Christian Waechter |
Type: | DA/MA/BA |
Area: | Industrial Tracking, Computer Vision, Industrial Augmented Reality, Medical Augmented Reality |
Status: | finished |
Start: | |
Finish: | |
Thesis (optional): | |
Picture: |