JASS2005Ubitrack

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

3nd Joint Advanced Student School (JASS 2005)

Course 3 - Ubiquitous Tracking for Augmented Reality

Prof. Boris Kudryashov, State Univ. of Aerospace Instrumentation St. Petersburg
Prof. Ivan V. Andronov, Prof. Alexander Pastor, St. Petersburg State University
Prof. G. Klinker, Ph.D., Dipl.-Inf. Martin Bauer, Technische Universität München

DSCN8058.JPG-viertel

This course's topic

Augmented Reality (AR) applications are highly dependent upon accurate and precise tracking data. Since current tracking technologies do not always provide such information everywhere in real-time application developers must combine several trackers to minimize negative properties of one tracker by another. The result are sensor networks. They can be used to inform applications about the current position and orientation of objects they are concerned about. Furthermore, such pose data can be evaluated with respect to several criteria of quality.

Currently most AR applications use their own customized solution to this problem. Typically, these solutions are hardly reusable in other systems. This inhibits the development of large-scale sensor networks because there are no standard interfaces between these technologies. In this course, we will look at the necessities to form ubiquitous tracking environments consisting of several sensor networks.

Course organization

The course is held as seminar, every participant has to give a presentation of approximately 60 minutes. An active participiation in discussions related to the course's topic is the main part of the course's learning experience. Every participant has to write a 10-page paper on his topic, a collection of these papers will be published as technical report.

General Reading for all Participants

Theses papers should be read by all participants not yet fimilar with the concepts presented therein, to get a common basis of understanding to build on.

Troels Frimor: Introduction & Concepts of Ubiquitous Tracking

20050407110725.jpg

In providing the possibility for wide area Augmented Reality applications, combining sensor information into complicated tracking setups becomes more and more an issue. With Ubiquitous Tracking a formal framework is introduced which addressees the problems with describing these setups. The framework defines three different kinds of spatial relationship graphs to create an abstract view of the relations between objects and sensors. This gives such a general and flexible way to describe spatial relations that it kan be used for all kinds of setups also from existing applications.

Georgi Nachev: Existing Architectures & Systems

20050407110658.jpg

In this work some existing systems for Ubiquitous Tracking are presented: OpenTracker, which implements a static dataflow model for streams of sensors readings; DWARF - a framework for component-based peer-to-peer systems; VRPN - a static network-transparent abstraction between applications and pre-defined trackers and Trackd - a commercial system from VRCO Inc., which abstracts the tracking from the other applications and offers a common API.

Basti Grembowietz: Tracker Alignement: Algorithms and Procedures

20050407110714.jpg

Given two sensors rigidly connected to each other, tracker alignment is used to determine the transformation between those two. Once correctly calibrated, the measurements made by the sensors can be related each other. A short scenario: A camera can be tracked by attaching a fiducal to it that is tracked by another device; knowledge of the pose of the camera can be used together with the images obtained by the camera to generate three dimensional models of the scene perceived. This paper presents techniques for obtaining the transformation between two coordinate systems obtained from trackers, starting from the classical way from Tsai, a nonlinear two stepped algorithm, and ending at a linear one stepped algorithm from Daniilidis which makes use of dual quaternions.

Katharina Pentenrieder: The Kalman Filter and its Extensions

20050402174843_1.jpg

Sensor Fusion aims to coveniently integrate data from different sensors in order to compute a dynamic systems's state. Here problems arise out of uncertainty. In general the knowledge about the system is incomplete and the given data is corrupted with noise. Hence the state of the system can only be estimated.

The Kalman Filter is a very robust and popular approach for stochastic estimation given noisy measurements. This paper therefore concentrates on the ideas of Kalman Filtering for linear and non-linear process models and the application of the filter for the task of combining different data sets. Furthermore some extensions of the Kalman Filter are presented which are useful for sensor fusion: the SCAAT method and the Federated Kalman Filter.

Kirill Yourkov: Adaptive transform of the color space in image compression

20050402084809.jpg

BMP (bit map) is one of the main formats in color image representation. The standard representation of pixels in this format is RGB, it uses 24 bits for one pixel. Every color component (Red, Green, Blue) is represented by one byte. Since components in this representation are correlated, their independent compression is redundant. The linear transform is usually used at the first stage of image and video compression. Mostly this is the RGB-to-YUV nonsingular linear transform with fixed coefficients. This transform decreases correlation between color components and improve energy localization. In general this transform is not optimal since its coefficients are fixed. So, one can choose coefficients of the linear transform for a given image on the base of some optimality criterion. Information on the applied transform depends on the image and must be transmitted to the decoder. The author introduces some optimality criterion. The optimal transform coefficients are found according to this criterion and analyzed. Then the algorithm for efficient transmission of information about the transform is proposed and estimated.

Alexander Chuikov: Decimation of color-difference components by wavelet filtering.

20050407111227.jpg

Usually all pixels in color images are presented by the color space transform, e.g., RGB-to-YUV. After such transform the color-difference components U,V are redundant and decimation of them is used for decreasing the redundancy. Standard decimation uses 1:4:4 format and decreases the number of U,V- components in 4 times. Such decimation may be considered as a result of low-pass filtering by the Haar wavelet filter. We generalize such approach considering the wavelet filtering with more complex filters. As the example we consider the application of the low-pass 3/5-wavelet filtering of color-difference components providing the wavelet decimation. Such filtering improves compression without visible destruction of images. We present the investigation results in this area.

Christian Buckl: Real-Time: The Zerberus System

20050407110637.jpg

In this talk a development model for the implementation of augmented reality application is suggested. The Zerberus System, originally designed for the development of safety-critical real-time applications, provides a tool chain and offers amongst other things an automatic code generation. By its platform independancy, by the use of commercial-of-the-shelf hardware and by the acceleration of the development process the costs can be reduced. This paper makes suggestions how the Zerberus System can be used in the context of augmented reality.

Alexej Minin: Holography

20050407110719.jpg

Holography dates from 1947, when British/Hungarian scientist Dennis Gabor developed the theory of holography while working to improve the resolution of an electron microscope. Gabor, who characterized his work as "an experiment in serendipity" that was "begun too soon," coined the term hologram from the Greek words holos, meaning "whole," and gramma, meaning "message." So we can use not only a visible light to produce holograms but also we can use particles, which have properties of waves. Holograms using electrons (considered in their "wave" manifestation, not as particles) provide sharp pictures, but because the electrons cannot penetrate far into a solid sample, the imaging process is usually restricted to surface regions. Holograms using x rays go can penetrate much farther, but their limitation consists of the fact that the penetration depth improves as the square of the atomic number. Therefore x-holography is not very good for materials with light elements. Holograms with neutrons are different; rather than scattering from the electrons in the atoms of the sample, neutrons scatter only from nuclei, which are 100,000 times smaller than the atoms in which they reside. In an experiment carried out with a beam of neutrons from a reactor at the Institute Laue-Langevin in Grenoble, a group of scientists has produced, for the first time, an atomic-scale map of a crystal

Irina Bobkova: Planes and Homographies for Augmented Reality

20050407111018.jpg

(abstract missing)

Denis Bessonov: Context coding of overlapped DCT coefficients

20050407120050.jpg

Two-dimension DCT (Discrete Cosine Transform) is the commonly used orthogonal transform applying to the image compression at the time. The main steps of image compression are: DCT, quantizing of transform coefficients, scanning the block in zig-zag order and encoding zero series and nonzero coefficients. However, for getting the high quality compression quantization steps are to be chosen small enough. In this case zero series become very short or disappear at all and this method of their encoding becomes inefficient. In order to get a better performance at high bit rates it is reasonable to use the context encoding on “coefficient-by-coefficient” basis (i.e. when there are no zero series). But the generic DCT doesn’t give much correlation between adjacent blocks. In addition it destroys the original distribution of transform coefficients because of rectangle-shaped window using in transforming of separate blocks. So the overlapped DCT with sinusoidal window covering 2 neighboring blocks has been proposed and investigated as well as context model for encoding of coefficients. The author presents the analysis showing that the compression is improved up to 10% at 40-44 dB PSNR comparatively with the standard not overlapping DCT.

Gordana Stojceska: Sensor Fusion: Particle Filters

20050407110707.jpg

In recent years, particle filters have found widespread application in domains with noisy sensors, such as computer vision and robotics, as well as in many other technology fields. Particle filters are powerful tools for Bayesian state estimation in non-linear systems. The key idea of particle filters is to approximate a posterior distribution over unknown state variables by a set of particles, drawn from this distribution. This paper addresses a primary definition and methods of particle filters: Particle filters are insensitive to costs that might arise from the approximate nature of the particle representation. Their only criterion for generating a particle is the posterior likelihood of a state. This paper gives also short introduction to Bayesian filters, continuing with the most known methods, as well as the advantages and disadvantages of the particle filters as one of the most use tracking methods. There is also one short overview of particle filter aplication in the area of tracking people.

  • David A. Forsyth, Jean Ponce, Tracking with Nonlinera Models, chapter that did not make it into the book "Computer Vision: A Modern Approach"
  • M. N. Rosenbluth and A.W. Rosenbluth, Monte Carlo calculation of the average extension of molecular chains, Journal of Chemical Physics 23 (2) 1956.
  • N. J. Gordon, D. J. Salmond, and A. F. M. Smith, Novel approach to nonlinear/non-Gaussian Bayesian state estimation, IEE Proceedings on Radar and Signal Processing, 140, 1993
  • A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, 2001.
  • B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, 2004.
  • M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking, IEEE Transactions on Signal Processing, 50 (2) 2002.
  • Matt Rosencrantz, Geoffrey Gordon, Sebastian Thrun, Decentralized Sensor Fusion with Distributed Particle Filters
  • Carine Hue, Jean-Pierre Le Cadre, Patrick Pérez, A Particle Filter to Track Multiple Objects (PDF)

Instructors: Gudrun Klinker, Ivan Andronov, Boris Koudryashov, Martin Bauer

20050407110640.jpg 20050407110645.jpg 20050402092555.jpg 20050407110918.jpg

Other Resources

Web Links

Individual Presentations

Technical Report

A technical report containing all the topics of this course will be published soon.



Edit | Attach | Refresh | Diffs | More | Revision r1.25 - 27 Jun 2005 - 14:46 - MartinBauer

Lehrstuhl für Computer Aided Medical Procedures & Augmented Reality    rss.gif