ResearchAr

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

Forschungsgruppe Augmented Reality (FAR)

ForLog Project: Wayfinding
Wayfinding in Project Forlog

Prof. Gudrun Klinker, Ph.D.

Augmented Reality (AR) is a newly emerging technology by which a user's view of the real world is augmented with additional information from a computer model. With AR, users can access information without having to leave their work place. They can manipulate and examine real objects and simultaneously receive additional information about them or the task at hand. Using AR technology, the information is presented three-dimensionally integrated into the real world. Exploiting people's visual and spatial skills to navigate in a three-dimensional world, Augmented Reality thus constitutes a particularly promising new user interface paradigm. Read more...

Check out our channel on youtube AugmentedRealityTUM.

People

AR Group
Photo: Hitzhofen, Mar. 2019 (Not all members are in the pic:))

  • Alumni and Collaborators: list

Research Issues

Our research focus lies on bringing AR technology into real applications. To this end, we look not only at "classical AR" solutions but consider suitable variations, thereby combining AR with concepts of mobile and ubiquitous computing. This leads to the concept of Ubiquitous Augmented Reality (UAR).

We investigate (not necessarily disjunct) issues with respect to:

  • Sensing:
    To track users and mobile objects in the real world, suitable sensors need to be registered, calibrated and combined in real-time (i.e.: at video rate). We use, test, evaluate and enhance sensing technology, placing particular emphasis on the establishment of sound characteristics of measurement noise.
  • Ubiquitous Tracking (Sensor Fusion):
    AR-based tracking strives for high presicion and speed. Location-based tracking support focuses on generality, flexibility, wide range of use, and low cost. We work towards a framework for sensor fusion, that allows for flexible, dynamically configurable combinations of several trackers, using location-based approaches to generalize and backup more brittle and constrained high quality tracking technology.

  • 3D Information Presentation:
    AR presents computer-based information three-dimensionally embedded into the real surroundings of a user. Visualizations can be shown on head-mounted displays (HMDs), mobile monitors (tablet PCs, PDAs, mobile phones, or intelligent instruments), nearby stationary monitors (desktops, walls), or as direct projections onto real objects. We conceive, investigate and compare such presentation techniques, as well as flexible, context-dependent combinations thereof.
  • 3D Interaction:
    When computer-based information is embedded within the real environment of users, users can react to such information directly within such real-world context - by manipulating real objects in addition to gesturing, speech, 3D pointing, and "standard" human-computer interaction via so-called "WIMP" (desktop-based) user interfaces. We investigate suitable combinations of such interfaces - with particular emphasis on direct 3D human-computer interaction embedded in the real world (tangible UIs).
  • HCI in Cars:
    User centered driver assistance with minimalistic interactions for active safety. The focus relies on embedded Augmented Reality and its use for the assistance part of all in-vehicle information systems. To improve driving performance, the driver is intended to reside in the control circuit of the driving task instead of being taken out for gathering information from secondary sources as warning or navigation systems and displays.
  • Multitouch Displays:
    Recently, interaction devices based on natural gestures have become more and more widespread, e.g. with Jeff Han's work (watch on YouTube), Microsoft Surface or the Apple iPhone. These devices particularly support multi-touch interaction, thereby enabling a single user to work with both hands or even several users in parallel. In this project, we explore the applications of multi-touch surfaces, both large and small. We are building and evaluating new hardware concepts as well as developing software to take advantage of the new interaction modalities.

  • System Architectures for Ubiquitous Augmented Reality:
    In order to provide ubiquitous tracking, information presentation and interaction facilities, the underlying system architecture needs to be very flexible, adaptive and extensible. We have built an ad-hoc, peer-to-peer framework. We are evaluating and extending and/or simplifying the system. We are also exploring how to base the system on concepts and tools to specify, customize and analyze desired system behavior.
  • Industrial Augmented Reality:
    We actively pursue intense relationships with industry partners who provide us with information pertaining to the real-world requirements and trade-offs of transferring AR technology into real applications.

Projects

Find all FAR Research Projects

Open Theses

Find all Open Theses

Publications

Read our full FAR Publication List.

Habilitation

Dissertations



Edit | Attach | Refresh | Diffs | More | Revision r1.199 - 12 May 2021 - 11:40 - DavidPlecher

Lehrstuhl für Computer Aided Medical Procedures & Augmented Reality    rss.gif