PublicationDetail

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

M. Tönnis, A. Benzina, G. Klinker
Exploring and Controlling Multidimensional Parameter Spaces
Proceedings of the International Conference on Simulation Technology (SimTech), Stuttgart, Germany, Jun. 14 - 17, 2011. (bib)

Increasing HPC capabilities and state of the art simulation techniques increase the demand on the user interface. The number of factors that influence properties of a simulation increases and need to be controlled. Besides the three spatial dimensions, often numbers of tens of parameters need to be controlled, some even requiring simultaneous adjustment. Simulation data is often visualized in immersive environments, such as CAVEs. There, a gap between visual spatial exploration and control of further parameters emerges. While handheld devices allow changing the viewpoint, the demand for functionality to control other simulation parameters has rarely been met in an integrated device. We propose and develop an integrated approach for controlling all simulation parameters with a single handheld device. No second person to control parameters or no switch between visualization and the management computer are necessary, reducing round-trip cycle time. Our interaction concept clearly separates between control of motion and other controls. Motion control is defined by the specific state in which the pose of the mobile device w.r.t. an initial position is held. If motion control is active the spatial difference is used to compute the transformational displacement of the viewpoint. Different handheld devices can be employed, either equipped with built-in sensors or tracked by the visualization environment. Touch input capabilities of displays can either provide additional motion control mechanisms or can be used for the integrated parameter adjustments. As easy to use navigation techniques are key factors for success of interaction in virtual environments, we specifically focus on the traveling task first. Several questions are of concern for traveling tasks while keeping in mind that the device should also serve for additional tasks. Which size and form factor of the device supports handy control while providing sufficient space for further interaction elements? Which metaphors are suited best for viewpoint control? How many degrees of freedom (DOF) are necessary for viewpoint control to provide highest flexibility for motion with good usability? To get insight into these questions and to find answers, two motion control systems have been built and tested. The first system employs a handheld ultra-mobile PC and the second uses a mobile phone. Both are equipped with touch displays. Two metaphors for viewpoint control have been implemented, a car-steering-like metaphor and an airplane-like metaphor. A user study showed that both metaphors are applicable and even could be combined into one system. A further study revealed usage of a third metaphor, a so-called looking-glass metaphor that now needs to be integrated.
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each authors copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.



Edit | Attach | Refresh | Diffs | More | Revision r1.13 - 30 Jan 2019 - 15:16 - LeslieCasas

Lehrstuhl für Computer Aided Medical Procedures & Augmented Reality    rss.gif