Transitional Augmented Reality navigation for live captured scenes

Markus Tatzgern, Raphael Grasset, Denis Kalkofen, Dieter Schmalstieg

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

23 Citations (Scopus)


Augmented Reality (AR) applications require knowledge about the real world environment in which they are used. This knowledge is often gathered while developing the AR application and stored for future uses of the application. Consequently changes to the real world lead to a mismatch between the previously recorded data and the real world. New capturing techniques based on dense Simultaneous Localization and Mapping (SLAM) not only allow users to capture real world scenes at run-time but also enables them to capture changes of the world. However instead of using previously recorded and prepared scenes users must interact with an unprepared environment. In this paper we present a set of new interaction techniques that support users in handling captured real world environments. The techniques present virtual viewpoints of the scene based on a scene analysis and provide natural transitions between the AR view and virtual viewpoints. We demonstrate our approach with a SLAM based prototype that allows us to capture a real world scene and describe example applications of our system.

Original languageEnglish
Title of host publication2014 IEEE Virtual Reality, VR 2014 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Print)9781479928712
Publication statusPublished - 24 Apr 2014
Externally publishedYes
Event21st IEEE Virtual Reality Conference, VR 2014 - Minneapolis, MN, United States
Duration: 29 Mar 20142 Apr 2014

Publication series

NameProceedings - IEEE Virtual Reality


Conference21st IEEE Virtual Reality Conference, VR 2014
Country/TerritoryUnited States
CityMinneapolis, MN


Dive into the research topics of 'Transitional Augmented Reality navigation for live captured scenes'. Together they form a unique fingerprint.

Cite this