TY - GEN
T1 - Adaptive user perspective rendering for Handheld Augmented Reality
AU - Mohr, Peter
AU - Tatzgern, Markus
AU - Grubert, Jens
AU - Schmalstieg, Dieter
AU - Kalkofen, Denis
PY - 2017/4/6
Y1 - 2017/4/6
N2 - Handheld Augmented Reality commonly implements some variant of magic lens rendering, which turns only a fraction of the users real environment into AR while the rest of the environment remains unaffected. Since handheld AR devices are commonly equipped with video see-through capabilities, AR magic lens applications often suffer from spatial distortions, because the AR environment is presented from the perspective of the camera of the mobile device. Recent approaches counteract this distortion based on estimations of the users head position, rendering the scene from the user's perspective. To this end, approaches usually apply face-tracking algorithms on the front camera of the mobile device. However, this demands high computational resources and therefore commonly affects the performance of the application beyond the already high computational load of AR applications. In this paper, we present a method to reduce the computational demands for user perspective rendering by applying lightweight optical flow tracking and an estimation of the users motion before head tracking is started. We demonstrate the suitability of our approach for computationally limited mobile devices and we compare it to device perspective rendering, to head tracked user perspective rendering, as well as to fixed point of view user perspective rendering.
AB - Handheld Augmented Reality commonly implements some variant of magic lens rendering, which turns only a fraction of the users real environment into AR while the rest of the environment remains unaffected. Since handheld AR devices are commonly equipped with video see-through capabilities, AR magic lens applications often suffer from spatial distortions, because the AR environment is presented from the perspective of the camera of the mobile device. Recent approaches counteract this distortion based on estimations of the users head position, rendering the scene from the user's perspective. To this end, approaches usually apply face-tracking algorithms on the front camera of the mobile device. However, this demands high computational resources and therefore commonly affects the performance of the application beyond the already high computational load of AR applications. In this paper, we present a method to reduce the computational demands for user perspective rendering by applying lightweight optical flow tracking and an estimation of the users motion before head tracking is started. We demonstrate the suitability of our approach for computationally limited mobile devices and we compare it to device perspective rendering, to head tracked user perspective rendering, as well as to fixed point of view user perspective rendering.
KW - Augmented Reality
KW - User Perspective Rendering
UR - http://www.scopus.com/inward/record.url?scp=85018996029&partnerID=8YFLogxK
U2 - 10.1109/3DUI.2017.7893336
DO - 10.1109/3DUI.2017.7893336
M3 - Conference contribution
AN - SCOPUS:85018996029
T3 - 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings
SP - 176
EP - 181
BT - 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings
PB - Institute of Electrical and Electronics Engineers
T2 - 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017
Y2 - 18 March 2017 through 19 March 2017
ER -