TY - JOUR
T1 - Drone-Augmented Human Vision
T2 - Exocentric Control for Drones Exploring Hidden Areas
AU - Erat, Okan
AU - Isop, Werner Alexander
AU - Kalkofen, Denis
AU - Schmalstieg, Dieter
PY - 2018/4
Y1 - 2018/4
N2 - Drones allow exploring dangerous or impassable areas safely from a distant point of view. However, flight control from an egocentric view in narrow or constrained environments can be challenging. Arguably, an exocentric view would afford a better overview and, thus, more intuitive flight control of the drone. Unfortunately, such an exocentric view is unavailable when exploring indoor environments. This paper investigates the potential of drone-Augmented human vision, i.e., of exploring the environment and controlling the drone indirectly from an exocentric viewpoint. If used with a see-Through display, this approach can simulate X-ray vision to provide a natural view into an otherwise occluded environment. The user's view is synthesized from a three-dimensional reconstruction of the indoor environment using image-based rendering. This user interface is designed to reduce the cognitive load of the drone's flight control. The user can concentrate on the exploration of the inaccessible space, while flight control is largely delegated to the drone's autopilot system. We assess our system with a first experiment showing how drone-Augmented human vision supports spatial understanding and improves natural interaction with the drone.
AB - Drones allow exploring dangerous or impassable areas safely from a distant point of view. However, flight control from an egocentric view in narrow or constrained environments can be challenging. Arguably, an exocentric view would afford a better overview and, thus, more intuitive flight control of the drone. Unfortunately, such an exocentric view is unavailable when exploring indoor environments. This paper investigates the potential of drone-Augmented human vision, i.e., of exploring the environment and controlling the drone indirectly from an exocentric viewpoint. If used with a see-Through display, this approach can simulate X-ray vision to provide a natural view into an otherwise occluded environment. The user's view is synthesized from a three-dimensional reconstruction of the indoor environment using image-based rendering. This user interface is designed to reduce the cognitive load of the drone's flight control. The user can concentrate on the exploration of the inaccessible space, while flight control is largely delegated to the drone's autopilot system. We assess our system with a first experiment showing how drone-Augmented human vision supports spatial understanding and improves natural interaction with the drone.
KW - Drone
KW - Hololens
KW - Mixed reality
KW - Pick-And-place
KW - X-ray
UR - http://www.scopus.com/inward/record.url?scp=85041687703&partnerID=8YFLogxK
U2 - 10.1109/TVCG.2018.2794058
DO - 10.1109/TVCG.2018.2794058
M3 - Article
C2 - 29543162
AN - SCOPUS:85041687703
SN - 1077-2626
VL - 24
SP - 1437
EP - 1446
JO - IEEE Transactions on Visualization and Computer Graphics
JF - IEEE Transactions on Visualization and Computer Graphics
IS - 4
ER -