Affiliation:
1. Institute of Robotics and Mechatronics German Aerospace Center (DLR) Wessling Germany
2. Research Center on Software Technologies and Multimedia Systems Universidad Politécnica de Madrid Madrid Spain
3. Laboratory of Image Synthesis and Analysis (LISA) Université Libre de Bruxelles (ULB) Brussels Belgium
Abstract
AbstractBesides haptics, the visual channel provides the most essential feedback to the operator in teleoperation setups. For optimal performance, the view on the remote scene must provide 3D information, be sharp, and of high resolution. Head‐mounted displays (HMD) are applied to improve the immersion of the operator into the remote environment. Still, so far, no near‐eye display technology was available that provides a natural view on objects within the typical manipulation distance (up to 1.2 m). The main limitation is a mismatch of the 3D distance and the focal distance of the visualized objects (vergence‐accommodation conflict) in displays with fixed focal distance. This conflict potentially leads to eye strain after extended use. Here, we apply a light‐field HMD providing close‐to‐continuous depth information to the user, thus avoiding the vergence‐accommodation conflict. Furthermore, we apply a time‐of‐flight sensor to generate a 2.5D environment model. The displayed content is processed with image‐based rendering allowing a 6 degree‐of‐freedom head motion in the visualized scene. The main objective of the presented study is evaluating the effects of view perspective and light‐field on performance and workload in a teleoperation setup. The reduction of visual effort for the user is confirmed in an abstract depth‐matching task.
Funder
Horizon 2020 Framework Programme
Subject
Electrical and Electronic Engineering,Atomic and Molecular Physics, and Optics,Electronic, Optical and Magnetic Materials