Abstract
Understanding the extent to which, and conditions under which, scene detail affects spatial perception accuracy can inform the responsible use of sketch-like rendering styles in applications such as immersive architectural design walkthroughs using 3D concept drawings. This paper reports the results of an experiment that provides important new insight into this question using a custom-built, portable video-see-through (VST) conversion of an optical-see-through head-mounted display (HMD). Participants made egocentric distance judgments by blind walking to the perceived location of a real physical target in a real-world outdoor environment under three different conditions of HMD-mediated scene detail reduction: full detail (raw camera view), partial detail (Sobel-filtered camera view), and no detail (complete background subtraction), and in a control condition of unmediated real world viewing through the same HMD. Despite the significant differences in participants' ratings of visual and experiential realism between the three different video-see-through rendering conditions, we found no significant difference in the distances walked between these conditions. Consistent with prior findings, participants underestimated distances to a significantly greater extent in each of the three VST conditions than in the real world condition. The lack of any clear penalty to task performance accuracy not only from the removal of scene detail, but also from the removal of all contextual cues to the target location, suggests that participants may be relying nearly exclusively on context - independent information such as angular declination when performing the blind-walking task. This observation highlights the limitations in using blind walking to the perceived location of a target on the ground to make inferences about people's understanding of the 3D space of the virtual environment surrounding the target. For applications like immersive architectural design, where we seek to verify the equivalence of the 3D spatial understanding derived from virtual immersion and real world experience, additional measures of spatial understanding should be considered.
Original language | English (US) |
---|---|
Title of host publication | Proceedings - 2021 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2021 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 336-344 |
Number of pages | 9 |
ISBN (Electronic) | 9780738125565 |
DOIs | |
State | Published - Mar 1 2021 |
Event | 28th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2021 - Virtual, Lisboa, Portugal Duration: Mar 27 2021 → Apr 3 2021 |
Publication series
Name | 2021 IEEE Virtual Reality and 3D User Interfaces (VR) |
---|
Conference
Conference | 28th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2021 |
---|---|
Country/Territory | Portugal |
City | Virtual, Lisboa |
Period | 3/27/21 → 4/3/21 |
Bibliographical note
Funding Information:This research was supported by the National Science Foundation through grants II-NEW 1305401 and CHS: Small 1526693, and by the Linda and Ted Johnson Digital Design Consortium Endowment.
Publisher Copyright:
© 2021 IEEE.
Keywords
- Non-photorealistic rendering
- Spatial perception
- Virtual reality