This paper reports preliminary investigations into the extent to which future directional intention might be reliably inferred from head pose and eye gaze during locomotion. Such findings could help inform the more effective implementation of realistic detailed animation for dynamic virtual agents in interactive first-person crowd simulations in VR, as well as the design of more efficient predictive controllers for redirected walking. In three different studies, with a total of 19 participants, we placed people at the base of a T-shaped virtual hallway environment and collected head position, head orientation, and gaze direction data as they set out to perform a hidden target search task across two rooms situated at right angles to the end of the hallway. Subjects wore an nVisorST50 HMD equipped with an Arrington Research ViewPoint eye tracker; positional data were tracked using a 12-camera Vicon MX40 motion capture system. The hidden target search task was used to blind participants to the actual focus of our study, which was to gain insight into how effectively head position, head orientation and gaze direction data might predict people's eventual choice of which room to search first. Our results suggest that eye gaze data does have the potential to provide additional predictive value over the use of 6DOF head tracked data alone, despite the relatively limited field-of-view of the display we used.
|Original language||English (US)|
|Title of host publication||Proceedings of the ACM Symposium on Applied Perception, SAP 2016|
|Publisher||Association for Computing Machinery, Inc|
|Number of pages||8|
|State||Published - Jul 22 2016|
|Event||ACM Symposium on Applied Perception, SAP 2016 - Anaheim, United States|
Duration: Jul 22 2016 → Jul 23 2016
|Name||Proceedings of the ACM Symposium on Applied Perception, SAP 2016|
|Other||ACM Symposium on Applied Perception, SAP 2016|
|Period||7/22/16 → 7/23/16|
Bibliographical noteFunding Information:
This work was supported by the National Science Foundation through grant CRI-1305401 Virtual Reality Infrastructure and Technology Development to Support Architectural Education and Basic Research in Immersive Design, Embodied Interaction, Spatial Cognition. The first of the initial pilot studies was conducted in partnership with Deepa Dongapure. We are grateful to Peng Liu for his assistance with the environment modeling and body tracking, and to all of our participants for their patient and dedicated efforts.
© 2016 ACM.
Copyright 2018 Elsevier B.V., All rights reserved.
- Gaze direction
- Virtual environments