Predicting destination using head orientation and gaze direction during locomotion in VR

Jonathan Gandrud, Victoria Interrante

Research output: Chapter in Book/Report/Conference proceedingConference contribution

14 Scopus citations

Abstract

This paper reports preliminary investigations into the extent to which future directional intention might be reliably inferred from head pose and eye gaze during locomotion. Such findings could help inform the more effective implementation of realistic detailed animation for dynamic virtual agents in interactive first-person crowd simulations in VR, as well as the design of more efficient predictive controllers for redirected walking. In three different studies, with a total of 19 participants, we placed people at the base of a T-shaped virtual hallway environment and collected head position, head orientation, and gaze direction data as they set out to perform a hidden target search task across two rooms situated at right angles to the end of the hallway. Subjects wore an nVisorST50 HMD equipped with an Arrington Research ViewPoint eye tracker; positional data were tracked using a 12-camera Vicon MX40 motion capture system. The hidden target search task was used to blind participants to the actual focus of our study, which was to gain insight into how effectively head position, head orientation and gaze direction data might predict people's eventual choice of which room to search first. Our results suggest that eye gaze data does have the potential to provide additional predictive value over the use of 6DOF head tracked data alone, despite the relatively limited field-of-view of the display we used.

Original languageEnglish (US)
Title of host publicationProceedings of the ACM Symposium on Applied Perception, SAP 2016
PublisherAssociation for Computing Machinery, Inc
Pages31-38
Number of pages8
ISBN (Electronic)9781450343831
DOIs
StatePublished - Jul 22 2016
EventACM Symposium on Applied Perception, SAP 2016 - Anaheim, United States
Duration: Jul 22 2016Jul 23 2016

Publication series

NameProceedings of the ACM Symposium on Applied Perception, SAP 2016

Other

OtherACM Symposium on Applied Perception, SAP 2016
Country/TerritoryUnited States
CityAnaheim
Period7/22/167/23/16

Bibliographical note

Funding Information:
This work was supported by the National Science Foundation through grant CRI-1305401 Virtual Reality Infrastructure and Technology Development to Support Architectural Education and Basic Research in Immersive Design, Embodied Interaction, Spatial Cognition. The first of the initial pilot studies was conducted in partnership with Deepa Dongapure. We are grateful to Peng Liu for his assistance with the environment modeling and body tracking, and to all of our participants for their patient and dedicated efforts.

Publisher Copyright:
© 2016 ACM.

Copyright:
Copyright 2018 Elsevier B.V., All rights reserved.

Keywords

  • Gaze direction
  • Locomotion
  • Virtual environments

Fingerprint

Dive into the research topics of 'Predicting destination using head orientation and gaze direction during locomotion in VR'. Together they form a unique fingerprint.

Cite this