Looking At or Through? Using Eye Tracking to Infer Attention Location for Wearable Transparent Displays

Mélodie Vidal, David H. Nguyen, Kent Lyons

Research output: Chapter in Book/Report/Conference proceedingConference contribution

12 Scopus citations

Abstract

Wearable near-eye displays pose interesting challenges for interface design. These devices present the user with a duality of visual worlds, with a virtual window of information overlaid onto the physical world. Because of this duality, we suggest that the wearable interface would benefit from understanding where the user's visual attention is directed. We explore the potential of eye tracking to address this problem, and describe four eye tracking techniques designed to provide data about where the user's attention is directed. We also propose some attention-aware user interface techniques demonstrating the potential of the eyes for wearable displays user interface management.

Original languageEnglish (US)
Title of host publicationISWC 2014 - Proceedings of the 2014 ACM International Symposium on Wearable Computers
PublisherAssociation for Computing Machinery
Pages87-90
Number of pages4
ISBN (Electronic)9781450329699
DOIs
StatePublished - Sep 13 2014
Externally publishedYes
Event18th ACM International Symposium on Wearable Computers, ISWC 2014 - Seattle, United States
Duration: Sep 13 2014Sep 17 2014

Publication series

NameProceedings - International Symposium on Wearable Computers, ISWC
ISSN (Print)1550-4816

Conference

Conference18th ACM International Symposium on Wearable Computers, ISWC 2014
Country/TerritoryUnited States
CitySeattle
Period9/13/149/17/14

Bibliographical note

Publisher Copyright:
© 2014 ACM.

Keywords

  • attention detection
  • eye movements
  • eye tracking
  • head-mounted displays (HMD)
  • user interface management

Fingerprint

Dive into the research topics of 'Looking At or Through? Using Eye Tracking to Infer Attention Location for Wearable Transparent Displays'. Together they form a unique fingerprint.

Cite this