Information-based analysis of visual cues in human guidance

Andrew Feit, Berenice Mettler

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Perception plays a central role in such humans' motion guidance skills as precision rotorcraft landing or driving a car. This paper investigates the information available in visible cues. relative to a first-person motion guidance task, in terms of sensory-motor guidance primitives. Iluman subjects performed a motion guidance task in a 3D simulation system, using only visual information. Guidance primitive patterns were identified from recorded subject motion and perception behavior. Information transfer between visual measurements and vehicle motion is quantified from the resulting data to identify both model-based and nonrepresentational guidance strategies. Results show that subjects use separate guidance primitive modes during distinct phases of a trajectory. This understanding of perceptual and guidance primitive elements in human behavior has applications ranging from improved augmented cuing for human pilots to advancing artificial perception in autonomous vehiclc systems.

Original languageEnglish (US)
Title of host publication72nd American Helicopter Society International Annual Forum 2016
Subtitle of host publicationLeveraging Emerging Technologies for Future Capabilities
PublisherAmerican Helicopter Society
Pages3422-3434
Number of pages13
ISBN (Electronic)9781510825062
StatePublished - Jan 1 2016

Publication series

NameAnnual Forum Proceedings - AHS International
Volume4
ISSN (Print)1552-2938

Fingerprint

Landing
Railroad cars
Trajectories

Cite this

Feit, A., & Mettler, B. (2016). Information-based analysis of visual cues in human guidance. In 72nd American Helicopter Society International Annual Forum 2016: Leveraging Emerging Technologies for Future Capabilities (pp. 3422-3434). (Annual Forum Proceedings - AHS International; Vol. 4). American Helicopter Society.

Information-based analysis of visual cues in human guidance. / Feit, Andrew; Mettler, Berenice.

72nd American Helicopter Society International Annual Forum 2016: Leveraging Emerging Technologies for Future Capabilities. American Helicopter Society, 2016. p. 3422-3434 (Annual Forum Proceedings - AHS International; Vol. 4).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Feit, A & Mettler, B 2016, Information-based analysis of visual cues in human guidance. in 72nd American Helicopter Society International Annual Forum 2016: Leveraging Emerging Technologies for Future Capabilities. Annual Forum Proceedings - AHS International, vol. 4, American Helicopter Society, pp. 3422-3434.
Feit A, Mettler B. Information-based analysis of visual cues in human guidance. In 72nd American Helicopter Society International Annual Forum 2016: Leveraging Emerging Technologies for Future Capabilities. American Helicopter Society. 2016. p. 3422-3434. (Annual Forum Proceedings - AHS International).
Feit, Andrew ; Mettler, Berenice. / Information-based analysis of visual cues in human guidance. 72nd American Helicopter Society International Annual Forum 2016: Leveraging Emerging Technologies for Future Capabilities. American Helicopter Society, 2016. pp. 3422-3434 (Annual Forum Proceedings - AHS International).
@inproceedings{050463dad7804d9281a9a2a3f86a938e,
title = "Information-based analysis of visual cues in human guidance",
abstract = "Perception plays a central role in such humans' motion guidance skills as precision rotorcraft landing or driving a car. This paper investigates the information available in visible cues. relative to a first-person motion guidance task, in terms of sensory-motor guidance primitives. Iluman subjects performed a motion guidance task in a 3D simulation system, using only visual information. Guidance primitive patterns were identified from recorded subject motion and perception behavior. Information transfer between visual measurements and vehicle motion is quantified from the resulting data to identify both model-based and nonrepresentational guidance strategies. Results show that subjects use separate guidance primitive modes during distinct phases of a trajectory. This understanding of perceptual and guidance primitive elements in human behavior has applications ranging from improved augmented cuing for human pilots to advancing artificial perception in autonomous vehiclc systems.",
author = "Andrew Feit and Berenice Mettler",
year = "2016",
month = "1",
day = "1",
language = "English (US)",
series = "Annual Forum Proceedings - AHS International",
publisher = "American Helicopter Society",
pages = "3422--3434",
booktitle = "72nd American Helicopter Society International Annual Forum 2016",
address = "United States",

}

TY - GEN

T1 - Information-based analysis of visual cues in human guidance

AU - Feit, Andrew

AU - Mettler, Berenice

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Perception plays a central role in such humans' motion guidance skills as precision rotorcraft landing or driving a car. This paper investigates the information available in visible cues. relative to a first-person motion guidance task, in terms of sensory-motor guidance primitives. Iluman subjects performed a motion guidance task in a 3D simulation system, using only visual information. Guidance primitive patterns were identified from recorded subject motion and perception behavior. Information transfer between visual measurements and vehicle motion is quantified from the resulting data to identify both model-based and nonrepresentational guidance strategies. Results show that subjects use separate guidance primitive modes during distinct phases of a trajectory. This understanding of perceptual and guidance primitive elements in human behavior has applications ranging from improved augmented cuing for human pilots to advancing artificial perception in autonomous vehiclc systems.

AB - Perception plays a central role in such humans' motion guidance skills as precision rotorcraft landing or driving a car. This paper investigates the information available in visible cues. relative to a first-person motion guidance task, in terms of sensory-motor guidance primitives. Iluman subjects performed a motion guidance task in a 3D simulation system, using only visual information. Guidance primitive patterns were identified from recorded subject motion and perception behavior. Information transfer between visual measurements and vehicle motion is quantified from the resulting data to identify both model-based and nonrepresentational guidance strategies. Results show that subjects use separate guidance primitive modes during distinct phases of a trajectory. This understanding of perceptual and guidance primitive elements in human behavior has applications ranging from improved augmented cuing for human pilots to advancing artificial perception in autonomous vehiclc systems.

UR - http://www.scopus.com/inward/record.url?scp=85001720476&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85001720476&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85001720476

T3 - Annual Forum Proceedings - AHS International

SP - 3422

EP - 3434

BT - 72nd American Helicopter Society International Annual Forum 2016

PB - American Helicopter Society

ER -