Perception plays a central role in such humans' motion guidance skills as precision rotorcraft landing or driving a car. This paper investigates the information available in visible cues. relative to a first-person motion guidance task, in terms of sensory-motor guidance primitives. Iluman subjects performed a motion guidance task in a 3D simulation system, using only visual information. Guidance primitive patterns were identified from recorded subject motion and perception behavior. Information transfer between visual measurements and vehicle motion is quantified from the resulting data to identify both model-based and nonrepresentational guidance strategies. Results show that subjects use separate guidance primitive modes during distinct phases of a trajectory. This understanding of perceptual and guidance primitive elements in human behavior has applications ranging from improved augmented cuing for human pilots to advancing artificial perception in autonomous vehiclc systems.