Multimodal perception of reachability expressed through locomotion

Bruno Mantel, Benoît G. Bardy, Thomas A. Stoffregen

Research output: Contribution to journalArticlepeer-review

11 Scopus citations


We investigated the information that supports perception of whether an object is within reach using a locomotor task. Participants adjusted their own position relative to a fixed target by stepping or by propelling a wheelchair until they judged it to be within reach. The to-be-reached object was presented in virtual reality. The display of the target was driven in real time as a function of the observer's movement, thus depicting a stationary virtual object at a definite distance only through the relation across optical and nonoptical patterns of stimulation. We asked participants to judge the distance they could reach with their unaided hand or when holding a rod that extended their effective reach. They could see neither their body nor the rod thereby limiting available visual information about "reachability". As expected, our results showed that despite the limited information that was available, participants' locomotor adjustments were influenced by (a) their simulated distance from the target, (b) their arm length, and (c) the presence or absence of the rod. The type of motion (stepping or wheelchair) had little influence. However, judgment accuracy was influenced by participants' initial simulated distance from the target. We compare the performance obtained in our locomotor judgment task with previous studies that have used different methods for measuring perceived reaching-ability. We discuss perceptual information that could have supported performance within the framework of the global array.

Original languageEnglish (US)
Pages (from-to)192-211
Number of pages20
JournalEcological Psychology
Issue number3
StatePublished - Jul 2010


Dive into the research topics of 'Multimodal perception of reachability expressed through locomotion'. Together they form a unique fingerprint.

Cite this