Somatosensory anticipation of curvature in a haptic virtual environment

Julian J. Tramper, Stephen Stephens, Martha Flanders

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

The human visuomotor system uses predictive mechanisms to allow the eye or hand to efficiently follow a moving target. The long-term goal of the present study is to determine whether the somatosensory system has similar capabilities. Subjects used the right arm to move the index fingertip inside of virtual tubes shaped as large elliptical objects positioned in the frontal plane. The virtual ellipses had three different aspect ratios and two different tilts, and some had flattened portions inserted in one of three regions. Each of the 24 virtual shapes was presented only once to each subject, but the subject explored each one by moving in five consecutive laps. Performance was more improved over the laps when subjects were allowed to stay in constant contact with the walls of the tube, rather than attempting to stay off the walls. However, even with this continuous haptic feedback, subjects could not precisely anticipate the timing of an upcoming flattened region. Thus, similar to recent results for visually-guided eye movements, it appears that it is difficult for the haptic guidance system to time the anticipation of an upcoming event.

Original languageEnglish (US)
Title of host publicationHaptics Symposium 2012, HAPTICS 2012 - Proceedings
Pages183-186
Number of pages4
DOIs
StatePublished - 2012
Event2012 IEEE Haptics Symposium, HAPTICS 2012 - Vancouver, BC, Canada
Duration: Mar 4 2012Mar 7 2012

Publication series

NameHaptics Symposium 2012, HAPTICS 2012 - Proceedings

Other

Other2012 IEEE Haptics Symposium, HAPTICS 2012
Country/TerritoryCanada
CityVancouver, BC
Period3/4/123/7/12

Keywords

  • active sensing
  • touch

Fingerprint

Dive into the research topics of 'Somatosensory anticipation of curvature in a haptic virtual environment'. Together they form a unique fingerprint.

Cite this