Head and eye movements of normal hearing and hearing impaired participants during three-party conversations

  • Hao Lu (Creator)
  • Martin McKinney (Creator)
  • Tao Zhang (Creator)
  • Andrew J Oxenham (Creator)

Dataset

Description

The data includes head movement, eye gaze movement and speech segment recorded from 10 young normal-hearing listeners, 10 older normal-hearing listeners and 10 older hearing-impaired listeners during a 3-party group conversation. Each conversation lasts about 25 minutes while different level of background noise were played including no noise, 50 dB, 60 dB and 70 dB SPL noise. The data was released as the supplemental material of the paper [insert paper info] and other researchers working on gaze-guided hearing aids can test their model on it.

Description
A detailed documentation of data was included as readme.txt Information of participants and related settings were stored in a txt file. The head/eye movements and labeled speech segments were seperated stored in folders. Those folders were compressed as a zip file. Please extract all folders from the zip file.

Funding information
Sponsorship: Starkey Hearing Technologies

Referenced by
Lu, H., McKinney, M., Zhang, T., & Oxenham, A. J. (2018). Tracking eye and head movements in natural conversational settings: Effects of hearing loss and background noise level. The Journal of the Acoustical Society of America, 143(3), 1743-1743.
https://doi.org/10.1121/1.5035688
Lu, Hao; McKinney, Martin F; Zhang, Tao; Oxenham, Andrew J. (2021). Investigating age, hearing loss, and background noise effects on speaker-targeted head and eye movements in three-way conversations Journal of Acoustic Society of America, 149(3), 1889.
https://doi.org/10.1121/10.0003707
Date made availableMar 4 2021
PublisherData Repository for the University of Minnesota

Cite this