Abstract
Although beamforming algorithms for hearing aids can enhance performance, the wearer's head may not always face the target talker, potentially limiting real-world benefits. This study aimed to determine the extent to which eye tracking improves the accuracy of locating the current talker in three-way conversations and to test the hypothesis that eye movements become more likely to track the target talker with increasing background noise levels, particularly in older and/or hearing-impaired listeners. Conversations between a participant and two confederates were held around a small table in quiet and with background noise levels of 50, 60, and 70 dB sound pressure level, while the participant's eye and head movements were recorded. Ten young normal-hearing listeners were tested, along with ten older normal-hearing listeners and eight hearing-impaired listeners. Head movements generally undershot the talker's position by 10°-15°, but head and eye movements together predicted the talker's position well. Contrary to our original hypothesis, no major differences in listening behavior were observed between the groups or between noise levels, although the hearing-impaired listeners tended to spend less time looking at the current talker than the other groups, especially at the highest noise level.
Original language | English (US) |
---|---|
Pages (from-to) | 1889-1900 |
Number of pages | 12 |
Journal | Journal of the Acoustical Society of America |
Volume | 149 |
Issue number | 3 |
DOIs | |
State | Published - Mar 1 2021 |
Bibliographical note
Funding Information:This research was supported by Starkey Laboratories. We are grateful to Nathaniel Helwig for comments, Andrew Byrne for technical support, PuiYii Goh for data analysis, and Peggy Nelson for sharing the background noise audio. Deidentified data collected during the experiment have been shared through the Data Repository for the University of Minnesota (DRUM) and can be obtained from https:// doi.org/10.13020/mef8-q570. The shared data include the head and eye movements, extracted from the eye tracker video recording, and the time stamps of speech segments extracted from the audio recordings. To maintain participant confidentiality, the raw audio and video recordings are not available.
Publisher Copyright:
© 2021 Acoustical Society of America.
PubMed: MeSH publication types
- Journal Article
- Research Support, Non-U.S. Gov't
Fingerprint
Dive into the research topics of 'Investigating age, hearing loss, and background noise effects on speaker-targeted head and eye movements in three-way conversations'. Together they form a unique fingerprint.Datasets
-
Head and eye movements of normal hearing and hearing impaired participants during three-party conversations
Lu, H., McKinney, M., Zhang, T. & Oxenham, A. J., Data Repository for the University of Minnesota, Mar 4 2021
DOI: 10.13020/mef8-q570, https://hdl.handle.net/11299/218998
Dataset