Speech perception under eye-controlled and head-controlled directional microphones in a dynamic ‘Cocktail Party’
Hearing impaired people have difficulties with understanding speech in noisy environments. Despite the fact that hearing aids with directional microphones reduce background noise and thus improve speech intelligibility, the benefit does not always translate to real listening scenarios. One of the reasons is that standard directional microphones amplify sound only in front of the listener, which makes listening to off-axis signals even more difficult than with the omni-directional microphones. In order to address this problem, the directivity of the microphones could be controlled by the eye gaze of the hearing aid user. In the current experiment, hearing impaired and normal hearing subjects listened to listened to single words at a rate of 1 word / 1.5 seconds coming either from 30 degrees to either the left or right, simulating a turn-taking conversation. The task of the participants was to repeat the words out loud. The stimuli were presented through the loudspeaker ring and the directivity pattern of the virtual hearing aids, either head-controlled or eye-controlled, was simulated on-line using head tracking and eye tracking to control the gain of each loudspeaker channel. The stimuli were presented in four conditions, combinations of the two directional steering methods and two acoustic beam widths. Preliminary results indicate a benefit of the eye-controlled technology over the head-controlled technology, however, the benefit was diminished in those subjects who preferred to re-orient using pronounced head movements. These data indicate that the patterns of behavior could be critical for understanding speech benefits in situations like dynamic ‘cocktail parties’.
Funding — Work supported by the Oticon Foundation, the Medical Research Council (grants #U135097131 and #MC_UU_00010/4), and the Chief Scientists Office – Scottish Government.