Learning to judge distance of nearby sounds in reverberant and anechoic environments

 

Norbert Kopčo1,2, Matt Schoolmaster1, and Barbara Shinn-Cunningham1

1Hearing Research Center, Boston University, Boston, MA, USA

2Dept. of Cybernetics & AI, Technical University, Ko�ice, Slovakia

 

Previous studies have shown that accuracy of distance judgments for nearby sources improves over time when the listener is in a reverberant environment, but not in an anechoic space. The improvement observed in rooms may be the result of the listener learning (through experience) how to interpret reverberation cues and map these cues to different distances in a particular room. The present study evaluates whether such �room learning� is disrupted when reverberation cues vary over the course of the experiment. Results of two auditory distance perception experiments are reported. In the first study, perceived distance was measured for listeners whose position in a real room was varied from session to session. In the second study, distance perception was studied using virtual auditory space (VAS) techniques to simulate sounds for different listener locations in a reverberant room and in anechoic space.In the real room, listeners appeared to get better at judging source distance despite the fact that their location in the room varied from session to session. However, in the VAS study, intermingling sounds simulated in a room with sounds simulated in anechoic space led to a dramatic reduction in performance overall as well as a reduction in the amount of improvement observed with experience. In the limit, when the simulated room was varied on a trial-by-trial basis, no learning was observed. These results suggest that listeners can generalize �room learning� across different listener locations within a single room, but not across dramatically different acoustic environments.