Neural representation of room acoustic simulations (en)
* Presenting author
Abstract:
Our understanding of the neuronal processing of spatial navigation has been dominated by vision, however, auditory spatial cues are also fundamental to how we understand the world around us. The advancement of spatial acoustic simulations now provide a high level of perceptual plausibility and allow for systematically manipulating acoustic parameters to enable investigating the role of auditory space in complex cognitive tasks. However, it is unclear how certain properties of such room acoustic simulations affect the neural processing of these virtual spaces. In this study, listeners performed an auditory distance perception task with headphone auralizations while we collected brain activity data using sparse multiband functional magnetic resonance imaging (fMRI). Three room auralizations were generated: a recording of the actual MRI room, a perceptually plausible simulation, and a simplified simulation, both generated using the acoustic simulator RAZR (Wendt et al. 2014). Although all auralizations activated primary auditory brain regions, differences in brain activity were strong enough for above chance decoding. By analyzing acoustic differences between auralizations we then determined what drove brain activity.Identifying differences in brain activity based upon spatial parameters can inform us what components of these simulations are relevant for the brain’s processing of spatial acoustic information.