首页    期刊浏览 2024年09月19日 星期四
登录注册

文章基本信息

  • 标题:Experience with crossmodal statistics reduces the sensitivity for audio-visual temporal asynchrony
  • 本地全文:下载
  • 作者:Boukje Habets ; Patrick Bruns ; Brigitte Röder
  • 期刊名称:Scientific Reports
  • 电子版ISSN:2045-2322
  • 出版年度:2017
  • 卷号:7
  • 期号:1
  • DOI:10.1038/s41598-017-01252-y
  • 语种:English
  • 出版社:Springer Nature
  • 摘要:Bayesian models propose that multisensory integration depends on both sensory evidence (the likelihood) and priors indicating whether or not two inputs belong to the same event. The present study manipulated the prior for dynamic auditory and visual stimuli to co-occur and tested the predicted enhancement of multisensory binding as assessed with a simultaneity judgment task. In an initial learning phase participants were exposed to a subset of auditory-visual combinations. In the test phase the previously encountered audio-visual stimuli were presented together with new combinations of the auditory and visual stimuli from the learning phase, audio-visual stimuli containing one learned and one new sensory component, and audio-visual stimuli containing completely new auditory and visual material. Auditory-visual asynchrony was manipulated. A higher proportion of simultaneity judgements was observed for the learned cross-modal combinations than for new combinations of the same auditory and visual elements, as well as for all other conditions. This result suggests that prior exposure to certain auditory-visual combinations changed the expectation (i.e., the prior) that their elements belonged to the same event. As a result, multisensory binding became more likely despite unchanged sensory evidence of the auditory and visual elements.
国家哲学社会科学文献中心版权所有