首页    期刊浏览 2024年09月04日 星期三
登录注册

文章基本信息

  • 标题:Multisensory visuo-tactile context learning enhances the guidance of unisensory visual search
  • 本地全文:下载
  • 作者:Siyi Chen ; Zhuanghua Shi ; Hermann J. Müller
  • 期刊名称:Scientific Reports
  • 电子版ISSN:2045-2322
  • 出版年度:2021
  • 卷号:11
  • DOI:10.1038/s41598-021-88946-6
  • 语种:English
  • 出版社:Springer Nature
  • 摘要:Does multisensory distractor-target context learning enhance visual search over and above unisensory learning? To address this, we had participants perform a visual search task under both uni- and multisensory conditions. Search arrays consisted of one Gabor target that differed from three homogeneous distractors in orientation; participants had to discriminate the target’s orientation. In the multisensory session, additional tactile (vibration-pattern) stimulation was delivered to two fingers of each hand, with the odd-one-out tactile target and the distractors co-located with the corresponding visual items in half the trials; the other half presented the visual array only. In both sessions, the visual target was embedded within identical (repeated) spatial arrangements of distractors in half of the trials. The results revealed faster response times to targets in repeated versus non-repeated arrays, evidencing ‘contextual cueing’. This effect was enhanced in the multisensory session—importantly, even when the visual arrays presented without concurrent tactile stimulation. Drift–diffusion modeling confirmed that contextual cueing increased the rate at which task-relevant information was accumulated, as well as decreasing the amount of evidence required for a response decision. Importantly, multisensory learning selectively enhanced the evidence-accumulation rate, expediting target detection even when the context memories were triggered by visual stimuli alone.
国家哲学社会科学文献中心版权所有