首页    期刊浏览 2024年07月23日 星期二
登录注册

文章基本信息

  • 标题:A geometric framework for understanding dynamic information integration in context-dependent computation
  • 本地全文:下载
  • 作者:Xiaohan Zhang ; Shenquan Liu ; Zhe Sage Chen
  • 期刊名称:iScience
  • 印刷版ISSN:2589-0042
  • 出版年度:2021
  • 卷号:24
  • 期号:8
  • 页码:1-24
  • DOI:10.1016/j.isci.2021.102919
  • 语种:English
  • 出版社:Elsevier
  • 摘要:SummaryThe prefrontal cortex (PFC) plays a prominent role in performing flexible cognitive functions and working memory, yet the underlying computational principle remains poorly understood. Here, we trained a rate-based recurrent neural network (RNN) to explore how the context rules are encoded, maintained across seconds-long mnemonic delay, and subsequently used in a context-dependent decision-making task. The trained networks replicated key experimentally observed features in the PFC of rodent and monkey experiments, such as mixed selectivity, neuronal sequential activity, and rotation dynamics. To uncover the high-dimensional neural dynamical system, we further proposed a geometric framework to quantify and visualize population coding and sensory integration in a temporally defined manner. We employed dynamic epoch-wise principal component analysis (PCA) to define multiple task-specific subspaces and task-related axes, and computed the angles between task-related axes and these subspaces. In low-dimensional neural representations, the trained RNN first encoded the context cues in a cue-specific subspace, and then maintained the cue information with a stable low-activity state persisting during the delay epoch, and further formed line attractors for sensor integration through low-dimensional neural trajectories to guide decision-making. We demonstrated via intensive computer simulations that the geometric manifolds encoding the context information were robust to varying degrees of weight perturbation in both space and time. Overall, our analysis framework provides clear geometric interpretations and quantification of information coding, maintenance, and integration, yielding new insight into the computational mechanisms of context-dependent computation.Graphical abstractDisplay OmittedHighlights•Units with mixed selectivity emerged in context-dependent computation•Neural sequences emerged in the trained RNN during cue delay•Task-specific neural trajectories distinguished in low-dimensional subspaces•Sensory integration formed dynamic fixed points and line attractorsNeuroscience; Cognitive neuroscience; Biocomputational method;
国家哲学社会科学文献中心版权所有