首页    期刊浏览 2025年04月14日 星期一
登录注册

文章基本信息

  • 标题:Overcoming catastrophic forgetting in neural networks
  • 本地全文:下载
  • 作者:James Kirkpatrick ; Razvan Pascanu ; Neil Rabinowitz
  • 期刊名称:Proceedings of the National Academy of Sciences
  • 印刷版ISSN:0027-8424
  • 电子版ISSN:1091-6490
  • 出版年度:2017
  • 卷号:114
  • 期号:13
  • 页码:3521-3526
  • DOI:10.1073/pnas.1611835114
  • 语种:English
  • 出版社:The National Academy of Sciences of the United States of America
  • 摘要:The ability to learn tasks in a sequential fashion is crucial to the development of artificial intelligence. Until now neural networks have not been capable of this and it has been widely thought that catastrophic forgetting is an inevitable feature of connectionist models. We show that it is possible to overcome this limitation and train networks that can maintain expertise on tasks that they have not experienced for a long time. Our approach remembers old tasks by selectively slowing down learning on the weights important for those tasks. We demonstrate our approach is scalable and effective by solving a set of classification tasks based on a hand-written digit dataset and by learning several Atari 2600 games sequentially.
  • 关键词:synaptic consolidation ; artificial intelligence ; stability plasticity ; continual learning ; deep learning
国家哲学社会科学文献中心版权所有