首页    期刊浏览 2024年11月28日 星期四
登录注册

文章基本信息

  • 标题:Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization
  • 本地全文:下载
  • 作者:Chunyuan Zhang ; Qingxin Zhu ; Xinzheng Niu
  • 期刊名称:Computational Intelligence and Neuroscience
  • 印刷版ISSN:1687-5265
  • 电子版ISSN:1687-5273
  • 出版年度:2016
  • 卷号:2016
  • DOI:10.1155/2016/2305854
  • 出版社:Hindawi Publishing Corporation
  • 摘要:By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) and regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms.
国家哲学社会科学文献中心版权所有