首页    期刊浏览 2024年11月25日 星期一
登录注册

文章基本信息

  • 标题:LTR-MDTS structure - a structure for multiple dependent time series prediction
  • 本地全文:下载
  • 作者:Pecev, Predrag ; Racković, Miloš
  • 期刊名称:Computer Science and Information Systems
  • 印刷版ISSN:1820-0214
  • 电子版ISSN:2406-1018
  • 出版年度:2017
  • 卷号:14
  • 期号:2
  • 页码:467-490
  • 出版社:ComSIS Consortium
  • 摘要:The subject of research presented in this paper is to model a neural network structure and appropriate training algorithm that is most suited for multiple dependent time series prediction / deduction. The basic idea is to take advantage of neural networks in solving the problem of prediction of synchronized basketball referees’ movement during a basketball action. Presentation of time series stemming from the aforementioned problem, by using traditional Multilayered Perceptron neural networks (MLP), leads to a sort of paradox of backward time lapse effect that certain input and hidden layers nodes have on output nodes that correspond to previous moments in time. This paper describes conducted research and analysis of different methods of overcoming the presented problem. Presented paper is essentially split into two parts. First part gives insight on efforts that are put into training set configuration on standard Multi Layered Perceptron back propagation neural networks, in order to decrease backwards time lapse effects that certain input and hidden layers nodes have on output nodes. Second part of paper focuses on the results that a new neural network structure called LTR - MDTS provides. Foundation of LTR - MDTS design relies on a foundation on standard MLP neural networks with certain, left-to-right synapse removal to eliminate aforementioned backwards time lapse effect on the output nodes. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. OI174023: "Intelligent techniques and their integration into wide-spectrum decision support]
  • 关键词:MLP; Multiple Dependent Time Series; LTR-MDTS structure; training parameter influence; Neural Network Configuration; training set configuration and optimization
国家哲学社会科学文献中心版权所有