期刊名称:International Journal of Advances in Soft Computing and Its Applications
印刷版ISSN:2074-8523
出版年度:2015
卷号:7
期号:3
出版社:International Center for Scientific Research and Studies
摘要:In a large dataset classification, a higher number of attributes commonly evolve over time, where many dynamic learning strategies have been proposed such as the ensemble network and incremental neural network. Ensemble network is a learning paradigm where many neural networks are jointly used to solve a problem. The relationship between the ensemble and component of neural networks is analyzed from the context of classification in integrated framework. This task would reveal that, it may be better to have many neural networks instead the incremental neural network. Most approaches of ensemble using totally different classifiers for prediction. Then, in order to find an appropriate neural network from ensemble members, it can be selected from a set of different available neural networks. Thus, a Distributed Reordering Technique (DRT) is proposed. DRT is an enhanced algorithm based on distributed random for different neural networks. The weights are randomly assigned to networks in order to evolve, so that they can characterize each neural network to some extent of fitness in constituting a better result. The ensemble network integrated framework supported by the selection of some neural networks based on output and weights that made up the ensemble. The experimental study shows that in comparing with some ensemble approaches such as Bagging, DRT can generate a neural network with enhanced performance and stronger generalization ability. Furthermore, the use of DRT for neural network classifier is practical and relevance to classification systems for large and can be applied to different large data dimension in future
关键词:Ensemble Network; Incremental Learning; Large Data; Neural ; Network