期刊名称:Journal of Theoretical and Applied Information Technology
印刷版ISSN:1992-8645
电子版ISSN:1817-3195
出版年度:2016
卷号:92
期号:1
出版社:Journal of Theoretical and Applied
摘要:Ensemble methods or multiple classifiers which combine decisions from many base classifiers have been confirmed to outperform the classification performance of any single classifiers. Despite having the ability of producing the highest classification accuracy, ensemble methods have suffered significantly from their large volume of base classifiers. Thus, in the previous work, we have proposed a novel soft set based method to prune the base classifiers from heterogeneous ensemble committee and have demonstrated the ability of our proposed soft set pruning algorithm in reducing a substantial number of classifiers while at the same time producing the highest prediction accuracy. However, the pruning method only suggests a subset of relevant classifiers, and the search for the optimized and best classifiers is not yet considered. The selection of the best or optimized classifiers is carried out by checking all combinations of pruned classifiers. In this paper, we extended our research by proposing a new soft ensemble selection and optimization method to find the best subset of the pruned classifiers. The results of this work have proven that our proposed method is able to search for the minimum number of classifiers in the ensemble repository while at the same time maintaining or improving the classification performance. The proposed method is systematically evaluated using Customer Churn dataset taken from the UC Irvine Machine Learning Repository data set. This work proved that the proposed soft ensemble selection and optimization method is able to search for the minimum number of classifiers in the ensemble repository while at the same time improving the classification performance.