期刊名称:Neural Information Processing: Letters and Reviews
电子版ISSN:1738-2532
出版年度:2008
卷号:12
期号:1-3
页码:21-30
出版社:Neural Information Processing
摘要:In hierarchical learning machines such as neural networks, Bayesian learning provides better generalization performance than maximum likelihood estimation. However, its accurate approximation using the Markov chain Monte Carlo (MCMC) method requires a huge computational cost. The exchange Monte Carlo (EMC) method was proposed as an improvement on the MCMC method. Although it has been shown to be effective not only in Bayesian learning but also in many fields, the mathematical foundation of the EMC method has not yet been established. In our previous work, we derived the asymptotic behavior of the average exchange ratio, which is used as a criterion for designing the EMC method. In this paper, we verify the accuracy of our theoretical result by the simulation of Bayesian learning in linear neural networks, and propose the method to check the convergence of EMC method based on our theoretical result.
关键词:Markov Chain Monte Carlo Method, Exchange Monte Carlo Method, Exchange Ration