期刊名称:American Journal of Computational Mathematics
印刷版ISSN:2161-1203
电子版ISSN:2161-1211
出版年度:2013
卷号:3
期号:1
页码:16-26
DOI:10.4236/ajcm.2013.31003
出版社:Scientific Research Publishing
摘要:We present a new derivative-free optimization algorithm based on the sparse grid numerical integration. The algorithm applies to a smooth nonlinear objective function where calculating its gradient is impossible and evaluating its value is also very expensive. The new algorithm has: 1) a unique starting point strategy; 2) an effective global search heuristic; and 3) consistent local convergence. These are achieved through a uniform use of sparse grid numerical integration. Numerical experiment result indicates that the algorithm is accurate and efficient, and benchmarks favourably against several state-of-art derivative free algorithms.