摘要:We investigate the asymptotic behavior of Bayesian posterior distributions under independent and identically distributed (i.i.d.) misspecified models. More specifically, we study the concentration of the posterior distribution on neighborhoods of f⋆, the density that is closest in the Kullback–Leibler sense to the true model f0. We note, through examples, the need for assumptions beyond the usual Kullback–Leibler support assumption. We then investigate consistency with respect to a general metric under three assumptions, each based on a notion of divergence measure, and then apply these to a weighted L1-metric in convex models and non-convex models.