期刊名称:Proceedings of the National Academy of Sciences
印刷版ISSN:0027-8424
电子版ISSN:1091-6490
出版年度:2020
卷号:117
期号:22
页码:12004-12010
DOI:10.1073/pnas.1920913117
出版社:The National Academy of Sciences of the United States of America
摘要:A catalytic prior distribution is designed to stabilize a high-dimensional “working model” by shrinking it toward a “simplified model.” The shrinkage is achieved by supplementing the observed data with a small amount of “synthetic data” generated from a predictive distribution under the simpler model. We apply this framework to generalized linear models, where we propose various strategies for the specification of a tuning parameter governing the degree of shrinkage and study resultant theoretical properties. In simulations, the resulting posterior estimation using such a catalytic prior outperforms maximum likelihood estimation from the working model and is generally comparable with or superior to existing competitive methods in terms of frequentist prediction accuracy of point estimation and coverage accuracy of interval estimation. The catalytic priors have simple interpretations and are easy to formulate.
关键词:Bayesian priors ; synthetic data ; stable estimation ; predictive distribution ; regularization