首页    期刊浏览 2024年11月07日 星期四
登录注册

文章基本信息

  • 标题:Central limit theorems for entropy-regularized optimal transport on finite spaces and statistical applications
  • 本地全文:下载
  • 作者:Jérémie Bigot ; Elsa Cazelles ; Nicolas Papadakis
  • 期刊名称:Electronic Journal of Statistics
  • 印刷版ISSN:1935-7524
  • 出版年度:2019
  • 卷号:13
  • 期号:2
  • 页码:5120-5150
  • DOI:10.1214/19-EJS1637
  • 语种:English
  • 出版社:Institute of Mathematical Statistics
  • 摘要:The notion of entropy-regularized optimal transport, also known as Sinkhorn divergence, has recently gained popularity in machine learning and statistics, as it makes feasible the use of smoothed optimal transportation distances for data analysis. The Sinkhorn divergence allows the fast computation of an entropically regularized Wasserstein distance between two probability distributions supported on a finite metric space of (possibly) high-dimension. For data sampled from one or two unknown probability distributions, we derive the distributional limits of the empirical Sinkhorn divergence and its centered version (Sinkhorn loss). We also propose a bootstrap procedure which allows to obtain new test statistics for measuring the discrepancies between multivariate probability distributions. Our work is inspired by the results of Sommerfeld and Munk in [33] on the asymptotic distribution of empirical Wasserstein distance on finite space using unregularized transportation costs. Incidentally we also analyze the asymptotic distribution of entropy-regularized Wasserstein distances when the regularization parameter tends to zero. Simulated and real datasets are used to illustrate our approach.
  • 关键词:Optimal transport; Sinkhorn divergence; central limit theorem; bootstrap; hypothesis testing
国家哲学社会科学文献中心版权所有