首页    期刊浏览 2024年11月27日 星期三
登录注册

文章基本信息

  • 标题:Convergence of Stochastic Vector Quantization and Learning Vector Quantization with Bregman Divergences ⁎
  • 本地全文:下载
  • 作者:Christos N. Mavridis ; John S. Baras
  • 期刊名称:IFAC PapersOnLine
  • 印刷版ISSN:2405-8963
  • 出版年度:2020
  • 卷号:53
  • 期号:2
  • 页码:2214-2219
  • DOI:10.1016/j.ifacol.2020.12.006
  • 语种:English
  • 出版社:Elsevier
  • 摘要:AbstractStochastic vector quantization methods have been extensively studied in supervised and unsupervised learning problems as online, data-driven, interpretable, robust, and fast to train and evaluate algorithms. Being prototype-based methods, they depend on a dissimilarity measure, which is both necessary and sufficient to belong to the family of Bregman divergences, if the mean value is used as the representative of the cluster. In this work, we investigate the convergence properties of stochastic vector quantization (VQ) and its supervised counterpart, Learning Vector Quantization (LVQ), using Bregman divergences. We employ the theory of stochastic approximation to study the conditions on the initialization and the Bregman divergence generating functions, under which, the algorithms converge to desired configurations. These results formally support the use of Bregman divergences, such as the Kullback-Leibler divergence, in vector quantization algorithms.
  • 关键词:Keywordslearning algorithmsstochastic approximationconvergence proofs
国家哲学社会科学文献中心版权所有