首页    期刊浏览 2024年10月06日 星期日
登录注册

文章基本信息

  • 标题:β-Variational Classifiers Under Attack
  • 本地全文:下载
  • 作者:Marco Maggipinto ; Matteo Terzi ; Gian Antonio Susto
  • 期刊名称:IFAC PapersOnLine
  • 印刷版ISSN:2405-8963
  • 出版年度:2020
  • 卷号:53
  • 期号:2
  • 页码:7903-7908
  • DOI:10.1016/j.ifacol.2020.12.1979
  • 语种:English
  • 出版社:Elsevier
  • 摘要:AbstractDeep Neural networks have gained lots of attention in recent years thanks to the breakthroughs obtained in the field of Computer Vision. However, despite their popularity, it has been shown that they provide limited robustness in their predictions. In particular, it is possible to synthesise small adversarial perturbations that imperceptibly modify a correctly classified input data, making the network confidently misclassify it. This has led to a plethora of different methods to try to improve robustness or detect the presence of these perturbations. In this paper, we perform an analysis of β-Variational Classifiers, a particular class of methods that not only solve a specific classification task, but also provide a generative component that is able to generate new samples from the input distribution. More in details, we study their robustness and detection capabilities, together with some novel insights on the generative part of the model.
  • 关键词:KeywordsAdversarial TrainingComputer VisionDeep LearningMachine LearningRobustness
国家哲学社会科学文献中心版权所有