首页    期刊浏览 2024年11月24日 星期日
登录注册

文章基本信息

  • 标题:GENERATING SYNTHETIC TRAINING DATA FOR OBJECT DETECTION USING MULTI-TASK GENERATIVE ADVERSARIAL NETWORKS
  • 本地全文:下载
  • 作者:Y. Lin ; K. Suzuki ; H. Takeda
  • 期刊名称:ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
  • 印刷版ISSN:2194-9042
  • 电子版ISSN:2194-9050
  • 出版年度:2020
  • 卷号:V-2-2020
  • 页码:443-449
  • DOI:10.5194/isprs-annals-V-2-2020-443-2020
  • 语种:English
  • 出版社:Copernicus Publications
  • 摘要:Nowadays, digitizing roadside objects, for instance traffic signs, is a necessary step for generating High Definition Maps (HD Map) which remains as an open challenge. Rapid development of deep learning technology using Convolutional Neural Networks (CNN) has achieved great success in computer vision field in recent years. However, performance of most deep learning algorithms highly depends on the quality of training data. Collecting the desired training dataset is a difficult task, especially for roadside objects due to their imbalanced numbers along roadside. Although, training the neural network using synthetic data have been proposed. The distribution gap between synthetic and real data still exists and could aggravate the performance. We propose to transfer the style between synthetic and real data using Multi-Task Generative Adversarial Networks (SYN-MTGAN) before training the neural network which conducts the detection of roadside objects. Experiments focusing on traffic signs show that our proposed method can reach mAP of 0.77 and is able to improve detection performance for objects whose training samples are difficult to collect.
国家哲学社会科学文献中心版权所有