首页    期刊浏览 2025年02月22日 星期六
登录注册

文章基本信息

  • 标题:GEOMETRIC AND NON-LINEAR RADIOMETRIC DISTORTION ROBUST MULTIMODAL IMAGE MATCHING VIA EXPLOITING DEEP FEATURE MAPS
  • 本地全文:下载
  • 作者:M. Chen ; Y. Zhao ; T. Fang
  • 期刊名称:ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
  • 印刷版ISSN:2194-9042
  • 电子版ISSN:2194-9050
  • 出版年度:2020
  • 卷号:V-3-2020
  • 页码:233-240
  • DOI:10.5194/isprs-annals-V-3-2020-233-2020
  • 语种:English
  • 出版社:Copernicus Publications
  • 摘要:Image matching is a fundamental issue of multimodal images fusion. Most of recent researches only focus on the non-linear radiometric distortion on coarsely registered multimodal images. The global geometric distortion between images should be eliminated based on prior information (e.g. direct geo-referencing information and ground sample distance) before using these methods to find correspondences. However, the prior information is not always available or accurate enough. In this case, users have to select some ground control points manually to do image registration and make the methods work. Otherwise, these methods will fail. To overcome this problem, we propose a robust deep learning-based multimodal image matching method that can deal with geometric and non-linear radiometric distortion simultaneously by exploiting deep feature maps. It is observed in our study that some of the deep feature maps have similar grayscale distribution and correspondences can be found from these maps using traditional geometric distortion robust matching methods even significant non-linear radiometric difference exists between the original images. Therefore, we can only focus on the geometric distortion when we deal with deep feature maps, and then only focus on non-linear radiometric distortion in patches similarity measurement. The experimental results demonstrate that the proposed method performs better than the state-of-the-art matching methods on multimodal images with both geometric and non-linear radiometric distortion.
国家哲学社会科学文献中心版权所有