The Fusion of Unmatched Infrared and Visible Images Based on Generative Adversarial Networks
Yuqing Zhao,
Guangyuan Fu,
Hongqiao Wang and
Shaolei Zhang
Mathematical Problems in Engineering, 2020, vol. 2020, 1-12
Abstract:
Visible images contain clear texture information and high spatial resolution but are unreliable under nighttime or ambient occlusion conditions. Infrared images can display target thermal radiation information under day, night, alternative weather, and ambient occlusion conditions. However, infrared images often lack good contour and texture information. Therefore, an increasing number of researchers are fusing visible and infrared images to obtain more information from them, which requires two completely matched images. However, it is difficult to obtain perfectly matched visible and infrared images in practice. In view of the above issues, we propose a new network model based on generative adversarial networks (GANs) to fuse unmatched infrared and visible images. Our method generates the corresponding infrared image from a visible image and fuses the two images together to obtain more information. The effectiveness of the proposed method is verified qualitatively and quantitatively through experimentation on public datasets. In addition, the generated fused images of the proposed method contain more abundant texture and thermal radiation information than other methods.
Date: 2020
References: Add references at CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://downloads.hindawi.com/journals/MPE/2020/3739040.pdf (application/pdf)
http://downloads.hindawi.com/journals/MPE/2020/3739040.xml (text/xml)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hin:jnlmpe:3739040
DOI: 10.1155/2020/3739040
Access Statistics for this article
More articles in Mathematical Problems in Engineering from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().