Two-scale decomposition and deep learning fusion for visible and infrared images

International Journal of Electrical and Computer Engineering

Two-scale decomposition and deep learning fusion for visible and infrared images

Abstract

The paper focuses on the fusion of visible and infrared images to generate composite images that preserve both the thermal radiation information from the infrared spectrum and the detailed texture from the visible spectrum. The proposed approach combines traditional methods, such as two-scale decomposition, with deep learning techniques, specifically employing an autoencoder architecture. The source images are subjected to two-scale decomposition, which extracts high-frequency detail and low-frequency base information. Additionally, an algorithmic unravelling technique establishes a logical connection between deep neural networks and traditional signal processing algorithms. The model consists of two encoders for decomposition and a decoder after the unravelling operation. During testing, a fusion layer merges the decomposed feature maps, and the decoder generates the fused image. Evaluation metrics including entropy, average gradient, spatial frequency and standard deviation are employed to subjectively assess fusion quality. The proposed approach demonstrates promise for effectively combining visible and infrared imagery for various applications.

Discover Our Library

Embark on a journey through our expansive collection of articles and let curiosity lead your path to innovation.

Explore Now
Library 3D Ilustration