Illumination-aware divide-and-conquer network for improperly-exposed image enhancement.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Improperly-exposed images often have unsatisfactory visual characteristics like inadequate illumination, low contrast, and the loss of small structures and details. The mapping relationship from an improperly-exposed condition to a well-exposed one may vary significantly due to the presence of multiple exposure conditions. Consequently, the enhancement methods that do not pay specific attention to this issue tend to yield inconsistent results when applied to the same scene under different exposure conditions. In order to obtain consistent enhancement results for various exposures while restoring rich details, we propose an illumination-aware divide-and-conquer network (IDNet). Specifically, to address the challenge of directly learning a sophisticated nonlinear mapping from an improperly-exposed condition to a well-exposed one, we utilize the discrete wavelet transform (DWT) to decompose the image into the low-frequency (LF) component, which primarily captures brightness and contrast, and the high-frequency (HF) components that depict fine-scale structures. To mitigate the inconsistency in correction across various exposures, we extract a conditional feature from the input that represents illumination-related global information. This feature is then utilized to modulate the dynamic convolution weights, enabling precise correction of the LF component. Furthermore, as the co-located positions of LF and HF components are highly correlated, we create a mask to distill useful knowledge from the corrected LF component, and integrate it into the HF component to support the restoration of fine-scale details. Extensive experimental results demonstrate that the proposed IDNet is superior to several state-of-the-art enhancement methods on two datasets with multiple exposures.

Authors

  • Fenggang Han
    School of Computer and Electronic Information, Guangxi University, Nanning 530004, China. Electronic address: hfg@st.gxu.edu.cn.
  • Kan Chang
    School of Computer and Electronic Information, Guangxi University, Nanning, Guangxi 530004, China; Guangxi Key Laboratory of Multimedia Communications and Network Technology, Guangxi University, Nanning, Guangxi 530004, China. Electronic address: changkan0@gmail.com.
  • Guiqing Li
    School of Computer Science and Engineering, South China University of Technology, Guangzhou 510006, China. Electronic address: ligq@scut.edu.cn.
  • Mingyang Ling
    School of Computer and Electronic Information, Guangxi University, Nanning 530004, China. Electronic address: lingmy@st.gxu.edu.cn.
  • Mengyuan Huang
    Academy of Artificial Intelligence, Beijing Institute of Petrochemical Technology, Beijing, 102617, China.
  • Zan Gao