CN115619670A - Method, system and related equipment for enhancing low-light image - Google Patents

Method, system and related equipment for enhancing low-light image Download PDF

Info

Publication number
CN115619670A
CN115619670A CN202211274300.XA CN202211274300A CN115619670A CN 115619670 A CN115619670 A CN 115619670A CN 202211274300 A CN202211274300 A CN 202211274300A CN 115619670 A CN115619670 A CN 115619670A
Authority
CN
China
Prior art keywords
low
light image
image enhancement
model
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211274300.XA
Other languages
Chinese (zh)
Inventor
王华龙
杨标
吴均城
李泽辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan Nanhai Guangdong Technology University CNC Equipment Cooperative Innovation Institute
Original Assignee
Foshan Nanhai Guangdong Technology University CNC Equipment Cooperative Innovation Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan Nanhai Guangdong Technology University CNC Equipment Cooperative Innovation Institute filed Critical Foshan Nanhai Guangdong Technology University CNC Equipment Cooperative Innovation Institute
Priority to CN202211274300.XA priority Critical patent/CN115619670A/en
Publication of CN115619670A publication Critical patent/CN115619670A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/92
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

A method, system and related device for low-light image enhancement, the method comprising the steps of: s1, constructing a low-light image enhancement model for enhancing a low-light image; s2, setting self-calibration parameters of the low-light image enhancement model; s3, setting unsupervised training loss of the low-light image enhancement model; and S4, training the low-light image enhancement model under a preset training parameter according to the self-calibration parameter and the unsupervised training loss, and applying the trained low-light image enhancement model to a low-light image enhancement process. According to the method, a weight sharing cascade mode is adopted in the low-light image enhancement model, a self-calibration module is constructed in each stage, convergence among results of each stage is realized, unsupervised training loss is defined, output of each stage is restrained under the influence of the self-calibration module, and the adaptability of the low-light image enhancement model in a general scene is improved.

Description

Method, system and related equipment for enhancing low-light image
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a system for enhancing a low-light image and related equipment.
Background
Low-light image enhancement aims at making visible information hidden in an image dark, thereby improving image quality, and this technology has attracted much attention in recent years in a number of emerging computer vision fields. The image enhancement of the low-light scene is focused on, mainly because the details in the low-light scene are complex and clear images are difficult to obtain. The existing low-light image enhancement technology is difficult to process visual quality and computational efficiency, and is generally ineffective in unknown complex scenes, so that a better low-light image enhancement method is urgently needed.
Most researchers have adopted model-based or network-based methods for low-light image enhancement for technical reasons, and these enhanced methods have the following problems: 1. parameters need to be manually adjusted to adapt to a real scene, and unsatisfactory results are easily generated under most conditions; 2. the performance of the enhancement achieved is not stable and may produce unnatural enhancement, poor processing details and exposure results.
Disclosure of Invention
The invention aims to solve the problem that the prior art is difficult to process the visual quality and the calculation efficiency when the low-light image is enhanced.
To solve the above problem, in a first aspect, an embodiment of the present invention provides a method for enhancing a low-light image, where the method includes the following steps:
s1, constructing a low-light image enhancement model for enhancing a low-light image;
s2, setting self-calibration parameters of the low-light image enhancement model;
s3, setting unsupervised training loss of the low-light image enhancement model;
and S4, training the low-light image enhancement model under a preset training parameter according to the self-calibration parameter and the unsupervised training loss, and applying the trained low-light image enhancement model to a low-light image enhancement process.
Further, the low-light image enhancement model in step S1 satisfies the following relational expression (1):
Figure BDA0003896413650000021
in the relation (1), u t ,X t And respectively representing a residual error item and an illumination component item of the low-light enhanced image model in the t stage, y representing a low-light observed value, H representing a model architecture, and theta representing a weight parameter.
Further, the self-calibration parameter in step S2 satisfies the following relation (2):
Figure BDA0003896413650000022
in the relation (2), z represents a clear image corresponding to the low-light image, v t Is the conversion input, k, of the low-light-intensity-enhanced image model in the t-th stage θ Is an operator with the weight parameter.
Further, the unsupervised training loss in step S3 includes a fidelity loss and a smoothness loss, and the fidelity loss and the smoothness loss satisfy the following relations (3), (4), respectively:
Figure BDA0003896413650000023
Figure BDA0003896413650000024
in the relational expressions (3) and (4), T represents the total number of stages of the low-light enhanced image model, N represents the total number of pixels of the low-light image, and N represents (i) Representing the adjacent pixel of the ith pixel in its preset window size, w i,j Representing the loss weight value.
Further, defining the unsupervised training loss as L t Said unsupervised training loss L t Satisfies the following relation (5):
L t =αL f +βL s (5);
in the relational expression (5), α and β represent equilibrium parameters.
Further, the preset window size is 5 × 5.
Further, the preset training parameters in step S4 are specifically:
the number of training rounds of the low-light image enhancement model is 1000 rounds, an ADAM is adopted as an optimizer in the training process, and beta of the optimizer 1 Parameter 0.9, beta 2 Parameter 0.999, e-parameter 10 -8 The number of mini-batches is 8, the initial learning rate is 10 -4 Parameter H θ The self-calibration parameter sets 4 convolutional layers using 3 convolutions and 3 channels of the RELU function as the activation function.
In a second aspect, an embodiment of the present invention further provides a system for enhancing a low-light image, including:
the model building module is used for building a low-light image enhancement model for enhancing a low-light image;
the self-calibration module is used for setting self-calibration parameters of the low-light image enhancement model;
the unsupervised training module is used for setting unsupervised training loss of the low-light image enhancement model;
and the model optimization module is used for training the low-light image enhancement model under a preset training parameter according to the self-calibration parameter and the unsupervised training loss, and using the trained low-light image enhancement model in a low-light image enhancement process.
In a third aspect, an embodiment of the present invention further provides a computer device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of the method for low-light image enhancement as described in any of the above embodiments.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps in the method for low-light image enhancement as described in any one of the above embodiments.
The method has the advantages that the weight sharing cascade mode is adopted in the low-light image enhancement model, the self-calibration module is constructed in each stage, the convergence between the results of each stage is realized, the output of each stage is restrained under the influence of the self-calibration module by defining the unsupervised training loss, and the capability of the model adapting to a common scene is improved.
Drawings
FIG. 1 is a flow chart illustrating steps of a method for low-light image enhancement according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a system 200 for low-light image enhancement according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating steps of a method for enhancing a low-light image according to an embodiment of the present invention, which specifically includes the following steps:
s1, constructing a low-light image enhancement model for enhancing a low-light image.
Further, the low-light image enhancement model in step S1 satisfies the following relational expression (1):
Figure BDA0003896413650000041
in the relational expression (1), u t ,X t And respectively representing a residual error item and an illumination component item of the low-light enhanced image model in the t stage, y representing a low-light observed value, H representing a model architecture, and theta representing a weight parameter.
S2, setting self-calibration parameters of the low-light image enhancement model.
Further, the self-calibration parameter in step S2 satisfies the following relation (2):
Figure BDA0003896413650000042
in the relation (2), z represents a clear image corresponding to the low-light image, v t Is the conversion input, k, of the low-light-intensity-enhanced image model in the t-th stage θ Is an operator with the weight parameter.
And S3, setting the unsupervised training loss of the low-light image enhancement model.
Further, the unsupervised training loss in step S3 includes a fidelity loss and a smoothness loss, and the fidelity loss and the smoothness loss satisfy the following relations (3), (4), respectively:
Figure BDA0003896413650000051
Figure BDA0003896413650000052
in the relational expressions (3) and (4), T represents the total number of stages of the low-light enhanced image model, N represents the total number of pixels of the low-light image, and N represents (i) Representing the adjacent pixel of the ith pixel in its preset window size, w i,j Representing the loss weight value.
Further, defining the unsupervised training loss as L t Said unsupervisedTraining loss L t Satisfies the following relation (5):
L t =αL f +βL s (5);
in the relational expression (5), α and β represent equilibrium parameters.
Further, the preset window size is 5 × 5.
And S4, training the low-light image enhancement model under a preset training parameter according to the self-calibration parameter and the unsupervised training loss, and applying the trained low-light image enhancement model to a low-light image enhancement process.
Further, the preset training parameters in step S4 specifically include:
the number of training rounds of the low-light image enhancement model is 1000 rounds, an ADAM is adopted as an optimizer in the training process, and beta of the optimizer 1 Parameter 0.9, beta 2 Parameter is 0.999, and the element parameter is 10 -8 Number of mini-batches of 8, initial learning rate of 10 -4 Parameter H θ The self-calibration parameters set 4 convolutional layers using 3 convolutions and 3 channels of the RELU function as the activation function.
The method has the advantages that the weight sharing cascade mode is adopted in the low-light image enhancement model, the self-calibration module is constructed in each stage, the convergence between the results of each stage is realized, the output of each stage is restrained under the influence of the self-calibration module by defining the unsupervised training loss, and the capability of the model adapting to a common scene is improved.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a system 200 for enhancing a low-light image according to an embodiment of the present invention, where the system 200 for enhancing a low-light image includes:
a model establishing module 201, configured to establish a low-light image enhancement model for low-light image enhancement;
a self-calibration module 202, configured to set self-calibration parameters of the low-light image enhancement model;
an unsupervised training module 203, configured to set an unsupervised training loss of the low-light image enhancement model;
and the model optimization module 204 is configured to train the low-light image enhancement model according to the self-calibration parameter and the unsupervised training loss under a preset training parameter, and use the trained low-light image enhancement model in a low-light image enhancement process.
The system 200 for enhancing a low-light image can implement the steps in the method for enhancing a low-light image in the above embodiment, and can implement the same technical effects, which are described in the above embodiment and are not repeated herein.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a computer device provided in an embodiment of the present invention, where the computer device 300 includes: a memory 302, a processor 301, and a computer program stored on the memory 302 and executable on the processor 301.
The processor 301 calls the computer program stored in the memory 302 to execute the steps of the method for enhancing a low-light image according to the embodiment of the present invention, and with reference to fig. 1, the method specifically includes:
s1, constructing a low-light image enhancement model for enhancing a low-light image.
Further, the low-light image enhancement model in step S1 satisfies the following relational expression (1):
Figure BDA0003896413650000071
in the relational expression (1), u t ,X t And respectively representing a residual error item and an illumination component item of the low-light enhanced image model in the t stage, y representing a low-light observed value, H representing a model architecture, and theta representing a weight parameter.
S2, setting self-calibration parameters of the low-light image enhancement model.
Further, the self-calibration parameter in step S2 satisfies the following relation (2):
Figure BDA0003896413650000072
in the relation (2), z represents a clear image corresponding to the low-light image, v t Is the conversion input, k, of the low-light-intensity-enhanced image model at the t-th stage θ Is an operator with the weight parameter.
And S3, setting unsupervised training loss of the low-light image enhancement model.
Further, the unsupervised training loss in step S3 includes a fidelity loss and a smoothness loss, and the fidelity loss and the smoothness loss satisfy the following relations (3), (4), respectively:
Figure BDA0003896413650000073
Figure BDA0003896413650000074
in the relational expressions (3) and (4), T represents the total number of stages of the low-light enhanced image model, N represents the total number of pixels of the low-light image, and N represents (i) Representing the adjacent pixel of the ith pixel in its preset window size, w i,j Representing the loss weight value.
Further, defining the unsupervised training loss as L t Said unsupervised training loss L t Satisfies the following relation (5):
L t =αL f +βL s (5);
in the relational expression (5), α and β represent equilibrium parameters.
Further, the preset window size is 5 × 5.
And S4, training the low-light image enhancement model under a preset training parameter according to the self-calibration parameter and the unsupervised training loss, and applying the trained low-light image enhancement model to the enhancement process of the low-light image.
Further, the preset training parameters in step S4 specifically include:
the number of training rounds of the low-light image enhancement model is 1000 rounds, an ADAM is adopted as an optimizer in the training process, and beta of the optimizer 1 Parameter 0.9, beta 2 Parameter 0.999, e-parameter 10 -8 The number of mini-batches is 8, the initial learning rate is 10 -4 Parameter H θ The self-calibration parameter sets 4 convolutional layers using 3 convolutions and 3 channels of the RELU function as the activation function.
The computer device 300 provided in the embodiment of the present invention can implement the steps in the method for enhancing a low-light image in the foregoing embodiment, and can implement the same technical effects, and reference is made to the description in the foregoing embodiment, which is not repeated herein.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the preferred embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, which are illustrative, but not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A method of low-light image enhancement, the method comprising the steps of:
s1, constructing a low-light image enhancement model for enhancing a low-light image;
s2, setting self-calibration parameters of the low-light image enhancement model;
s3, setting unsupervised training loss of the low-light image enhancement model;
and S4, training the low-light image enhancement model under a preset training parameter according to the self-calibration parameter and the unsupervised training loss, and applying the trained low-light image enhancement model to the enhancement process of the low-light image.
2. The method of low-light image enhancement according to claim 1, wherein the low-light image enhancement model in step S1 satisfies the following relation (1):
Figure FDA0003896413640000011
in the relation (1), u t ,X t And respectively representing a residual error item and an illumination component item of the low-light enhanced image model in the t stage, y representing a low-light observed value, H representing a model architecture, and theta representing a weight parameter.
3. The method of low-light image enhancement according to claim 2, wherein the self-calibration parameter in step S2 satisfies the following relation (2):
Figure FDA0003896413640000012
in the relation (2), z represents a clear image corresponding to the low-light image, v t Is the conversion input, k, of the low-light-intensity-enhanced image model at the t-th stage θ Is an operator with the weight parameter.
4. The method of low-light image enhancement according to claim 3, wherein the unsupervised training loss in step S3 comprises a fidelity loss and a smoothness loss, and the fidelity loss and the smoothness loss satisfy the following relations (3), (4), respectively:
Figure FDA0003896413640000013
Figure FDA0003896413640000021
in the relational expressions (3) and (4), T represents the total number of stages of the low-light enhanced image model, N represents the total number of pixels of the low-light image, and N represents (i) Represents the ith pixelAdjacent pixels in its preset window size, w i,j Representing the loss weight value.
5. The method of low-light image enhancement of claim 4, wherein the unsupervised training loss is defined as L t Said unsupervised training loss L t Satisfies the following relation (5):
L t =αL f +βL s (5);
in the relational expression (5), α and β represent equilibrium parameters.
6. The method for low-light image enhancement according to claim 4, wherein the preset window size is 5 x 5.
7. The method for enhancing low-light images according to claim 4, wherein the preset training parameters in step S4 are specifically:
the number of training rounds of the low-light image enhancement model is 1000 rounds, an ADAM is adopted as an optimizer in the training process, and beta of the optimizer 1 Parameter 0.9, beta 2 Parameter 0.999, e-parameter 10 -8 The number of mini-batches is 8, the initial learning rate is 10 -4 Parameter H θ The self-calibration parameter sets 4 convolutional layers using 3 convolutions and 3 channels of the RELU function as the activation function.
8. A system for low-light image enhancement, comprising:
the model building module is used for building a low-light image enhancement model for enhancing a low-light image;
the self-calibration module is used for setting self-calibration parameters of the low-light image enhancement model;
the unsupervised training module is used for setting unsupervised training loss of the low-light image enhancement model;
and the model optimization module is used for training the low-light image enhancement model under a preset training parameter according to the self-calibration parameter and the unsupervised training loss, and using the trained low-light image enhancement model in a low-light image enhancement process.
9. A computer device, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of low-light image enhancement according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the method of low-light image enhancement according to any one of claims 1 to 7.
CN202211274300.XA 2022-10-18 2022-10-18 Method, system and related equipment for enhancing low-light image Pending CN115619670A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211274300.XA CN115619670A (en) 2022-10-18 2022-10-18 Method, system and related equipment for enhancing low-light image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211274300.XA CN115619670A (en) 2022-10-18 2022-10-18 Method, system and related equipment for enhancing low-light image

Publications (1)

Publication Number Publication Date
CN115619670A true CN115619670A (en) 2023-01-17

Family

ID=84862080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211274300.XA Pending CN115619670A (en) 2022-10-18 2022-10-18 Method, system and related equipment for enhancing low-light image

Country Status (1)

Country Link
CN (1) CN115619670A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116894791A (en) * 2023-08-01 2023-10-17 中国人民解放军战略支援部队航天工程大学 Visual SLAM method and system for enhancing image under low illumination condition

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116894791A (en) * 2023-08-01 2023-10-17 中国人民解放军战略支援部队航天工程大学 Visual SLAM method and system for enhancing image under low illumination condition
CN116894791B (en) * 2023-08-01 2024-02-09 中国人民解放军战略支援部队航天工程大学 Visual SLAM method and system for enhancing image under low illumination condition

Similar Documents

Publication Publication Date Title
CN112288658B (en) Underwater image enhancement method based on multi-residual joint learning
CN109087273B (en) Image restoration method, storage medium and system based on enhanced neural network
CN111145123B (en) Image denoising method based on U-Net fusion retention details
CN112435191B (en) Low-illumination image enhancement method based on fusion of multiple neural network structures
CN110189260B (en) Image noise reduction method based on multi-scale parallel gated neural network
CN114240735B (en) Arbitrary style migration method, system, storage medium, computer equipment and terminal
Liu et al. Learning hadamard-product-propagation for image dehazing and beyond
CN115619670A (en) Method, system and related equipment for enhancing low-light image
CN114463223A (en) Image enhancement processing method and device, computer equipment and medium
CN115984570A (en) Video denoising method and device, storage medium and electronic device
CN113658091A (en) Image evaluation method, storage medium and terminal equipment
CN112070686B (en) Backlight image cooperative enhancement method based on deep learning
CN117422653A (en) Low-light image enhancement method based on weight sharing and iterative data optimization
CN111798381A (en) Image conversion method, image conversion device, computer equipment and storage medium
CN115761242B (en) Denoising method and terminal based on convolutional neural network and fuzzy image characteristics
CN111768326A (en) High-capacity data protection method based on GAN amplification image foreground object
CN113052768A (en) Method for processing image, terminal and computer readable storage medium
CN110807752A (en) Image attention mechanism processing method based on convolutional neural network
CN114663300A (en) DCE-based low-illumination image enhancement method, system and related equipment
CN115100058A (en) Image reflection removing method and system based on neural network and related equipment
CN114943655A (en) Image restoration system for generating confrontation network structure based on cyclic depth convolution
CN116645268A (en) Image processing method, device, electronic equipment and computer readable storage medium
CN114240767A (en) Image wide dynamic range processing method and device based on exposure fusion
CN114187174A (en) Image super-resolution reconstruction method based on multi-scale residual error feature fusion
Tanaka et al. Locally adaptive learning for translation-variant MRF image priors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination