CN117333401A - Image illumination enhancement method, system, medium and device with uneven brightness distribution - Google Patents

Image illumination enhancement method, system, medium and device with uneven brightness distribution Download PDF

Info

Publication number
CN117333401A
CN117333401A CN202311621045.6A CN202311621045A CN117333401A CN 117333401 A CN117333401 A CN 117333401A CN 202311621045 A CN202311621045 A CN 202311621045A CN 117333401 A CN117333401 A CN 117333401A
Authority
CN
China
Prior art keywords
illumination
image
image data
enhancement
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311621045.6A
Other languages
Chinese (zh)
Inventor
胡允霄
张洁
吴王辉
杨志
饶宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Light Cosmos Jinye Wuhan Intelligent Technology Co ltd
Original Assignee
Light Cosmos Jinye Wuhan Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Light Cosmos Jinye Wuhan Intelligent Technology Co ltd filed Critical Light Cosmos Jinye Wuhan Intelligent Technology Co ltd
Priority to CN202311621045.6A priority Critical patent/CN117333401A/en
Publication of CN117333401A publication Critical patent/CN117333401A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • G06N3/0455Auto-encoder networks; Encoder-decoder networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0475Generative networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an image illumination enhancement method, system, medium and equipment with uneven brightness distribution, wherein the method comprises the following steps: performing iterative training on the generator network according to the low-illumination image data, and distinguishing an overexposed region and an underexposed region in the low-illumination image data in each iterative training so as to perform image illumination enhancement; comparing the illumination enhancement image data generated by each iterative training of the generator network with the normal illumination image data by utilizing the discriminator network; and iteratively optimizing the generator network according to the comparison result of each time until the comparison result meets the condition, acquiring a corresponding final generator network at the moment, and utilizing the final generator network to carry out illumination enhancement on the low illumination scene image to be detected. The invention can accurately improve the image enhancement effect by distinguishing the overexposed area and the underexposed area in the low-light image to enhance the image illumination, so that the image is more natural in color and illumination.

Description

Image illumination enhancement method, system, medium and device with uneven brightness distribution
Technical Field
The invention relates to the technical field of sonar image recognition, in particular to an image illumination enhancement method, an image illumination enhancement system, an image illumination enhancement medium and image illumination enhancement equipment with uneven brightness distribution.
Background
Due to illumination, a camera or a mobile phone may be under-exposed or over-exposed. Due to the complex illumination conditions, there may be both overexposed and underexposed areas in the image. In this case, the partial image content is buried in the overexposed and underexposed areas. For example, lights such as street lamps, billboards, car lights, etc. are visible everywhere on night roads or in cells. The brightness of different areas in an image may vary widely due to the presence of uneven illumination, which makes it more difficult to enhance the illumination of the image as a whole. While a photographer may take advantage of high-end DSLR cameras and carefully adjust them (e.g., aperture, ISO, and special filters) to alleviate this problem, this requires photographic expertise and expensive equipment.
Many methods have been proposed to improve the quality of images captured under poor lighting conditions. A series of methods focus on passing Retinex-based methods, bilateral learning, generation of countermeasure learning, depth parameter filtering, and self-supervised or semi-supervised learning. Most of these existing methods of night illumination enhancement are focused mainly on the excessive enhancement and saturation of bright areas when these methods are applied to night images containing uneven illumination, which will inevitably result in enhancement of low light areas. Since the exposure area is enlarged again, the visibility of the image is further weakened, and problems such as light and shadow enhancement are brought to the problem that the image quality is reduced, so that the low light area cannot be properly and reasonably enhanced. In summary, all of these approaches generally assume that the scene illumination is uniform, and thus result in localized overexposure or underexposure of the image when processing unevenly illuminated images.
Disclosure of Invention
The invention provides an image illumination enhancement method, system, medium and equipment with uneven brightness distribution, which can accurately improve the image enhancement effect by differentiating the overexposed area and the underexposed area in a low-illumination image to enhance the dark-illumination area and weaken the illumination influence at the same time, so that the image is more natural in color and illumination.
In a first aspect, an image illumination enhancement method with non-uniform brightness distribution is provided, which is characterized by comprising the following steps:
acquiring low-light image data and normal-light image data;
performing iterative training on a generator network according to the low-illumination image data, distinguishing an overexposed region and an underexposed region in the low-illumination image data in each iterative training, so as to perform image illumination enhancement, and generating illumination enhancement image data;
comparing the illumination enhancement image data generated by each iterative training of the generator network with the normal illumination image data by utilizing a discriminator network;
and iteratively optimizing the generator network according to each comparison result until the comparison result meets the condition, acquiring a corresponding final generator network at the moment, and utilizing the final generator network to carry out illumination enhancement on the low illumination scene image to be detected.
According to a first aspect, in a first possible implementation manner of the first aspect, the step of distinguishing an overexposed area and an underexposed area in the low-light image data in each iterative training to perform image light enhancement and generate light enhancement image data specifically includes the following steps:
the generator network comprises a U-Net generator and a color distribution pyramid network;
normalizing one training image in the low-illumination image data;
carrying out feature extraction on one of the training images after normalization processing by using an encoder in the U-Net generator, and carrying out image size progressive decrease on the image after feature extraction through downsampling to obtain a multi-scale feature image;
generating a multi-scale color distribution map according to one training image which is not subjected to normalization processing by utilizing a color distribution pyramid network so as to distinguish an overexposed area and an underexposed area in the image;
and respectively selecting the feature images with the same size from the multi-scale feature images and the multi-scale color distribution map by using a decoder in the U-Net generator to perform image fusion.
In a second possible implementation manner of the first aspect according to the first aspect, the calculation formula of the color distribution pyramid network is as follows:
wherein i, j, c are the horizontal, vertical and channel indices of the image, respectively; b is the index of the color histogram bin;for returning pixels +.>Index of pixels in the K x K size block to which it belongs.
In a third possible implementation manner of the first aspect, the comparing, by using the discriminator network, the illumination enhanced image data generated by each iteration of training of the generator network with the normal illumination image data specifically includes the following steps:
the discriminator network comprises a global discriminator and a local discriminator;
evaluating the similarity between the illumination enhancement image data and the normal illumination image data generated by each iterative training by using the global discriminator;
and evaluating the quality improvement degree of the illumination enhancement image data generated by each iteration training relative to the normal illumination image data by using the local discriminant.
In a fourth possible implementation manner of the first aspect according to the third possible implementation manner of the first aspect, a calculation formula of the global arbiter is as follows:
wherein D (x, y) is the probability that the normal illumination image y is judged to be true by the global discriminator; d (x, G (x, z)) is the probability that the illumination enhanced image G (x, z) is discriminated as true;the expected value is output by a global discriminator for the normal illumination image; />The expected value of the global arbiter output is performed for the illumination enhanced image.
In a fifth possible implementation manner of the first aspect according to the first aspect, a calculation formula of the local arbiter is as follows:
wherein, C is a discriminator network;representing a sigmoid activation function; />Is the distribution of the normal illumination image; />Enhancing the distribution of the image for illumination; />Calculating local discriminant loss of the normal illumination image relative to the illumination enhancement image; />To calculate the local discriminant loss of the illumination enhancement image relative to the normal illumination image.
In a second aspect, an image illumination enhancement system with non-uniform brightness distribution is provided, comprising:
the image acquisition module is used for acquiring low-illumination image data and normal-illumination image data;
the iterative training module is in communication connection with the image acquisition module and is used for carrying out iterative training on the generator network according to the low-illumination image data, distinguishing an overexposed region and an underexposed region in the low-illumination image data in each iterative training so as to carry out image illumination enhancement and generate illumination enhancement image data;
the comparison module is in communication connection with the image acquisition module and the iterative training module and is used for comparing the illumination enhancement image data generated by each iterative training of the generator network with the normal illumination image data by utilizing the identifier network; the method comprises the steps of,
and the final enhancement module is in communication connection with the comparison module and is used for iteratively optimizing the generator network according to each comparison result until the comparison result meets the condition, obtaining the corresponding final generator network at the moment and utilizing the final generator network to carry out illumination enhancement on the low-illumination scene image to be detected.
Compared with the prior art, the invention has the following advantages: and distinguishing an overexposed region and an underexposed region in the low-light image data by using the generator network, carrying out image illumination enhancement, comparing the illumination enhanced image data with a normal illumination image by using the discriminator network, iteratively optimizing the generator network according to each comparison result until the comparison result meets the condition, acquiring a corresponding final generator network at the moment, and finally carrying out illumination enhancement on the low-light scene image to be detected by using the final generator network. Therefore, the invention can accurately improve the image enhancement effect by distinguishing the overexposed area and the underexposed area in the low-light image to enhance the influence of light, and make the image more natural in color and light.
Drawings
FIG. 1 is a flowchart of an embodiment of a method for enhancing illumination of an image with non-uniform brightness distribution according to the present invention;
FIG. 2 is a schematic diagram of a network architecture of a U-Net generator and a color distribution pyramid network of the present invention for image illumination enhancement of low-light images;
FIG. 3 is a schematic diagram of the image illumination enhancement processing of the same scene to obtain corresponding enhancement results by selecting a plurality of different methods;
fig. 4 is a schematic structural diagram of an image illumination enhancement system with non-uniform brightness distribution according to the present invention.
Detailed Description
Reference will now be made in detail to the present embodiments of the invention, examples of which are illustrated in the accompanying drawings. While the invention will be described in conjunction with the specific embodiments, it will be understood that they are not intended to limit the invention to the described embodiments. On the contrary, it is intended to cover alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims. It should be noted that the method steps described herein may be implemented by any functional block or arrangement of functions, and any functional block or arrangement of functions may be implemented as a physical entity or a logical entity, or a combination of both.
The present invention will be described in further detail below with reference to the drawings and detailed description for the purpose of enabling those skilled in the art to understand the invention better.
Note that: the examples to be described below are only one specific example, and not as limiting the embodiments of the present invention necessarily to the following specific steps, values, conditions, data, sequences, etc. Those skilled in the art can, upon reading the present specification, make and use the concepts of the invention to construct further embodiments not mentioned in the specification.
Referring to fig. 1, an embodiment of the present invention provides an image illumination enhancement method with non-uniform brightness distribution, including the following steps:
s100, acquiring low-illumination image data and normal-illumination image data;
the image data may be obtained directly by photographing.
It is also possible to collect existing data sets, in particular: the training set contained 1200 low-light images from the Exdark dataset, 1159 normal light images from the MIT-Adobe FiveK dataset and 41 normal light images from the EnlightnGAN dataset. The ExDark dataset is a collection of 7363 and Zhang Diguang images, from extremely low light environment to dusk (namely 10 different conditions), 12 object categories are marked according to image content, 1200 night images can be randomly selected from different categories, and the images comprise overexposed and underexposed areas. The test set contains 32 pieces of data from the MIT-Adobe FiveK data set. A typical 32 non-uniform illumination pictures of N1.5 are selected from MIT-Adobe FiveK, and the pictures are scaled to the minimum edge of not more than 640p in an equal ratio.
S200, carrying out iterative training on a generator network according to the low-illumination image data, and distinguishing an overexposed region and an underexposed region in the low-illumination image data in each iterative training so as to carry out image illumination enhancement and generate illumination enhancement image data;
s300, comparing the illumination enhancement image data generated by each iteration training of the generator network with the normal illumination image data by utilizing a discriminator network;
and S400, iteratively optimizing the generator network according to each comparison result until the comparison result meets the condition (for example, the final comparison result value is smaller than a certain threshold value), acquiring a final generator network corresponding to the comparison result, and carrying out illumination enhancement on the low illumination scene image to be detected by utilizing the final generator network.
Specifically, in this embodiment, the invention uses the generator network to distinguish the overexposed region and the underexposed region in the low-light image data, and performs image illumination enhancement, then uses the discriminator network to compare the illumination enhancement image data with the normal light image, and then iteratively optimizes the generator network according to each comparison result until the comparison result meets the condition, and finally uses the final generator network to perform illumination enhancement on the low-light scene image to be detected. Therefore, the invention can accurately improve the image enhancement effect by distinguishing the overexposed area and the underexposed area in the low-light image to enhance the influence of light, and make the image more natural in color and light.
That is, the generator network and the discriminator network are subjected to multiple rounds of iterative optimization with the goal of enabling the generated illumination-enhanced image to fool the discriminator, making it difficult for the discriminator to distinguish between the generated illumination-enhanced image and the normal illumination image. During training, the discriminator provides information about the differences between the generated illumination enhanced image and the normal illumination image, and the generator improves the quality of the generated image by minimizing these differences.
Preferably, in another embodiment of the present application, the step S200 of differentiating the overexposed area and the underexposed area in the low-light image data for image light enhancement in each iterative training, and generating the light enhanced image data specifically includes the following steps:
the generator network comprises a U-Net generator and a color distribution pyramid network;
s210, normalizing one training image in the low-light image data;
one of the training images needs to be normalized by the illumination channel, I normalized to 0,1, before the image is input to the encoder.
S220, carrying out feature extraction on one of the training images after normalization processing by using an encoder in the U-Net generator, and carrying out step-by-step reduction on the image size of the image after feature extraction through downsampling to obtain a multi-scale feature image;
s230, generating a multi-scale color distribution map by utilizing a color distribution pyramid network according to one training image which is not subjected to normalization processing so as to distinguish an overexposed area and an underexposed area in the image;
s240, respectively selecting feature images with the same size from the multi-scale feature images and the multi-scale color distribution map by using a decoder in the U-Net generator to perform image fusion.
The calculation formula of the color distribution pyramid network is as follows:
wherein i, j, c are the horizontal, vertical and channel indices of the image, respectively; b is the index of the color histogram bin;for returning pixels +.>Index of pixels in the K x K size block to which it belongs.
Specifically, in this embodiment, referring to fig. 2 at the same time, firstly, an image of one image after normalization processing is input into an encoder, features are extracted by a convolution layer, the size of the feature map is gradually reduced by downsampling, after each downsampling step, a residual connection is added, and each layer of feature map obtained by gradually decreasing is stored for use by a decoder.
While another image (without illumination channel) passes through the color distribution pyramid, the image is divided into N features, each small block having a feature size ofAnd->Is defined as a color histogram in the local area. I.e. building a double lattice network of color calculation histograms +.>By assigning different values to K, LCD maps (color profiles) of different scales can be obtained, thus utilizing a color profile pyramid to obtain a multi-scale area illumination profile.
Given a sheet of size ofIs of the pixel range +.>Between them. First, the color distribution pyramid divides the input image I into N features, the size of which is +.>Then the features are all +.>And each. The local color distribution is defined as being of size +.>Is described, the color histogram in the local area of (a). Then use the map +.>To represent the dimension +.>Is a distribution of (a). I.e. first construct a +.>Bilateral grid->Histogram calculation along the range dimension and then using the formula +.>And calculating to obtain a multi-scale color distribution map.
By the way ofDifferent LCD maps are allocated which can be obtained at different scales. />Is a pixel-level color distribution, wherein->Is a sparse single thermal vector. When->When increasing, the wearer is strapped with>And the local representation of (c) correspondingly increases. Let->From the slaveAnd (5) taking a value. />A multi-scale LCD pyramid, has different levels of color distribution. The LCD pyramid contains a multi-scale area illumination distribution that can help distinguish between overexposed and underexposed areas.
The feature map of the same size is selected from the multi-scale feature map and the multi-scale color distribution map respectively, the feature map stored before each up-sampling operation is obtained from the corresponding encoder layer, the feature map of the current layer and the color distribution map of the same size output by the pyramid network are fused through residual connection (in the fusion process, element-wise addition is realized, so that the position information between the encoder and the decoder can be reserved, the area illumination distribution can help to distinguish the overexposed area from the underexposed area), the detail and the space information can be recovered, and the final layer of the decoder outputs the same size as the image input into the U-Net generator to represent the generated illumination enhancement result.
Preferably, in a further embodiment of the present application, the step of comparing, with the identifier network, the illumination enhanced image data generated by each iteration of training of the generator network with the normal illumination image data, specifically includes the steps of:
the discriminator network comprises a global discriminator and a local discriminator;
evaluating the similarity between the illumination enhancement image data and the normal illumination image data generated by each iterative training by using the global discriminator;
and evaluating the quality improvement degree of the illumination enhancement image data generated by each iteration training relative to the normal illumination image data by using the local discriminant.
Preferably, in a further embodiment of the present application, the calculation formula of the global arbiter is as follows:
wherein D (x, y) is the probability that the normal illumination image y is judged to be true by the global discriminator; d (x, G (x, z)) is the probability that the illumination enhanced image G (x, z) is discriminated as true;the expected value is output by a global discriminator for the normal illumination image; />The expected value of the global arbiter output is performed for the illumination enhanced image.
The calculation formula of the local discriminator is as follows:
wherein, C is a discriminator network;representing a sigmoid activation function; />Is the distribution of the normal illumination image; />Enhancing the distribution of the image for illumination; />Calculating local discriminant loss of the normal illumination image relative to the illumination enhancement image; />To calculate the local discriminant loss of the illumination enhancement image relative to the normal illumination image.
Specifically, in this embodiment, the global arbiter is used to evaluate the authenticity of the entire image, i.e., to determine whether the image with enhanced illumination is realistically similar to the normal illumination image. The method accepts an input image and outputs a scalar value representing the probability that the input image (image illumination enhanced image) is discriminated as a normal illumination image, and a global discriminator is usually a convolutional neural network, extracts the integral characteristics of the image through operations such as a convolutional layer, a pooling layer and a full-connection layer, and distinguishes the normal illumination image from the image illumination enhanced image.
The role of the local discriminant is to evaluate how well the quality of the image illumination-enhanced image improves relative to the normal illumination image. The local discriminant is typically a convolutional neural network, but unlike the global discriminant, it focuses on subtle differences and improvements between the normal illumination image and the image illumination-enhanced image.
In order to verify the image illumination enhancement recovery effect between the method and the traditional algorithm LIME, the unsupervised EnlightenGAN, zero-DCE, SCL_LLE and SCI, the image recovery capability of each method is compared by referring to image quality evaluation indexes (peak signal to noise (PSNR) and Structural Similarity (SSIM)), and the comparison results of the image quality evaluation indexes are shown in table 1.
TABLE 1
Method species PSNR SSIM
LIME 17.51 0.73
EnlightenGAN 18.03 0.80
Zero-DCE 14.20 0.67
SCL-LLE 18.70 0.80
SCI 16.49 0.75
The method of the invention 19.67 0.82
The method has optimal PSNR value and SSIM value, which shows that the method can better keep the detailed information of the image and can control the influence caused by uneven illumination of the image to a certain extent.
Image illumination enhancement recovery is carried out on the same scene by selecting Input, LIME, enlightenGAN, zero-DCE, SCL_ LLE, SCI, GT and the method, and the specific recovery effect is shown in FIG. 3.
Referring to fig. 4, the embodiment of the present invention further provides an image illumination enhancement system with non-uniform brightness distribution, including:
the image acquisition module is used for acquiring low-illumination image data and normal-illumination image data;
the iterative training module is in communication connection with the image acquisition module and is used for carrying out iterative training on the generator network according to the low-illumination image data, distinguishing an overexposed region and an underexposed region in the low-illumination image data in each iterative training so as to carry out image illumination enhancement and generate illumination enhancement image data;
the comparison module is in communication connection with the image acquisition module and the iterative training module and is used for comparing the illumination enhancement image data generated by each iterative training of the generator network with the normal illumination image data by utilizing the identifier network; the method comprises the steps of,
and the final enhancement module is in communication connection with the image acquisition module and the comparison module and is used for iteratively optimizing the generator network according to each comparison result until the comparison result meets the condition, acquiring the corresponding final generator network at the moment and utilizing the final generator network to carry out illumination enhancement on the low illumination scene image to be detected.
Therefore, the invention utilizes the generator network to distinguish the overexposed area and the underexposed area in the low-illumination image data, and carries out image illumination enhancement, then utilizes the discriminator network to compare the illumination enhanced image data with the normal illumination image, and then iteratively optimizes the generator network according to each comparison result until the comparison result meets the condition, and finally utilizes the final generator network to carry out illumination enhancement on the low-illumination scene image to be detected. Therefore, the invention can accurately improve the image enhancement effect by distinguishing the overexposed area and the underexposed area in the low-light image to enhance the influence of light, and make the image more natural in color and light.
Specifically, the present embodiment corresponds to the foregoing method embodiments one by one, and the functions of each module are described in detail in the corresponding method embodiments, so that a detailed description is not given.
Based on the same inventive concept, the embodiments of the present application also provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements all or part of the method steps of the above method.
The present invention may be implemented by implementing all or part of the above-described method flow, or by instructing the relevant hardware by a computer program, which may be stored in a computer readable storage medium, and which when executed by a processor, may implement the steps of the above-described method embodiments. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable medium can be appropriately increased or decreased according to the requirements of the jurisdiction's jurisdiction and the patent practice, for example, in some jurisdictions, the computer readable medium does not include electrical carrier signals and telecommunication signals according to the jurisdiction and the patent practice.
Based on the same inventive concept, the embodiments of the present application further provide an electronic device, including a memory and a processor, where the memory stores a computer program running on the processor, and when the processor executes the computer program, the processor implements all or part of the method steps in the above method.
The processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, the processor being a control center of the computer device, and the various interfaces and lines connecting the various parts of the overall computer device.
The memory may be used to store computer programs and/or modules, and the processor implements various functions of the computer device by running or executing the computer programs and/or modules stored in the memory, and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (e.g., a sound playing function, an image playing function, etc.); the storage data area may store data (e.g., audio data, video data, etc.) created according to the use of the handset. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, server, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, magnetic disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), servers and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (9)

1. An image illumination enhancement method with uneven brightness distribution is characterized by comprising the following steps:
acquiring low-light image data and normal-light image data;
performing iterative training on a generator network according to the low-illumination image data, distinguishing an overexposed region and an underexposed region in the low-illumination image data in each iterative training, so as to perform image illumination enhancement, and generating illumination enhancement image data;
comparing the illumination enhancement image data generated by each iterative training of the generator network with the normal illumination image data by utilizing a discriminator network;
and iteratively optimizing the generator network according to each comparison result until the comparison result meets the condition, acquiring a corresponding final generator network at the moment, and utilizing the final generator network to carry out illumination enhancement on the low illumination scene image to be detected.
2. The method for enhancing image illumination of uneven brightness distribution according to claim 1, wherein said step of differentiating between overexposed and underexposed areas in the low-light image data for image illumination enhancement in each iterative training, generating the illumination enhanced image data comprises the steps of:
the generator network comprises a U-Net generator and a color distribution pyramid network;
normalizing one training image in the low-illumination image data;
carrying out feature extraction on one of the training images after normalization processing by using an encoder in the U-Net generator, and carrying out image size progressive decrease on the image after feature extraction through downsampling to obtain a multi-scale feature image;
generating a multi-scale color distribution map according to one training image which is not subjected to normalization processing by utilizing a color distribution pyramid network so as to distinguish an overexposed area and an underexposed area in the image;
and respectively selecting the feature images with the same size from the multi-scale feature images and the multi-scale color distribution map by using a decoder in the U-Net generator to perform image fusion.
3. The method for enhancing image illumination with uneven brightness distribution according to claim 2, wherein the calculation formula of the color distribution pyramid network is as follows:
wherein i, j, c are the horizontal, vertical and channel indices of the image, respectively; b is the index of the color histogram bin;for returning pixels +.>Index of pixels in the K x K size block to which it belongs.
4. The method for enhancing image illumination with uneven brightness distribution according to claim 1, wherein said step of comparing said illumination enhanced image data generated by each iterative training of a generator network with said normal illumination image data by using a discriminator network comprises the steps of:
the discriminator network comprises a global discriminator and a local discriminator;
evaluating the similarity between the illumination enhancement image data and the normal illumination image data generated by each iterative training by using the global discriminator;
and evaluating the quality improvement degree of the illumination enhancement image data generated by each iteration training relative to the normal illumination image data by using the local discriminant.
5. The method for enhancing illumination of an image with uneven brightness distribution according to claim 4, wherein the calculation formula of the global arbiter is as follows:
wherein D (x, y) is the probability that the normal illumination image y is judged to be true by the global discriminator; d (x, G (x, z)) is the probability that the illumination enhanced image G (x, z) is discriminated as true;the expected value is output by a global discriminator for the normal illumination image; />The expected value of the global arbiter output is performed for the illumination enhanced image.
6. The method for enhancing image illumination with uneven brightness distribution according to claim 1, wherein the calculation formula of said local discriminator is as follows:
wherein, C is a discriminator network;representing a sigmoid activation function; />Is the distribution of the normal illumination image; />Enhancing the distribution of the image for illumination; />Calculating local discriminant loss of the normal illumination image relative to the illumination enhancement image; />To calculate the local discriminant loss of the illumination enhancement image relative to the normal illumination image.
7. An image illumination enhancement system having a non-uniform luminance distribution, comprising:
the image acquisition module is used for acquiring low-illumination image data and normal-illumination image data;
the iterative training module is in communication connection with the image acquisition module and is used for carrying out iterative training on the generator network according to the low-illumination image data, distinguishing an overexposed region and an underexposed region in the low-illumination image data in each iterative training so as to carry out image illumination enhancement and generate illumination enhancement image data;
the comparison module is in communication connection with the image acquisition module and the iterative training module and is used for comparing the illumination enhancement image data generated by each iterative training of the generator network with the normal illumination image data by utilizing the identifier network; the method comprises the steps of,
and the final enhancement module is in communication connection with the comparison module and is used for iteratively optimizing the generator network according to each comparison result until the comparison result meets the condition, obtaining the corresponding final generator network at the moment and utilizing the final generator network to carry out illumination enhancement on the low-illumination scene image to be detected.
8. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the image illumination enhancement method of uneven brightness distribution according to any of claims 1 to 6.
9. An electronic device comprising a storage medium, a processor and a computer program stored in the storage medium and executable on the processor, characterized in that the processor implements the method of image illumination enhancement with non-uniform brightness distribution according to any of claims 1 to 6 when the computer program is run by the processor.
CN202311621045.6A 2023-11-30 2023-11-30 Image illumination enhancement method, system, medium and device with uneven brightness distribution Pending CN117333401A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311621045.6A CN117333401A (en) 2023-11-30 2023-11-30 Image illumination enhancement method, system, medium and device with uneven brightness distribution

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311621045.6A CN117333401A (en) 2023-11-30 2023-11-30 Image illumination enhancement method, system, medium and device with uneven brightness distribution

Publications (1)

Publication Number Publication Date
CN117333401A true CN117333401A (en) 2024-01-02

Family

ID=89293768

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311621045.6A Pending CN117333401A (en) 2023-11-30 2023-11-30 Image illumination enhancement method, system, medium and device with uneven brightness distribution

Country Status (1)

Country Link
CN (1) CN117333401A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051428A (en) * 2023-03-31 2023-05-02 南京大学 Deep learning-based combined denoising and superdivision low-illumination image enhancement method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051428A (en) * 2023-03-31 2023-05-02 南京大学 Deep learning-based combined denoising and superdivision low-illumination image enhancement method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HAOYUAN WANG: ""Local Color Distributions Prior for Image Enhancement"", 《ECCV 2022: COMPUTER VISION》, pages 343 - 359 *
JIANG Y: ""EnlightenGAN: Deep Light Enhancement without Paired Supervision"", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》, pages 2340 - 2349 *
JIARUN FU: ""Low-light image enhancement base on brightness attention mechanism generative adversarial networks"", 《MULTIMEDIA TOOLS AND APPLICATIONS》, pages 1 - 25 *
小小将: ""炫酷的图像转换:从pix2pix到CycleGAN"", pages 1 - 6, Retrieved from the Internet <URL:https://zhuanlan.zhihu.com/p/93219297> *

Similar Documents

Publication Publication Date Title
US10666873B2 (en) Exposure-related intensity transformation
CN110473185B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN111292264A (en) Image high dynamic range reconstruction method based on deep learning
CN111915505B (en) Image processing method, device, electronic equipment and storage medium
CN108234858B (en) Image blurring processing method and device, storage medium and electronic equipment
CN110443766B (en) Image processing method and device, electronic equipment and readable storage medium
WO2012170462A2 (en) Automatic exposure correction of images
CN113674303B (en) Image processing method, device, electronic equipment and storage medium
CN110838088B (en) Multi-frame noise reduction method and device based on deep learning and terminal equipment
WO2023125750A1 (en) Image denoising method and apparatus, and storage medium
CN109819176A (en) A kind of image pickup method, system, device, electronic equipment and storage medium
CN113177438A (en) Image processing method, apparatus and storage medium
CN111953893B (en) High dynamic range image generation method, terminal device and storage medium
CN116645527A (en) Image recognition method, system, electronic device and storage medium
CN117152182B (en) Ultralow-illumination network camera image processing method and device and electronic equipment
CN113989387A (en) Camera shooting parameter adjusting method and device and electronic equipment
CN116843561A (en) Image enhancement method and device
CN111953888B (en) Dim light imaging method and device, computer readable storage medium and terminal equipment
CN117333401A (en) Image illumination enhancement method, system, medium and device with uneven brightness distribution
CN112950509B (en) Image processing method and device and electronic equipment
CN115696027A (en) Photographing method, photographing apparatus, controller, device, and computer-readable storage medium
CN112087556B (en) Dark light imaging method and device, readable storage medium and terminal equipment
CN112561847B (en) Image processing method and device, computer readable medium and electronic equipment
CN116962890B (en) Processing method, device, equipment and storage medium of point cloud image
CN115086566B (en) Picture scene detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination