CN116823677B - Image enhancement method and device, storage medium and electronic equipment - Google Patents

Image enhancement method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN116823677B
CN116823677B CN202311089413.7A CN202311089413A CN116823677B CN 116823677 B CN116823677 B CN 116823677B CN 202311089413 A CN202311089413 A CN 202311089413A CN 116823677 B CN116823677 B CN 116823677B
Authority
CN
China
Prior art keywords
pixel
value
image
parameter
enhanced
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311089413.7A
Other languages
Chinese (zh)
Other versions
CN116823677A (en
Inventor
贲圣兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ainnovation Nanjing Technology Co ltd
Original Assignee
Ainnovation Nanjing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ainnovation Nanjing Technology Co ltd filed Critical Ainnovation Nanjing Technology Co ltd
Priority to CN202311089413.7A priority Critical patent/CN116823677B/en
Publication of CN116823677A publication Critical patent/CN116823677A/en
Application granted granted Critical
Publication of CN116823677B publication Critical patent/CN116823677B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

Some embodiments of the present application provide a method, apparatus, storage medium and electronic device for image enhancement, where the method includes: determining correction parameters corresponding to wave peaks in a histogram of illumination components of an image to be enhanced; obtaining brightness conversion values of all the pixel points according to the correction parameters and the distances of all the pixel points in the illumination component, wherein the distances of all the pixel points are the distances from all the pixel points to the central pixel point of each wave crest; and converting the initial brightness value of each pixel point in the illumination component into the brightness conversion value of each pixel point to obtain an enhanced illumination component, wherein the enhanced illumination component is used for being fused with the reflection component of the image to be enhanced to obtain a target enhanced image. Some embodiments of the present application may enhance the illumination enhancement effect of an image with uneven illumination.

Description

Image enhancement method and device, storage medium and electronic equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and apparatus for image enhancement, a storage medium, and an electronic device.
Background
In the image capturing process, due to the influences of factors such as weather, time and shooting angle, insufficient ambient illumination can be caused, the obtained image is too dark in illumination, and the visual perception of observation is reduced.
At present, the existing method for enhancing the low-illumination image is to directly process the gray value of the image and improve the brightness of a dark area through a related algorithm. However, in the prior art, the whole image is mapped, and noise is amplified while the brightness of the image is improved, so that phenomena such as color distortion and detail abnormality of the image are easy to occur, and the image enhancement effect is poor.
Therefore, how to provide a method for enhancing an image with a better enhancement effect is a technical problem to be solved.
Disclosure of Invention
The application aims to provide an image enhancement method, an image enhancement device, a storage medium and electronic equipment.
In a first aspect, some embodiments of the present application provide a method of image enhancement, comprising: determining correction parameters corresponding to wave peaks in a histogram of illumination components of an image to be enhanced; obtaining brightness conversion values of all the pixel points according to the correction parameters and the distances of all the pixel points in the illumination component, wherein the distances of all the pixel points are the distances from all the pixel points to the central pixel point of each wave crest; and converting the initial brightness value of each pixel point in the illumination component into the brightness conversion value of each pixel point to obtain an enhanced illumination component, wherein the enhanced illumination component is used for being fused with the reflection component of the image to be enhanced to obtain a target enhanced image.
According to some embodiments of the application, each pixel brightness conversion value is obtained through determining each correction parameter of the illumination component of the image to be enhanced and the pixel distance in the illumination component, and finally, the initial brightness value of each pixel in the illumination component is converted into each pixel brightness conversion value, so that the target enhanced image is obtained. The embodiment of the application can realize the enhancement of the image, and has better enhancement effect.
In some embodiments, determining correction parameters corresponding to peaks in a histogram of an illumination component of an image to be enhanced includes: acquiring a histogram of the illumination component, and solving a starting point and an ending point of each wave crest in the histogram; calculating a central pixel brightness average value between the starting point and the ending point of each wave crest, wherein the central pixel brightness average value is the central pixel point; and determining each correction parameter based on the central pixel brightness average value.
Some embodiments of the application obtain correction parameters by solving the central pixel brightness average value between the starting point and the ending point of each wave crest, and provide data support for subsequent image enhancement.
In some embodiments, the determining the correction parameters based on the center pixel luminance average includes: acquiring gray probability offset of each wave crest; and solving an exponential function which takes the gray probability offset of each wave crest as a base number and takes the parameter related to the brightness average value of the central pixel as an index to obtain each correction parameter.
According to some embodiments of the application, each correction parameter is obtained by calculating the gray probability offset and the center pixel brightness average value, so that the calculation is simple and convenient, and the accuracy is high.
In some embodiments, the obtaining the gray probability offset of each peak includes: calculating the gray probability density of each wave crest; and solving the ratio of the gray probability density to the reference density to obtain the gray probability offset.
Some embodiments of the application obtain the gray probability offset through the gray probability density and the reference density, and the calculation is simple and convenient.
In some embodiments, the obtaining the brightness conversion value of each pixel through each correction parameter and each pixel distance in the illumination component includes: obtaining the distance from each pixel point in the illumination component to the peak center of each peak, and obtaining the distance from each pixel point; and weighting the correction parameters by using the distances of the pixels to obtain the brightness conversion value of each pixel.
According to some embodiments of the application, the pixel brightness conversion value is obtained through the pixel distance and the correction parameter, and effective data support is provided for subsequent image enhancement.
In some embodiments, the weighting the correction parameters by using the distances between the pixels to obtain the brightness transformation values of the pixels includes: solving an exponential function taking the ratio of the distance of each pixel point to the maximum gray value as a base number and taking each correction parameter as an index to obtain a first parameter; multiplying the first parameter by the transformation weight of each wave crest to obtain a second parameter; and determining the ratio of the result of weighting the second parameter to the result of weighting the conversion weight of each wave crest, and multiplying the ratio by the maximum gray value to obtain the brightness conversion value of each pixel.
Some embodiments of the present application may achieve effective enhancement of an image by obtaining luminance transformation values for each pixel.
In some embodiments, the transformation weights for the respective peaks are obtained by: determining the difference value between the distance of each pixel point and the central pixel brightness average value of each wave crest; taking the square value of the ratio of the difference value to the intermediate gray value as a third parameter; and solving an exponential function taking e as a base number and taking the negative number of the third parameter as an index to obtain the transformation weight of each wave crest.
Some embodiments of the present application may subsequently implement effective enhancement of the image by obtaining the transform weights for each peak.
In a second aspect, some embodiments of the present application provide an apparatus for image enhancement, comprising: the determining module is used for determining correction parameters corresponding to wave peaks in the histogram of the illumination component of the image to be enhanced; the processing module is used for obtaining the brightness conversion value of each pixel point through each correction parameter and each pixel point distance in the illumination component, wherein each pixel point distance is the distance from each pixel point to the central pixel point of each peak; the conversion module is used for converting the initial brightness value of each pixel point in the illumination component into the brightness conversion value of each pixel point to obtain an enhanced illumination component, wherein the enhanced illumination component is used for being fused with the reflection component of the image to be enhanced to obtain a target enhanced image.
In a third aspect, some embodiments of the application provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs a method according to any of the embodiments of the first aspect.
In a fourth aspect, some embodiments of the application provide an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor is operable to implement a method according to any of the embodiments of the first aspect when executing the program.
In a fifth aspect, some embodiments of the application provide a computer program product comprising a computer program, wherein the computer program, when executed by a processor, is adapted to carry out the method according to any of the embodiments of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of some embodiments of the present application, the drawings that are required to be used in some embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be construed as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort to those of ordinary skill in the art.
FIG. 1 is a system diagram of image enhancement provided by some embodiments of the application;
FIG. 2 is one of the flow charts of the method of image enhancement provided by some embodiments of the present application;
FIG. 3 is a second flowchart of a method for image enhancement according to some embodiments of the present application;
FIG. 4 is a block diagram of an apparatus for image enhancement according to some embodiments of the present application;
fig. 5 is a schematic diagram of an electronic device according to some embodiments of the present application.
Detailed Description
The technical solutions of some embodiments of the present application will be described below with reference to the drawings in some embodiments of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
In the related art, in the capturing process of an image, due to the influence of factors such as weather, time, shooting angle and the like, the environment illumination is insufficient, the obtained image is too dark in illumination, the visual perception of observation is reduced, and the processing effect of subsequent visual analysis such as target detection and semantic segmentation is influenced.
The low-illumination image enhancement algorithm in the prior art enhances the low-brightness area by using an image transformation or learning-based method, can effectively improve the visual quality of the image, can be used as a preprocessing algorithm, and improves the performance and stability of a subsequent visual analysis algorithm. The research in this field is of great importance for the application of computer vision technology. Existing low-light image enhancement algorithms can be based on distributed mapping methods and model-based methods.
The gray value of the image is directly processed based on a distribution mapping method, and smaller pixel values are mapped to a larger dynamic range by using methods such as histogram equalization, curve transformation and the like, so that the brightness of a dark light area is improved. The method performs integral mapping on the image, does not distinguish semantic factors and illumination factors of the image, and amplifies noise while improving the brightness of the image, so that phenomena such as color distortion, detail abnormality and the like often occur. The core of the model-based approach is retinal cortex theory (Retinal Cortex Theory, retinex), which represents the image as the product of the illumination component and the reflection component. The reflection component is determined by the reflection characteristic of the object and is irrelevant to illumination intensity, so that the semantic content of the image can be kept unchanged under the condition of enhancing the illumination of the image by enhancing the illumination component of the image and keeping the reflection component. The manner in which the illumination component is enhanced can also be divided into a conventional method and a learning-based method. Conventional gamma conversion uses fixed conversion coefficients and cannot accommodate the enhancement requirements of different images. Adaptive gamma conversion has been proposed by researchers to adaptively calculate gamma conversion coefficients based on pixel values and average brightness of an image, and such conversion algorithms use global information of the image, but have poor effects on unbalanced images where bright and dark regions exist at the same time. In recent years, researchers have also used deep learning networks to automatically learn illumination transformation functions based on pairs of low-illumination-normal illumination data sets. The illumination transformation mode learned by the deep learning method is related to the training set, and for the illumination scene which does not appear in the training set, an unreal transformation effect is easy to generate.
As known from the related art, the conventional low-light enhancement algorithm is prone to artifacts and unreal in visual effect; for the image with unbalanced illumination, the illumination enhancement effect is poor, and the over-bright or over-dark condition is easy to occur. The low illumination enhancement algorithm based on deep learning is characterized in that the learned illumination transformation mode is related to a training set, and unreal enhancement effects are easy to generate for low illumination scenes which do not appear in the training set.
In view of this, some embodiments of the present application provide a method for enhancing an image, which determines correction parameters corresponding to peaks in a histogram corresponding to an illumination component in an image to be enhanced, and obtains pixel brightness transformation values according to the correction parameters and pixel distances in the illumination component. And finally, converting the initial brightness value of each pixel point in the illumination component into a brightness conversion value of each pixel point to obtain the enhanced target enhanced image. According to some embodiments of the application, the image enhancement effect can be improved, so that the enhancement effect is more real. In addition, the application enhances the illumination information of the image and does not change the semantic content of the image.
The overall composition of the image enhancement system provided by some embodiments of the present application is described below by way of example with reference to fig. 1.
As shown in fig. 1, some embodiments of the present application provide an image enhancement system including: a decomposition module 110, an enhancement module 120, and a fusion module 130. The decomposition module 110 is used for inputting an image to be enhancedIDecomposing to obtain illumination componentLAnd a reflection componentR. Then, the enhancement module 120 may perform enhancement processing on the illumination component to obtain an enhanced illumination componentL r . Finally, the fusion module 130 fuses the enhanced illumination component and the reflection component, and outputs an enhanced illumination enhanced imageI r (as a specific example of a target enhanced image).
In some embodiments of the present application, the image to be enhanced may be any type of image captured under insufficient ambient light, and embodiments of the present application are not limited herein. The decomposition module 110 may decompose the image to be enhanced based on the Retinex model. It is understood that the decomposition module 110 or the fusion module 130 may be an image algorithm model with image decomposition or fusion, and embodiments of the present application are not specifically limited herein.
An implementation of image enhancement performed by enhancement module 120 provided by some embodiments of the present application is described below by way of example in conjunction with fig. 2.
Referring to fig. 2, fig. 2 is a flowchart of a method for enhancing an image according to some embodiments of the present application, where the method for enhancing an image includes:
s210, determining correction parameters corresponding to wave peaks in a histogram of the illumination component of the image to be enhanced.
For example, in some embodiments of the present application, after the image to be enhanced is decomposed into an illumination component and a reflection component by the decomposition module 110, the illumination component is subjected to enhancement processing.
The implementation of S210 is exemplarily set forth below.
S211, acquiring a histogram of the illumination component, and solving a starting point and an ending point of each wave crest in the histogram.
For example, in some embodiments of the application, first a histogram H [ s ] of illumination components is counted]S=0,..255. Then, the peak of the histogram is obtained:peak i =[h i1 ,h i2 ],h i1 andh i2 respectively the ith wave peakpeak i Is a start point and an end point of the (c). Where i is a positive integer and the ith peak is any one of the peaks.
S212, calculating a central pixel brightness average value between the starting point and the ending point of each wave crest, wherein the central pixel brightness average value is the central pixel point.
For example, in some embodiments of the present application, the luminance average of the pixels within the ith peak (as one specific example of the center pixel luminance average) is taken as the center of the peak.
Specifically, the luminance average value is obtained by the following formulam i
Where j is the gray value of the illumination component.
S213, determining each correction parameter based on the central pixel brightness average value.
For example, in some embodiments of the present application, the gamma correction parameter of the i-th peak (as a specific example of each correction parameter) may be obtained from the luminance average of the i-th peak.
In some embodiments of the present application, S213 may include:
s2131, obtaining gray probability offsets of the peaks.
In some embodiments of the application, S2131 may include: calculating the gray probability density of each wave crest; and solving the ratio of the gray probability density to the reference density to obtain the gray probability offset.
For example, in some embodiments of the application, byh i1 Andh i2 the average probability density of the ith peak can be calculatedmp i (as a specific example of the gray probability density of each peak).
Specifically, the average probability density of the ith peak is obtained by the following formulamp i
Where W, T are the width and height, respectively, of the image to be enhanced.
The degree of deviation of the gray probability density corresponding to the ith peak from the average density can be obtained through the average probability density of the ith peakα i (as a specific example of gray probability offset).
Specifically, the method is obtained by the following formulaα i
S2132, solving an exponential function taking the gray probability offset of each wave crest as a base number and taking parameters related to the brightness average value of the central pixel as an index to obtain each correction parameter.
For example, in some embodiments of the present application, the gamma correction parameter of the ith peak may be obtained by the average probability density of the ith peak and the degree to which the gray probability density corresponding to the ith peak deviates from the average density
Specifically, the gamma correction parameter of the ith peak is obtained by the following formula
S220, obtaining brightness conversion values of all the pixel points through all the correction parameters and all the pixel point distances in the illumination component, wherein all the pixel point distances are distances from all the pixel points to the central pixel point of each wave crest.
For example, in some embodiments of the present application, the ith pixel intensity transform value may be obtained from the ith peak gamma correction parameter and the ith pixel distance. The s-th pixel is any pixel in the illumination component, s=0.
In some embodiments of the present application, S220 may include: solving an exponential function taking the ratio of the distance of each pixel point to the maximum gray value as a base number and taking each correction parameter as an index to obtain a first parameter; multiplying the first parameter by the transformation weight of each wave crest to obtain a second parameter; and determining the ratio of the result of weighting the second parameter to the result of weighting the conversion weight of each wave crest, and multiplying the ratio by the maximum gray value to obtain the brightness conversion value of each pixel.
For example, in some embodiments of the present application, an exponential function based on the ratio of the distance of the s-th pixel to the maximum gray value, where the gamma correction parameter of the i-th peak is an exponent, is solved to obtain the first parameter. Multiplying the first parameter by the conversion weight of the ith wave crest to obtain a second parameter. And determining the ratio of the result of weighting the second parameter to the result of weighting the conversion weight of the ith wave crest, and multiplying the ratio by the maximum gray value to obtain the brightness conversion value of the ith pixel.
Specifically, the luminance conversion value of the s-th pixel point is obtained by the following formulafs):
Wherein, the first parameter is:the second parameter is: />w i The weight of gamma transformation corresponding to the ith peak (i.e. the transformation weight of the ith peak), and M is the number of peaks.
In some embodiments of the present application, the transform weights of the peaks are obtained by: determining the difference value between the distance of each pixel point and the central pixel brightness average value of each wave crest; taking the square value of the ratio of the difference value to the intermediate gray value as a third parameter; and solving an exponential function taking e as a base number and taking the negative number of the third parameter as an index to obtain the transformation weight of each wave crest.
Specifically, the weight of gamma conversion corresponding to the ith peak is obtained by the following formulaw i
Wherein, the third parameter is:128 is the intermediate gray value. As can be seen from the above-mentioned formula,w i from the following componentssTo the point ofm i The farther the distance is, the smaller the influence of the gray-scale concentration area on the pixel is, and the smaller the corresponding weight is. Wherein e is a natural constant in mathematical computation and is a base of a natural number logarithmic function.
S230, converting the initial brightness value of each pixel point in the illumination component into the brightness conversion value of each pixel point, and obtaining an enhanced illumination component, wherein the enhanced illumination component is used for being fused with the reflection component of the image to be enhanced, so as to obtain a target enhanced image.
For example, in some embodiments of the present application, each pixel in the illumination component is enhanced by using any pixel as described above, so as to obtain the enhanced illumination component. Finally, the fusion module 130 may fuse the enhanced illumination component and the reflection component to obtain an illumination enhanced image.
The specific process of image enhancement provided by some embodiments of the present application is described below by way of example in conjunction with fig. 3.
Referring to fig. 3, fig. 3 is a flowchart illustrating a method for enhancing an image according to some embodiments of the present application.
The above-described process is exemplarily set forth below.
S310, the decomposition module 110 decomposes the image to be enhanced to obtain illumination components and reflection classifications.
S320, the enhancement module 120 acquires a histogram of the illumination component.
S330, the enhancement module 120 determines correction parameters corresponding to the peaks in the histogram of the illumination component.
S340, the enhancement module 120 obtains the peak center distance from each pixel point to each peak in the illumination component, and obtains the distance between each pixel point.
S350, the enhancement module 120 weights each correction parameter by using each pixel distance to obtain each pixel brightness transformation value.
S360, the enhancement module 120 converts the initial brightness value of each pixel point in the illumination component into the brightness conversion value of each pixel point, and acquires the enhanced illumination component.
And S370, the fusion module 130 fuses the enhanced illumination component and the reflection component to obtain an illumination enhanced image.
It should be noted that, the specific implementation process of S310 to S370 may refer to the method embodiments provided above, and detailed descriptions are omitted here appropriately to avoid repetition.
Referring to fig. 4, fig. 4 is a block diagram illustrating an apparatus for image enhancement according to some embodiments of the present application. It should be understood that the image enhancement apparatus corresponds to the above method embodiments, and is capable of performing the steps involved in the above method embodiments, and specific functions of the image enhancement apparatus may be referred to the above description, and detailed descriptions thereof are omitted herein as appropriate to avoid redundancy.
The image enhancement device of fig. 4 includes at least one software functional module that can be stored in memory in the form of software or firmware or cured in the image enhancement device, the image enhancement device comprising: a determining module 410, configured to determine correction parameters corresponding to peaks in a histogram of an illumination component of an image to be enhanced; the processing module 420 is configured to obtain a brightness conversion value of each pixel according to the correction parameters and the distances between each pixel in the illumination component, where the distances between each pixel are the distances between each pixel and the center pixel of each peak; the conversion module 430 is configured to convert the initial brightness value of each pixel in the illumination component into the brightness conversion value of each pixel, and obtain an enhanced illumination component, where the enhanced illumination component is used to be fused with the reflection component of the image to be enhanced, so as to obtain a target enhanced image.
In some embodiments of the application, the determining module 410 obtains a histogram of the illumination component and solves for a starting point and an ending point of each peak in the histogram; calculating a central pixel brightness average value between the starting point and the ending point of each wave crest, wherein the central pixel brightness average value is the central pixel point; and determining each correction parameter based on the central pixel brightness average value.
In some embodiments of the present application, the determining module 410 obtains gray probability offsets of the peaks; and solving an exponential function which takes the gray probability offset of each wave crest as a base number and takes the parameter related to the brightness average value of the central pixel as an index to obtain each correction parameter.
In some embodiments of the application, the determining module 410 calculates gray probability densities for the respective peaks; and solving the ratio of the gray probability density to the reference density to obtain the gray probability offset.
In some embodiments of the present application, the processing module 420 is configured to obtain a peak center distance from each pixel point in the illumination component to each peak, so as to obtain the distance between each pixel point; and weighting the correction parameters by using the distances of the pixels to obtain the brightness conversion value of each pixel.
In some embodiments of the present application, the processing module 420 is configured to solve an exponential function based on a ratio of the distance between the pixels to a maximum gray value, where each correction parameter is an exponent, to obtain a first parameter; multiplying the first parameter by the transformation weight of each wave crest to obtain a second parameter; and determining the ratio of the result of weighting the second parameter to the result of weighting the conversion weight of each wave crest, and multiplying the ratio by the maximum gray value to obtain the brightness conversion value of each pixel.
In some embodiments of the present application, the processing module 420 is configured to determine a difference between the distance between the pixel points and the average value of brightness of the center pixel of each peak; taking the square value of the ratio of the difference value to the intermediate gray value as a third parameter; and solving an exponential function taking e as a base number and taking the negative number of the third parameter as an index to obtain the transformation weight of each wave crest.
It will be clear to those skilled in the art that, for convenience and brevity of description, reference may be made to the corresponding procedure in the foregoing method for the specific working procedure of the apparatus described above, and this will not be repeated here.
Some embodiments of the present application also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the operations of the method according to any of the above-described methods provided by the above-described embodiments.
Some embodiments of the present application also provide a computer program product, where the computer program product includes a computer program, where the computer program when executed by a processor may implement operations of a method corresponding to any of the above embodiments of the above method provided by the above embodiments.
As shown in fig. 5, some embodiments of the present application provide an electronic device 500, the electronic device 500 comprising: memory 510, processor 520, and a computer program stored on memory 510 and executable on processor 520, wherein processor 520 may implement a method as in any of the embodiments described above when reading the program from memory 510 and executing the program via bus 530.
Processor 520 may process the digital signals and may include various computing structures. Such as a complex instruction set computer architecture, a reduced instruction set computer architecture, or an architecture that implements a combination of instruction sets. In some examples, processor 520 may be a microprocessor.
Memory 510 may be used for storing instructions to be executed by processor 520 or data related to execution of the instructions. Such instructions and/or data may include code to implement some or all of the functions of one or more of the modules described in embodiments of the present application. The processor 520 of the disclosed embodiments may be configured to execute instructions in the memory 510 to implement the methods shown above. Memory 510 includes dynamic random access memory, static random access memory, flash memory, optical memory, or other memory known to those skilled in the art.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and variations will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.

Claims (7)

1. A method of image enhancement, comprising:
determining correction parameters corresponding to wave peaks in a histogram of illumination components of an image to be enhanced;
obtaining brightness conversion values of all the pixel points according to the correction parameters and the distances of all the pixel points in the illumination component, wherein the distances of all the pixel points are the distances from all the pixel points to the central pixel point of each wave crest;
converting the initial brightness value of each pixel point in the illumination component into the brightness conversion value of each pixel point to obtain an enhanced illumination component, wherein the enhanced illumination component is used for being fused with the reflection component of the image to be enhanced to obtain a target enhanced image;
wherein the obtaining the brightness conversion value of each pixel point through each correction parameter and each pixel point distance in the illumination component comprises the following steps:
obtaining the distance from each pixel point in the illumination component to the peak center of each peak, and obtaining the distance from each pixel point; weighting the correction parameters by using the distances of the pixel points to obtain brightness conversion values of the pixel points;
the weighting the correction parameters by using the distances of the pixels to obtain the brightness conversion value of each pixel includes: solving an exponential function taking the ratio of the distance of each pixel point to the maximum gray value as a base number and taking each correction parameter as an index to obtain a first parameter; multiplying the first parameter by the transformation weight of each wave crest to obtain a second parameter; determining the ratio of the result of weighting the second parameter to the result of weighting the conversion weight of each wave crest, and multiplying the ratio by the maximum gray value to obtain the brightness conversion value of each pixel;
the transformation weight of each wave crest is obtained by the following method: determining the difference value between the distance of each pixel point and the central pixel brightness average value of each wave crest; taking the square value of the ratio of the difference value to the intermediate gray value as a third parameter; and solving an exponential function taking e as a base number and taking the negative number of the third parameter as an index to obtain the transformation weight of each wave crest.
2. The method of claim 1, wherein determining correction parameters corresponding to peaks in a histogram of an illumination component of the image to be enhanced comprises:
acquiring a histogram of the illumination component, and solving a starting point and an ending point of each wave crest in the histogram;
calculating the central pixel brightness average value between the starting point and the ending point of each wave crest;
determining the correction parameters based on the central pixel brightness average value;
wherein the center pixel brightness average value is obtained by the following methodm i
Wherein j is the gray level of the illumination componentThe value of the sum of the values,h i1 andh i2 respectively the ith wave peakpeak i Starting and ending points of H [ j ]]Is a histogram of the illumination component.
3. The method of claim 2, wherein said determining said correction parameters based on said center pixel luminance average comprises:
acquiring gray probability offset of each wave crest;
solving an exponential function taking the gray probability offset of each wave crest as a base number and taking parameters related to the brightness average value of the central pixel as an index to obtain each correction parameter;
the parameters related to the brightness average value of the central pixel are as follows:m i and the brightness average value of the central pixel.
4. A method as claimed in claim 3, wherein said obtaining gray probability offsets for said respective peaks comprises:
calculating the gray probability density of each wave crest;
and solving the ratio of the gray probability density to the reference density to obtain the gray probability offset.
5. An apparatus for image enhancement, comprising:
the determining module is used for determining correction parameters corresponding to wave peaks in the histogram of the illumination component of the image to be enhanced;
the processing module is used for obtaining the brightness conversion value of each pixel point through each correction parameter and each pixel point distance in the illumination component, wherein each pixel point distance is the distance from each pixel point to the central pixel point of each peak;
the conversion module is used for converting the initial brightness value of each pixel point in the illumination component into the brightness conversion value of each pixel point to obtain an enhanced illumination component, wherein the enhanced illumination component is used for being fused with the reflection component of the image to be enhanced to obtain a target enhanced image;
the processing module is used for obtaining the distance from each pixel point in the illumination component to the peak center of each peak to obtain the distance between each pixel point; weighting the correction parameters by using the distances of the pixel points to obtain brightness conversion values of the pixel points;
the processing module is used for solving an exponential function taking the ratio of the distance of each pixel point to the maximum gray value as a base number and taking each correction parameter as an index to obtain a first parameter; multiplying the first parameter by the transformation weight of each wave crest to obtain a second parameter; determining the ratio of the result of weighting the second parameter to the result of weighting the conversion weight of each wave crest, and multiplying the ratio by the maximum gray value to obtain the brightness conversion value of each pixel; determining the difference value between the distance of each pixel point and the central pixel brightness average value of each wave crest; taking the square value of the ratio of the difference value to the intermediate gray value as a third parameter; and solving an exponential function taking e as a base number and taking the negative number of the third parameter as an index to obtain the transformation weight of each wave crest.
6. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program, wherein the computer program when run by a processor performs the method according to any of claims 1-4.
7. An electronic device comprising a memory, a processor, and a computer program stored on the memory and running on the processor, wherein the computer program when run by the processor performs the method of any one of claims 1-4.
CN202311089413.7A 2023-08-28 2023-08-28 Image enhancement method and device, storage medium and electronic equipment Active CN116823677B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311089413.7A CN116823677B (en) 2023-08-28 2023-08-28 Image enhancement method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311089413.7A CN116823677B (en) 2023-08-28 2023-08-28 Image enhancement method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN116823677A CN116823677A (en) 2023-09-29
CN116823677B true CN116823677B (en) 2023-11-10

Family

ID=88114799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311089413.7A Active CN116823677B (en) 2023-08-28 2023-08-28 Image enhancement method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116823677B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117036204B (en) * 2023-10-09 2024-02-02 东莞市华复实业有限公司 Image quality enhancement method for visual interphone

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530847A (en) * 2013-09-24 2014-01-22 电子科技大学 Infrared image enhancing method
CN106780375A (en) * 2016-12-02 2017-05-31 南京邮电大学 A kind of image enchancing method under low-light (level) environment
CN107451969A (en) * 2017-07-27 2017-12-08 广东欧珀移动通信有限公司 Image processing method, device, mobile terminal and computer-readable recording medium
CN110580690A (en) * 2019-09-02 2019-12-17 杭州雄迈集成电路技术有限公司 image enhancement method for identifying peak value transformation nonlinear curve
CN110782400A (en) * 2019-09-12 2020-02-11 南宁师范大学 Self-adaptive uniform illumination realization method and device
CN112507930A (en) * 2020-12-16 2021-03-16 华南理工大学 Method for improving human face video heart rate detection by using illumination balancing method
CN112508809A (en) * 2020-11-27 2021-03-16 湖南傲英创视信息科技有限公司 Low-illumination image/video enhancement method and system
CN114693739A (en) * 2022-04-11 2022-07-01 中国矿业大学 Downhole drill rod counting method and device based on visual tracking algorithm
CN114820417A (en) * 2021-01-29 2022-07-29 深圳市万普拉斯科技有限公司 Image anomaly detection method and device, terminal device and readable storage medium
CN115409745A (en) * 2022-10-31 2022-11-29 深圳市亿康医疗技术有限公司 CT image enhancement method applied to radiotherapy preparation
CN116433525A (en) * 2023-04-18 2023-07-14 南京邮电大学 Underwater image defogging method based on edge detection function variation model

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040208385A1 (en) * 2003-04-18 2004-10-21 Medispectra, Inc. Methods and apparatus for visually enhancing images
EP1738198A4 (en) * 2004-02-13 2011-05-25 Technion Res & Dev Foundation Enhanced underwater imaging
CN111524071B (en) * 2020-04-24 2022-09-16 安翰科技(武汉)股份有限公司 Capsule endoscope image splicing method, electronic device and readable storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530847A (en) * 2013-09-24 2014-01-22 电子科技大学 Infrared image enhancing method
CN106780375A (en) * 2016-12-02 2017-05-31 南京邮电大学 A kind of image enchancing method under low-light (level) environment
CN107451969A (en) * 2017-07-27 2017-12-08 广东欧珀移动通信有限公司 Image processing method, device, mobile terminal and computer-readable recording medium
CN110580690A (en) * 2019-09-02 2019-12-17 杭州雄迈集成电路技术有限公司 image enhancement method for identifying peak value transformation nonlinear curve
CN110782400A (en) * 2019-09-12 2020-02-11 南宁师范大学 Self-adaptive uniform illumination realization method and device
CN112508809A (en) * 2020-11-27 2021-03-16 湖南傲英创视信息科技有限公司 Low-illumination image/video enhancement method and system
CN112507930A (en) * 2020-12-16 2021-03-16 华南理工大学 Method for improving human face video heart rate detection by using illumination balancing method
CN114820417A (en) * 2021-01-29 2022-07-29 深圳市万普拉斯科技有限公司 Image anomaly detection method and device, terminal device and readable storage medium
CN114693739A (en) * 2022-04-11 2022-07-01 中国矿业大学 Downhole drill rod counting method and device based on visual tracking algorithm
CN115409745A (en) * 2022-10-31 2022-11-29 深圳市亿康医疗技术有限公司 CT image enhancement method applied to radiotherapy preparation
CN116433525A (en) * 2023-04-18 2023-07-14 南京邮电大学 Underwater image defogging method based on edge detection function variation model

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Improving apple fruit firmness predictions by effective correction of multispectral scattering images;Yankun Peng等;《Postharvest Biology and Technology》;第41卷(第3期);第266-274页 *
Weak-Light Image Enhancement Method Based on Adaptive Local Gamma Transform and Color Compensation;Wencheng Wang等;《Journal of Sensors》;第2021卷;第1-18页 *
低照度图像增强算法研究;刘清;《中国优秀硕士学位论文全文数据库 信息科技辑》(第8期);第I138-746页 *
基于光线散射衰减模型的低照度图像增强;石吉豪等;《光学精密工程》;第31卷(第8期);第1244-1255页 *
基于机器视觉的热轧冷床区域钢板位置跟踪技术的研究与应用;王子;《中国优秀硕士学位论文全文数据库 工程科技Ⅰ辑》(第5期);第B022-1073页 *
基于自适应阈值投影的双行车牌分割方法;马永杰等;《计算机工程与科学》;第42卷(第9期);第1616-1624页 *

Also Published As

Publication number Publication date
CN116823677A (en) 2023-09-29

Similar Documents

Publication Publication Date Title
CN112837303A (en) Defect detection method, device, equipment and medium for mold monitoring
CN116823677B (en) Image enhancement method and device, storage medium and electronic equipment
CN111079764B (en) Low-illumination license plate image recognition method and device based on deep learning
CN112614136A (en) Infrared small target real-time instance segmentation method and device
CN113327206B (en) Image fuzzy processing method of intelligent power transmission line inspection system based on artificial intelligence
Wang et al. Low-light image enhancement based on deep learning: a survey
CN117218026B (en) Infrared image enhancement method and device
CN112036253B (en) Face key point positioning method based on deep learning
He et al. A night low‐illumination image enhancement model based on small probability area filtering and lossless mapping enhancement
CN115423723B (en) Self-adaptive railway wagon anomaly detection training set image enhancement method
US8311358B2 (en) Method and system for image extraction and identification
CN113989127A (en) Image contrast adjusting method, system, equipment and computer storage medium
CN116129417A (en) Digital instrument reading detection method based on low-quality image
Zhou et al. An improved algorithm using weighted guided coefficient and union self‐adaptive image enhancement for single image haze removal
CN113393394B (en) Low-illumination gray level image enhancement method and device based on gamma conversion and storage medium
CN115761241A (en) Image enhancement method and application thereof
CN109658357A (en) A kind of denoising method towards remote sensing satellite image
CN111815658B (en) Image recognition method and device
CN114998186A (en) Image processing-based method and system for detecting surface scab defect of copper starting sheet
CN111275642A (en) Low-illumination image enhancement method based on significant foreground content
CN115880300B (en) Image blurring detection method, device, electronic equipment and storage medium
JP6060638B2 (en) Object identification device, learning sample creation device, and program
CN116740231A (en) Image processing method, device, storage medium, electronic equipment and product
CN118037610B (en) Water gauge image distortion correction method and system for complex environment
CN116188506A (en) Method and system for acquiring infrared image edge

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 19 / F, building B, Xingzhi science and Technology Park, 6 Xingzhi Road, Nanjing Economic and Technological Development Zone, Jiangsu Province, 210000

Patentee after: AINNOVATION (NANJING) TECHNOLOGY Co.,Ltd.

Address before: Floor 19, building B, Xingzhi science and Technology Park, 6 Xingzhi Road, Jiangning Economic and Technological Development Zone, Nanjing, Jiangsu Province

Patentee before: AINNOVATION (NANJING) TECHNOLOGY Co.,Ltd.