CN115578284A - Multi-scene image enhancement method and system - Google Patents

Multi-scene image enhancement method and system Download PDF

Info

Publication number
CN115578284A
CN115578284A CN202211320708.6A CN202211320708A CN115578284A CN 115578284 A CN115578284 A CN 115578284A CN 202211320708 A CN202211320708 A CN 202211320708A CN 115578284 A CN115578284 A CN 115578284A
Authority
CN
China
Prior art keywords
illumination
component
area
mask
over
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211320708.6A
Other languages
Chinese (zh)
Inventor
徐亮
冯雨
华凤
汤汉兵
朱文康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Core Microelectronics Technology Zhuhai Co ltd
Original Assignee
Core Microelectronics Technology Zhuhai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Core Microelectronics Technology Zhuhai Co ltd filed Critical Core Microelectronics Technology Zhuhai Co ltd
Publication of CN115578284A publication Critical patent/CN115578284A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a multi-scene image enhancement method and a multi-scene image enhancement system. The method comprises the following steps: acquiring an image to be enhanced and converting the image to an HSV space to obtain a hue component diagram, a saturation component diagram and a brightness component diagram; performing multi-scale filtering weighted fusion based on the brightness component diagram to obtain weighted illumination components; performing multi-scale illumination correction and overexposure correction based on the brightness component map and the weighted illumination components to obtain three illumination correction components; performing multilevel binarization to obtain different illumination areas; and obtaining corresponding masks based on different illumination areas, performing weighted fusion, and converting the masks back to an RGB space after obtaining a fusion brightness component to obtain an enhanced image. The invention realizes accurate division and effective adjustment of different areas of brightness, ensures that the pixels at the edges of different illumination areas are smoother and more real, and improves the image enhancement quality.

Description

Multi-scene image enhancement method and system
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a multi-scene image enhancement method and system.
Background
Digital image processing systems are widely used in various industries, however, due to different illumination conditions of different scenes and the limitation of hardware quality, the problems of overexposure and underexposure of the acquired images often occur, and the requirements of the image processing systems cannot be met, so that an image enhancement method is necessary to improve the image quality.
The image enhancement technology improves the quality of an original picture by inhibiting interference factors on an image, not only meets the requirement of better visual experience, but also improves the reliability and robustness of a visual system, so that the enhanced image can better meet the requirement of an image processing system. Although the existing image enhancement algorithm can realize image enhancement under poor illumination conditions, the enhancement effect on multiple scenes needs to be improved.
Disclosure of Invention
In view of the above drawbacks or needs for improvement in the prior art, the present invention provides a method and system for enhancing multi-scene images, which can effectively solve the above problems.
To achieve the above object, according to an aspect of the present invention, there is provided a multi-scene image enhancement method, including:
acquiring an image to be enhanced, and converting the image to be enhanced into an HSV space to obtain a hue component diagram, a saturation component diagram and a brightness component diagram;
performing multi-scale filtering based on the brightness component diagram to obtain a plurality of illumination components of different scales, and performing weighted fusion based on the illumination components to obtain weighted illumination components;
performing multi-scale illumination correction based on the brightness component map and the weighted illumination components to obtain a first illumination correction component and a second illumination correction component with different correction amplitudes; overexposure correction is carried out on the basis of the brightness component map, and a third illumination correction component is obtained;
performing multilevel binarization on the brightness information based on the illumination component to obtain an over-illumination dark area, an illumination normal area and an over-illumination bright area;
filtering based on the over-illumination dark area, the normal illumination area and the over-illumination bright area to obtain a mask of the over-illumination dark area, a mask of the normal illumination area and a mask of the over-illumination bright area;
performing weighted fusion on the basis of the first illumination correction component, the second illumination correction component, the third illumination correction component, the too-dark area mask, the normal illumination area mask and the too-bright area mask to obtain a fusion brightness component;
and converting the fused brightness component, the hue component map and the saturation component map into an RGB space to obtain an enhanced image.
In some embodiments, the performing weighted fusion based on the first illumination correction component, the second illumination correction component, the third illumination correction component, the too-dark area mask, the normal-illumination area mask, and the too-bright area mask to obtain a fused luminance component includes:
weighting and fusing the mask in the over-illumination dark area, the mask in the normal illumination area, the mask in the over-illumination bright area, the first adjusting coefficient of the mask in the over-illumination dark area, the first adjusting coefficient of the mask in the normal illumination area and the first adjusting coefficient of the mask in the over-illumination bright area based on the first illumination correction component, the third illumination correction component, the brightness component map, the mask in the over-illumination dark area, the mask in the normal illumination area, the first adjusting coefficient of the mask in the over-illumination dark area and the first adjusting coefficient of the mask in the over-illumination bright area to obtain a first weighted brightness component;
weighting and fusing the mask in the over-illumination dark area, the mask in the normal illumination area, the mask in the over-illumination bright area, the second adjusting coefficient of the mask in the over-illumination dark area, the second adjusting coefficient of the mask in the normal illumination area and the second adjusting coefficient of the mask in the over-illumination bright area based on the second illumination correction component, the third illumination correction component, the brightness component map, the mask in the over-illumination dark area, the mask in the normal illumination area, the second adjusting coefficient of the mask in the over-illumination dark area, and the second adjusting coefficient of the mask in the over-illumination bright area to obtain a second weighted brightness component;
performing weighted fusion on the basis of the first weighted brightness component and the second weighted brightness component to obtain a fused brightness component;
the first adjusting coefficient of the mask of the illumination-passing dark area is smaller than the second adjusting coefficient of the mask of the illumination-passing dark area, the first adjusting coefficient of the mask of the illumination-normal area is smaller than the second adjusting coefficient of the mask of the illumination-normal area, and the correction amplitude of the first illumination correction component is larger than the second illumination correction component.
In some embodiments, the first weighted illumination component is derived according to the following formula:
Figure BDA0003910257920000031
the second weighted illumination component is also derived according to the following formula:
Figure BDA0003910257920000032
wherein, c 11 Representing a first adjustment coefficient of said mask for over-illumination dark areas, c 12 Representing a second adjustment coefficient of the mask in the over-illumination dark area; c. C 21 Representing a first adjustment factor of the mask in said normally illuminated area, c 22 Representing a second adjusting coefficient of the mask in the normal illumination area; c. C 13 Representing a first adjustment factor of the mask in said over-illuminated area, c 23 And representing a second adjustment coefficient of the mask in the over-illumination area.
In some embodiments, the performing multi-level binarization on the luminance information based on the illumination component to obtain an over-illuminated dark region, a normally-illuminated region, and an over-illuminated bright region includes:
determining a smallest scale illumination component of a plurality of the differently-scaled illumination components;
obtaining a plurality of inter-class variances based on the brightness information of the minimum-scale illumination component and a plurality of groups of thresholds, wherein each group of thresholds comprises a first threshold and a second threshold, and the first threshold is smaller than the second threshold;
determining a first threshold value and a second threshold value corresponding to the maximum value in the multiple inter-class variances as a final first threshold value and a final second threshold value;
and segmenting the minimum scale illumination component based on the final first threshold and the final second threshold to obtain the over-illumination dark area, the normal illumination area and the over-illumination bright area.
In some embodiments, said deriving a plurality of inter-class variances based on the luminance information of the smallest scale illumination component and a plurality of sets of thresholds comprises:
obtaining a plurality of said inter-class variances according to the following formula:
g(i,j)=w 1 ·(μ 1 -μ) 2 +w 2 ·(μ 2 -μ) 2 +w 3 ·(μ 3 -μ) 2
wherein i represents a first threshold, j represents a second threshold, i<j, g (i, j) represents the inter-class variance of the minimum scale illumination component when the threshold values are i and j; w is a 1 And mu 1 Respectively representing the proportion and the average gray scale of the number of the pixels in the range of 0 to i in the minimum scale illumination component, w 2 And mu 2 The number of the pixels in the range of i to j accounts for the proportion and the average gray scale of the minimum scale illumination component, w 3 And mu 3 And the proportion and the average gray scale of the number of the pixels occupying the minimum scale illumination component in the range of j to 255 are shown.
In some embodiments, the segmenting the minimum-scale illumination component based on the final first threshold and the final second threshold to obtain the over-illuminated region, the normal-illuminated region, and the over-illuminated region includes:
obtaining the over-illumination dark area, the normal illumination area and the over-illumination bright area according to the following formula:
Figure BDA0003910257920000041
Figure BDA0003910257920000042
Figure BDA0003910257920000043
wherein I 'represents the final first threshold, j' represents the final second threshold, I v_b1 Represents said dimly illuminated region, I v_b2 Represents the normally illuminated area, I v_b3 Representing the over-lit area.
In some embodiments, the performing the multi-scale illumination correction based on the brightness component map to obtain the first illumination correction component and the second illumination correction component with different correction amplitudes includes:
obtaining the first illumination correction component and the second illumination correction component according to the following formula:
Figure BDA0003910257920000044
Figure BDA0003910257920000045
wherein i =1,2,I v1 ' represents the first illumination correction component, I v2 ' represents the second illumination correction component, I v Represents the luminance component map, I v_gm Representing said weighted illumination component, k i Denotes a correction coefficient, α 1 Representing a first weight coefficient, alpha 2 A second weight coefficient is represented by a second weight coefficient,the first weight coefficient is smaller than the second weight coefficient,
Figure BDA0003910257920000046
representing an average of the saturation component map, W representing a total number of pixels of the saturation component map;
the overexposure correction is further performed based on the brightness component map, and obtaining a third illumination correction component includes:
obtaining the third illumination correction component according to the following formula:
Figure BDA0003910257920000051
wherein, I v3 ' denotes the third illumination correction component, and ξ denotes an adjustment coefficient.
According to another aspect of the present invention, there is provided a multi-scene image enhancement system, comprising:
the conversion module is used for acquiring an image to be enhanced and converting the image to be enhanced into an HSV space to obtain a hue component diagram, a saturation component diagram and a brightness component diagram;
the filtering module is used for carrying out multi-scale filtering on the basis of the brightness component diagram to obtain a plurality of illumination components with different scales, and obtaining weighted illumination components on the basis of weighted fusion of the illumination components;
the correction module is used for carrying out multi-scale illumination correction based on the brightness component map and the weighted illumination components to obtain a first illumination correction component and a second illumination correction component which have different correction amplitudes; the illumination correction component is used for performing overexposure correction on the basis of the brightness component map to obtain a third illumination correction component;
the partition module is used for carrying out multilevel binarization on the basis of the brightness information of the illumination component to obtain an over-illumination dark area, an illumination normal area and an over-illumination bright area;
the mask module is used for filtering based on the over-illumination dark area, the normal illumination area and the over-illumination bright area to obtain a mask of the over-illumination dark area, a mask of the normal illumination area and a mask of the over-illumination bright area;
the weighting module is used for weighting and fusing the first illumination correction component, the second illumination correction component, the third illumination correction component, the over-illumination dark area mask, the normal illumination area mask and the over-illumination bright area mask to obtain a fused brightness component;
and the enhancement module is used for converting the fusion brightness component, the hue component map and the saturation component map into an RGB space to obtain an enhanced image.
According to still another aspect of the present invention, there is provided an electronic apparatus including: a processor; a memory communicatively coupled to the processor; the memory stores instructions executable by the processor to enable the processor to perform the multi-scene image enhancement method described above.
According to still another aspect of the present invention, there is provided a computer-readable storage medium storing computer instructions which, when executed by a processor, implement the multi-scene image enhancement method described above.
Generally, compared with the prior art, the above technical solution conceived by the present invention has the following beneficial effects: converting an image to be enhanced into an HSV space to obtain an original brightness component, performing multi-scale filtering and weighted fusion based on the original brightness component to obtain a weighted illumination component, and ensuring the accuracy of illumination component extraction; then, multi-scale illumination correction is carried out based on the original brightness component and the weighted illumination component to obtain two illumination correction components with different correction amplitudes, overexposure correction is carried out based on the original brightness component to obtain a third illumination correction component, overexposure correction can be effectively avoided, and the detail information of the illumination components is well reserved; then, multi-level binarization is carried out on the basis of brightness information of illumination components to obtain an over-illumination dark area, a normal illumination area and an over-illumination bright area, area segmentation is carried out efficiently according to the brightness information, an image is subdivided into three effective areas, namely the over-illumination dark area, the normal illumination area and the over-illumination bright area, the three effective areas serve as key steps for preprocessing of separate enhancement of different areas, and the problems of over-enhancement of partial areas and under-enhancement of partial areas caused by uniform amplitude enhancement of all the areas are effectively avoided; and filtering is carried out based on the three effective areas to obtain different area masks, so that the edge of the illumination area is more consistent with the illumination rule of a real scene, the masks of different illumination areas are weighted and fused with different illumination correction components, different adjustment coefficients are set according to different areas, accurate division and effective enhancement of different areas with different brightness are realized, pixels at the edges of different illumination areas are ensured to be smoother and more real, and the image enhancement quality is improved.
Drawings
FIG. 1 is a flow chart of a multi-scene image enhancement method according to an embodiment of the present invention;
FIG. 2 is a flow chart of multi-level binarization according to an embodiment of the invention;
FIG. 3 is a flow chart illustrating a process of determining a fused luma component according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an image to be enhanced according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of the result of image enhancement of FIG. 4 based on a prior art algorithm;
FIG. 6 is a schematic diagram illustrating the result of image enhancement performed on FIG. 4 according to an embodiment of the present invention;
fig. 7 is a block diagram of the electronic device of the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. As those skilled in the art will recognize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present application. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
At present, the commonly used image enhancement algorithms are mainly classified into 5 types: 1) A histogram equalization algorithm; 2) Wavelet transformation image enhancement algorithm; 3) Partial differential equation image enhancement algorithm; 4) Retinex image enhancement algorithm; 5) And (4) deep learning image enhancement algorithm.
The histogram equalization algorithm is mainly used for counting and normalizing each gray level of a histogram of an image, and then correcting the gray level of a pixel in a remapping mode to achieve the effect of image enhancement. On the basis of the method, a series of improved algorithms are provided, such as: AHE, CLAHE, POSHE, WAHE, EPMPCE, BBHE, CVC and the like. The wavelet transformation image enhancement algorithm mainly decomposes an image into 1 low-pass sub-image and 3 directional high-pass sub-images, and achieves the purpose of noise reduction. On the basis of the method, a series of improved algorithms are provided, such as: wavelet transformation image enhancement algorithm based on contrast entropy, image enhancement algorithm combining wavelet transformation and curvelet transformation, wavelet transformation image enhancement algorithm (KGWT) based on knee function and gamma correction and the like. The partial differential equation image enhancement algorithm is mainly used for describing an enhanced image through standard partial differential equation image enhancement (PDE) of an input image, and solving through a variational method to obtain the enhanced image. On the basis of the method, a series of improved algorithms are provided, such as: the total variation model is a partial differential equation image enhancement algorithm (TVPDE), defect image enhancement based on quantum mechanics partial differential equations, fractional order differential equation enhancement algorithm and the like. The Retinex image enhancement algorithm mainly decomposes an original image into an illumination image and a reflection image, and then removes the influence of illumination components on the original image to obtain the reflection attribute of the image so as to achieve the purpose of enhancement. The image enhancement algorithm based on Retinex theory mainly comprises the following steps: single-scale Retinex algorithm (SSR), multi-scale Retinex algorithm (MSR), multi-scale Retinex algorithm with color recovery (MSMCR) and other algorithms. The deep learning image enhancement is mainly to carry out big data training by designing a network and achieve the purpose of image enhancement by network fitting. Mainly comprises algorithms such as SSDA and MSR-net.
The above algorithms can enhance images under bad illumination conditions, but the effects are different, especially when images shot under backlight conditions are processed, normal exposure, underexposure and overexposure areas exist in the images at the same time, and the existing algorithms cannot well enhance the images with multiple scenes at the same time.
According to the multi-scene image enhancement method and system, an original brightness component is obtained by converting an image to be enhanced into an HSV space, and weighted fusion is carried out after multi-scale filtering based on the original brightness component, so that a weighted illumination component is obtained, and the accuracy of illumination component extraction is ensured; then, multi-scale illumination correction is carried out based on the original brightness component and the weighted illumination component to obtain two illumination correction components with different correction amplitudes, overexposure correction is carried out based on the original brightness component to obtain a third illumination correction component, overexposure correction can be effectively avoided, and the detail information of the illumination components is well reserved; then, multi-level binarization is carried out on the basis of brightness information of illumination components to obtain an over-illumination dark area, a normal illumination area and an over-illumination bright area, area segmentation is carried out efficiently according to the brightness information, an image is subdivided into three effective areas, namely the over-illumination dark area, the normal illumination area and the over-illumination bright area, the three effective areas serve as key steps for preprocessing of separate enhancement of different areas, and the problems of over-enhancement of partial areas and under-enhancement of partial areas caused by uniform amplitude enhancement of all the areas are effectively avoided; and then filtering is carried out based on the three effective areas to obtain different area masks, so that the edge of the illumination area is more in accordance with the illumination rule of a real scene, the three area masks are weighted and fused with different illumination correction components, different effective adjustment on different areas of brightness is realized by setting different adjustment coefficients according to the different areas, the pixels at the edges of the different illumination areas are ensured to be smoother and more real, and the image enhancement quality is improved.
As shown in fig. 1, the multi-scene image enhancement method according to the embodiment of the present invention includes the following steps S1 to S7:
s1: and acquiring an image to be enhanced, and converting the image to be enhanced into an HSV space to obtain a hue component diagram, a saturation component diagram and a brightness component diagram.
In some embodiments, an image I to be enhanced is acquired by an image acquisition device, an input RGB image I is converted into HSV space, and a hue component map I is obtained h Saturation component diagram I s And a luminance component map I v . Specifically, canThe hue component map I is calculated by the following equations (1) to (3) h Saturation component diagram I s And a luminance component map I v
I v =max (1)
Figure BDA0003910257920000081
Figure BDA0003910257920000091
Wherein, R, G, B respectively represent red, green, blue channels in the RGB color space of the image I to be enhanced, max represents the maximum value of R, G, B, min represents the minimum value of R, G, B.
S2: and performing multi-scale filtering based on the brightness component diagram to obtain a plurality of illumination components with different scales, and performing weighted fusion based on the plurality of illumination components to obtain weighted illumination components. In some embodiments, the size of the gaussian filter is adjusted by setting a plurality of different scale parameters to obtain processed images with different dynamic ranges. Specifically, the gaussian-filtered illumination components I of three different scales can be calculated by the following equations (4) to (5) v_g1 、I v_g2 And I v_g3
Figure BDA0003910257920000092
Wherein, I v (x, y) denotes the luminance component map I v Brightness value at the middle coordinate (x, y), I v_g (x, y) represents I v (x, y) and G (x, y) represents a two-dimensional Gaussian function with the window size of m multiplied by n, wherein the values of m and n are obtained by calculation through a Gaussian scale parameter c. G (x, y) is defined as shown in the following formula (5):
Figure BDA0003910257920000093
wherein c represents a scale parameter, which determines the size of a gaussian filtering neighborhood, and the smaller the value of the gaussian filtering neighborhood, the larger the dynamic range of a compressed image is, the clearer the image is, q represents a normalization parameter of a gaussian function, and the gaussian function is guaranteed to satisfy.
In the embodiment of the present invention, the scale parameter c is set to 4, 20 and 63 respectively, and 3 different sizes of gaussian filtering results I are obtained v_g1 、I v_g2 And I v_g3
In some embodiments, the gaussian filtering result I is weighted v_g1 、I v_g2 And I v_g3 Fusing to obtain a weighted illumination component I v_gm . Specifically, the weighted illumination component I can be calculated by the following formula (6) v_gm
Figure BDA0003910257920000101
Wherein,
Figure BDA0003910257920000102
representing the illumination component, θ, extracted by the i-th scale Gaussian function i To represent
Figure BDA0003910257920000103
Corresponding weight coefficient, H represents the number of scales, I v_gm (x, y) represents the weighted illumination components after fusion.
In the embodiment of the present invention, in order to balance the accuracy and the calculation amount of the illumination component extraction, H =3 and the weight coefficient θ are set i Are set to 1/3. It is understood that the specific values set for the various parameters throughout the text are exemplary only and can be adjusted according to actual needs, and the present invention is not limited thereto.
S3: performing multi-scale illumination correction based on the brightness component diagram and the weighted illumination components to obtain a first illumination correction component and a second illumination correction component with different correction amplitudes; and performing overexposure correction based on the brightness component map to obtain third illuminationAnd correcting the component. In some embodiments, the luminance component map I is combined v And weighted illumination component I v_gm Performing multi-scale illumination correction to obtain a corrected first illumination correction component I v1 ' and a second illumination correction component I v2 ', first illumination correction component I v1 ' is larger than the second illumination correction component I v2 '; and according to the designed overexposure correction function, the brightness component diagram I v Carrying out overexposure correction to obtain a corrected third illumination correction component I v3 '. The image is corrected by adopting the illumination correction with different intensities, so that different illumination intensities in the image to be enhanced can be effectively adjusted, and overexposure correction is effectively avoided.
Specifically, the multi-scale illumination correction is performed by setting different correction coefficients, and 2 illumination correction results of different-scale underexposure, i.e., the first illumination correction component I, are obtained according to the following formulas (7) to (8) v1 ' and a second illumination correction component I v2 ', wherein the first illumination correction component I v1 ' correction of component I over second illumination v2 The' adjustment range is large, so that the aim of multi-scale correction is fulfilled, and the overall image enhancement effect is improved.
Figure BDA0003910257920000104
Figure BDA0003910257920000111
Wherein i =1,2,I v1 ' denotes a first illumination correction component, I v2 ' denotes a second illumination correction component, I v A graph representing the luminance components obtained in step 1, I v_gm Representing the weighted illumination component, I, obtained in step 2 s Graph I representing saturation Components s W represents the saturation component diagram I s Total number of pixels, k i Denotes a correction coefficient, α i A weight coefficient is represented, and the smaller the weight coefficient, the larger the image correction amplitude. In the present inventionIn an illustrative embodiment, the first weighting factor α 1 And a second weight coefficient alpha 2 Set to 0.1 and 1 respectively to obtain 2 first illumination correction components I with different correction degrees v1 ' and a second illumination correction component I v2 ', and the first illumination correction component I v1 ' is adjusted to a larger extent than the second illumination correction component I v2 ′。
In particular, the luminance component map I is also corrected by the designed overexposure correction function v Performing overexposure correction to obtain a third illumination correction component I as shown in the following formula (9) v3 ′:
Figure BDA0003910257920000112
Where ξ represents the adjustment coefficient.
The overexposure correction function designed by the invention can effectively avoid overexposure correction of the image, well reserve detail information, has an enhancement effect superior to that of the existing gamma correction function, can control the correction degree through adjusting the coefficient xi, and is suitable for overexposure correction of different scenes.
S4: and performing multilevel binarization on the brightness information based on the illumination component to obtain an over-illumination dark area, an illumination normal area and an over-illumination bright area. In some embodiments, the illumination component is subjected to multilevel binarization, and the region is divided according to the brightness information to obtain an under-illumination dark region, an under-illumination normal region and an under-illumination bright region, which are used as key pretreatment steps for separately enhancing different regions. The different region division can effectively avoid the problems of over-enhancement and under-enhancement of partial regions caused by uniform enhancement of all regions.
Fig. 2 is a schematic flow chart of multilevel binarization according to an embodiment of the present invention, which includes steps S41 to S44:
s41: a smallest scale illumination component of the plurality of differently scaled illumination components is determined. In the embodiment of the invention, the Gaussian filtering results I with different sizes obtained in the step S2 are determined v_g1 、I v_g2 And I v_g3 Light with minimum Gaussian blur ScaleIllumination quantity I v_g1 As a minimum scale illumination component, more image detail may be preserved.
S42: and obtaining a plurality of inter-class variances based on the brightness information of the minimum scale illumination component and a plurality of groups of thresholds, wherein each group of thresholds comprises a first threshold i and a second threshold j, and the first threshold i is smaller than the second threshold j. In the embodiment of the invention, the minimum-scale illumination component I is subjected to the OTSU threshold multi-scale segmentation algorithm v_g1 Performing multilevel binarization, and calculating a plurality of g (i, j) by the following formula (10):
g(i,j)=w 1 ·(μ 1 -μ) 2 +w 2 ·(μ 2 -μ) 2 +w 3 ·(μ 3 -μ) 2 (10)
wherein g (i, j) represents the inter-class variance of the image when the threshold values are i and j, i<j;w 1 And mu 1 Respectively representing the proportion and the average gray scale of the number of pixels in the range of 0 to i in the whole image, w 2 And mu 2 Representing the proportion and average gray scale of the number of pixels in the range of i-j to the whole image, w 3 And mu 3 The ratio of the number of pixels in the range of j to 255 to the whole image and the average gray scale are shown.
S43: and determining a first threshold value and a second threshold value corresponding to the maximum value in the multiple inter-class variances as a final first threshold value and a final second threshold value. In the embodiment of the present invention, a first threshold I 'and a second threshold j' corresponding to the maximum value of the obtained multiple inter-class variances g (I, j) are counted as a final threshold, and the minimum scale illumination component I is subjected to v_g1 And (6) carrying out segmentation.
S44: and segmenting the minimum scale illumination component based on the final first threshold and the final second threshold to obtain an over-illumination dark area, a normal illumination area and an over-illumination bright area. In the embodiment of the present invention, the over-illuminated dark region I is obtained through the following equations (11) to (13) based on the final first threshold I 'and the final second threshold j' determined as above v_b1 Normal illumination area I v_b2 And over-lighting area I v_b3
Figure BDA0003910257920000121
Figure BDA0003910257920000122
Figure BDA0003910257920000123
Therefore, the image can be efficiently subdivided into 3 effective areas, namely an over-illumination dark area, an illumination normal area and an over-illumination bright area, according to the image brightness through a multi-stage OTSU threshold segmentation algorithm, and the effective areas serve as key preprocessing steps for separately enhancing different areas. The division of different illumination areas can effectively avoid the problems of over-enhancement and under-enhancement of partial areas caused by the unified enhancement of all areas, and improve the overall image processing quality.
S5: and filtering based on the over-illumination dark area, the normal illumination area and the over-illumination bright area to obtain a mask of the over-illumination dark area, a mask of the normal illumination area and a mask of the over-illumination bright area. In some embodiments, in order to effectively reduce the problem of edge pixel abnormality of an image illumination area caused by abrupt change of a mask boundary, in an embodiment of the present invention, a gaussian filtering method is used to perform filtering on an excessively dark area I v_b1 Normal illumination area I v_b2 And over-lighting area I v_b3 And smoothing is carried out, so that the edge of the illumination area is more consistent with the illumination rule of a real scene, the next image masking and fusion operation are facilitated, and the pixels at the edge of the illumination area are more smooth and real. Specifically, the dimly illuminated region I can be illuminated by the above-described formulas (4) to (5) v_b1 Normal illumination area I v_b2 And over-lighting area I v_b3 Gaussian filtering is carried out to obtain 3 masks I v_mask1 、I v_mask2 And I v_mask3 And the masks respectively correspond to an over-dark illumination area, a normal illumination area and an over-bright illumination area, wherein the scale parameter c of the Gaussian filter is set to be 30.
S6: correcting scores based on first illuminationAnd performing weighted fusion on the quantity, the second illumination correction component, the third illumination correction component, the over-illumination dark area mask, the normal illumination area mask and the over-illumination bright area mask to obtain a fused brightness component. In some embodiments, the dark area mask I is illuminated v_mask1 Mask I in normal illumination area v_mask2 And light over-bright area mask I v_mask3 Correcting the first illumination correction component I obtained in the step S3 after illumination correction v1 ', second illumination correction component I v2 ' and a third illumination correction component I v3 ' performing mask and weighting fusion to obtain a first weighted luminance component I v_m1 And a second weighted luminance component I v_m2 Wherein the first illumination correction component I v1 ' and third illumination correction component I v3 ' and a luminance component map I v Carrying out mask and weighting fusion to obtain a first weighted brightness component I v_m1 Second illumination correction component I v2 ' and a third illumination correction component I v3 ' and a luminance component map I v Performing mask and weighting fusion to obtain a second weighted brightness component I v_m2 . Then, a first weighted brightness component I is calculated by a PCA algorithm v_m1 And a second weighted luminance component I v_m2 The weights are weighted and fused to obtain a fused brightness component I v_c
Fig. 3 is a schematic flow chart of determining a fused luminance component according to an embodiment of the present invention, which includes steps S61 to S63:
s61: and performing weighted fusion based on the first illumination correction component, the third illumination correction component, the brightness component image, the mask in the over-illumination dark area, the mask in the normal illumination area, the mask in the over-illumination bright area, the first adjustment coefficient in the over-illumination dark area, the first adjustment coefficient in the normal illumination area and the first adjustment coefficient in the over-illumination bright area to obtain a first weighted brightness component. Specifically, the first weighted luminance component I is obtained by performing weighted fusion as shown in the following formula (14) v_m1
Figure BDA0003910257920000141
Wherein, c 11 Mask I for representing over-illuminated dark areas v_mask1 A first adjustment factor; c. C 12 Mask I for indicating normal illumination area v_mask2 A first adjustment factor; c. C 13 Mask I for representing over-bright area of light v_mask3 A first adjustment factor.
S62: and performing weighted fusion based on the second illumination correction component, the third illumination correction component, the brightness component diagram, the mask in the over-illumination dark area, the mask in the normal illumination area, the mask in the over-illumination bright area, the second adjustment coefficient in the over-illumination dark area, the second adjustment coefficient in the normal illumination area and the second adjustment coefficient in the over-illumination bright area to obtain a second weighted brightness component. Specifically, the second weighted luminance component I is obtained by weighted fusion according to the following formula (145) v_m2
Figure BDA0003910257920000142
Wherein, c 21 Mask I for representing over-illuminated dark areas v_mask1 A second adjustment factor; c. C 22 Mask I for indicating normal illumination area v_mask2 A second adjustment factor; c. C 23 Mask I for representing over-bright area of light v_mask3 A second adjustment factor.
In some embodiments, the larger each of the adjustment coefficients is, the larger the adjustment range of the image is. Due to the first illumination correction component I v1 ' is adjusted to a larger extent than the second illumination correction component I v2 ' to improve the fusion effect and avoid overcorrection, c is provided 11 Is less than c 21 ,c 12 Is less than c 22
S63: and performing weighted fusion based on the first weighted brightness component and the second weighted brightness component to obtain a fused brightness component. In some embodiments, the first weighted luminance component I is calculated by a PCA algorithm v_m1 And a second weighted luminance component I v_m2 The weights are weighted and fused to obtain a fused brightness component I v_c
In particular, the first weighted luminance component I v_m1 And a second weighted luminance component I v_m2 Respectively expanding an n-dimensional column vector by X p Where p =1,2, as shown in the following equation (16):
Figure BDA0003910257920000143
the covariance matrix C of the matrix X is calculated by the following equations (17) to (18):
Figure BDA0003910257920000151
Figure BDA0003910257920000152
wherein,
Figure BDA0003910257920000153
represents the covariance of the weighted luminance components,
Figure BDA0003910257920000154
representing the average gray value of the ith weighted luminance component.
An eigenvalue formula | λ I-C | =0 is created, and an eigenvalue (λ I-C | =) of the covariance matrix C is calculated 12 ) And corresponding feature vector ([ xi ]) 12 ) In which ξ i Vectors representing a 2-dimensional column, i.e. [ xi ] i1i2 ] T
The larger eigenvalue is selected by the following equation (19):
r=argmax(λ l ) (19)
wherein l =1 or 2.
Calculating a weight coefficient of each weighted luminance component by calculating a feature vector corresponding to the maximum feature value by the following formula (20):
Figure BDA0003910257920000155
the fused luminance component I is calculated by the following formula (20) v_c
I v_c =ω 1 I 12 I 2 (21)
In the embodiment of the invention, the correction amplitude of different regions can be effectively controlled by adjusting the coefficient of mask weighting fusion, and different degrees of enhancement of different regions can be realized. Correcting component I for the first illumination with larger illumination deficiency correction amplitude v1 ' Can adjust little light and cross dark area mask I v_mask1 The adjustment coefficient of (2) inhibits the over-enhancement of the dark illumination area to a certain extent; correcting a first illumination correction component I with smaller amplitude for insufficient illumination v1 ' dark area mask I can be set by increasing illumination v_mask1 And the corresponding adjustment coefficient improves the enhancement range of the insufficient illumination area. For the underillumination area, the mask I of the over-illumination area can be adjusted v_mask3 The corresponding adjustment coefficient effectively adjusts the correction amplitude of the over-illumination area, thereby avoiding the problem of over-exposure; meanwhile, for the normal illumination area, the mask I can be used for covering the normal illumination area v_mask2 The adjustment factor of (2) is fine-tuned or not adjusted. Therefore, the effects of fine adjustment of normal illumination areas, enhancement of insufficient illumination areas and reduction of illumination intensity of excessive illumination areas are achieved, the three illumination areas are respectively adjusted, and the overall image enhancement effect is improved.
S7: and converting the image into an RGB space based on the fusion brightness component, the hue component diagram and the saturation component diagram to obtain an enhanced image. In some embodiments, the weighted and fused luminance component I is v_c As a luminance component, a hue component map I is combined h And saturation component map I s Converting HSV space to RGB space to obtain enhanced image I c . Specifically, the hue component map I is represented by the following formulas (22) to (26) h Saturation component diagram I s And a fused luminance component I v_p Converting the HSV space into RGB space to obtain enhanced image I e
C=V×S (22)
X=C×(1-|(H-60°)mod2-1|) (23)
m=V-C (24)
Figure BDA0003910257920000161
(R,G,B)=((R′+m)×255,(G′+m)×255,(B′+m)×255) (26)
Wherein, the H, S and V components respectively represent the input HSV space hue component diagram I h Saturation component diagram I s And a luminance component map I v_c R, G and B respectively denote enhanced images I e Corresponding to the R, G and B components of the RGB space.
It will be appreciated that in some embodiments, outputting the enhanced image I is also included e
Fig. 4 shows an example of an image to be enhanced according to an embodiment of the present invention, in which a relatively large area with relatively low illumination intensity and an area with relatively high light intensity exist at the same time, and the overall display effect of the image is relatively poor. Fig. 5 shows the result of image enhancement performed on fig. 4 based on other existing algorithms, and although the display brightness of the darker area is improved, the over-exposure condition exists in the over-bright area, the detail loss is serious, and the light-dark boundary is not natural enough. Fig. 6 shows a result of image enhancement performed on fig. 4 by the multi-scene image enhancement method according to the embodiment of the present invention, and it can be seen through comparison that the image enhancement method according to the present invention can effectively improve the brightness of the underilluminated area and simultaneously reduce the brightness of the over-illuminated area, after enhancement, there is no transition exposure or underexposure, and the edges of the areas with different brightness are naturally fused, so that the overall image enhancement effect is better.
By adopting the multi-scene image enhancement method provided by the embodiment of the invention, the original brightness component is obtained by converting the image to be enhanced into the HSV space, and weighted fusion is carried out after multi-scale filtering based on the original brightness component, so that the weighted illumination component is obtained, and the accuracy of illumination component extraction is ensured; then, multi-scale illumination correction is carried out based on the original brightness component and the weighted illumination component to obtain two illumination correction components with different correction amplitudes, overexposure correction is carried out based on the original brightness component to obtain a third illumination correction component, overexposure correction can be effectively avoided, and the detail information of the illumination components is well reserved; then, multilevel binarization is carried out on the basis of brightness information of the illumination component to obtain an over-illumination dark area, a normal illumination area and an over-illumination bright area, area segmentation is carried out efficiently according to the brightness information, the image is subdivided into three effective areas, namely the over-illumination dark area, the normal illumination area and the over-illumination bright area, and the three effective areas are used as key steps for preprocessing of separate enhancement of different areas, so that the problems of over-enhancement of partial areas and under-enhancement of partial areas caused by uniform amplitude enhancement of all the areas are effectively avoided; and then filtering is carried out based on the three effective areas to obtain different area masks, so that the edge of the illumination area is more in line with the illumination rule of a real scene, the masks of different illumination areas are fused with different illumination correction components in a weighting manner, different adjustment coefficients are set according to different areas, accurate division and respective effective adjustment of different areas with different brightness are realized, the pixels at the edges of different illumination areas are ensured to be smoother and real, and the image enhancement quality is improved.
The embodiment of the present application further provides an uneven illumination image enhancement system, including:
the conversion module is used for acquiring an image to be enhanced and converting the image to be enhanced into an HSV space to obtain a hue component diagram, a saturation component diagram and a brightness component diagram;
the filtering module is used for carrying out multi-scale filtering based on the brightness component diagram to obtain a plurality of illumination components with different scales and obtaining weighted illumination components based on the weighted fusion of the illumination components;
the correction module is used for carrying out multi-scale illumination correction based on the brightness component diagram and the weighted illumination components to obtain first illumination correction components and second illumination correction components with different correction amplitudes; the illumination correction module is also used for carrying out overexposure correction based on the brightness component diagram to obtain a third illumination correction component;
the partition module is used for carrying out multilevel binarization on the basis of the brightness information of the illumination component to obtain an over-illumination dark area, an illumination normal area and an over-illumination bright area;
the mask module is used for filtering based on the over-illumination dark area, the normal illumination area and the over-illumination bright area to obtain a mask of the over-illumination dark area, a mask of the normal illumination area and a mask of the over-illumination bright area;
the weighting module is used for weighting and fusing the mask of the over-illumination dark area, the mask of the normal illumination area and the mask of the over-illumination bright area based on the first illumination correction component, the second illumination correction component, the third illumination correction component to obtain a fused brightness component;
and the enhancement module is used for converting the fusion brightness component, the hue component diagram and the saturation component diagram into an RGB space to obtain an enhanced image.
In some embodiments, the weighting module is further configured to perform weighting fusion based on the first illumination correction component, the third illumination correction component, the luminance component map, the too-illuminated dark area mask, the normal-illuminated area mask, the too-illuminated bright area mask, the too-illuminated dark area mask first adjustment coefficient, the normal-illuminated area mask first adjustment coefficient, and the too-illuminated bright area mask first adjustment coefficient to obtain a first weighted luminance component; the second illumination correction component, the third illumination correction component, the brightness component graph, the mask in the over-illumination dark area, the mask in the normal illumination area, the mask in the over-illumination bright area, the second adjustment coefficient of the mask in the over-illumination dark area, the second adjustment coefficient of the mask in the normal illumination area and the second adjustment coefficient of the mask in the over-illumination bright area are used for weighting and fusing to obtain a second weighted brightness component; the first weighted brightness component and the second weighted brightness component are used for carrying out weighted fusion to obtain a fused brightness component; the first adjusting coefficient of the mask in the dark illumination area is smaller than the second adjusting coefficient of the mask in the dark illumination area, the first adjusting coefficient of the mask in the normal illumination area is smaller than the second adjusting coefficient of the mask in the normal illumination area, and the correction amplitude of the first illumination correction component is larger than the second illumination correction component.
In some embodiments, the weighting module is further configured to derive the first weighted luminance component according to the following formula:
Figure BDA0003910257920000181
and for deriving the second weighted luminance component according to the following equation:
Figure BDA0003910257920000191
wherein, c 11 First adjustment factor of mask representing over-illumination dark area, c 12 Representing a second adjustment coefficient of the mask in the dark area; c. C 21 First adjustment coefficient of mask representing normal illumination area, c 22 Representing a second adjusting coefficient of the mask in the illumination normal area; c. C 13 Representing a first adjustment factor of the mask in the over-illuminated area, c 23 And a second adjustment coefficient of the mask representing the over-illumination area.
In some embodiments, the partitioning module is further to determine a smallest scale illumination component of the plurality of differently sized illumination components; obtaining a plurality of inter-class variances based on the brightness information of the minimum-scale illumination component and a plurality of groups of thresholds, wherein each group of thresholds comprises a first threshold and a second threshold, and the first threshold is smaller than the second threshold; determining a first threshold value and a second threshold value corresponding to the maximum value in the multiple inter-class variances as a final first threshold value and a final second threshold value; and segmenting the minimum scale illumination component based on the final first threshold and the final second threshold to obtain an over-illumination dark area, a normal illumination area and an over-illumination bright area.
In some embodiments, the correction module is further configured to derive the first illumination correction component I according to the following formula v1 ' and a second illumination correction component I v2 ′:
Figure BDA0003910257920000192
Figure BDA0003910257920000193
Wherein i =1,2,I v1 ' denotes a first illumination correction component, I v2 ' denotes a second illumination correction component, I v Denotes a luminance component diagram, I v_gm Representing a weighted illumination component, k i Denotes a correction coefficient, α 1 Representing a first weight coefficient, alpha 2 Representing a second weight coefficient, the first weight coefficient being smaller than the second weight coefficient,
Figure BDA0003910257920000194
represents an average of the saturation component map, W represents a total number of pixels of the saturation component map;
and is also used for obtaining a third illumination correction component I according to the following formula v3 ′:
Figure BDA0003910257920000195
Wherein, I v3 ' denotes a third illumination correction component, and ξ denotes an adjustment coefficient.
The more specific implementation of each module of the uneven-illumination image enhancement system of the present invention can be referred to the description of the multi-scene image enhancement method of the present invention, and has similar beneficial effects, and is not repeated herein.
Fig. 7 is a block diagram of an electronic device according to an embodiment of the application. An embodiment of the present application further provides an electronic device, as shown in fig. 7, the electronic device includes: at least one processor 701, and a memory 703 communicatively coupled to the at least one processor 701. The memory 703 has stored therein instructions executable by the at least one processor 701. The instructions are executed by at least one processor 701. The processor 701, when executing the instructions, implements the multi-scene image enhancement method in the above-described embodiments. The number of the memory 703 and the processor 701 may be one or more. The electronic device is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
The electronic device may further include a communication interface 705 for communicating with an external device for data interactive transmission. The various devices are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor 701 may process instructions for execution within the electronic device, including instructions stored in or on the memory to display Graphical information for a Graphical User Interface (GUI) on an external input/output device, such as a display device coupled to the Interface, hi an alternative implementation, the memory 703, the processor 701, and the communication Interface 705 may communicate with each other via an internal Interface if the memory 703, the processor 701, and the communication Interface 705 are integrated on a single chip.
It should be understood that the processor may be a Central Processing Unit (CPU), other general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or any conventional processor or the like. It is noted that the processor may be a processor supporting an Advanced reduced instruction set machine (ARM) architecture.
Embodiments of the present application provide a computer-readable storage medium (such as the above-mentioned memory 703), which stores computer instructions, and when executed by a processor, the program implements the method provided in embodiments of the present application.
Optionally, the memory 703 may include a program storage area and a data storage area, wherein the program storage area may store an operating system and an application program required by at least one function; the storage data area may store data created according to use of the electronic device of the driving scene reconstruction method, and the like. Further, the memory 703 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 703 may optionally include memory located remotely from the processor 701, which may be connected to the driving scenario reconstruction method electronics over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
In the description of the present specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive various changes or substitutions within the technical scope of the present application, and these should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for enhancing a multi-scene image, comprising:
acquiring an image to be enhanced, and converting the image to be enhanced into an HSV (hue, saturation and value) space to obtain a hue component diagram, a saturation component diagram and a brightness component diagram;
performing multi-scale filtering based on the brightness component diagram to obtain a plurality of illumination components of different scales, and performing weighted fusion based on the illumination components to obtain weighted illumination components;
performing multi-scale illumination correction based on the brightness component map and the weighted illumination components to obtain a first illumination correction component and a second illumination correction component with different correction amplitudes; overexposure correction is carried out based on the brightness component map to obtain a third illumination correction component;
performing multilevel binarization on the basis of the brightness information of the illumination component to obtain an over-illumination dark area, an over-illumination normal area and an over-illumination bright area;
filtering based on the over-illumination dark area, the normal illumination area and the over-illumination bright area to obtain a mask of the over-illumination dark area, a mask of the normal illumination area and a mask of the over-illumination bright area;
performing weighted fusion on the basis of the first illumination correction component, the second illumination correction component, the third illumination correction component, the over-illumination dark area mask, the normal illumination area mask and the over-illumination bright area mask to obtain a fused brightness component;
and converting the image into an RGB space based on the fusion brightness component, the hue component diagram and the saturation component diagram to obtain an enhanced image.
2. The method of multi-scene image enhancement according to claim 1, wherein said performing a weighted fusion based on the first illumination correction component, the second illumination correction component, the third illumination correction component, the dimly illuminated region mask, the normally illuminated region mask, and the dimly illuminated region mask to obtain a fused luminance component comprises:
weighting and fusing based on the first illumination correction component, the third illumination correction component, the brightness component diagram, the too-illuminated dark area mask, the normal-illuminated area mask, the too-illuminated bright area mask, the too-illuminated dark area mask first adjustment coefficient, the normal-illuminated area mask first adjustment coefficient and the too-illuminated bright area mask first adjustment coefficient to obtain a first weighted brightness component;
weighting and fusing the mask in the over-illumination dark area, the mask in the normal illumination area, the mask in the over-illumination bright area, the second adjusting coefficient of the mask in the over-illumination dark area, the second adjusting coefficient of the mask in the normal illumination area and the second adjusting coefficient of the mask in the over-illumination bright area based on the second illumination correction component, the third illumination correction component, the brightness component map, the mask in the over-illumination dark area, the mask in the normal illumination area, the second adjusting coefficient of the mask in the over-illumination dark area, and the second adjusting coefficient of the mask in the over-illumination bright area to obtain a second weighted brightness component;
performing weighted fusion on the basis of the first weighted brightness component and the second weighted brightness component to obtain a fused brightness component;
the first adjusting coefficient of the mask of the dark illumination area is smaller than the second adjusting coefficient of the mask of the dark illumination area, the first adjusting coefficient of the mask of the normal illumination area is smaller than the second adjusting coefficient of the mask of the normal illumination area, and the correction amplitude of the first illumination correction component is larger than the second illumination correction component.
3. The multi-scene image enhancement method according to claim 2, wherein the first weighted luminance component is obtained according to the following formula:
Figure FDA0003910257910000021
the second weighted luminance component is also derived according to the following equation:
Figure FDA0003910257910000022
wherein, c 11 Representing a first adjustment coefficient of said mask for over-illumination dark areas, c 12 Representing a second adjustment coefficient of the mask for the over-illumination dark area; c. C 21 Representing a first adjustment factor of the mask in said normally illuminated area, c 22 Representing a second adjusting coefficient of the mask in the normal illumination area; c. C 13 Representing a first adjustment factor of the mask in said over-illuminated area, c 23 And representing a second adjusting coefficient of the mask in the over-illumination area.
4. The multi-scene image enhancement method according to claim 1, wherein the performing multi-level binarization on the luminance information based on the illumination component to obtain an under-illuminated area, a normally-illuminated area, and an under-illuminated area comprises:
determining a smallest scale illumination component of a plurality of the different scales of illumination components;
obtaining a plurality of inter-class variances based on the brightness information of the minimum-scale illumination component and a plurality of groups of thresholds, wherein each group of thresholds comprises a first threshold and a second threshold, and the first threshold is smaller than the second threshold;
determining a first threshold and a second threshold corresponding to the maximum value in the multiple inter-class variances as a final first threshold and a final second threshold;
and segmenting the minimum scale illumination component based on the final first threshold and the final second threshold to obtain the over-illumination dark area, the normal illumination area and the over-illumination bright area.
5. The multi-scene image enhancement method of claim 4, wherein the deriving a plurality of inter-class variances based on the luminance information of the minimum-scale illumination component and a plurality of sets of thresholds comprises:
obtaining a plurality of said inter-class variances according to the following formula:
g(i,j)=w 1 ·(μ 1 -μ) 2 +w 2 ·(μ 2 -μ) 2 +w 3 ·(μ 3 -μ) 2
where i denotes a first threshold value, j denotes a second threshold value, i<j, g (i, j) represents the inter-class variance of the minimum scale illumination component when the threshold values are i and j; w is a 1 And mu 1 Respectively representing the proportion and the average gray scale of the number of pixels occupying the minimum scale illumination component in the range of 0 to i, w 2 And mu 2 The number of the pixels in the range of i to j accounts for the proportion and the average gray scale of the minimum scale illumination component, w 3 And mu 3 And the proportion and the average gray scale of the number of the pixels occupying the minimum scale illumination component in the range of j to 255 are shown.
6. The multi-scene image enhancement method of claim 5, wherein the segmenting the minimum scale illumination component based on the final first threshold and the final second threshold to obtain the over-illuminated region, the normal-illuminated region, and the over-illuminated region comprises:
obtaining the over-illumination dark area, the normal illumination area and the over-illumination bright area according to the following formula:
Figure FDA0003910257910000031
Figure FDA0003910257910000032
Figure FDA0003910257910000033
wherein i 'represents the final first threshold value, and j' represents the final second threshold value,I v_b1 Represents said dimly illuminated region, I v_b2 Represents the normally illuminated area, I v_b3 Representing the over-illuminated area.
7. The method of multi-scene image enhancement of claim 1, wherein the performing multi-scale illumination correction based on the luminance component map and the weighted illumination components, obtaining first illumination correction components and second illumination correction components with different correction magnitudes comprises:
obtaining the first illumination correction component and the second illumination correction component according to the following formula:
Figure FDA0003910257910000041
Figure FDA0003910257910000042
wherein i =1,2,I v1 ' represents the first illumination correction component, I v2 ' represents the second illumination correction component, I v Represents the luminance component map, I v_gm Representing said weighted illumination component, k i Denotes a correction coefficient, α 1 Representing a first weight coefficient, alpha 2 Representing a second weight coefficient, the first weight coefficient being smaller than the second weight coefficient,
Figure FDA0003910257910000044
representing an average of the saturation component map, W representing a total number of pixels of the saturation component map;
the overexposure correction is further performed based on the brightness component map, and obtaining a third illumination correction component includes:
obtaining the third illumination correction component according to the following formula:
Figure FDA0003910257910000043
wherein, I v3 ' denotes the third illumination correction component, and ξ denotes an adjustment coefficient.
8. An uneven illumination image enhancement system, comprising:
the conversion module is used for acquiring an image to be enhanced and converting the image to be enhanced into an HSV space to obtain a hue component diagram, a saturation component diagram and a brightness component diagram;
the filtering module is used for carrying out multi-scale filtering on the basis of the brightness component diagram to obtain a plurality of illumination components with different scales, and weighting and fusing the illumination components to obtain weighted illumination components;
the correction module is used for carrying out multi-scale illumination correction based on the brightness component map and the weighted illumination components to obtain a first illumination correction component and a second illumination correction component which have different correction amplitudes; the illumination correction component is used for performing overexposure correction on the basis of the brightness component map to obtain a third illumination correction component;
the partition module is used for carrying out multilevel binarization on the basis of the brightness information of the illumination component to obtain an over-illumination dark area, an illumination normal area and an over-illumination bright area;
the mask module is used for filtering based on the over-illumination dark area, the normal illumination area and the over-illumination bright area to obtain a mask of the over-illumination dark area, a mask of the normal illumination area and a mask of the over-illumination bright area;
the weighting module is used for weighting and fusing the first illumination correction component, the second illumination correction component, the third illumination correction component, the over-illumination dark area mask, the normal illumination area mask and the over-illumination bright area mask to obtain a fused brightness component;
and the enhancement module is used for converting the fusion brightness component, the hue component map and the saturation component map into an RGB space to obtain an enhanced image.
9. An electronic device, comprising:
a processor;
a memory communicatively coupled to the processor;
the memory stores instructions executable by the processor to enable the processor to perform the multi-scene image enhancement method of any one of claims 1 to 7.
10. A computer readable storage medium storing computer instructions which, when executed by a processor, implement a multi-scene image enhancement method as claimed in any one of claims 1 to 7.
CN202211320708.6A 2022-07-18 2022-10-26 Multi-scene image enhancement method and system Pending CN115578284A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210844129 2022-07-18
CN2022108441295 2022-07-18

Publications (1)

Publication Number Publication Date
CN115578284A true CN115578284A (en) 2023-01-06

Family

ID=84587253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211320708.6A Pending CN115578284A (en) 2022-07-18 2022-10-26 Multi-scene image enhancement method and system

Country Status (1)

Country Link
CN (1) CN115578284A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116580210A (en) * 2023-07-05 2023-08-11 四川弘和数智集团有限公司 Linear target detection method, device, equipment and medium
CN116843581A (en) * 2023-08-30 2023-10-03 山东捷瑞数字科技股份有限公司 Image enhancement method, system, device and storage medium for multi-scene graph
CN117094912A (en) * 2023-10-16 2023-11-21 南洋电气集团有限公司 Welding image enhancement method and system for low-voltage power distribution cabinet
CN117690142A (en) * 2024-02-01 2024-03-12 深圳中科精工科技有限公司 Wafer character preprocessing method, device and storage medium
TWI840207B (en) * 2023-04-28 2024-04-21 信驊科技股份有限公司 Image enhancement method and electronic device

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI840207B (en) * 2023-04-28 2024-04-21 信驊科技股份有限公司 Image enhancement method and electronic device
CN116580210A (en) * 2023-07-05 2023-08-11 四川弘和数智集团有限公司 Linear target detection method, device, equipment and medium
CN116580210B (en) * 2023-07-05 2023-09-15 四川弘和数智集团有限公司 Linear target detection method, device, equipment and medium
CN116843581A (en) * 2023-08-30 2023-10-03 山东捷瑞数字科技股份有限公司 Image enhancement method, system, device and storage medium for multi-scene graph
CN116843581B (en) * 2023-08-30 2023-12-01 山东捷瑞数字科技股份有限公司 Image enhancement method, system, device and storage medium for multi-scene graph
CN117094912A (en) * 2023-10-16 2023-11-21 南洋电气集团有限公司 Welding image enhancement method and system for low-voltage power distribution cabinet
CN117094912B (en) * 2023-10-16 2024-01-16 南洋电气集团有限公司 Welding image enhancement method and system for low-voltage power distribution cabinet
CN117690142A (en) * 2024-02-01 2024-03-12 深圳中科精工科技有限公司 Wafer character preprocessing method, device and storage medium
CN117690142B (en) * 2024-02-01 2024-05-28 深圳中科精工科技有限公司 Wafer character preprocessing method, device and storage medium

Similar Documents

Publication Publication Date Title
Fu et al. Retinex-based perceptual contrast enhancement in images using luminance adaptation
CN115578284A (en) Multi-scene image enhancement method and system
Rivera et al. Content-aware dark image enhancement through channel division
Wang et al. Simple low-light image enhancement based on Weber–Fechner law in logarithmic space
Celik Spatial entropy-based global and local image contrast enhancement
Lai et al. Improved local histogram equalization with gradient-based weighting process for edge preservation
CN107358586A (en) A kind of image enchancing method, device and equipment
CN109919859B (en) Outdoor scene image defogging enhancement method, computing device and storage medium thereof
Tian et al. Global and local contrast adaptive enhancement for non-uniform illumination color images
US20240193739A1 (en) Image processing method and apparatus, computer device, and storage medium
Parihar et al. A comprehensive analysis of fusion-based image enhancement techniques
CN112541868A (en) Image processing method, image processing device, computer equipment and storage medium
Li et al. Content adaptive guided image filtering
Lei et al. Low-light image enhancement using the cell vibration model
CN114359083B (en) High-dynamic thermal infrared image self-adaptive preprocessing method for interference environment
Singh et al. Image enhancement by adaptive power-law transformations
Lee et al. Ramp distribution-based contrast enhancement techniques and over-contrast measure
CN117830134A (en) Infrared image enhancement method and system based on mixed filtering decomposition and image fusion
Yuan et al. Adaptive histogram equalization with visual perception consistency
Kim Low-light image enhancement by diffusion pyramid with residuals
CN110807748A (en) New tone mapping image enhancement method based on high dynamic range
Kaur et al. An improved adaptive bilateral filter to remove gaussian noise from color images
CN115564682A (en) Uneven-illumination image enhancement method and system
WO2020241337A1 (en) Image processing device
Simon et al. Contrast enhancement of color images using improved Retinex method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication