CN112541868A - Image processing method, image processing device, computer equipment and storage medium - Google Patents

Image processing method, image processing device, computer equipment and storage medium Download PDF

Info

Publication number
CN112541868A
CN112541868A CN202011418526.3A CN202011418526A CN112541868A CN 112541868 A CN112541868 A CN 112541868A CN 202011418526 A CN202011418526 A CN 202011418526A CN 112541868 A CN112541868 A CN 112541868A
Authority
CN
China
Prior art keywords
image
color
pixel
channel
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011418526.3A
Other languages
Chinese (zh)
Inventor
谢朝毅
谢亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Insta360 Innovation Technology Co Ltd
Original Assignee
Insta360 Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Insta360 Innovation Technology Co Ltd filed Critical Insta360 Innovation Technology Co Ltd
Priority to CN202011418526.3A priority Critical patent/CN112541868A/en
Publication of CN112541868A publication Critical patent/CN112541868A/en
Priority to PCT/CN2021/136086 priority patent/WO2022121893A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The application relates to an image processing method, an image processing device, a computer device and a storage medium. The method comprises the following steps: carrying out image processing on an image to be processed to obtain a processed image; acquiring color channel values corresponding to all pixel points in a processed image, and counting the color channel values to obtain channel statistical values corresponding to the processed image; acquiring saturation lifting pixel points in the processed image, and calculating to obtain channel color ratios corresponding to the color channels of the saturation lifting pixel points based on the color channel values and the channel statistical values of the saturation lifting pixel points; selecting a minimum value from the channel color ratios as a color suppression ratio corresponding to each color channel of the saturation enhancement pixel point; and carrying out color suppression processing on each color channel of the saturation enhancement pixel point based on the color suppression ratio corresponding to each color channel of the saturation enhancement pixel point to obtain a target image. The method can improve the image processing effect.

Description

Image processing method, image processing device, computer equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, a computer device, and a storage medium.
Background
With the development of image processing technology, the requirements of users on image processing effects are higher and higher. For example, there is a demand for a processing effect of saturation in image processing, and saturation is an important factor for evaluating the image processing effect. The saturation refers to the brightness of color, also called color purity, and the higher the purity is, the more vivid the color is; the purity is low, and the color appearance is dark; however, when the saturation is too high, the image may be distorted.
For images or videos straight out of a camera, if the dynamic range is insufficient and the overall contrast of the image quality is insufficient, the overall image or video looks blurry, and if the saturation is insufficient, the overall image quality is also dark. For example, a classical histogram equalization algorithm can improve the overall contrast of an image, but the image contrast is oversaturated, and the overall image quality is reduced; and corresponding deep learning algorithms (such as DPED) can cause the problems of over-saturation of the overall image quality, overflow of pure colors and the like.
In summary, it is known that there is no image processing method currently, which can automatically process the video or picture content taken by the user to achieve the effects of improving the dynamic range of the picture and improving the contrast saturation.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an image processing method, an image processing apparatus, a computer device, and a storage medium, which can improve the dynamic range of a screen, improve the contrast saturation, and improve the quality of an output image as a whole.
A method of image processing, the method comprising:
carrying out image processing on an image to be processed to obtain a processed image;
acquiring color channel values corresponding to all pixel points in the processed image, and counting the color channel values to obtain channel statistical values corresponding to the processed image;
acquiring saturation lifting pixel points in the processed image, and calculating to obtain channel color ratios corresponding to the color channels of the saturation lifting pixel points based on the color channel values of the saturation lifting pixel points and the channel statistical values;
selecting a minimum value from the channel color ratios corresponding to the color channels of the saturation enhancement pixel points, and taking the minimum value as a color suppression ratio corresponding to each color channel of the saturation enhancement pixel points;
and carrying out color suppression processing on each color channel of the saturation enhancement pixel point based on the color suppression ratio corresponding to each color channel of the saturation enhancement pixel point to obtain a target image.
In one embodiment, the obtaining the saturation level lifting pixel point in the processed image includes:
acquiring a gray image corresponding to the image to be processed;
determining the pixel lifting ratio of each pixel point in the processed image relative to the gray image according to the pixel value of each pixel point in the processed image and the pixel value of the pixel point at the corresponding position in the gray image;
and taking the pixel points with the pixel lifting ratio value larger than a preset threshold value in the processed image as saturation lifting pixel points.
In one embodiment, the calculating, based on the color channel values of the saturation boost pixel and the channel statistic, a channel color ratio corresponding to each color channel of the saturation boost pixel includes:
calculating the change value of each color channel value of the saturation lifting pixel point relative to the channel statistical value;
determining an adjusting weight corresponding to each color channel value of the saturation lifting pixel point according to the change value, wherein the color channel value adjusting weight and the change value form a negative correlation relationship;
and calculating to obtain a channel color ratio corresponding to each color channel of the saturation promotion pixel point according to the adjustment weight corresponding to each color channel value of the saturation promotion pixel point.
In one embodiment, the calculating, according to the adjustment weight corresponding to each color channel value of the saturation lifting pixel, a channel color ratio corresponding to each color channel of the saturation lifting pixel includes:
calculating to obtain a first ratio according to the product of the adjusting weight corresponding to each color channel value of the saturation lifting pixel point and the pixel lifting ratio of the saturation lifting pixel point;
subtracting the adjustment weight corresponding to each color channel value of the saturation lifting pixel point by using a preset value to obtain a second ratio;
and adding the first ratio and the second ratio to obtain a channel color ratio corresponding to each color channel of the saturation enhancement pixel point.
In one embodiment, the method further comprises the following steps: acquiring pixel points of which the pixel values are larger than a preset threshold value in the gray level image as highlight pixel points;
multiplying the pixel lifting ratio corresponding to the highlight pixel point by a first coefficient to obtain a third ratio; the first coefficient and the pixel value of the pixel point at the corresponding position of the highlight pixel point in the processed image form a negative correlation relationship;
subtracting the first coefficient by using a preset value to obtain a fourth ratio;
and adding the third ratio and the fourth ratio to obtain a channel color ratio corresponding to each color channel of the highlight pixel points.
In one embodiment, the image processing the image to be processed to obtain a processed image includes:
carrying out image processing on an image to be processed to obtain an intermediate image;
acquiring a first pixel point of which the pixel value in the gray image corresponding to the image to be processed is smaller than a preset threshold value, and taking the pixel point corresponding to the first pixel point in the intermediate image as a second pixel point;
obtaining a noise suppression weight of the corresponding second pixel point according to the pixel value of the first pixel point in the gray level image, wherein the pixel value and the noise suppression weight form a correlation relationship;
and carrying out noise suppression on a second pixel point in the intermediate image according to the noise suppression weight to obtain a processed image.
In one embodiment, the image processing the image to be processed to obtain a processed image includes:
acquiring a gray image corresponding to an image to be processed;
brightening the gray level image to obtain a brightened image;
carrying out contrast enhancement processing on the gray level image to obtain a contrast enhanced image;
and carrying out fusion processing on the gray level image, the brightening image and the contrast enhancement image to obtain a processed image.
An image processing apparatus, the apparatus comprising:
the processing image acquisition module is used for carrying out image processing on the image to be processed to obtain a processing image;
a channel statistic value obtaining module, configured to obtain color channel values corresponding to each pixel point in the processed image, and perform statistics on the color channel values to obtain channel statistic values corresponding to the processed image;
a channel color ratio obtaining module, configured to obtain a saturation boost pixel point in the processed image, and calculate a channel color ratio corresponding to each color channel of the saturation boost pixel point based on each color channel value of the saturation boost pixel point and the channel statistical value;
a color suppression ratio obtaining module, configured to select a minimum value from channel color ratios corresponding to the color channels of the saturation boost pixel, where the minimum value is used as the color suppression ratio corresponding to each color channel of the saturation boost pixel;
and the target image acquisition module is used for carrying out color suppression processing on each color channel of the saturation promotion pixel point based on the color suppression ratio corresponding to each color channel of the saturation promotion pixel point to obtain a target image.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
carrying out image processing on an image to be processed to obtain a processed image;
acquiring color channel values corresponding to all pixel points in the processed image, and counting the color channel values to obtain channel statistical values corresponding to the processed image;
acquiring saturation lifting pixel points in the processed image, and calculating to obtain channel color ratios corresponding to the color channels of the saturation lifting pixel points based on the color channel values of the saturation lifting pixel points and the channel statistical values;
selecting a minimum value from the channel color ratios corresponding to the color channels of the saturation enhancement pixel points, and taking the minimum value as a color suppression ratio corresponding to each color channel of the saturation enhancement pixel points;
and carrying out color suppression processing on each color channel of the saturation enhancement pixel point based on the corresponding color suppression ratio of each color channel of the saturation enhancement pixel point to obtain a target image.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
carrying out image processing on an image to be processed to obtain a processed image;
acquiring color channel values corresponding to all pixel points in the processed image, and counting the color channel values to obtain channel statistical values corresponding to the processed image;
acquiring saturation lifting pixel points in the processed image, and calculating to obtain channel color ratios corresponding to the color channels of the saturation lifting pixel points based on the color channel values of the saturation lifting pixel points and the channel statistical values;
selecting a minimum value from the channel color ratios corresponding to the color channels of the saturation enhancement pixel points, and taking the minimum value as a color suppression ratio corresponding to each color channel of the saturation enhancement pixel points;
and carrying out color suppression processing on each color channel of the saturation enhancement pixel point based on the corresponding color suppression ratio of each color channel of the saturation enhancement pixel point to obtain a target image.
The image processing method, the image processing device, the computer equipment and the storage medium obtain a processed image after image processing is carried out on the image to be processed, obtain color channel values corresponding to all pixel points in the processed image, and obtain channel statistical values corresponding to the processed image by counting the color channel values; and then, acquiring saturation lifting pixel points in the processed image, calculating to obtain a channel color ratio corresponding to the color channel based on each color channel value and the channel statistic value of the saturation lifting pixel points, selecting a minimum value from the channel color ratio, and using the minimum value as a color suppression ratio corresponding to the saturation lifting pixel points, so that color suppression processing can be performed on the pixel points with high saturation, and the image processing effect is improved.
Drawings
FIG. 1 is a diagram of an exemplary embodiment of an image processing method;
FIG. 2 is a flow chart illustrating an image processing method;
FIG. 3 is a flowchart illustrating a method for obtaining saturation lifting pixels in a processed image according to an embodiment;
FIG. 4 is a schematic flow chart illustrating a method for obtaining saturation lifting pixels in a processed image according to another embodiment;
FIG. 5 is a schematic flow chart illustrating a method for obtaining saturation lifting pixels in a processed image according to another embodiment;
FIG. 6 is a flowchart illustrating an image processing method according to another embodiment;
FIG. 7 is a flowchart illustrating a method for processing an image to obtain a processed image according to an embodiment of the present invention;
FIG. 8 is a flowchart illustrating a method for processing an image to be processed according to another embodiment;
FIG. 9 is a block diagram showing the configuration of an image processing apparatus according to an embodiment;
FIG. 10 is a diagram showing an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The image processing method provided by the application can be applied to the application environment shown in fig. 1. The application environment includes an image capture device 102 and a terminal 104, wherein the image capture device 102 is communicatively coupled to the terminal 104. After acquiring an image to be processed, the image acquisition device 102 transmits the image to the terminal 104, the terminal 104 acquires the image to be processed, and the terminal 104 can count color channel values corresponding to pixel points of the acquired image to be processed to obtain a channel statistical value corresponding to the processed image; acquiring saturation lifting pixel points in the processed image, and calculating to obtain a channel color ratio corresponding to a color channel based on each color channel value and a channel statistic value of the saturation lifting pixel points; selecting a minimum value from channel color ratios corresponding to the color channels corresponding to the saturation enhancement pixel points as a color suppression ratio corresponding to the saturation enhancement pixel points; and carrying out color suppression processing on the saturation lifting pixel points based on the color suppression ratio corresponding to the saturation lifting pixel points to obtain a target image. The image capturing device 102 may be, but is not limited to, various devices with an image capturing function, and may be distributed outside the terminal 104 or distributed inside the terminal 104. For example: various cameras, scanners, various cameras, image acquisition cards distributed outside the terminal 104. The terminal 104 may be, but is not limited to, various cameras, personal computers, laptops, smartphones, tablets, and portable wearable devices.
It is understood that the method provided by the embodiment of the present application may also be executed by a server.
In one embodiment, as shown in fig. 2, an image processing method is provided, which is described by taking the method as an example applied to the terminal in fig. 1, and includes the following steps:
step 202, performing image processing on the image to be processed to obtain a processed image.
The processed image is an image subjected to image processing.
Specifically, the terminal may use the image acquired in real time as the image to be processed, and the display effect of the image to be processed is not ideal, so that the image to be processed needs to be processed to obtain the processed image.
In an embodiment, the terminal may also obtain the processed image by obtaining the image to be processed from the memory in which the image is stored, and performing image processing on the image to be processed.
In an embodiment, a gray-scale processing may be performed on an image to be processed to obtain a gray-scale image, the gray-scale image may be subjected to a brightening process to obtain a brightening image, the gray-scale image may be subjected to a contrast enhancement process to obtain a contrast-enhanced image, the gray-scale image, the brightening image, and the contrast-enhanced image may be subjected to a fusion process to obtain an intermediate processed image, and the intermediate processed image obtained at this time may be used as the processed image.
Further, in an embodiment, the intermediate processed image obtained by the fusion processing of the grayscale image, the brightening image and the contrast enhanced image in the above embodiment is selected, and the noise suppression processing may be further performed, and the obtained processed image is used as the processed image.
Further, in the above embodiment, the noise suppression processing is performed on the intermediate processed image (i.e. the intermediate image), and specifically, the noise suppression processing may be performed on the intermediate processed image (i.e. the intermediate image):
acquiring a first pixel point of which the pixel value in a gray image corresponding to an image to be processed is smaller than a preset threshold value, and taking the pixel point corresponding to the first pixel point in an intermediate image as a second pixel point;
obtaining a noise suppression weight of a corresponding second pixel point according to a pixel value of a first pixel point in the gray level image, wherein the pixel value and the noise suppression weight form a correlation;
and carrying out noise suppression on a second pixel point in the intermediate image according to the noise suppression weight to obtain a processed image.
And 204, acquiring color channel values corresponding to all pixel points in the processed image, and counting the color channel values to obtain channel statistical values corresponding to the processed image.
The color channel value is a value of each channel after each pixel point is divided into three channels of R (red), G (green) and B (blue). The value of each channel may be represented using the pixel value corresponding to each channel. For example, for a pixel, the color channel values of the RGB three channels are 199, 237 and 204, respectively. The channel statistic value is a color channel value which can embody the overall data characteristics of the color channel value. The channel statistic value may refer to an average value of three color channel values, or a median value.
Specifically, the terminal can detect the color channel value corresponding to each pixel point by acquiring the processed image and using the pixel point analysis software or the analysis tool provided by the terminal, and obtain the channel statistic value corresponding to the processed image through calculation.
And step 206, acquiring saturation lifting pixel points in the processed image, and calculating to obtain channel color ratios corresponding to the color channels of the saturation lifting pixel points based on the color channel values and the channel statistical values of the saturation lifting pixel points.
The saturation-improved pixel points refer to pixel points with improved saturation relative to the original pixel point in the image to be processed. The channel color ratio refers to the ratio of the RGB three-channel colors in each pixel point to each pixel point respectively.
Specifically, after the color channel value corresponding to each pixel point is obtained, a channel statistical value corresponding to the processed image is obtained through calculation, the saturation promoting pixel point in the processed image is obtained, and the channel color ratio is obtained based on the saturation promoting pixel point in the processed image.
In one embodiment, the color saturation of a certain channel of the image is positively correlated with the absolute value of the difference of the corresponding channel statistics. The greater the absolute value of the difference between the color channel value and the processed image, the higher the color saturation of a certain channel of the image. For example, the larger the absolute value of the difference between the color channel value of red represented by R in the RGB three channels and the processed image, the higher the red saturation of the R channel, and the more red the red in the image appears.
And 208, selecting the minimum value from the channel color ratios corresponding to the color channels of the saturation enhancement pixel point, and using the minimum value as the color suppression ratio corresponding to the color channels of the saturation enhancement pixel point.
The color suppression ratio is a ratio which can be used for suppressing the overlarge color saturation, and the image processing effect can be improved by establishing a functional relation between the ratio and the corresponding channel color.
Specifically, because the channel color ratio is a numerical value greater than 1, the closer the numerical value is to 1, the higher the saturation of the image is, the minimum value selected from the channel color ratio is the numerical value closest to 1, the numerical value is used as a color suppression ratio, and color suppression processing is performed on the saturation enhancement pixel points based on the color suppression ratio, so that a target image is obtained.
And step 210, performing color suppression processing on each color channel of the saturation enhancement pixel point based on the color suppression ratio corresponding to each color channel of the saturation enhancement pixel point to obtain a target image.
In one embodiment, after the color suppression ratio is obtained, a functional relationship is established between the pixel value of the saturation-improving pixel and the color suppression ratio, so that the pixel value of the pixel obtained after the functional relationship is established is obtained, the effect of color suppression processing is achieved, and the target image is obtained.
In an embodiment, the functional relationship between the saturation boost pixel point and the color suppression ratio may be established by taking the color suppression ratio as a coefficient, and multiplying the pixel value of the saturation boost pixel point by the color suppression ratio to obtain a pixel point subjected to suppression processing, thereby obtaining the target image.
In the image processing method, after image processing is carried out on an image to be processed, a processed image is obtained, color channel values corresponding to all pixel points in the processed image are obtained, and channel statistical values corresponding to the processed image are obtained by counting the color channel values; and then, acquiring saturation lifting pixel points in the processed image, calculating to obtain a channel color ratio corresponding to the color channel based on each color channel value and the channel statistic value of the saturation lifting pixel points, selecting a minimum value from the channel color ratio, and using the minimum value as a color suppression ratio corresponding to the saturation lifting pixel points, so that color suppression processing can be performed on the pixel points with high saturation, and the image processing effect is improved.
In one embodiment, as shown in fig. 3, the step of obtaining the saturation boost pixel point in the processed image includes:
step 302, obtaining a gray image corresponding to the image to be processed.
The grayscale image is an image obtained by dividing an image from black to white into several levels. The gray level image can make the transition of the image smoother and finer.
In one embodiment, the image to be processed may be correspondingly converted to obtain a converted grayscale image. The gray level image corresponding to the image to be processed can be obtained by performing corresponding mathematical operation on the image to be processed. For example, representing the grayscale image as Gray, and the channel color values of the three channels of the image pixel to be processed as R, G and B, respectively, can be calculated by the floating point algorithm: gray-scale image was calculated as R0.3 + G0.59 + B0.11.
Step 304, determining a pixel lifting ratio of each pixel point in the processed image relative to the gray image according to the pixel value of each pixel point in the processed image and the pixel value of the pixel point at the corresponding position in the gray image.
The pixel lifting ratio refers to the enhancement degree of the pixel value of each pixel point in the processed image relative to the pixel value of the pixel point at the corresponding position in the gray level image.
Specifically, the pixel value of the pixel point is a numerical value, the numerical value and the saturation are in a positive correlation relationship, and the pixel lifting ratio of the processed image relative to the gray image can be determined through the numerical value.
In one embodiment, the pixel lifting ratio of the processed image to the gray image may be expressed by a multiple relationship between the pixel values of the pixels in the processed image and the pixel values of the pixels at the corresponding positions in the gray image. For example, suppose that the pixel value of each pixel point of the processed image is represented as I, the pixel value of the pixel point at the corresponding position in the gray-scale image is represented as I1, and the pixel lifting ratio of the processed image determined relative to the gray-scale image is represented as ratiosrcThen the relationship between the three can be expressed as:
ratiosrc=(I+1)/(I1+1)
step 306, taking the pixel points with the pixel lifting ratio value larger than the preset threshold value in the processed image as saturation lifting pixel points.
The preset threshold value is a set critical value, and when the preset threshold value is larger than the critical value, the pixel points meeting the conditions are used as saturation promoting pixel points; when the saturation value is less than the critical value, the pixel points meeting the conditions do not need to be used as saturation promoting pixel points.
Specifically, after the pixel lifting ratio is obtained, the pixels needing saturation lifting can be screened out in a form of a preset threshold, and the saturation of the screened pixels is adjusted.
In one embodiment, the preset threshold may be set to a fixed value, and when the preset threshold is greater than the fixed value, the pixel points meeting the condition are screened out as the pixel points to be improved in saturation; it can be understood that when the value is smaller than the fixed value, the pixel point satisfying the condition is not used as the pixel point for improving the saturation. For example, with a fixed value of 1, the pixel lifting ratio is ratiosrcThen ratiosrcWhen the saturation is larger than 1, screening out corresponding pixel points in the processed image meeting the conditions for saturation promotion; ratio (R)srcAnd when the saturation is less than 1, performing saturation enhancement processing on the pixel points in the processed image meeting the condition.
In this embodiment, the pixel lifting ratio of the processed image relative to the gray image is determined by determining the processed image, a preset threshold is set for the pixel lifting ratio, and the saturation lifting pixel points are screened out through the preset threshold, so that the purpose of accurately determining the saturation lifting pixel points can be achieved.
In an embodiment, as shown in fig. 4, the step of calculating, based on the color channel values and the channel statistics of the saturation enhancement pixel, a channel color ratio corresponding to each color channel of the saturation enhancement pixel includes:
step 402, calculating the change value of each color channel value of the saturation lifting pixel point relative to the channel statistical value.
In an embodiment, a change value of the color channel value of each saturation boost pixel point relative to the channel statistic may be a difference value of the color channel value of each saturation boost pixel point relative to the channel statistic. For example, if the color channel value of any saturation-improving pixel is c, the channel statistic value is avg, and the change value of the color channel value corresponding to the saturation-improving pixel relative to the channel statistic value is d, then d is represented as: d ═ c-avg
And step 404, determining an adjusting weight corresponding to each color channel value of the saturation lifting pixel point according to the change value, wherein the color channel value adjusting weight and the change value form a negative correlation relationship.
The adjusting weight refers to the importance degree of each color channel value of each saturation lifting pixel point to be adjusted. The variable value and the variable value are in a negative correlation relationship, the larger the variable value is, the smaller the adjustment weight is, and the smaller the variable value is, the larger the adjustment weight is.
In one embodiment, the adjustment weight may be represented by a functional relationship between the change value and the adjustment weight. For example, if any channel adjustment weight is denoted as w (c, avg), the functional relationship between the variation value d and the color channel value adjustment weight w (c, avg) can be expressed as:
Figure BDA0002821133410000101
and 406, calculating to obtain a channel color ratio corresponding to each color channel of the saturation lifting pixel point according to the adjusting weight corresponding to each color channel of the saturation lifting pixel point.
Specifically, the adjustment weight w (c, avg) corresponding to the pixel point is increased through the saturation, and the pixel increase ratiosrcThe channel color ratio corresponding to the color channel of each saturation lifting pixel point can be calculatedc
In one embodiment, the adjustment weight corresponding to each color channel value of the saturation enhancement pixel point has a positive correlation with the channel color ratio corresponding to each color channel of the saturation enhancement pixel point, and the larger the adjustment weight corresponding to each color channel value of the saturation enhancement pixel point is, the higher the channel color ratio corresponding to each color channel of the saturation enhancement pixel point is. For example, the channel color ratio corresponding to each color channel of the saturation boost pixelcCan be expressed as:
ratioc=ratiosrc*w(c,avg)+(1-w(c,avg))*1
in this embodiment, the adjustment weight corresponding to each color channel value of each saturation promotion pixel point is obtained through the change value of each color channel value of the saturation promotion pixel point relative to the channel statistical value, and the purpose of obtaining the channel color ratio can be achieved through adjusting the weight.
In an embodiment, as shown in fig. 5, the step of calculating, according to the adjustment weight corresponding to each color channel value of the saturation lifting pixel, a channel color ratio corresponding to each color channel of the saturation lifting pixel includes:
step 502, calculating to obtain a first ratio according to a product of the adjusting weight corresponding to each color channel value of the saturation lifting pixel point and the pixel lifting ratio of the saturation lifting pixel point.
Specifically, the first ratio is expressed as ratioc1Then, it can be calculated by the following formula:
ratioc1=ratiosrc*w(c,avg)
step 504, subtracting the adjustment weight corresponding to each color channel value of the saturation lifting pixel point from the preset value to obtain a second ratio.
In particular, the second ratio is expressed as ratioc2When the preset value is e, the ratio isc2Can be expressed as:
ratioc2=e-w(c,avg)
in one embodiment, the preset value e may be 1, and then the second ratio isc2Can be expressed as:
ratioc2=1-w(c,avg)
step 506, the first ratio and the second ratio are added to obtain a channel color ratio corresponding to each color channel of the saturation enhancement pixel point.
In particular, the channel color ratio is expressed as ratiocThe first ratio is expressed as ratioc1And the second ratio is expressed as ratioc2Then the channel color ratiocCan be expressed as:
ratioc=ratioc1+ratioc2
in this embodiment, the product of the adjustment weight corresponding to each color channel value of the saturation promotion pixel point and the pixel promotion ratio of the saturation promotion pixel point is obtained as a first ratio, the adjustment weight corresponding to the saturation promotion pixel point is subtracted by the preset value to obtain a second ratio, the purpose of obtaining the channel color ratio corresponding to the color channel can be achieved through the first ratio and the second ratio, so that the minimum value in the channel color ratio can be selected as the color suppression ratio, the color suppression processing is performed on the image through the color suppression ratio, and the target image is obtained.
In one embodiment, as shown in fig. 6, the step image processing method further includes:
step 602, obtaining a pixel point in the gray image whose pixel value is greater than a preset threshold value as a highlight pixel point.
The highlight pixel points refer to pixel points with higher pixel values.
Specifically, a preset threshold value is set for a pixel value, and a pixel point with the pixel value larger than the preset threshold value is used as a highlight pixel point; and not listing the pixel points with the pixel values less than or equal to the preset threshold value into the range of the highlight pixel points. It can be understood that highlight pixels have an influence on the quality of an image, and the more highlight pixels exist in the image, the worse the quality of the image is.
Step 604, multiplying the pixel lifting ratio corresponding to the highlight pixel point by the first coefficient to obtain a third ratio;
specifically, the pixel lifting ratio corresponding to the highlight pixel point is expressed as ratiosrcThe first coefficient is expressed as f and the third ratio is expressed as ratioc3Then the third ratio is expressed as ratioc3Can be expressed as:
ratioc3=ratiosrc*f
the first coefficient f is in negative correlation with the pixel value I of the pixel point at the corresponding position of the highlight pixel point in the processed image. The first coefficient f may be expressed as:
Figure BDA0002821133410000121
alpha is 255-preset threshold
In one embodiment, the preset threshold may be 230, and the pixels with pixel values greater than 230 in the grayscale image are taken as highlight pixels. The first coefficient f may be obtained by a preset threshold.
Step 606, the first coefficient is subtracted from the preset value to obtain a fourth ratio.
Specifically, the preset value is denoted as g, and the fourth ratio is denoted as ratioc4Then the fourth ratioc4Expressed as:
ratioc4=g-f
in one embodiment, the preset value is 1,then the fourth ratioc4Expressed as:
ratioc4=1-f
step 608, adding the third ratio to the fourth ratio to obtain a channel color ratio corresponding to each color channel of the highlight pixel.
Specifically, the ratio of the channel color corresponding to the color channel of the highlight pixel point is represented as ratioc'Then ratioc'Expressed as:
ratioc'=ratioc3+ratioc4
in one embodiment, three channels are each multiplied by the ratioc'And the channel color ratio corresponding to the color channel of the highlight pixel point can be obtained.
In one embodiment, three channels are respectively multiplied by the ratio, if the maximum value in the obtained results is larger than 255, the pixel value on the channel is forced to be set to 255, the ratio of 255 to the maximum value is obtained through calculation, and the results of the other two channels are multiplied by the ratio to obtain the channel color ratio corresponding to the color channel of the highlight pixel.
In this embodiment, highlight pixel points are screened out through setting a preset threshold, and the purpose of obtaining the channel color ratio corresponding to the color channel of the highlight pixel points can be achieved through the functional relationship between the pixel promotion ratio corresponding to the highlight pixel points and the corresponding coefficient.
In one embodiment, as shown in fig. 7, the step of performing image processing on the image to be processed to obtain a processed image includes:
step 702, performing image processing on the image to be processed to obtain an intermediate image.
The intermediate image is an image before the image to be processed is processed but the processed image required by the user is not obtained yet.
Specifically, the intermediate image may be a fusion image, and the fusion image is an image obtained by performing fusion processing on an image to be processed.
Step 704, obtaining a first pixel point of which the pixel value in the gray image corresponding to the image to be processed is smaller than a preset threshold, and taking the pixel point corresponding to the first pixel point in the intermediate image as a second pixel point.
Specifically, the terminal can acquire a pixel value in a gray-scale image corresponding to the image to be processed through a pixel value acquisition tool, acquire a first pixel point smaller than a preset threshold value by comparing the acquired pixel value with the preset threshold value stored at the terminal, and take a pixel point corresponding to the first pixel point in the intermediate image as a second pixel point.
In an embodiment, the preset threshold of the pixel point may be 64, and the terminal acquires a pixel point with a pixel value smaller than 64 in the grayscale image as a first pixel point and acquires a second pixel point in the intermediate image corresponding to the first pixel point.
Step 706, obtaining a noise suppression weight of a corresponding second pixel point according to a pixel value of a first pixel point in the grayscale image, where the pixel value and the noise suppression weight form a correlation.
Specifically, the noise suppression weight of the second pixel point may be represented as w, and the pixel value of the first pixel point in the grayscale image may be represented as I1And β is a preset threshold, the noise suppression weight w of the second pixel point can be expressed as:
w=(I1/β)2*2+1.0-(I1/β)2,I1
in one embodiment, the preset threshold β is 64, and a pixel value smaller than 64 can be regarded as a dark image, and the noise suppression weight w of the second pixel point is obtained by selecting the preset threshold.
And 708, performing noise suppression on the second pixel point in the intermediate image according to the noise suppression weight to obtain a processed image.
Specifically, after the noise suppression weight is obtained through calculation, the gray level image is adjusted through the noise suppression weight, and a processed image is obtained. It will be appreciated that the processed image may be taken as an intermediate image prior to acquisition of the target image.
In one embodiment, the intermediate image may be processed by a noise suppression weight, and the intermediate image may be controlled to be between two of the images to be processedWithin a multiple of the pixel value range. For example, the intermediate image is represented as IenhanceAnd the processed image is represented as I, then I can be represented as:
Figure BDA0002821133410000141
in this embodiment, by obtaining the intermediate image, and obtaining the first pixel point within the preset threshold in the grayscale image, the corresponding second pixel point in the intermediate image, and the noise suppression weight, the pixel value of the processed image can be controlled within twice of the pixel value of the image to be processed, so that the image with the pixel value smaller than the preset threshold can improve the brightness, so that the details are more prominent, and the occurrence of color faults due to too large difference of the pixel values is prevented.
In one embodiment, as shown in fig. 8, the step of performing image processing on the image to be processed to obtain a processed image includes:
step 802, obtaining a gray image corresponding to the image to be processed.
In one embodiment, after the terminal acquires the image to be processed, the image to be processed is converted into a grayscale image, and the conversion method includes a component method, a maximum value method, a weighted average value method, or the like.
In one embodiment, a weighted average method may be used to convert the image to be processed into a grayscale image, and the RGB three-channel color pixel values in the image to be processed are weighted-averaged according to different weights to obtain the grayscale image. For example, the pixel value of a grayscale image is represented as I1(I, j), the RGB three-channel color pixel values in the image to be processed are R (I, j), G (I, j) and B (I, j), respectively, then the pixel value I of the gray image1(i, j) can be expressed as:
I1(i,j)=0.299R(i,j)+0.578G(i,j)+0.114B(i,j)
and step 804, performing brightening processing on the gray level image to obtain a brightening image.
Specifically, the brightening image corresponding to the gray-scale image can be obtained through a functional relationship between the gray-scale image pixel value and the brightening image pixel value. For example, brightening graphsLike is represented as I2The gray image is represented as I1And a and b are expressed as independent variables, the following functional relationship exists between the two variables:
Figure BDA0002821133410000142
in an embodiment, the argument a may be 2, the argument b may be 20, and a brightening image with more details can be obtained after the brightening process is performed on the gray-scale image through the above functional relationship by selecting the arguments.
Step 806, performing contrast enhancement processing on the grayscale image to obtain a contrast enhanced image.
Specifically, the low-pass filtering image I can be obtained by guiding filtering through the gray level imageguideBy low-pass filtering the diagram IguideObtaining a contrast enhanced intermediate image v1
v1=(Iguide-127.5)*c+127.5
Wherein the guided filtering is a low pass filter.
In one embodiment, the argument c may be chosen to be 1.2, enabling a contrast enhanced intermediate image v1More image detail is retained.
In one embodiment, the gray image and the intermediate image are subjected to difference calculation so as to obtain more image details, and a detail enhancement coefficient k is setdetailAn enhanced and detailed intermediate processed image v can be obtained2Expressed by the formula:
v2=v1+(I1-v1)*kdetail
in one embodiment, the detail enhancement factor k is used to obtain a high-quality gray scale imagedetailAnd 4 can be set, on the basis of the detail enhancement coefficient, the processed image details are more prominent, and the image processing effect is improved.
In one embodiment, the intermediate processed image v is processed2Set up preset up and downLimit, upper limit expressed as valmaxThe lower limit is valminIs formulated as:
valmax=I1*kdetail
valmin=I1*kdetail-(kdetail-1)
obtaining a contrast-enhanced image I3:
Figure BDA0002821133410000151
and 808, fusing the gray level image, the brightening image and the contrast enhancement image to obtain a processed image.
In one embodiment, the grayscale image, the brightened image, and the contrast-enhanced image may be weighted to obtain a processed image. Expressing the grayscale image as I1The brightened image is represented as I2And the contrast enhanced image is denoted as I3,I1、I2And I3The corresponding weight map can be represented using the following formula:
Figure BDA0002821133410000152
in one embodiment, grayscale image I1Brightening image I2And contrast enhanced image I3The corresponding weight maps are respectively set to w1、w2And w3
Figure BDA0002821133410000153
Through w1、w2And w3Middle parameter
Figure BDA0002821133410000154
And setting of parameter delta, can be such that I1、I2And I3The value of each pixel point in the corresponding weight map is closer to each pixel pointPixel mean of the image.
And normalizing the same position of the three weight maps, and converting the weight into a decimal between 0 and 1. Specifically, three weights corresponding to the positions are added, and each weight is changed into a current weight value divided by a weight sum; for example, the weight w1、w2And w3Is { 3.52.54 }, then normalized and converted to { 0.350.250.4 }. And setting the converted weight as a new weight at a corresponding position to form a weight graph. And performing multi-scale fusion or multi-resolution fusion on the weight map, the gray level image, the brightening image and the contrast enhancement image to obtain a processed image. It is to be understood that the processed image at this time may be a fused image as an intermediate image.
In this embodiment, the purpose of obtaining a fused image can be achieved by converting the image into a grayscale image, a brightened image, and a contrast-enhanced image, respectively, for fusion processing, and performing weighted fusion on the images.
In one embodiment, the fused image obtained in the above embodiment may be used as the processing image in step 202, and after the fused image is subjected to the noise suppression processing in steps 702 to 708 in fig. 7, the fused image may also be used as the processing image in step 202.
It should be understood that although the various steps in the flow charts of fig. 1-8 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-8 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least some of the other steps.
In one embodiment, as shown in fig. 9, there is provided an image processing apparatus 900 including: a processed image acquisition module 902, a channel statistics acquisition module 904, a channel color ratio acquisition module 906, a color suppression ratio acquisition module 908, and a target image acquisition module 910, wherein:
a processed image obtaining module 902, configured to perform image processing on an image to be processed to obtain a processed image;
a channel statistic value obtaining module 904, configured to obtain a color channel value corresponding to each pixel point in the processed image, and perform statistics on the color channel value to obtain channel statistics corresponding to the processed image;
a channel color ratio obtaining module 906, configured to obtain a saturation boost pixel in the processed image, and calculate, based on each color channel value of the saturation boost pixel and a channel statistic value, a channel color ratio corresponding to each color channel of the saturation boost pixel;
a color suppression ratio obtaining module 908, configured to select a minimum value from the channel color ratios corresponding to the color channels of the saturation boost pixel, as a color suppression ratio corresponding to each color channel of the saturation boost pixel;
the target image obtaining module 910 is configured to perform color suppression processing on each color channel of the saturation boost pixel based on a color suppression ratio corresponding to each color channel of the saturation boost pixel, so as to obtain a target image.
In one embodiment, the channel color ratio obtaining module 906 is further configured to obtain a grayscale image corresponding to the image to be processed;
determining the pixel lifting ratio of each pixel point in the processed image relative to the gray image according to the pixel value of each pixel point in the processed image and the pixel value of the pixel point at the corresponding position in the gray image;
and taking the pixel points with the pixel lifting ratio value larger than the preset threshold value in the processed image as saturation lifting pixel points.
In one embodiment, the channel color ratio obtaining module 906 is further configured to calculate a change value of each color channel value of the saturation boost pixel point relative to the channel statistical value;
determining an adjusting weight corresponding to each color channel value of the saturation lifting pixel point according to the change value, wherein the color channel value adjusting weight and the change value are in a negative correlation relationship;
according to the adjustment weight corresponding to each color channel value of the saturation lifting pixel point, calculating to obtain the channel color ratio corresponding to each color channel of the saturation lifting pixel point
In one embodiment, the channel color ratio obtaining module 906 is further configured to calculate a first ratio according to a product of an adjustment weight corresponding to each color channel value of the saturation boost pixel and a pixel boost ratio of the saturation boost pixel;
subtracting the saturation from the preset value to improve the adjustment weight corresponding to each color channel value of the pixel point, and obtaining a second ratio;
and adding the first ratio and the second ratio to obtain a channel color ratio corresponding to each color channel of the saturation enhancement pixel point.
In one embodiment, the image processing apparatus further comprises: highlight pixel point obtains module, third ratio and obtains module, fourth ratio and passageway colour ratio and obtain the module, wherein:
the highlight pixel point acquisition module is used for acquiring pixel points with pixel values larger than a preset threshold value in the gray level image as highlight pixel points;
the third ratio acquisition module is used for multiplying the pixel lifting ratio corresponding to the highlight pixel point by the first coefficient to obtain a third ratio; the first coefficient and the pixel value of the pixel point at the corresponding position of the highlight pixel point in the processed image form a negative correlation relationship;
the fourth ratio obtaining module is used for subtracting the first coefficient from the preset value to obtain a fourth ratio;
and the channel color ratio acquisition module is used for adding the third ratio and the fourth ratio to obtain a channel color ratio corresponding to each color channel of the highlight pixel point.
In one embodiment, the process image acquisition module is further to:
carrying out image processing on an image to be processed to obtain an intermediate image;
acquiring a first pixel point of which the pixel value in a gray image corresponding to an image to be processed is smaller than a preset threshold value, and taking the pixel point corresponding to the first pixel point in the intermediate image as a second pixel point;
obtaining a noise suppression weight of a corresponding second pixel point according to a pixel value of a first pixel point in the gray level image, wherein the pixel value and the noise suppression weight form a correlation;
and carrying out noise suppression on a second pixel point in the intermediate image according to the noise suppression weight to obtain a processed image.
In one embodiment, the process image acquisition module is further to:
acquiring a gray image corresponding to an image to be processed;
brightening the gray level image to obtain a brightened image;
carrying out contrast enhancement processing on the gray level image to obtain a contrast enhanced image;
and carrying out fusion processing on the gray level image, the brightening image and the contrast enhancement image to obtain a processed image.
For specific limitations of the image processing apparatus, reference may be made to the above limitations of the image processing method, which are not described herein again. The respective modules in the image processing apparatus described above may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 10. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement an image processing method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 10 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
carrying out image processing on an image to be processed to obtain a processed image;
acquiring color channel values corresponding to all pixel points in a processed image, and counting the color channel values to obtain channel statistical values corresponding to the processed image;
acquiring saturation lifting pixel points in the processed image, and calculating to obtain channel color ratios corresponding to the color channels of the saturation lifting pixel points based on the color channel values and the channel statistical values of the saturation lifting pixel points;
selecting a minimum value from the channel color ratios corresponding to the color channels of the saturation enhancement pixel points as a color suppression ratio corresponding to each color channel of the saturation enhancement pixel points;
and carrying out color suppression processing on each color channel of the saturation enhancement pixel point based on the color suppression ratio corresponding to each color channel of the saturation enhancement pixel point to obtain a target image.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring a gray image corresponding to an image to be processed;
determining the pixel lifting ratio of each pixel point in the processed image relative to the gray image according to the pixel value of each pixel point in the processed image and the pixel value of the pixel point at the corresponding position in the gray image;
and taking the pixel points with the pixel lifting ratio value larger than the preset threshold value in the processed image as saturation lifting pixel points.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
calculating the change value of each color channel value of the saturation lifting pixel point relative to the channel statistical value;
determining an adjusting weight corresponding to each color channel value of the saturation lifting pixel point according to the change value, wherein the color channel value adjusting weight and the change value are in a negative correlation relationship;
and calculating to obtain the channel color ratio corresponding to each color channel of the saturation lifting pixel point according to the adjusting weight corresponding to each color channel of the saturation lifting pixel point.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
calculating to obtain a first ratio according to the product of the adjusting weight corresponding to the saturation lifting pixel point and the lifting ratio of the saturation lifting pixel point;
subtracting the saturation from the preset value to improve the adjustment weight corresponding to each color channel value of the pixel point, and obtaining a second ratio;
and adding the first ratio and the second ratio to obtain a channel color ratio corresponding to each color channel of the saturation enhancement pixel point.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring pixel points of which the pixel values are larger than a preset threshold value in the gray level image as highlight pixel points;
multiplying the pixel lifting ratio corresponding to the highlight pixel point by the first coefficient to obtain a third ratio; the first coefficient and the pixel value of the pixel point at the corresponding position of the highlight pixel point in the processed image form a negative correlation relationship;
subtracting the first coefficient from a preset value to obtain a fourth ratio;
and adding the third ratio and the fourth ratio to obtain a channel color ratio corresponding to each color channel of the highlight pixel points.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
carrying out image processing on an image to be processed to obtain an intermediate image;
acquiring a first pixel point of which the pixel value in a gray image corresponding to an image to be processed is smaller than a preset threshold value, and taking the pixel point corresponding to the first pixel point in the intermediate image as a second pixel point;
obtaining a noise suppression weight of a corresponding second pixel point according to a pixel value of a first pixel point in the gray level image, wherein the pixel value and the noise suppression weight form a correlation;
and carrying out noise suppression on a second pixel point in the intermediate image according to the noise suppression weight to obtain a processed image.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
acquiring a gray image corresponding to an image to be processed;
brightening the gray level image to obtain a brightened image;
carrying out contrast enhancement processing on the gray level image to obtain a contrast enhanced image;
and carrying out fusion processing on the gray level image, the brightening image and the contrast enhancement image to obtain a processed image.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
carrying out image processing on an image to be processed to obtain a processed image;
acquiring color channel values corresponding to all pixel points in a processed image, and counting the color channel values to obtain channel statistical values corresponding to the processed image;
acquiring saturation lifting pixel points in the processed image, and calculating to obtain channel color ratios corresponding to the color channels of the saturation lifting pixel points based on the color channel values and the channel statistical values of the saturation lifting pixel points;
selecting a minimum value from the channel color ratios corresponding to the color channels of the saturation enhancement pixel points as a color suppression ratio corresponding to each color channel of the saturation enhancement pixel points;
and carrying out color suppression processing on each color channel of the saturation enhancement pixel point based on the color suppression ratio corresponding to each color channel of the saturation enhancement pixel point to obtain a target image.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring a gray image corresponding to an image to be processed;
determining the pixel lifting ratio of each pixel point in the processed image relative to the gray image according to the pixel value of each pixel point in the processed image and the pixel value of the pixel point at the corresponding position in the gray image;
and taking the pixel points with the pixel lifting ratio value larger than the preset threshold value in the processed image as saturation lifting pixel points.
In one embodiment, the computer program when executed by the processor further performs the steps of:
calculating the change value of each color channel value of the saturation lifting pixel point relative to the channel statistical value;
determining an adjusting weight corresponding to each color channel value of the saturation lifting pixel point according to the change value, wherein the color channel value adjusting weight and the change value are in a negative correlation relationship;
and calculating to obtain the channel color ratio corresponding to each color channel of the saturation lifting pixel point according to the adjusting weight corresponding to each color channel of the saturation lifting pixel point.
In one embodiment, the computer program when executed by the processor further performs the steps of:
calculating to obtain a first ratio according to the product of the adjusting weight corresponding to each color channel value of the saturation lifting pixel point and the pixel lifting ratio of the saturation lifting pixel point;
subtracting the saturation from the preset value to improve the adjustment weight corresponding to each color channel value of the pixel point, and obtaining a second ratio;
and adding the first ratio and the second ratio to obtain a channel color ratio corresponding to each color channel of the saturation enhancement pixel point.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring pixel points of which the pixel values are larger than a preset threshold value in the gray level image as highlight pixel points;
multiplying the pixel lifting ratio corresponding to the highlight pixel point by the first coefficient to obtain a third ratio; the first coefficient and the pixel value of the pixel point at the corresponding position of the highlight pixel point in the processed image form a negative correlation relationship;
subtracting the first coefficient from a preset value to obtain a fourth ratio;
and adding the third ratio and the fourth ratio to obtain a channel color ratio corresponding to each color channel of the highlight pixel points.
In one embodiment, the computer program when executed by the processor further performs the steps of:
carrying out image processing on an image to be processed to obtain an intermediate image;
acquiring a first pixel point of which the pixel value in a gray image corresponding to an image to be processed is smaller than a preset threshold value, and taking the pixel point corresponding to the first pixel point in the intermediate image as a second pixel point;
obtaining a noise suppression weight of a corresponding second pixel point according to a pixel value of a first pixel point in the gray level image, wherein the pixel value and the noise suppression weight form a correlation;
and carrying out noise suppression on a second pixel point in the intermediate image according to the noise suppression weight to obtain a processed image.
In one embodiment, the computer program when executed by the processor further performs the steps of:
acquiring a gray image corresponding to an image to be processed;
brightening the gray level image to obtain a brightened image;
carrying out contrast enhancement processing on the gray level image to obtain a contrast enhanced image;
and carrying out fusion processing on the gray level image, the brightening image and the contrast enhancement image to obtain a processed image.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An image processing method, characterized in that the method comprises:
carrying out image processing on an image to be processed to obtain a processed image;
acquiring color channel values corresponding to all pixel points in the processed image, and counting the color channel values to obtain channel statistical values corresponding to the processed image;
acquiring saturation lifting pixel points in the processed image, and calculating to obtain channel color ratios corresponding to the color channels of the saturation lifting pixel points based on the color channel values of the saturation lifting pixel points and the channel statistical values;
selecting a minimum value from the channel color ratios corresponding to the color channels of the saturation enhancement pixel points, and taking the minimum value as a color suppression ratio corresponding to each color channel of the saturation enhancement pixel points;
and carrying out color suppression processing on each color channel of the saturation enhancement pixel point based on the corresponding color suppression ratio of each color channel of the saturation enhancement pixel point to obtain a target image.
2. The method of claim 1, wherein the obtaining saturation boost pixel points in the processed image comprises:
acquiring a gray image corresponding to the image to be processed;
determining the pixel lifting ratio of each pixel point in the processed image relative to the gray image according to the pixel value of each pixel point in the processed image and the pixel value of the pixel point at the corresponding position in the gray image;
and taking the pixel points with the pixel lifting ratio value larger than a preset threshold value in the processed image as saturation lifting pixel points.
3. The method of claim 2, wherein the calculating the channel color ratio corresponding to each color channel of the saturation boost pixel based on each color channel value of the saturation boost pixel and the channel statistic comprises:
calculating the change value of each color channel value of the saturation lifting pixel point relative to the channel statistical value;
determining an adjusting weight corresponding to each color channel value of the saturation lifting pixel point according to the change value, wherein the color channel value adjusting weight and the change value form a negative correlation relationship;
and calculating to obtain a channel color ratio corresponding to each color channel of the saturation promotion pixel point according to the adjustment weight corresponding to each color channel value of the saturation promotion pixel point.
4. The method according to claim 3, wherein the calculating, according to the adjustment weight corresponding to each color channel value of the saturation enhancement pixel, a channel color ratio corresponding to each color channel of the saturation enhancement pixel comprises:
calculating to obtain a first ratio according to the product of the adjusting weight corresponding to each color channel value of the saturation lifting pixel point and the pixel lifting ratio of the saturation lifting pixel point;
subtracting the adjustment weight corresponding to each color channel value of the saturation lifting pixel point by using a preset value to obtain a second ratio;
and adding the first ratio and the second ratio to obtain a channel color ratio corresponding to each color channel of the saturation enhancement pixel point.
5. The method of claim 2, further comprising:
acquiring pixel points of which the pixel values are larger than a preset threshold value in the gray level image as highlight pixel points;
multiplying the pixel lifting ratio corresponding to the highlight pixel point by a first coefficient to obtain a third ratio; the first coefficient and the pixel value of the pixel point at the corresponding position of the highlight pixel point in the processed image form a negative correlation relationship;
subtracting the first coefficient by using a preset value to obtain a fourth ratio;
and adding the third ratio and the fourth ratio to obtain a channel color ratio corresponding to each color channel of the highlight pixel points.
6. The method of claim 1, wherein the image processing the image to be processed to obtain a processed image comprises:
carrying out image processing on an image to be processed to obtain an intermediate image;
acquiring a first pixel point of which the pixel value in the gray image corresponding to the image to be processed is smaller than a preset threshold value, and taking the pixel point corresponding to the first pixel point in the intermediate image as a second pixel point;
obtaining a noise suppression weight of the corresponding second pixel point according to the pixel value of the first pixel point in the gray level image, wherein the pixel value and the noise suppression weight form a correlation relationship;
and carrying out noise suppression on a second pixel point in the intermediate image according to the noise suppression weight to obtain a processed image.
7. The method according to claim 1 or 6, wherein the image processing of the image to be processed to obtain the processed image comprises:
acquiring a gray image corresponding to an image to be processed;
brightening the gray level image to obtain a brightened image;
carrying out contrast enhancement processing on the gray level image to obtain a contrast enhanced image;
and carrying out fusion processing on the gray level image, the brightening image and the contrast enhancement image to obtain a processed image.
8. An image processing apparatus, characterized in that the apparatus comprises:
the processing image acquisition module is used for carrying out image processing on the image to be processed to obtain a processing image;
a channel statistic value obtaining module, configured to obtain a color channel value corresponding to each pixel point in the processed image, and perform statistics on the color channel value to obtain a channel statistic value corresponding to the processed image;
a channel color ratio obtaining module, configured to obtain a saturation boost pixel point in the processed image, and calculate a channel color ratio corresponding to each color channel of the saturation boost pixel point based on each color channel value of the saturation boost pixel point and the channel statistical value;
a color suppression ratio obtaining module, configured to select a minimum value from channel color ratios corresponding to the color channels of the saturation boost pixel, where the minimum value is used as the color suppression ratio corresponding to each color channel of the saturation boost pixel;
and the target image acquisition module is used for carrying out color suppression processing on each color channel of the saturation promotion pixel point based on the color suppression ratio corresponding to each color channel of the saturation promotion pixel point to obtain a target image.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202011418526.3A 2020-12-07 2020-12-07 Image processing method, image processing device, computer equipment and storage medium Pending CN112541868A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011418526.3A CN112541868A (en) 2020-12-07 2020-12-07 Image processing method, image processing device, computer equipment and storage medium
PCT/CN2021/136086 WO2022121893A1 (en) 2020-12-07 2021-12-07 Image processing method and apparatus, and computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011418526.3A CN112541868A (en) 2020-12-07 2020-12-07 Image processing method, image processing device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112541868A true CN112541868A (en) 2021-03-23

Family

ID=75016303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011418526.3A Pending CN112541868A (en) 2020-12-07 2020-12-07 Image processing method, image processing device, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112541868A (en)
WO (1) WO2022121893A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022121893A1 (en) * 2020-12-07 2022-06-16 影石创新科技股份有限公司 Image processing method and apparatus, and computer device and storage medium
CN117237258A (en) * 2023-11-14 2023-12-15 山东捷瑞数字科技股份有限公司 Night vision image processing method, system, equipment and medium based on three-dimensional engine

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115731861A (en) * 2022-11-18 2023-03-03 武汉路特斯汽车有限公司 Screen color adjusting method and device and terminal

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120294527A1 (en) * 2011-05-19 2012-11-22 Rastislav Lukac Method for processing highlights and saturated regions in a digital image
CN103618887A (en) * 2013-11-29 2014-03-05 深圳Tcl新技术有限公司 Method and device for processing image
CN105631826A (en) * 2015-12-25 2016-06-01 武汉鸿瑞达信息技术有限公司 Color image saturation enhancing method and system
CN108876742A (en) * 2018-06-25 2018-11-23 深圳市华星光电技术有限公司 image color enhancement method and device
CN109741279A (en) * 2019-01-04 2019-05-10 Oppo广东移动通信有限公司 Image saturation method of adjustment, device, storage medium and terminal
CN111163268A (en) * 2020-01-09 2020-05-15 腾讯科技(深圳)有限公司 Image processing method and device and computer storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112541868A (en) * 2020-12-07 2021-03-23 影石创新科技股份有限公司 Image processing method, image processing device, computer equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120294527A1 (en) * 2011-05-19 2012-11-22 Rastislav Lukac Method for processing highlights and saturated regions in a digital image
CN103618887A (en) * 2013-11-29 2014-03-05 深圳Tcl新技术有限公司 Method and device for processing image
CN105631826A (en) * 2015-12-25 2016-06-01 武汉鸿瑞达信息技术有限公司 Color image saturation enhancing method and system
CN108876742A (en) * 2018-06-25 2018-11-23 深圳市华星光电技术有限公司 image color enhancement method and device
CN109741279A (en) * 2019-01-04 2019-05-10 Oppo广东移动通信有限公司 Image saturation method of adjustment, device, storage medium and terminal
CN111163268A (en) * 2020-01-09 2020-05-15 腾讯科技(深圳)有限公司 Image processing method and device and computer storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022121893A1 (en) * 2020-12-07 2022-06-16 影石创新科技股份有限公司 Image processing method and apparatus, and computer device and storage medium
CN117237258A (en) * 2023-11-14 2023-12-15 山东捷瑞数字科技股份有限公司 Night vision image processing method, system, equipment and medium based on three-dimensional engine
CN117237258B (en) * 2023-11-14 2024-02-09 山东捷瑞数字科技股份有限公司 Night vision image processing method, system, equipment and medium based on three-dimensional engine

Also Published As

Publication number Publication date
WO2022121893A1 (en) 2022-06-16

Similar Documents

Publication Publication Date Title
US10038855B2 (en) Operating a device to capture high dynamic range images
US9826149B2 (en) Machine learning of real-time image capture parameters
US11113795B2 (en) Image edge processing method, electronic device, and computer readable storage medium
CN112541868A (en) Image processing method, image processing device, computer equipment and storage medium
US9251573B2 (en) Device, method, and storage medium for high dynamic range imaging of a combined image having a moving object
US10565742B1 (en) Image processing method and apparatus
KR101998531B1 (en) Real-time video enhancement methods, terminals, and non-transitory computer readable storage media
CN112118388B (en) Image processing method, image processing device, computer equipment and storage medium
US8860806B2 (en) Method, device, and system for performing color enhancement on whiteboard color image
CN115115554B (en) Image processing method and device based on enhanced image and computer equipment
CN113132695B (en) Lens shading correction method and device and electronic equipment
CN115578284A (en) Multi-scene image enhancement method and system
CN110175967B (en) Image defogging processing method, system, computer device and storage medium
WO2023137956A1 (en) Image processing method and apparatus, electronic device, and storage medium
WO2018165023A1 (en) Method of decaying chrominance in images
CN112150368A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN115660997B (en) Image data processing method and device and electronic equipment
WO2023011280A1 (en) Image noise degree estimation method and apparatus, and electronic device and storage medium
US11640654B2 (en) Image processing method and apparatus
WO2022111269A1 (en) Method and device for enhancing video details, mobile terminal, and storage medium
CN113781330A (en) Image processing method, device and electronic system
CN114240767A (en) Image wide dynamic range processing method and device based on exposure fusion
CN111915529A (en) Video dim light enhancement method and device, mobile terminal and storage medium
JP2020191030A (en) Image processing device
US20230085693A1 (en) Image capturing device and image calculation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination