CN116797510A - Image processing method, device, computer equipment and storage medium - Google Patents
Image processing method, device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN116797510A CN116797510A CN202210240302.0A CN202210240302A CN116797510A CN 116797510 A CN116797510 A CN 116797510A CN 202210240302 A CN202210240302 A CN 202210240302A CN 116797510 A CN116797510 A CN 116797510A
- Authority
- CN
- China
- Prior art keywords
- image
- evaluated
- pixel
- similarity
- gradient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 26
- 238000011156 evaluation Methods 0.000 claims abstract description 153
- 238000013441 quality evaluation Methods 0.000 claims abstract description 34
- 238000004590 computer program Methods 0.000 claims abstract description 21
- 238000012545 processing Methods 0.000 claims abstract description 19
- 238000000034 method Methods 0.000 claims abstract description 16
- 230000000875 corresponding effect Effects 0.000 claims description 134
- 230000002596 correlated effect Effects 0.000 claims description 18
- 239000006185 dispersion Substances 0.000 claims description 17
- 238000004364 calculation method Methods 0.000 claims description 14
- 238000000605 extraction Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 6
- 239000000284 extract Substances 0.000 description 6
- 238000001303 quality assessment method Methods 0.000 description 6
- YBJHBAHKTGYVGT-ZKWXMUAHSA-N (+)-Biotin Chemical compound N1C(=O)N[C@@H]2[C@H](CCCCC(=O)O)SC[C@@H]21 YBJHBAHKTGYVGT-ZKWXMUAHSA-N 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- FEPMHVLSLDOMQC-UHFFFAOYSA-N virginiamycin-S1 Natural products CC1OC(=O)C(C=2C=CC=CC=2)NC(=O)C2CC(=O)CCN2C(=O)C(CC=2C=CC=CC=2)N(C)C(=O)C2CCCN2C(=O)C(CC)NC(=O)C1NC(=O)C1=NC=CC=C1O FEPMHVLSLDOMQC-UHFFFAOYSA-N 0.000 description 3
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 229910021389 graphene Inorganic materials 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The present application relates to an image processing method, an image processing apparatus, a computer device, a storage medium, and a computer program product. The method comprises the following steps: acquiring an image to be evaluated and a reference image corresponding to the image to be evaluated; respectively extracting pixel gradient information and pixel chromaticity information of an image to be evaluated and a reference image; determining and obtaining gradient similarity of corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel gradient information, and determining chromaticity similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel chromaticity information; determining contrast evaluation information of the image to be evaluated based on the gradient similarity and the chromaticity similarity; acquiring structure evaluation information of an image to be evaluated; the structure evaluation information is used for representing the similarity degree of the structure; and determining a quality evaluation result corresponding to the image to be evaluated based on the contrast evaluation information and the structure evaluation information. By adopting the method, the accuracy of the image quality evaluation result can be improved.
Description
Technical Field
The present application relates to the field of computer technology, and in particular, to an image processing method, an image processing apparatus, a computer device, a computer readable storage medium, and a computer program product.
Background
With the development of computer technology, quality evaluation of images is possible by computer equipment. Image quality assessment has been widely used in many fields, and therefore, there is an increasing demand for high efficiency and reliability of assessment results. The image quality evaluation can use a mathematical model to give a quantized value of the image quality, and in the traditional technology, the image to be evaluated and the distorted image are directly subjected to difference square summation, so that the calculation complexity is low, and the implementation is easy.
However, the conventional image quality evaluation method has a problem in that the accuracy of the evaluation result is low.
Disclosure of Invention
Embodiments of the present application provide an image processing method, apparatus, computer device, computer readable storage medium, and computer program product, which can improve accuracy of an image quality evaluation result.
In one aspect, the present application provides an image processing method. The method comprises the following steps: acquiring an image to be evaluated and a reference image corresponding to the image to be evaluated; respectively extracting pixel gradient information and pixel chromaticity information of the image to be evaluated and the reference image; determining and obtaining gradient similarity of corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel gradient information, and determining chromaticity similarity of corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel chromaticity information; determining contrast evaluation information of the image to be evaluated based on the gradient similarity and the chromaticity similarity; acquiring structure evaluation information of the image to be evaluated; the structure evaluation information is used for representing the structure similarity degree between the image to be evaluated and the reference image; and determining a quality evaluation result corresponding to the image to be evaluated based on the contrast evaluation information and the structure evaluation information.
On the other hand, the application also provides an image processing device. The device comprises: the image acquisition module is used for acquiring an image to be evaluated and a reference image corresponding to the image to be evaluated; the information extraction module is used for respectively extracting pixel gradient information and pixel chromaticity information of the image to be evaluated and the reference image; the similarity calculation module is used for determining and obtaining gradient similarity of corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel gradient information, and determining chromaticity similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel chromaticity information; the contrast evaluation module is used for determining contrast evaluation information of the image to be evaluated based on the gradient similarity and the chromaticity similarity; the structure evaluation module is used for acquiring structure evaluation information of the image to be evaluated; the structure evaluation information is used for representing the structure similarity degree between the image to be evaluated and the reference image; and the evaluation result obtaining module is used for determining a quality evaluation result corresponding to the image to be evaluated based on the contrast evaluation information and the structure evaluation information.
On the other hand, the application also provides computer equipment. The computer device comprises a memory storing a computer program and a processor implementing the steps of the image processing method described above when the processor executes the computer program.
In another aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the image processing method described above.
In another aspect, the present application also provides a computer program product. The computer program product comprises a computer program which, when being executed by a processor, implements the steps of the above-mentioned image processing method.
According to the image processing method, the device, the computer equipment, the computer readable storage medium and the computer program product, through respectively extracting the pixel gradient information and the pixel chromaticity information of the image to be evaluated and the reference image, the gradient similarity of the corresponding pixel points between the image to be evaluated and the reference image is determined and obtained based on the extracted pixel gradient information, the chromaticity similarity of the corresponding pixel points between the image to be evaluated and the reference image is determined based on the extracted pixel chromaticity information, the contrast evaluation information of the image to be evaluated is determined based on the gradient similarity and the chromaticity similarity, the structure evaluation information of the image to be evaluated is obtained, the quality evaluation result corresponding to the image to be evaluated is determined based on the contrast evaluation information and the structure evaluation information, the contrast evaluation information is obtained due to the fact that the gradient similarity and the chromaticity similarity are combined, and the obtained evaluation result is closer to the subjective evaluation result of human eyes, and the accuracy of the image evaluation result is improved.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a diagram of an application environment for an image processing method in one embodiment;
FIG. 2 is a flow chart of an image processing method in one embodiment;
FIG. 3 is a flowchart illustrating steps for obtaining structural evaluation information of the image to be evaluated according to one embodiment;
FIG. 4 is a schematic diagram of image position correspondence in one embodiment;
FIG. 5 is a general flow diagram of an image processing method in one embodiment;
FIG. 6 is a block diagram showing the structure of an image processing apparatus in one embodiment;
FIG. 7 is a block diagram of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The image processing method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices, and portable wearable devices, where the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart vehicle devices, and the like. The portable wearable device may be a smart watch, smart bracelet, headset, or the like. The server 104 may be implemented as a stand-alone server or as a server cluster of multiple servers.
Specifically, the terminal may collect an image and send the image to the server, request the server to perform quality assessment on the collected image, the server uses the image sent by the terminal as an image to be assessed, acquires a reference image corresponding to the image to be assessed, extracts pixel gradient information and pixel chromaticity information of the image to be assessed and the reference image respectively, determines to obtain gradient similarity of corresponding pixels between the image to be assessed and the reference image based on the extracted pixel gradient information, determines chromaticity similarity of corresponding pixels between the image to be assessed and the reference image based on the extracted pixel chromaticity information, determines contrast assessment information of the image to be assessed based on the gradient similarity and the chromaticity similarity, acquires structure assessment information of the image to be assessed, uses the structure assessment information to represent structure similarity between the image to be assessed and the reference image to be assessed to determine quality assessment results corresponding to the image to be assessed based on the contrast assessment information and the structure assessment information, and returns the obtained quality assessment results to the terminal.
In one embodiment, as shown in fig. 2, an image processing method is provided, which may be performed by a terminal and a server in cooperation, or may be performed by the terminal or the server alone. In this embodiment, the server in fig. 1 is taken as an example for illustrating the application of the method, which includes the following steps:
step 202, obtaining an image to be evaluated and a reference image corresponding to the image to be evaluated.
The image to be evaluated refers to an image which needs quality evaluation. In one embodiment, the image to be evaluated may be a night scene image taken by the terminal. The reference image corresponding to the image to be evaluated refers to a standard image which is used as an evaluation reference when the quality of the image to be evaluated is evaluated. It will be appreciated that the higher the similarity between the image to be evaluated and the reference image, the better the quality of the image to be evaluated.
In one embodiment, the terminal may collect a night scene image, send the night scene image to the server, request the server to evaluate the quality of the night scene image, and after receiving the night scene image, the server evaluates the quality of the night scene image photographed by the terminal by using the night scene image as an image to be evaluated and an image obtained by performing long exposure on the night scene image as a reference image.
In one embodiment, the image to be evaluated may also be a distorted image obtained by subjecting the reference image to distortion processing. The distortion processing here may be gaussian blur processing, white noise addition, compression processing, or the like. And the server can evaluate the quality of the image obtained by the distortion processing.
In step 204, pixel gradient information and pixel chromaticity information of the image to be evaluated and the reference image are extracted, respectively.
The pixel gradient information of the image to be evaluated refers to gradient information of each pixel in the image to be evaluated, and the pixel chromaticity information of the image to be evaluated refers to chromaticity information of each pixel in the image to be evaluated. The pixel gradient information of the reference image refers to gradient information of each pixel in the reference image, and the pixel chromaticity information of the reference image refers to chromaticity information of each pixel in the reference image. The gradient information of the pixel may be, for example, at least one of a vertical gradient and a horizontal gradient of the pixel.
Specifically, the server may extract gradient information from each pixel in the image to be evaluated to obtain pixel gradient information of the image to be evaluated, and extract chromaticity information from each pixel in the image to be evaluated to obtain pixel chromaticity information of the image to be evaluated. The server may extract gradient information for each pixel in the reference image to obtain pixel gradient information of the reference image, and extract chromaticity information for each pixel in the reference image to obtain pixel chromaticity information of the reference image.
In one embodiment, the image to be evaluated and the reference image are both RGB images, and the server may perform chromaticity conversion based on the pixel values of the image to be evaluated in each color channel to obtain pixel chromaticity information of the image to be evaluated, and perform chromaticity conversion based on the pixel values of the reference image in each color channel to obtain pixel chromaticity information of the reference image.
And 206, determining and obtaining gradient similarity of corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel gradient information, and determining chromaticity similarity of corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel chromaticity information.
The corresponding pixel points between the image to be evaluated and the reference image refer to the pixel points with the same pixel positions between the image to be evaluated and the reference image. For example, the pixels in the first row and the second column in the image to be evaluated, and the pixels corresponding to the pixels in the reference image refer to the pixels in the first row and the second column in the reference image. Gradient similarity is used to characterize the degree of similarity of gradient information between two pixels. The chroma similarity is used to characterize the degree of similarity of chroma information between two pixel points.
Specifically, the reference image and the image to be evaluated contain the same pixel position, and for each pixel position, the server may calculate the gradient similarity between the image to be evaluated and the reference image at two pixels at the pixel position, and calculate the chromaticity similarity between the image to be evaluated and the reference image at two pixels at the pixel position.
Step 208, determining contrast evaluation information of the image to be evaluated based on the gradient similarity and the chromaticity similarity.
The contrast evaluation information of the image to be evaluated is used for representing the contrast similarity between the image to be evaluated and the reference image. The contrast evaluation information is positively correlated with the gradient similarity and positively correlated with the chromaticity similarity.
Specifically, for each pixel position, the server may obtain the contrast similarity corresponding to the pixel position based on the gradient similarity and the chromaticity similarity corresponding to the pixel position, and calculate a similarity average value based on the contrast similarities of all the pixel positions, to obtain the contrast evaluation information of the image to be evaluated.
In one embodiment, considering that the importance degree of the image content included in each region in the image is different for the image, the image to be evaluated can be divided to obtain a plurality of image blocks, different weights are set for each image block, so that when the contrast evaluation information of the image to be evaluated is calculated, the server can multiply the contrast similarity of each pixel position with the weight of the image block to which the pixel position belongs, so as to update the contrast similarity of each pixel position, and finally, the updated contrast similarity is added and divided by the number of the pixel positions to obtain the contrast evaluation information of the image to be evaluated.
Step 210, obtaining structure evaluation information of an image to be evaluated; the structural evaluation information is used to characterize the degree of structural similarity between the image to be evaluated and the reference image.
Natural images have extremely high structural properties, representing strong correlations between pixels of the image, which carry important information about the structure of the object in the visual scene. Assuming that the human visual system (HSV) mainly acquires structural information from the visible region, the similarity of two images is measured by detecting whether the structural information changes the approximate information that can perceive the distortion of the images. Based on this, in the embodiment of the present application, the structural evaluation information may be further acquired, where the structural evaluation information is used to characterize the structural similarity between the image to be evaluated and the reference image.
Specifically, in one embodiment, the server may calculate pixel value dispersions of the image to be evaluated and the reference image, and calculate a pixel value variation trend correlation between the image to be evaluated and the reference image, and further may determine structural evaluation information of the image to be evaluated based on the pixel value dispersions of the image to be evaluated, the reference image, and the pixel value variation trend correlation between the image to be evaluated and the reference image. The structural evaluation information is positively correlated with the pixel value variation trend correlation degree and negatively correlated with the pixel value dispersion degree.
Step 212, determining a quality evaluation result corresponding to the image to be evaluated based on the contrast evaluation information and the structure evaluation information.
Wherein the quality evaluation result is used for evaluating the quality loss degree of the image. The quality assessment result may be a specific value, or a quality assessment level, which may be, for example: poor, generally good, very good.
Specifically, the server may multiply the contrast evaluation information and the structure evaluation information to obtain a quality evaluation result corresponding to the image to be evaluated.
In one embodiment, the server may further obtain luminance evaluation information of the image to be evaluated, where the luminance evaluation information is used to characterize a luminance similarity between the image to be evaluated and the reference image, and further the server may determine a quality evaluation result corresponding to the image to be evaluated based on the luminance evaluation information, the contrast evaluation information, and the structure evaluation information.
Further, the server may return the quality evaluation result to the terminal.
According to the image processing method, the pixel gradient information and the pixel chromaticity information of the image to be evaluated and the reference image are respectively extracted, the gradient similarity of the corresponding pixel points between the image to be evaluated and the reference image is determined and obtained based on the extracted pixel gradient information, the chromaticity similarity of the corresponding pixel points between the image to be evaluated and the reference image is determined based on the extracted pixel chromaticity information, the contrast evaluation information of the image to be evaluated is determined based on the gradient similarity and the chromaticity similarity, the structure evaluation information of the image to be evaluated is obtained, the quality evaluation result corresponding to the image to be evaluated is determined based on the contrast evaluation information and the structure evaluation information, the contrast evaluation information is obtained by combining the gradient similarity and the chromaticity similarity, and the obtained evaluation result is closer to the subjective evaluation result of human eyes, so that the accuracy of the image evaluation result is improved.
In one embodiment, determining the gradient similarity of corresponding pixels between the image to be evaluated and the reference image based on the extracted pixel gradient information includes: for each pixel point to be evaluated of the image to be evaluated, calculating the horizontal gradient and the vertical gradient of the pixel point to be evaluated, and determining a first target gradient of the pixel point to be evaluated based on the horizontal gradient and the vertical gradient; for each reference pixel point of the reference image, calculating a horizontal gradient and a vertical gradient of the reference pixel point, and determining a second target gradient of the reference pixel point based on the horizontal gradient and the vertical gradient; and determining the gradient similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the first target gradient and the second target gradient of the corresponding pixel points between the image to be evaluated and the reference image.
Specifically, for each pixel to be evaluated of the image to be evaluated, the server may calculate the horizontal gradient of the pixel to be evaluated by the following formula (1):
wherein f (x) is a pixel value, cg h (x) For the calculated horizontal gradient.
For each pixel to be evaluated of the image to be evaluated, the server may calculate the vertical gradient of the pixel to be evaluated by the following formula (2):
Wherein f (x) is a pixel value, cg v (x) For the calculated vertical gradient.
After calculating the horizontal gradient and the vertical gradient of the pixel to be evaluated, the server may determine a first target gradient of the pixel to be evaluated by the following formula (3):
wherein Cg (x) is the first target gradient.
For each reference pixel point of the reference image, the server may calculate a horizontal gradient and a vertical gradient of the reference pixel point in the same calculation manner as the pixel point to be evaluated, and determine a second target gradient of the reference pixel point based on the horizontal gradient and the vertical gradient.
After the first target gradient of each pixel point to be evaluated and the second target gradient of each reference pixel point are obtained through calculation, the server calculates the gradient similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the first target gradient and the second target gradient of the corresponding pixel points between the image to be evaluated and the reference image through the following formula (4):
wherein Cg 1 (x) Cg is the first target gradient of the corresponding pixel point 2 (x) For the second target gradient of the corresponding pixel point, M 1 Is a constant, S cg (x) Gradient similarity of corresponding pixels.
In the above embodiment, the horizontal gradient and the vertical gradient are calculated in the same manner for the pixel point to be evaluated and the reference pixel point, and the target gradient is calculated according to the horizontal gradient and the vertical gradient, and finally the gradient similarity of the corresponding pixel point between the image to be evaluated and the reference image is obtained based on the target gradient calculation, so that the obtained gradient similarity can accurately reflect the gradient information similarity degree of the corresponding pixel point between the image to be evaluated and the reference image.
In one embodiment, determining the chroma similarity of the corresponding pixel point between the image to be evaluated and the reference image based on the extracted pixel chroma information includes: for each pixel point to be evaluated of an image to be evaluated, acquiring pixel values of the pixel point to be evaluated in a red channel, a green channel and a blue channel respectively, and calculating a first chromaticity value to be evaluated of the pixel point to be evaluated in a first chromaticity channel and a second chromaticity value to be evaluated in a second chromaticity channel based on the acquired pixel values; for each reference pixel point of the reference image, acquiring pixel values of the reference pixel point in a red channel, a green channel and a blue channel respectively, and calculating a first reference chromaticity value of the reference pixel point in a first chromaticity channel and a second reference chromaticity value of the reference pixel point in a second chromaticity channel based on the acquired pixel values; determining a first chroma similarity component of a corresponding pixel point between the image to be evaluated and the reference image based on a first chroma value to be evaluated and a first reference chroma value of the corresponding pixel point between the image to be evaluated and the reference image; determining a second chroma similarity component of the corresponding pixel point between the image to be evaluated and the reference image based on the second chroma value to be evaluated and the second reference chroma value of the corresponding pixel point between the image to be evaluated and the reference image; and determining the chromaticity similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the first chromaticity similarity component and the second chromaticity similarity component.
Specifically, in this embodiment, the image to be evaluated and the reference image are both RGB images, considering that the chromaticity channel includes chromaticity information and may be used as a characteristic metric for color distortion, then for each pixel to be evaluated of the image to be evaluated, obtaining pixel values of the pixel to be evaluated in the red channel, the green channel and the blue channel respectively, and calculating a first chromaticity value to be evaluated of the pixel to be evaluated in the first chromaticity channel and a second chromaticity value to be evaluated of the pixel to be evaluated in the second chromaticity channel according to the following formula (5):
wherein, R represents the pixel value of the red channel, G represents the pixel value of the green channel, B represents the pixel value of the blue channel, P represents the first chromaticity value to be evaluated of the first chromaticity channel, and Q represents the second chromaticity value to be evaluated of the second chromaticity channel.
Similarly, for each reference pixel point of the reference image, after obtaining the pixel values of the reference pixel point in the red channel, the green channel, and the blue channel, the server may calculate the first reference chromaticity value of the reference pixel point in the first chromaticity channel and the second reference chromaticity value in the second chromaticity channel based on the obtained pixel values through the above formula (5).
The server may calculate a first chroma similarity component of a corresponding pixel point between the image to be evaluated and the reference image based on the obtained first chroma value to be evaluated and a second chroma value to be evaluated of each pixel point in the first chroma channel and the second reference chroma value of each reference pixel point in the second chroma channel, and calculate a second chroma similarity component of a corresponding pixel point between the image to be evaluated and the reference image based on the obtained first chroma similarity components and the second reference chroma values, and determine a chroma similarity of a corresponding pixel point between the image to be evaluated and the reference image based on the first chroma similarity components and the second chroma similarity components, with reference to the following formula (6):
In the above formula, S cc (x) For chroma similarity, M 2 The former term of "constant" is to calculate a first chroma similarity component, where P 1 (x)、P 2 (x) Respectively obtaining first reference chromaticity values of corresponding pixel points between the image to be evaluated and the reference image in a first chromaticity channel; the latter term of ". Is to calculate a second chroma similarity component, where Q 1 (x)、Q 2 (x) And the second reference chromaticity value of the corresponding pixel point between the image to be evaluated and the reference image in the second chromaticity channel is respectively.
In the above embodiment, the chroma similarity of the corresponding pixel points between the evaluation image and the reference image is calculated by the chroma similarity of the respective pixel points of the reference image and the image to be evaluated in two different chroma channels, so that the calculation result is more accurate.
In one embodiment, determining contrast evaluation information of an image to be evaluated based on gradient similarity and chromaticity similarity includes: forming pixel point pairs by corresponding pixel points between the image to be evaluated and the image to be evaluated; for each pixel point pair, determining a contrast component based on the gradient similarity and the chromaticity similarity of the pixel point pair; the contrast component and the gradient similarity of the pixel point pair are positively correlated, and the contrast component and the chromaticity similarity of the pixel point pair are positively correlated; and carrying out average value calculation on the contrast components of each pixel point pair to obtain contrast evaluation information of the image to be evaluated.
Specifically, for each pixel point pair, the server determines a contrast component based on the gradient similarity and the chromaticity similarity of the pixel point pair, and finally sums all the contrast components and calculates an average value to obtain contrast evaluation information of the image to be evaluated.
In a specific embodiment, the server may calculate the contrast evaluation information of the image to be evaluated with reference to the following formula (7):
wherein C is contrast evaluation information, N is total number of pixels of the image to be evaluated, alpha and beta are variable parameters, and default is 1.
In the above embodiment, the average value of the chromaticity similarity of all the corresponding pixel points is calculated as the chromaticity evaluation information of the image to be evaluated, so that the quality evaluation of the chromaticity information of the image to be evaluated can be well performed.
In one embodiment, as shown in fig. 3, obtaining structural evaluation information of an image to be evaluated includes:
step 302, dividing the image to be evaluated and the reference image respectively to obtain a plurality of reference image blocks corresponding to the reference image and a plurality of image blocks to be evaluated corresponding to the image to be evaluated.
And step 304, forming image pairs by the image blocks to be evaluated and the reference image blocks with image position corresponding relation with the image blocks to be evaluated, and obtaining an image pair set.
Wherein, the existence of the image position correspondence relationship between the reference image block and the image block to be evaluated means that the position of the reference image block in the reference image corresponds to the position of the image block to be evaluated in the transformation image, and then for each pixel in the reference image block, the pixel with the same position exists in the image block to be evaluated corresponding to the position of the pixel in the reference image block. For example, assuming that the reference image a is divided into 4 reference image blocks A1, A2, A3, A4, and the transformed image B is divided into 4 image blocks B1, B2, B3, B4 of the same size, position, and number in the same manner, the image position correspondence is shown in fig. 4, where the dashed arrow indicates the image position correspondence, as can be seen from fig. 4, there is a position correspondence between the reference image block A1 and the image block B1 to be evaluated, there is a position correspondence between the reference image block A2 and the image block B2 to be evaluated, there is a position correspondence between the reference image block A3 and the image block B3 to be evaluated, and there is a position correspondence between the reference image block A4 and the image block B4 to be evaluated, that is, the positions of the reference image block and the image block to be evaluated, which constitute the image pair, are identical in the image.
Specifically, the server may divide the reference image to obtain a plurality of reference image blocks, divide the transformed image to obtain a plurality of image blocks to be evaluated, and form an image pair from the reference image block and the image block to be evaluated having an image position correspondence with the reference image block to obtain an image pair set. Wherein the dividing refers to region dividing pixels in the image. Wherein a plurality refers to at least two. In some embodiments, the server may divide the reference image and the transformed image separately using the same image block division so that the number, location, and size of the image blocks to be evaluated and the reference image blocks are matched. In some embodiments, the server may acquire a sliding window, slide the sliding window on the reference image according to a preset sliding manner, use an image area in the sliding window as a reference image block, slide the sliding window on the transformed image according to the preset sliding manner, and use the image area in the sliding window as an image block to be evaluated, so that the reference image block and the image block to be evaluated, which have the same size and number and have image positions corresponding to each other, may be obtained.
Step 306, for the image pairs in the image pair set, calculating the correlation of the pixel value variation trend among the image blocks in the image pair, and calculating the pixel value dispersion corresponding to each image block in the image pair.
The pixel value change trend correlation refers to the correlation degree between the change trends of the pixel values between two image blocks. The pixel value trend correlation may specifically be a covariance between pixel values of two image blocks. Pixel value dispersion refers to the degree of dispersion of pixel values in an image block. The pixel value dispersion may in particular be the variance of the image block.
Step 308, obtaining the intermediate similarity corresponding to the image pair based on the correlation of the pixel value variation trend and the pixel value dispersion corresponding to each image block in the image pair.
The similarity corresponding to the image pair is in positive correlation with the pixel value variation trend correlation, and in negative correlation with the pixel value dispersion.
Specifically, for an image pair in the image pair set, the server may calculate a correlation degree of a change trend of pixel values between image blocks in the image pair, and a dispersion degree of pixel values corresponding to each image block in the image pair, and obtain an intermediate similarity corresponding to the image pair based on the correlation degree of the change trend of pixel values and the dispersion degree of pixel values corresponding to each image block in the image pair, where the intermediate similarity is used to characterize a similarity obtained by comparing image blocks in the image pair in a structure (structure).
In one embodiment, the correlation of the pixel value variation trend is covariance, the pixel value dispersion is variance, and the server can calculate the similarity between image blocks in the image pair by referring to the following formula (8), wherein S is the similarity between image blocks, σ 12 Is the covariance of pixel values between image blocks, σ 1 Sum sigma 2 Respectively the variances of the pixel values of two image blocks in the image pair, M 4 Is constant:
step 310, obtaining a preset attention degree of each image block to be evaluated, and carrying out attention treatment on the structural similarity corresponding to the image block to be evaluated based on the preset attention degree to obtain the target similarity corresponding to the image pair.
The preset attention degree refers to a preset attention degree of the image block to be evaluated. The preset attention may be, for example, a weight.
Specifically, for each image block to be evaluated, the server may multiply the preset attention of the image to be evaluated with the structural similarity corresponding to the image where the image block to be evaluated is located, to obtain the target similarity corresponding to the image where the image block to be evaluated is located.
Step 312, statistics is performed on the target similarity corresponding to each image pair in the image pair set, so as to obtain the structural evaluation information of the image to be evaluated.
Specifically, after obtaining the similarity corresponding to each image pair in the image pair set, the server may count the similarities to obtain a statistical similarity, where the statistical similarity may be at least one of an average value of the similarities or a median of the similarities, and is used as structural evaluation information of the image to be evaluated.
In the above embodiment, by dividing the image to be evaluated and the reference image, the structural similarity between the image to be evaluated and the reference image can be calculated by using the pixel value variation trend correlation between the image blocks and the pixel value dispersion of each image block as units, and the similarity can be focused by acquiring the preset focusing degree of each image block to be evaluated, and the similarity obtained by focusing processing can be counted to obtain structural evaluation information, so that the obtained structural evaluation information is more accurate.
In one embodiment, the method further comprises: calculating a first pixel value average value of each pixel point of the image to be evaluated, and calculating a second pixel value average value of each pixel point of the reference image; determining brightness evaluation information of the image to be evaluated based on the first pixel value average value and the second pixel value average value, wherein the product of the brightness evaluation information, the first pixel value average value and the second pixel value average value is positively correlated, and the sum of squares of the first pixel value average value and the second pixel value average value is negatively correlated; based on the contrast evaluation information and the structure evaluation information, determining a quality evaluation result corresponding to the image to be evaluated, including: and determining a quality evaluation result corresponding to the image to be evaluated based on the brightness evaluation information, the contrast evaluation information and the structure evaluation information.
Specifically, the server may add the pixel values of all the pixels of the image to be evaluated and divide the pixel values by the total number of the pixels in the image to be evaluated to obtain a first pixel value average value, add the pixel values of all the pixels of the reference image and divide the pixel values by the total number of the pixels in the reference image to obtain a second pixel value average value, and further determine the brightness evaluation information of the image to be evaluated based on the first pixel value average value and the second pixel value average value. The luminance evaluation information is used to characterize the luminance similarity between the image to be evaluated and the reference image.
In a specific embodiment, the server may calculate the brightness evaluation information by referring to the following formula (9), where L is the brightness evaluation information, μ 1 Mu, which is the mean value of the second pixel value 2 M is the mean value of the second pixel value 3 Is constant:
after obtaining the brightness evaluation information, the server may perform quality evaluation on the image to be evaluated based on the brightness evaluation information, the contrast evaluation information and the structure evaluation information to obtain a quality evaluation result corresponding to the image to be evaluated, with reference to the following formula (10):
SITD=C*L*S (10)
in the above embodiment, the average value of the pixel values in the image to be evaluated and the reference image is calculated to obtain the brightness evaluation information, and the quality evaluation result corresponding to the image to be evaluated is determined by combining the brightness evaluation information, the contrast evaluation information and the structure evaluation information, so that the obtained quality evaluation result is more accurate.
In a specific embodiment, an image processing method is provided, and a specific flow of the image processing method may be referred to in fig. 5. The following description will be given with reference to fig. 5, taking an example that the method is applied to the server in fig. 1, and includes the following steps:
1. and acquiring the image to be evaluated and a reference image corresponding to the image to be evaluated.
Wherein the image to be evaluated is a night scene image. The reference image is an image obtained by performing long exposure on the night scene image. The image to be evaluated and the reference image are both RGB images.
2. Respectively extracting pixel gradient Cg of each pixel point to be evaluated of the image to be evaluated 1 And a pixel gradient Cg of each reference pixel point of the reference image 2 Pixel gradient Cg based on the image to be evaluated 1 And pixel gradient Cg of the reference image 2 And calculating the gradient similarity (Scg) of the corresponding pixel points between the image to be evaluated and the reference image.
Specifically, the server may calculate the pixel gradient with reference to the above formulas (1) to (3), and calculate the gradient similarity with reference to the above formula (4).
3. Respectively extracting the chromaticity P of each pixel point to be evaluated in the first chromaticity channel of the image to be evaluated 1 And chromaticity Q in the second chromaticity channel 1 。
4. Respectively extracting chromaticity P of each reference pixel point of the reference image in a first chromaticity channel 2 And chromaticity Q in the second chromaticity channel 2 。
5. The server calculates chromaticity P of the first chromaticity channel based on corresponding pixel points between the image to be evaluated and the reference image 1 And P 2 And chromaticity Q in the second chromaticity channel 1 And Q 2 The chromaticity similarity Scc is calculated.
Specifically, the server may acquire pixel values of the image to be evaluated in the R channel, the G channel, and the B channel, respectively, and acquire pixel values of the reference image in the R channel, the G channel, and the B channel, respectively, extract the chromaticity value of the first chromaticity channel and the chromaticity value of the second chromaticity channel with reference to the above formula (5), and calculate the chromaticity similarity Scc with reference to the above formula (6).
6. And forming pixel point pairs by corresponding pixel points between the image to be evaluated and the image to be evaluated, determining a contrast component based on the gradient similarity and the chromaticity similarity of the pixel point pairs for each pixel point pair, and carrying out average value calculation based on the contrast components of each pixel point pair to obtain a contrast evaluation factor C of the image to be evaluated.
The contrast component and the gradient similarity of the pixel point pair are positively correlated, and the gradient similarity of the pixel point pair and the chroma similarity of the pixel point pair are positively correlated. The server may calculate the contrast evaluation factor C with reference to the above formula (7).
7. Calculating the average value mu of pixel values of all pixel points to be evaluated of the image to be evaluated 1 And the pixel value average mu of all reference pixel points of the reference image 2 Based on mu 1 Sum mu 2 And calculating to obtain a brightness evaluation factor L.
Specifically, the server may calculate the luminance evaluation factor L with reference to the above formula (9).
8. Calculating the variance sigma of the pixel values of all the pixel points to be evaluated of the image to be evaluated 1 Calculating the variance sigma of the pixel values of all the reference pixel points of the reference image 2 。
9. Calculating covariance sigma of pixel values between an image to be evaluated and a reference image 12 。
10. Based on variance sigma 1 Variance sigma 2 Sum covariance sigma 12 A structural evaluation factor S between the image to be evaluated and the reference image is calculated.
Specifically, the server may calculate the structure evaluation factor S with reference to the above formula (8).
11. And determining a quality evaluation result of the image to be evaluated based on the brightness evaluation factor L, the contrast evaluation factor C and the structure evaluation factor S.
Specifically, the server may calculate the quality evaluation result with reference to the above formula (10).
In the above embodiment, considering that the night scene image tends to have larger contrast and obvious image chromaticity change, the contrast evaluation factor is obtained by calculating the gradient similarity and the chromaticity similarity between the night scene image and the reference image, and the quality evaluation result of the night scene image is obtained by combining the brightness evaluation factor and the structure evaluation factor of the night scene image, so that a more strict image quality evaluation result can be obtained and is more close to the subjective evaluation result of human eyes.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an image processing device for realizing the above-mentioned related method. The implementation of the solution provided by the apparatus is similar to the implementation described in the above method, so the specific limitation of one or more embodiments of the image processing apparatus provided below may refer to the limitation of the image processing method hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 6, there is provided an image processing apparatus 600 including:
the image acquisition module 602 is configured to acquire an image to be evaluated and a reference image corresponding to the image to be evaluated;
an information extraction module 604, configured to extract pixel gradient information and pixel chromaticity information of the image to be evaluated and the reference image, respectively;
a similarity calculation module 606, configured to determine, based on the extracted pixel gradient information, a gradient similarity of a corresponding pixel point between the image to be evaluated and the reference image, and determine, based on the extracted pixel chromaticity information, a chromaticity similarity of the corresponding pixel point between the image to be evaluated and the reference image;
a contrast evaluation module 608 for determining contrast evaluation information of the image to be evaluated based on the gradient similarity and the chromaticity similarity;
the structure evaluation module 610 is configured to obtain structure evaluation information of an image to be evaluated; the structure evaluation information is used for representing the structure similarity degree between the image to be evaluated and the reference image;
and an evaluation result obtaining module 612, configured to determine a quality evaluation result corresponding to the image to be evaluated based on the contrast evaluation information and the structure evaluation information.
According to the image processing device, the pixel gradient information and the pixel chromaticity information of the image to be evaluated and the reference image are respectively extracted, the gradient similarity of the corresponding pixel points between the image to be evaluated and the reference image is determined and obtained based on the extracted pixel gradient information, the chromaticity similarity of the corresponding pixel points between the image to be evaluated and the reference image is determined based on the extracted pixel chromaticity information, the contrast evaluation information of the image to be evaluated is determined based on the gradient similarity and the chromaticity similarity, the structure evaluation information of the image to be evaluated is obtained, the quality evaluation result corresponding to the image to be evaluated is determined based on the contrast evaluation information and the structure evaluation information, the contrast evaluation information is obtained by combining the gradient similarity and the chromaticity similarity, and the obtained evaluation result is closer to the subjective evaluation result of human eyes and the accuracy of the image evaluation result is improved.
In one embodiment, the similarity calculation module is further configured to calculate, for each pixel to be evaluated of the image to be evaluated, a horizontal gradient and a vertical gradient of the pixel to be evaluated, and determine a first target gradient of the pixel to be evaluated based on the horizontal gradient and the vertical gradient; for each reference pixel point of the reference image, calculating a horizontal gradient and a vertical gradient of the reference pixel point, and determining a second target gradient of the reference pixel point based on the horizontal gradient and the vertical gradient; and determining the gradient similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the first target gradient and the second target gradient of the corresponding pixel points between the image to be evaluated and the reference image.
In one embodiment, the similarity calculation module is further configured to obtain, for each pixel to be evaluated of the image to be evaluated, pixel values of the pixel to be evaluated in a red channel, a green channel, and a blue channel, respectively, and calculate, based on the obtained pixel values, a first chromaticity value to be evaluated of the pixel to be evaluated in the first chromaticity channel and a second chromaticity value to be evaluated of the pixel to be evaluated in the second chromaticity channel; for each reference pixel point of the reference image, acquiring pixel values of the reference pixel point in a red channel, a green channel and a blue channel respectively, and calculating a first reference chromaticity value of the reference pixel point in a first chromaticity channel and a second reference chromaticity value of the reference pixel point in a second chromaticity channel based on the acquired pixel values; determining a first chroma similarity component of a corresponding pixel point between the image to be evaluated and the reference image based on a first chroma value to be evaluated and a first reference chroma value of the corresponding pixel point between the image to be evaluated and the reference image; determining a second chroma similarity component of the corresponding pixel point between the image to be evaluated and the reference image based on the second chroma value to be evaluated and the second reference chroma value of the corresponding pixel point between the image to be evaluated and the reference image; and determining the chromaticity similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the first chromaticity similarity component and the second chromaticity similarity component.
In one embodiment, the contrast evaluation module is further configured to form pixel pairs from corresponding pixels between the image to be evaluated and the image to be evaluated; for each pixel point pair, determining a contrast component based on the gradient similarity and the chromaticity similarity of the pixel point pair; the contrast component and the gradient similarity of the pixel point pair are positively correlated, and the contrast component and the chromaticity similarity of the pixel point pair are positively correlated; and carrying out average value calculation on the contrast components of each pixel point pair to obtain contrast evaluation information of the image to be evaluated.
In one embodiment, the structure evaluation module is further configured to divide the image to be evaluated and the reference image respectively, so as to obtain a plurality of reference image blocks corresponding to the reference image and a plurality of image blocks to be evaluated corresponding to the image to be evaluated; forming an image pair by the image block to be evaluated and a reference image block with an image position corresponding relation with the image block to be evaluated, and obtaining an image pair set; for the image pairs in the image pair set, calculating the pixel value change trend correlation degree among the image blocks in the image pair, and calculating the pixel value dispersion degree corresponding to each image block in the image pair; obtaining the corresponding intermediate similarity of the image pair based on the pixel value variation trend correlation and the pixel value dispersion corresponding to each image block in the image pair; acquiring preset attention degrees of all the image blocks to be evaluated, and carrying out attention treatment on the intermediate similarity corresponding to the image of the image block to be evaluated based on the preset attention degrees to obtain the target similarity corresponding to the image pair; and counting the target similarity corresponding to each image pair in the image pair set to obtain the structure evaluation information of the image to be evaluated.
In one embodiment, the apparatus is further configured to: the brightness evaluation module is used for calculating a first pixel value average value of each pixel point of the image to be evaluated and calculating a second pixel value average value of each pixel point of the reference image; determining brightness evaluation information of the image to be evaluated based on the first pixel value average value and the second pixel value average value, wherein the product of the brightness evaluation information, the first pixel value average value and the second pixel value average value is positively correlated, and the sum of squares of the first pixel value average value and the second pixel value average value is negatively correlated; the evaluation result obtaining module is further used for determining a quality evaluation result corresponding to the image to be evaluated based on the brightness evaluation information, the contrast evaluation information and the structure evaluation information.
The respective modules in the above-described image processing apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, an Input/Output interface (I/O) and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is for storing image data. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an image processing method.
It will be appreciated by those skilled in the art that the structure shown in FIG. 7 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform steps of an image processing method.
Embodiments of the present application also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform an image processing method.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.
Claims (10)
1. An image processing method, comprising:
acquiring an image to be evaluated and a reference image corresponding to the image to be evaluated;
respectively extracting pixel gradient information and pixel chromaticity information of the image to be evaluated and the reference image;
determining and obtaining gradient similarity of corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel gradient information, and determining chromaticity similarity of corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel chromaticity information;
Determining contrast evaluation information of the image to be evaluated based on the gradient similarity and the chromaticity similarity;
acquiring structure evaluation information of the image to be evaluated; the structure evaluation information is used for representing the structure similarity degree between the image to be evaluated and the reference image;
and determining a quality evaluation result corresponding to the image to be evaluated based on the contrast evaluation information and the structure evaluation information.
2. The method of claim 1, wherein the determining gradient similarity of corresponding pixels between the image to be evaluated and the reference image based on the extracted pixel gradient information comprises:
for each pixel point to be evaluated of the image to be evaluated, calculating a horizontal gradient and a vertical gradient of the pixel point to be evaluated, and determining a first target gradient of the pixel point to be evaluated based on the horizontal gradient and the vertical gradient;
for each reference pixel point of the reference image, calculating a horizontal gradient and a vertical gradient of the reference pixel point, and determining a second target gradient of the reference pixel point based on the horizontal gradient and the vertical gradient;
and determining the gradient similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the first target gradient and the second target gradient of the corresponding pixel points between the image to be evaluated and the reference image.
3. The method of claim 1, wherein the determining the chroma similarity of the corresponding pixel point between the image to be evaluated and the reference image based on the extracted pixel chroma information comprises:
for each pixel point to be evaluated of the image to be evaluated, acquiring pixel values of the pixel point to be evaluated in a red channel, a green channel and a blue channel respectively, and calculating a first chromaticity value to be evaluated of the pixel point to be evaluated in a first chromaticity channel and a second chromaticity value to be evaluated of the pixel point to be evaluated in a second chromaticity channel based on the acquired pixel values;
for each reference pixel point of the reference image, acquiring pixel values of the reference pixel point in a red channel, a green channel and a blue channel respectively, and calculating a first reference chromaticity value of the reference pixel point in a first chromaticity channel and a second reference chromaticity value of the reference pixel point in a second chromaticity channel based on the acquired pixel values;
determining a first chroma similarity component of a corresponding pixel point between the image to be evaluated and the reference image based on a first chroma value to be evaluated and a first reference chroma value of the corresponding pixel point between the image to be evaluated and the reference image;
Determining a second chroma similarity component of a corresponding pixel point between the image to be evaluated and the reference image based on a second chroma value to be evaluated and a second reference chroma value of the corresponding pixel point between the image to be evaluated and the reference image;
and determining the chromaticity similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the first chromaticity similarity component and the second chromaticity similarity component.
4. The method of claim 1, wherein the determining contrast evaluation information of the image to be evaluated based on the gradient similarity and the chroma similarity comprises:
forming pixel point pairs by corresponding pixel points between the image to be evaluated and the image to be evaluated;
for each pixel point pair, determining a contrast component based on gradient similarity and chromaticity similarity of the pixel point pair; the contrast component and the gradient similarity of the pixel point pair are positively correlated, and the contrast component and the chromaticity similarity of the pixel point pair are positively correlated;
and carrying out average value calculation on the contrast components of each pixel point pair to obtain contrast evaluation information of the image to be evaluated.
5. The method according to claim 1, wherein the acquiring structural evaluation information of the image to be evaluated includes:
dividing the image to be evaluated and the reference image respectively to obtain a plurality of reference image blocks corresponding to the reference image and a plurality of image blocks to be evaluated corresponding to the image to be evaluated;
forming an image pair by the image block to be evaluated and a reference image block with an image position corresponding relation with the image block to be evaluated, and obtaining an image pair set;
for image pairs in the image pair set, calculating the correlation degree of the change trend of pixel values among image blocks in the image pair, and calculating the dispersion degree of pixel values corresponding to each image block in the image pair;
obtaining the corresponding intermediate similarity of the image pair based on the pixel value variation trend correlation and the pixel value dispersion corresponding to each image block in the image pair;
acquiring preset attention degrees of all image blocks to be evaluated, and carrying out attention treatment on intermediate similarity corresponding to an image where the image blocks to be evaluated are positioned based on the preset attention degrees to obtain target similarity corresponding to the image pairs;
And counting the target similarity corresponding to each image pair in the image pair set to obtain the structure evaluation information of the image to be evaluated.
6. The method according to any one of claims 1 to 5, further comprising:
calculating a first pixel value average value of each pixel point of the image to be evaluated, and calculating a second pixel value average value of each pixel point of the reference image;
determining brightness evaluation information of the image to be evaluated based on the first pixel value average value and the second pixel value average value, wherein the brightness evaluation information is positively correlated with the product of the first pixel value average value and the second pixel value average value, and the square sum of the first pixel value average value and the second pixel value average value is negatively correlated;
the determining, based on the contrast evaluation information and the structure evaluation information, a quality evaluation result corresponding to the image to be evaluated includes:
and determining a quality evaluation result corresponding to the image to be evaluated based on the brightness evaluation information, the contrast evaluation information and the structure evaluation information.
7. An image processing apparatus, comprising:
The image acquisition module is used for acquiring an image to be evaluated and a reference image corresponding to the image to be evaluated;
the information extraction module is used for respectively extracting pixel gradient information and pixel chromaticity information of the image to be evaluated and the reference image;
the similarity calculation module is used for determining and obtaining gradient similarity of corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel gradient information, and determining chromaticity similarity of the corresponding pixel points between the image to be evaluated and the reference image based on the extracted pixel chromaticity information;
the contrast evaluation module is used for determining contrast evaluation information of the image to be evaluated based on the gradient similarity and the chromaticity similarity;
the structure evaluation module is used for acquiring structure evaluation information of the image to be evaluated; the structure evaluation information is used for representing the structure similarity degree between the image to be evaluated and the reference image;
and the evaluation result obtaining module is used for determining a quality evaluation result corresponding to the image to be evaluated based on the contrast evaluation information and the structure evaluation information.
8. A computer device comprising a memory and a processor, the memory having stored therein a computer program which, when executed by the processor, causes the processor to perform the steps of the image processing method according to any of claims 1 to 6.
9. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the image processing method according to any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the image processing method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210240302.0A CN116797510A (en) | 2022-03-10 | 2022-03-10 | Image processing method, device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210240302.0A CN116797510A (en) | 2022-03-10 | 2022-03-10 | Image processing method, device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116797510A true CN116797510A (en) | 2023-09-22 |
Family
ID=88044251
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210240302.0A Pending CN116797510A (en) | 2022-03-10 | 2022-03-10 | Image processing method, device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116797510A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117611578A (en) * | 2024-01-17 | 2024-02-27 | 深圳市新良田科技股份有限公司 | Image processing method and image processing system |
CN118015033A (en) * | 2024-02-02 | 2024-05-10 | 嘉洋智慧安全科技(北京)股份有限公司 | Image processing method, device, equipment and medium |
-
2022
- 2022-03-10 CN CN202210240302.0A patent/CN116797510A/en active Pending
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117611578A (en) * | 2024-01-17 | 2024-02-27 | 深圳市新良田科技股份有限公司 | Image processing method and image processing system |
CN117611578B (en) * | 2024-01-17 | 2024-06-11 | 深圳市新良田科技股份有限公司 | Image processing method and image processing system |
CN118015033A (en) * | 2024-02-02 | 2024-05-10 | 嘉洋智慧安全科技(北京)股份有限公司 | Image processing method, device, equipment and medium |
CN118015033B (en) * | 2024-02-02 | 2024-09-17 | 嘉洋智慧安全科技(北京)股份有限公司 | Image processing method, device, equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Ren et al. | Gated fusion network for single image dehazing | |
Zhang et al. | A no-reference evaluation metric for low-light image enhancement | |
WO2022022154A1 (en) | Facial image processing method and apparatus, and device and storage medium | |
CN113034358B (en) | Super-resolution image processing method and related device | |
CN111079764B (en) | Low-illumination license plate image recognition method and device based on deep learning | |
CN112991278B (en) | Method and system for detecting Deepfake video by combining RGB (red, green and blue) space domain characteristics and LoG (LoG) time domain characteristics | |
CN116797510A (en) | Image processing method, device, computer equipment and storage medium | |
CN108337551A (en) | A kind of screen recording method, storage medium and terminal device | |
WO2022100510A1 (en) | Image distortion evaluation method and apparatus, and computer device | |
CN111160284A (en) | Method, system, equipment and storage medium for evaluating quality of face photo | |
WO2021175040A1 (en) | Video processing method and related device | |
CN111935479A (en) | Target image determination method and device, computer equipment and storage medium | |
WO2022116104A1 (en) | Image processing method and apparatus, and device and storage medium | |
CN110298829A (en) | A kind of lingual diagnosis method, apparatus, system, computer equipment and storage medium | |
CN113810611B (en) | Data simulation method and device for event camera | |
CN112184672A (en) | No-reference image quality evaluation method and system | |
WO2023169318A1 (en) | Image quality determination method, apparatus, device, and storage medium | |
CN113850748B (en) | Evaluation system and method for point cloud quality | |
CN114581318A (en) | Low-illumination image enhancement method and system | |
CN105979283A (en) | Video transcoding method and device | |
CN117911370A (en) | Skin image quality evaluation method and device, electronic equipment and storage medium | |
Yang et al. | No-reference image quality assessment focusing on human facial region | |
CN116744125A (en) | Image color data processing method, device, equipment and storage medium | |
Yan et al. | Blind image quality assessment based on natural redundancy statistics | |
WO2021000495A1 (en) | Image processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |