CN114764771A - Image quality evaluation method, device, equipment, chip and storage medium - Google Patents

Image quality evaluation method, device, equipment, chip and storage medium Download PDF

Info

Publication number
CN114764771A
CN114764771A CN202110029714.5A CN202110029714A CN114764771A CN 114764771 A CN114764771 A CN 114764771A CN 202110029714 A CN202110029714 A CN 202110029714A CN 114764771 A CN114764771 A CN 114764771A
Authority
CN
China
Prior art keywords
value
image
evaluated
calculation
factor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110029714.5A
Other languages
Chinese (zh)
Inventor
段勤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110029714.5A priority Critical patent/CN114764771A/en
Publication of CN114764771A publication Critical patent/CN114764771A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

The embodiment of the application discloses an image quality evaluation method, an image quality evaluation device, equipment, a chip and a storage medium, wherein the method comprises the following steps: acquiring an image to be evaluated and a corresponding true value image; determining a color information difference value and an edge information difference value between the image to be evaluated and the true value image; and performing weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated. Therefore, the technical scheme of the application can not only integrate the human eye vision system into an image quality evaluation system, but also use the difference between the color information and the edge information as an evaluation index, thereby improving the accuracy of the evaluation result and further improving the matching degree of the evaluation result and the subjective feeling of human eyes.

Description

Image quality evaluation method, device, equipment, chip and storage medium
Technical Field
The present disclosure relates to the field of image evaluation technologies, and in particular, to an image quality evaluation method, apparatus, device, chip, and storage medium.
Background
Images are the main source of information acquisition of people, with the popularization of electronic equipment such as smart phones, video cameras, digital cameras and the like, a large number of images are acquired at every moment, and the rapidly developed internet technology enables the transmission and sharing of the images to be rapid and convenient. During the process of acquiring, transmitting, processing and displaying images, some inevitable interference factors often cause the degradation of image quality, such as electronic noise, jitter blurring, data loss caused by compression and the like, so that reliable quality evaluation of images is a precondition for accurate recognition and better utilization of images.
Among them, in the current image quality evaluation system, Peak Signal To Noise Ratio (PSNR) and Structural Similarity (SSIM) are generally used as objective quality indexes for describing images. Although the PSNR and the SSIM can quantify the difference in image quality to some extent, the difference in subjective perception of human eyes cannot be reflected in many scenes, so that the matching degree between the final evaluation result and the subjective perception of human eyes is low.
Disclosure of Invention
The application provides an image quality evaluation method, an image quality evaluation device, an image quality evaluation equipment, a chip and a storage medium, which can solve the problem that the current image quality evaluation index is inconsistent with the subjective feeling of human eyes, thereby improving the accuracy of the evaluation result and further improving the matching degree of the evaluation result and the subjective feeling of human eyes.
In order to achieve the purpose, the technical scheme of the application is realized as follows:
in a first aspect, an embodiment of the present application provides an image quality evaluation method, including:
acquiring an image to be evaluated and a corresponding true value image;
determining a color information difference value and an edge information difference value between the image to be evaluated and the true value image; and
and performing weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated.
In a second aspect, an embodiment of the present application provides an image quality evaluation apparatus including an acquisition unit, a determination unit, and a calculation unit; wherein, the first and the second end of the pipe are connected with each other,
the acquisition unit is configured to acquire an image to be evaluated and a corresponding true value image;
the determining unit is configured to determine a color information difference value and an edge information difference value between the image to be evaluated and the truth value image;
the calculating unit is configured to perform weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory and a processor; wherein the content of the first and second substances,
the memory for storing a computer program operable on the processor;
the processor, when executing the computer program, is configured to perform the method according to the first aspect.
In a fourth aspect, an embodiment of the present application provides a chip, where the chip includes a processor; wherein the content of the first and second substances,
the processor is adapted to perform the method according to the first aspect when running a computer program called from the memory.
In a fifth aspect, the present application provides a computer storage medium storing a computer program, which when executed by at least one processor implements the method according to the first aspect.
According to the image quality evaluation method, the image quality evaluation device, the image quality evaluation equipment, the image quality evaluation chip and the storage medium, the image to be evaluated and the corresponding true value image are obtained; determining a color information difference value and an edge information difference value between the image to be evaluated and the true value image; and performing weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated. Therefore, the technical scheme of the application can not only integrate a human visual system into an image quality evaluation system, but also use the difference between color information and edge information as an evaluation index, thereby solving the problem that the current image quality evaluation index is inconsistent with human subjective feeling, improving the accuracy of the evaluation result and further improving the matching degree of the evaluation result and the human subjective feeling.
Drawings
Fig. 1 is a schematic flowchart of an image quality evaluation method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of another image quality evaluation method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an image quality evaluation framework provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an image quality evaluation apparatus according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a specific hardware structure of an electronic device according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a specific hardware structure of a chip according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant application and are not limiting of the application. It should be further noted that, for the convenience of description, only the portions relevant to the related applications are shown in the drawings.
In recent years, Image Quality Assessment (IQA) has become an important research direction and research focus in the field of Image processing. The image quality evaluation method can be generally divided into a subjective evaluation method and an objective evaluation method, the subjective evaluation is mainly that subjective scoring is performed on the image quality by a plurality of observers, and a commonly used scoring mechanism is an average subjective Score Method (MOS) or a difference subjective Score method (DMOS). Although the subjective evaluation method is the most accurate description of the image quality, the method has large workload and long time consumption and is difficult to be practically applied; the objective evaluation method is a method for obtaining image quality indexes which are consistent with subjective evaluation results through construction of a mathematical model and calculation of a computer, is convenient and efficient to apply, and is a main means for image quality evaluation of people.
In today of rapid development of System On Chip (SOC) of smart phones, capabilities of Image Signal Processors (ISPs) are regarded as one of core competitiveness by more and more manufacturers. With the increasing development of Artificial Intelligence (AI) technology, more and more ISP traditional algorithms are replaced by AI algorithms, such as denoising (Denoise), Super Resolution (SR), etc. Since AI algorithms using Supervised Learning need to measure the difference between a true value (ground route) picture and an actually obtained picture, such algorithms often face a problem of how to quantitatively describe the difference. In the actual process of denoising and super-resolution training (Train) and Inference (Inference), Peak Signal To Noise Ratio (PSNR) and Structural Similarity (SSIM) can be used as objective quality indexes for describing images.
PSNR is an engineering term representing the ratio of the maximum possible power of a signal and the power of destructive noise affecting its representation accuracy. Since many signals have a very wide dynamic range, the peak signal-to-noise ratio is often expressed in logarithmic decibel units. Assume that there are two single-channel images I and K of size m × n, I being the true-value image and K being the noise-approximation image of I. PSNR of both can be obtained by equation (1), where Mean Square Error (MSE) refers to the expected value of the Square of the difference between the estimated value of the parameter and the true value of the parameter, i.e., MSE can be represented by equation (2);
Figure BDA0002891622130000041
represents MAXISquare of (MAX)IRepresenting the maximum possible pixel value of the image I, which may be 255 if each pixel is represented by an 8-bit binary.
Figure BDA0002891622130000042
Figure BDA0002891622130000043
SSIM is an index used to measure the similarity between two images. When one of the two images is a true-value image and the other is a true-value distorted image, the structural similarity between the two images can be regarded as an image quality measure of the distorted image. Compared with the traditional image quality measurement index, the SSIM can better accord with the judgment of human eyes on the image. The SSIM calculation method is shown in formula (3), wherein x and y respectively represent two images to be compared, and C1、C2Is constant, μ is mean, σ is standard deviation, σxyIs the covariance, σ, of the two imagesxyThe specific calculation of (2) is shown in formula (4).
Figure BDA0002891622130000044
Figure BDA0002891622130000051
At present, although PSNR and SSIM can quantify the difference of image quality to a certain extent, the difference of human eyes' subjective feeling cannot be reflected in many scenes. Illustratively, two images with the same scene are taken as an example, one is a true value picture, the other is a picture obtained after denoising, and SSIM values of the two are as high as more than 0.9. From the value of SSIM, the two pictures should be very close, but this is often not the case in practice. If a detail is enlarged locally, it may be found that there is a large difference in the subjective perception of the human eye at that detail.
Based on this, the embodiment of the present application provides an image quality evaluation method, and the basic idea of the method is: acquiring an image to be evaluated and a corresponding true value image; determining a color information difference value and an edge information difference value between the image to be evaluated and the true value image; and performing weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated. Therefore, the technical scheme of the application not only integrates a human visual system into an image quality evaluation system, but also uses the difference between color information and edge information as an evaluation index, so that the problem that the current image quality evaluation index is inconsistent with human subjective feeling can be solved, the accuracy of the evaluation result can be improved, and the matching degree of the evaluation result and the human subjective feeling is improved.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
In an embodiment of the present application, referring to fig. 1, a flowchart of an image quality evaluation method provided in an embodiment of the present application is shown. As shown in fig. 1, the method may include:
s101: and acquiring an image to be evaluated and a corresponding true value image.
It should be noted that the image quality evaluation method according to the embodiment of the present application may be applied to an image quality evaluation device or an electronic apparatus integrated with an image quality evaluation device. Here, the electronic device may be a device such as a smart phone, a tablet computer, a notebook computer, a palm top computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a video camera, a Digital camera, and the like, and the embodiments of the present application are not particularly limited.
It should be noted that, in the image quality evaluation system, for the image to be evaluated, a corresponding true value image is usually used as an evaluation reference. The truth value image is different for different application scenes; but the true images are the same for the same application scenario.
S102: and determining a color information difference value and an edge information difference value between the image to be evaluated and the true value image.
In the embodiment of the present application, a Human Visual System (HVS) may be integrated into an image quality evaluation System, and a color information difference value and an edge information difference value are used as evaluation indexes for image quality evaluation. That is to say, after the image to be evaluated and the corresponding true value image are obtained, the color information difference value and the edge information difference value between the image to be evaluated and the true value image can be calculated based on the human visual system.
Wherein, the human visual system can be used as a visual weight factor to act on the calculation of the color information difference value and the edge information difference value. Here, the calculation of the visual weight factor is mainly based on the following three points: (1) human eyes are sensitive to noise in smooth regions and insensitive to noise in texture regions; (2) the human eye is particularly sensitive to edge noise, and the image edge area cannot be greatly changed; (3) human eyes have different sensitivities to different grays, are more sensitive to middle grays, and are less sensitive to high/low grays. For the (1) th point and the (2) th point, an edge factor of an image (such as an image to be evaluated or a true image) can be determined and represented by an hsv _ grad _ factor; for point (3), a grayscale factor of an image (e.g., an image to be evaluated or a true image) can be determined, represented by an hsv _ gray _ factor.
In some embodiments, according to the edge information of the image to be evaluated, calculating by using a first calculation model to obtain an edge factor of the image to be evaluated; and calculating to obtain an edge factor of the true value image by using a first calculation model according to the edge information of the true value image.
Specifically, for the hsv _ grad _ factor, the first calculation model can be calculated by the following equations (5) and (6). Wherein norm represents a normalization function, the purpose of which is to limit the processed data within a certain preset range, and the specific function of normalization is to summarize the statistical distribution of uniform samples. The concrete calculation of the norm function is shown in equation (7).
Figure BDA0002891622130000061
Figure BDA0002891622130000062
Figure BDA0002891622130000063
Here, grad represents edge information (or referred to as "gradient") of an image; the mean function is a computer function that can return the median of a given value. The median value may be a value in the middle of a group of values, or may be an average of the maximum value (max) and the minimum value (min). In the embodiment of the present application, the median is an average value of the maximum value and the minimum value, i.e., a specific calculation of the mean function is shown in equation (6).
In some embodiments, according to the gray information of the image to be evaluated, calculating by using a second calculation model to obtain a gray factor of the image to be evaluated; and calculating to obtain the gray factor of the true value image by using a second calculation model according to the gray information of the true value image.
Specifically, for the hsv _ gray _ factor, the second calculation model can be calculated by the following equation (8). Wherein light represents gray information of an image.
Figure BDA0002891622130000071
Thus, after obtaining the hsv _ grad _ factor and the hsv _ grad _ factor, multiplying the hsv _ grad _ factor and the hsv _ grad _ factor to obtain a visual weight factor, which can be expressed by hvs _ factor, and the specific calculation thereof is shown in formula (9).
hvs_factor=hvs_grad_factor*hvs_gray_factor (9)
It should be further noted that, when the image is an image to be evaluated, the obtained visual weight factor is the first visual weight factor of the image to be evaluated, and hvs _ factor may be used1And (4) showing. When the image is a true value image, the visual weighting factor obtained at this time is the second visual weighting factor of the true value image, and the visual weighting factor can beUsing hvs _ factorgtAnd (4) showing.
Thus, the hvs _ factor is obtained1And hvs _ factorgtThen, the difference between the image to be evaluated and the true-value image in the two indexes of color information and edge information can be calculated to obtain a color information difference value (which can be represented by hsv _ diff) and an edge information difference value (which can be represented by G _ diff) between the image to be evaluated and the true-value image.
S103: and performing weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated.
Note that the preset weight values may include a first weight value (denoted by hsv _ weight) and a second weight value (denoted by G _ weight). Wherein the first weight value is used for indicating the proportion occupied by the color information difference in the image quality evaluation; the second weight value is used to indicate a specific gravity occupied by the edge information difference in the image quality evaluation.
In some embodiments, for a preset weight value, the method may further include:
if the image to be evaluated is in the color information of the relative concern, setting the first weight value to be larger than the second weight value;
and if the image to be evaluated is relatively concerned with the edge information, setting the first weight value to be smaller than the second weight value.
Here, for an image to be evaluated, if a user pays more attention to color information in an image quality evaluation system, a first weight value may be set to be greater than a second weight value; if the user pays more attention to the edge information in the image quality evaluation system, the first weight value may be set to be smaller than the second weight value. That is to say, in the embodiment of the present application, specific values of the first weight value and the second weight value are set according to an actual scene, and are also related to the attention point of the user. Illustratively, if the user focuses more on the color information, the setting of the first weight value is greater than the second weight value; if the user is more interested in edge information (such as smoothness, waviness, etc.), the second weight value is set to be greater than the first weight value.
It should be further noted that after the color information difference value and the edge information difference value are obtained, the color information difference value and the edge information difference value need to be normalized, so that the color information difference value and the edge information difference value are in the same preset range. Therefore, in some embodiments, the performing, by using a preset weight value, a weighted calculation on the color information difference value and the edge information difference value to obtain an image quality evaluation result of the image to be evaluated may include:
normalizing the color information difference value to obtain a color information normalized value; carrying out weighted calculation on the color information normalization value according to the first weight value to obtain a color information evaluation result;
normalizing the edge information difference value to obtain an edge information normalized value; carrying out weighted calculation on the edge information normalization value according to the second weight value to obtain an edge information evaluation result;
and summing the color information evaluation result and the edge information evaluation result to obtain an image quality evaluation result of the image to be evaluated.
That is, after the color information difference value (hsv _ diff) is obtained, it is normalized, i.e., norm (hsv _ diff); the evaluation result of the color information obtained at this time is represented by norm (hsv _ diff) × hsv _ weight. After the edge information difference value (G _ diff) is obtained, normalizing the edge information difference value (G _ diff), namely norm (G _ diff); the edge information evaluation result obtained at this time is represented by norm (G _ diff) × G _ weight. Further, as for the image quality evaluation result of the image to be evaluated, the specific calculation is shown in formula (10).
output=norm(hsv_diff)*hsv_weight+norm(G_diff)*G_weight (10)
Here, output denotes an image quality evaluation result. The higher the output value is, the higher the similarity between the image to be evaluated and the true value image is, so that the evaluation result is closer to the subjective feeling of human eyes, and the matching degree between the evaluation result and the subjective feeling of human eyes is greatly improved.
The embodiment provides an image quality evaluation method, which includes acquiring an image to be evaluated and a corresponding true value image; determining a color information difference value and an edge information difference value between the image to be evaluated and the true value image; and performing weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated. Therefore, the technical scheme of the application can not only integrate a human eye vision system into an image quality evaluation system, but also use the difference between color information and edge information as an evaluation index, thereby solving the problem that the current image quality evaluation index is inconsistent with human eye subjective feeling, improving the accuracy of the evaluation result and further improving the matching degree of the evaluation result and the human eye subjective feeling.
In another embodiment of the present application, refer to fig. 2, which shows a schematic flowchart of another image quality evaluation method provided in this embodiment of the present application. As shown in fig. 2, the method may include:
s201: and acquiring an image to be evaluated and a corresponding true value image.
S202: and determining a first visual weight factor of the image to be evaluated and a second visual weight factor of the truth-value image.
It should be noted that, in the image quality evaluation system, the input data includes an image to be evaluated and a corresponding true value image. In order to solve the problem that the current image quality evaluation index is inconsistent with the subjective feeling of human eyes, the embodiment of the application can integrate a human eye vision system into an image quality evaluation system, so that the human eye vision system can be used as a vision weight factor to act on the calculation of the color information difference value and the edge information difference value between the image to be evaluated and the true value image.
It should be further noted that, for the image to be evaluated, the first visual weight factor (using hvs _ factor) can be determined according to the human visual system at this time1Representation). Specifically, in some embodiments, the determining a first visual weight factor of the image to be evaluated may include:
determining edge information of the image to be evaluated and gray information of the image to be evaluated;
determining an edge calculation value of the image to be evaluated based on the edge information of the image to be evaluated;
normalizing the edge calculation value of the image to be evaluated to obtain an edge factor of the image to be evaluated;
determining a gray scale calculation value of the image to be evaluated based on the gray scale information of the image to be evaluated;
normalizing the gray scale calculation value of the image to be evaluated to obtain a gray scale factor of the image to be evaluated;
and multiplying the edge factor of the image to be evaluated and the gray factor of the image to be evaluated to obtain a first visual weight factor of the image to be evaluated.
That is, based on the human visual system, the grad for edge information of the image to be evaluated can be determined1Light for indicating gray scale information of image to be evaluated1Indicating that the edge of the image to be evaluated is calculated as
Figure BDA0002891622130000101
The gray scale calculation value of the image to be evaluated is
Figure BDA0002891622130000102
In this way, for correlation coefficients in the human visual system, such as the edge factor (hvs _ grad _ factor) of the image to be evaluated1) And the gray scale factor (hvs _ gray _ factor) of the image to be evaluated1) The calculation formulas are respectively shown as a formula (11) and a formula (13), wherein the specific calculation of the norm function is shown as a formula (7) and the specific calculation of the mean function is shown as a formula (12).
Figure BDA0002891622130000103
Figure BDA0002891622130000104
Figure BDA0002891622130000105
Thus, the hvs _ grad _ factor is obtained1And hvs _ gray _ factor1Then, the first visual weight factor (using hvs _ factor) can be obtained by multiplying the two1Expressed), the specific calculation thereof is as shown in formula (14).
hvs_factor1=hvs_grad_factor1*hvs_gray_factor1 (14)
It should be noted that, for the true value image, the second visual weight factor (using hvs _ factor) can be determined according to the human visual system at this timegtRepresentation). Specifically, in some embodiments, the determining the second visual weighting factor for the truth image may include:
determining edge information of the true value image and gray scale information of the true value image;
determining an edge calculation value of the truth image based on edge information of the truth image;
normalizing the edge calculation value of the true value image to obtain an edge factor of the true value image;
determining a gray scale calculation value of the true value image based on gray scale information of the true value image;
normalizing the gray scale calculation value of the true value image to obtain a gray scale factor of the true value image;
and performing multiplication operation on the edge factor of the true value image and the gray factor of the true value image to obtain a second visual weight factor of the true value image.
That is, based on the human visual system, the edge information grad of the true value image can be determinedgtLight for gray scale information representing true value imagegtShowing that the edge calculation value of the true image is
Figure BDA0002891622130000111
The gray value of the true value image is
Figure BDA0002891622130000112
Thus, for correlation coefficients in the human visual system, such as the edge factor (hvs _ grad _ factor) of the true imagegt) And the gray scale factor (hvs _ gray _ factor) of the true value imagegt) The calculation formulas are respectively shown as a formula (15) and a formula (17), wherein the specific calculation of the norm function is shown as a formula (7) and the specific calculation of the mean function is shown as a formula (16).
Figure BDA0002891622130000113
Figure BDA0002891622130000114
Figure BDA0002891622130000115
Thus, we get hvs _ grad _ factorgtAnd hvs _ gray _ factorgtThen, multiplying the two can obtain a second visual weight factor (using hvs factor)gtExpressed), the specific calculation thereof is as shown in formula (18).
hvs_factorgt=hvs_grad_factorgt*hvs_gray_factorgt (18)
Thus, the hvs _ factor is obtained1And hvs _ factorgtThen, the difference between the image to be evaluated and the true-value image in the two indexes of color information and edge information can be calculated to obtain a color information difference value (which can be represented by hsv _ diff) and an edge information difference value (which can be represented by G _ diff) between the image to be evaluated and the true-value image.
S203: and acquiring the color information value of the image to be evaluated and the color information value of the true value image, and performing difference calculation on the color information value of the image to be evaluated and the color information value of the true value image according to the first visual weight factor and the second visual weight factor to obtain the color information difference value.
It should be noted that the color information values may include a first color component value, a second color component value, and a third color component value. Wherein the first color component Value is a Hue (Hue, H) Value, the second color component Value is a Saturation (S) Value, and the third color component Value is a Value (Value, V) Value; alternatively, the first color component value is a Hue (H) value, the second color component value is a Saturation (S) value and the third color component value is a luminance (L) value. In other words, the embodiment of the present application may be applicable to HSV color spaces, HSL color spaces, and even other color spaces, which are not specifically limited herein.
It should be noted that, for the calculation of the color information difference value, the color component difference values of the three color components need to be calculated first, and then the color information difference value is further calculated according to the three color component difference values. Specifically, in some embodiments, the performing, according to the first visual weighting factor and the second visual weighting factor, a difference calculation on the color information value of the image to be evaluated and the color information value of the true-value image to obtain the color information difference value may include:
performing difference calculation on the first color component value of the image to be evaluated and the first color component value of the truth-value image according to the first visual weight factor and the second visual weight factor to obtain a first color component difference value;
performing difference calculation on a second color component value of the image to be evaluated and a second color component value of the truth-value image according to the first visual weight factor and the second visual weight factor to obtain a second color component difference value;
performing difference calculation on a third color component value of the image to be evaluated and a third color component value of the truth-value image according to the first visual weight factor and the second visual weight factor to obtain a third color component difference value;
and performing weighted average calculation on the first color component difference value, the second color component difference value and the third color component difference value by using a preset color weight factor to obtain the color information difference value.
In the embodiment of the present application, the color information may be composed of three color components, i.e., Hue, Saturation, and Value. In calculating the color information difference value, the differences in the three color components may be calculated separately. Wherein H1,S1,V1Respectively representing the values of Hue, Saturation, Value in the image to be evaluated, Hgt,Sgt,VgtValues of Hue, Saturation and Value in the truth image are shown, respectively.
Assuming that the first color component is Hue (Hue), it can be represented by h _ diff for the first color component difference value; the specific calculation formula is shown in formula (19).
Figure BDA0002891622130000131
Assuming that the second color component is Saturation (Saturation), the difference value for the second color component may be represented by s _ diff; the specific calculation formula is shown as formula (20).
Figure BDA0002891622130000132
Assuming that the third color component is luma (Value), then for the third color component difference Value, it can be denoted by v _ diff; the specific calculation formula is shown in formula (21).
Figure BDA0002891622130000133
Here, mean function means calculating mean, std function means calculating standard deviation, and the purpose of introducing standard deviation is to introduce description of distribution difference; alpha and beta respectively represent weight coefficients of the mean value and the standard deviation, and specific values of the alpha and the beta are set according to an actual scene. Typically, α is greater than β, with the sum of α and β equal to 1. Illustratively, the value of α is set to 0.75, and the value of β is set to 0.25, but the embodiment of the present application is not particularly limited.
In addition, the std function is calculated as shown in equation (22), and the std function cannot calculate the whole standard deviation of the matrix, and can only solve the standard deviation one by one according to rows or columns, and the standard deviation is solved according to columns as a default.
Figure BDA0002891622130000141
It should be further noted that, in the embodiment of the present application, the distance calculation method may be to calculate the L1 distance, and the specific calculation formula is shown in equation (23). However, in the embodiment of the present application, the distance may be calculated not only by calculating the L1 distance, but also by calculating the L2 distance, and may also be calculated by calculating the euclidean distance, and may even be calculated by other distance calculation methods, which is not limited herein.
l1_distance(x,y)=|xi-yi| (23)
Further, after h _ diff, s _ diff and v _ diff are obtained through calculation, the color information difference value (hsv _ diff) can be further calculated, and a specific calculation formula is shown as a formula (24).
hsv_diff=mean(h_diff*hsv_factor[h]+s_diff*hsv_factor[s]+v_diff*hsv_factor[v]) (24)
Here, the hsv _ factor [ h ], hsv _ factor [ s ], and hsv _ factor [ v ] represent weighting coefficients (or "weighting factors") for three color components, Hue, Saturation, and Value, respectively, and specific values thereof are set according to an actual scene and are also related to a user's attention point. Illustratively, if the user is more concerned about saturation, then the value of hsv _ factor [ h ] is biased large; if the user is more interested in brightness, then the value of hsv _ factor [ v ] is larger, but not limited to.
S204: and acquiring edge information of the image to be evaluated and edge information of the true value image, and performing difference calculation on the edge information of the image to be evaluated and the edge information of the true value image according to the first visual weight factor and the second visual weight factor to obtain the edge information difference value.
It should be noted that the edge information may be obtained by using a preset detection model, such as a Sobel Filter (Sobel Filter), or other edge detection methods such as a Canny edge detector. In a specific example, the edge information can be calculated by using a sobel filter, which is a discrete difference algorithm, and is used for calculating the gradient approximation of the image gray scale. Using edge detection algorithm to generate corresponding gradient vector or norm at any pixel point of the image, obtaining horizontal approximate value and longitudinal gradient approximate value of each pixel point, and further obtaining edge information of the image to be evaluated or the true value image. Specifically, in some embodiments, the acquiring edge information of the image to be evaluated and edge information of the true value image may include:
performing edge detection on the image to be evaluated by using a preset edge detection model to obtain a transverse gradient approximate value and a longitudinal gradient approximate value of each pixel in the image to be evaluated; performing edge detection on the truth-value image by using a preset edge detection model to obtain a transverse gradient approximate value and a longitudinal gradient approximate value of each pixel in the truth-value image;
performing gradient calculation on a transverse gradient approximate value and a longitudinal gradient approximate value of each pixel in the image to be evaluated to obtain a gradient value of each pixel in the image to be evaluated, and determining the gradient value of each pixel in the image to be evaluated as edge information of the image to be evaluated;
and performing gradient calculation on a transverse gradient approximation and a longitudinal gradient approximation of each pixel in the truth-value image to obtain a gradient value of each pixel in the truth-value image, and determining the gradient value of each pixel in the truth-value image as edge information of the truth-value image.
That is, for the image to be evaluated, the horizontal gradient of each pixel can be obtained by using the preset edge detection modelApproximate value (which may be G)x1Expressed) and longitudinal gradient approximations (which may be expressed as G)y1Indicated). Then according to Gx1And Gy1The combination of (2) can calculate the gradient value of each pixel in the image to be evaluated, as shown in formula (25), where the gradient value of each pixel in the image to be evaluated is the edge information of the image to be evaluated, and G is used specifically1And (4) showing.
Figure BDA0002891622130000151
For a truth image, a preset edge detection model can be used to obtain a lateral gradient approximation (which can be G) of each pixelxgtExpressed) and longitudinal gradient approximations (which may be expressed as G)ygtRepresentation). Then according to GxgtAnd GygtThe combination of (1) and (b) can calculate the gradient value of each pixel in the true value image, as shown in formula (26), where the gradient value of each pixel in the true value image is the edge information of the true value image, and is specifically represented by GgtAnd (4) showing.
Figure BDA0002891622130000152
Thus, edge information (G) of the image to be evaluated is obtained1) And edge information (G) of the truth imagegt) Thereafter, a first visual weight factor (hvs factor) may be incorporated1) And the second visual weight factor (hvs factor)gt) Difference calculation of edge information is performed. Specifically, in some embodiments, the performing difference calculation on the edge information of the image to be evaluated and the edge information of the true-value image according to the first visual weighting factor and the second visual weighting factor to obtain the edge information difference value may include:
performing weighted calculation on the gradient value of each pixel in the image to be evaluated according to the first visual weight factor to obtain a gradient correction value of each pixel in the image to be evaluated;
performing weighted calculation on the gradient value of each pixel in the truth value image according to the second visual weight factor to obtain a gradient correction value of each pixel of the image to be evaluated;
and performing distance calculation on the gradient correction value of each pixel in the image to be evaluated and the gradient correction value of each pixel in the true value image to obtain the edge information difference value.
It should be noted that, first, the gradient value of each pixel in the image to be evaluated needs to be weighted and calculated by using the first visual weight factor to obtain hvs _ factor1*G1(ii) a And performing weighted calculation on the gradient value of each pixel in the truth-value image by using a second visual weight factor to obtain hvs _ factorgt*Ggt(ii) a Then hvs _ factor can be calculated1*G1And hvs _ factorgt*GgtFor example, the distance L1 can be calculated, and the specific calculation formula is shown in equation (27).
Figure BDA0002891622130000161
Here, G _ diff represents an edge information difference value between the image to be evaluated and the true value image; g1iRepresenting the gradient value, G, of the ith pixel in the image to be evaluatedgtiRepresenting the gradient value of the ith pixel in the true image.
S205: and performing weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated.
It should be noted that after the color information difference value (hsv _ diff) and the edge information difference value (G _ diff) are obtained, the contents of the two parts may be normalized respectively, and then weighted average is performed, specifically as shown in formula (10). In the formula (10), specific values of the preset weight values (hsv _ weight and G _ weight) are adaptively set according to different scenes. According to equation (10), the final image quality average result (output) can be obtained. The higher the average result of the finally obtained image quality is, the higher the similarity of the two images is, so that the evaluation result is closer to the subjective feeling of human eyes, and the matching degree of the evaluation result and the subjective feeling of human eyes is greatly improved.
In short, since the related art uses PSNR, SSIM, and the like to evaluate image quality, there is a problem that it cannot be inconsistent with human subjective perception. Therefore, the embodiment of the application provides a picture evaluation scheme closer to the subjective feeling of human eyes. In the technical scheme of the application, a human eye data system (abbreviated as 'HVS') can be integrated into an image quality evaluation system, and the difference between HSV (Hue, Saturation and Value) color space and edge information is combined to serve as an evaluation index, so that the matching degree between the final image quality evaluation result and human eye subjective feeling is improved.
Referring to fig. 3, a schematic diagram of a composition structure of an image quality evaluation framework provided in an embodiment of the present application is shown. As shown in fig. 3, the image quality evaluation framework may include a data module 301, an index difference module 302, a Human Visual System (HVS)303, a weighted average calculation module 304, and an image quality evaluation result 305. The data module 301 includes a true value image and an image to be evaluated; in the index difference module 302, the index includes color information (HSV) and edge information (Edges), and the human eye vision system 303 will act on the index difference calculation in the index difference module 302; to obtain a final image quality evaluation result 305.
In the embodiment of the application, the human eye vision system is used as a weighting factor to act on the difference between the HSV and the edge information obtained by calculation, so that the problem that the current image evaluation index is inconsistent with the subjective perception of human eyes can be solved. In addition, in the embodiment of the present application, various types of parameters (for example, weight information, such as hsv _ factor [ h ], hsv _ factor [ s ], hsv _ factor [ v ], hsv _ weight, and G _ weight, etc.) may be manually adjusted to use different scenes and different evaluation systems (for example, an evaluation system in which a user pays more attention to color information and an evaluation system in which a user pays more attention to edge information, etc.). In addition, after parameter configuration is completed, automatic evaluation can be conveniently carried out on one image or even the whole data set; and the evaluation result of the whole evaluation system can also be used as a part of a loss function and substituted into model training, so that the convergence result of some denoising or hyper-division networks can be helped to be more consistent with the expectation of human eyes.
The embodiment provides an image quality evaluation method, and the specific implementation of the embodiment is elaborated through the embodiment, so that it can be seen that through the technical scheme of the embodiment, not only is a human visual system integrated into an image quality evaluation system, but also the difference between color information and edge information is used as an evaluation index, so that the problem that the current image quality evaluation index is inconsistent with human subjective feeling can be solved, the accuracy of the evaluation result can be improved, and the matching degree of the evaluation result and the human subjective feeling is improved.
In another embodiment of the present application, based on the same inventive concept as the previous embodiment, refer to fig. 4, which shows a schematic structural diagram of a composition of an image quality evaluation apparatus 40 provided in an embodiment of the present application. As shown in fig. 4, the image quality evaluation device 40 may include: an acquisition unit 401, a determination unit 402, and a calculation unit 403; wherein the content of the first and second substances,
an obtaining unit 401 configured to obtain an image to be evaluated and a corresponding true value image;
a determining unit 402 configured to determine a color information difference value and an edge information difference value between the image to be evaluated and the true value image;
the calculating unit 403 is configured to perform weighted calculation on the color information difference value and the edge information difference value by using a preset weight value, so as to obtain an image quality evaluation result of the image to be evaluated.
In some embodiments, the determining unit 402 is further configured to determine a first visual weighting factor of the image to be evaluated and a second visual weighting factor of the truth image;
an obtaining unit 401, configured to obtain a color information value of the image to be evaluated and a color information value of the true value image;
a calculating unit 403, further configured to perform difference calculation on the color information value of the image to be evaluated and the color information value of the true value image according to the first visual weighting factor and the second visual weighting factor, so as to obtain the color information difference value;
an obtaining unit 401, configured to obtain edge information of the image to be evaluated and edge information of the true value image;
the calculating unit 403 is further configured to perform difference calculation on the edge information of the image to be evaluated and the edge information of the true value image according to the first visual weighting factor and the second visual weighting factor, so as to obtain the edge information difference value.
In some embodiments, referring to fig. 4, the image quality evaluation apparatus 40 may further include a normalization unit 404; wherein, the first and the second end of the pipe are connected with each other,
a determining unit 402, further configured to determine edge information of the image to be evaluated and gray information of the image to be evaluated; determining an edge calculation value of the image to be evaluated based on the edge information of the image to be evaluated;
a normalization unit 404 configured to normalize the edge calculation value of the image to be evaluated to obtain an edge factor of the image to be evaluated;
a determining unit 402, further configured to determine a gray scale calculation value of the image to be evaluated based on gray scale information of the image to be evaluated;
the normalization unit 404 is further configured to perform normalization processing on the gray scale calculation value of the image to be evaluated to obtain a gray scale factor of the image to be evaluated;
the calculating unit 403 is further configured to perform multiplication operation on the edge factor of the image to be evaluated and the gray scale factor of the image to be evaluated to obtain a first visual weight factor of the image to be evaluated.
In some embodiments, the determining unit 402 is further configured to determine edge information of the truth image and gray scale information of the truth image; and determining an edge calculation value of the truth image based on the edge information of the truth image;
a normalization unit 404, configured to perform normalization processing on the edge calculation value of the true value image to obtain an edge factor of the true value image;
a determining unit 402 further configured to determine a gray scale calculation value of the true value image based on gray scale information of the true value image;
a normalization unit 404, configured to perform normalization processing on the gray scale calculation value of the true value image to obtain a gray scale factor of the true value image;
the calculating unit 403 is further configured to multiply an edge factor of the true value image and a gray scale factor of the true value image to obtain a second visual weight factor of the true value image.
In some embodiments, the color information value comprises a first color component value, a second color component value, and a third color component value;
a calculating unit 403, configured to perform difference calculation on the first color component value of the image to be evaluated and the first color component value of the true-value image according to the first visual weighting factor and the second visual weighting factor, so as to obtain a first color component difference value; performing difference calculation on a second color component value of the image to be evaluated and a second color component value of the truth-value image according to the first visual weight factor and the second visual weight factor to obtain a second color component difference value; performing difference calculation on a third color component value of the image to be evaluated and a third color component value of the truth-value image according to the first visual weight factor and the second visual weight factor to obtain a third color component difference value; and performing weighted average calculation on the first color component difference value, the second color component difference value and the third color component difference value by using a preset color weight factor to obtain the color information difference value.
In some embodiments, the first color component value is a hue value, the second color component value is a saturation value, the third color component value is a lightness value; alternatively, the first color component value is a hue value, the second color component value is a saturation value, and the third color component value is a luminance value.
In some embodiments, referring to fig. 4, the image quality evaluation apparatus 40 may further include a detection unit 405 configured to perform edge detection on the image to be evaluated by using a preset edge detection model, so as to obtain a horizontal gradient approximation and a vertical gradient approximation of each pixel in the image to be evaluated; performing edge detection on the truth-value image by using a preset edge detection model to obtain a transverse gradient approximate value and a longitudinal gradient approximate value of each pixel in the truth-value image;
the calculating unit 403 is further configured to perform gradient calculation on the horizontal gradient approximation value and the vertical gradient approximation value of each pixel in the image to be evaluated to obtain a gradient value of each pixel in the image to be evaluated, and determine the gradient value of each pixel in the image to be evaluated as edge information of the image to be evaluated; and performing gradient calculation on a transverse gradient approximation and a longitudinal gradient approximation of each pixel in the truth-value image to obtain a gradient value of each pixel in the truth-value image, and determining the gradient value of each pixel in the truth-value image as edge information of the truth-value image.
In some embodiments, the calculating unit 403 is specifically configured to perform weighted calculation on the gradient value of each pixel in the image to be evaluated according to the first visual weighting factor, so as to obtain a gradient correction value of each pixel in the image to be evaluated; performing weighted calculation on the gradient value of each pixel in the truth-value image according to the second visual weight factor to obtain a gradient correction value of each pixel of the image to be evaluated; and performing distance calculation on the gradient correction value of each pixel in the image to be evaluated and the gradient correction value of each pixel in the true value image to obtain the edge information difference value.
In some embodiments, the preset weight value comprises a first weight value and a second weight value;
a normalization unit 404, further configured to perform normalization processing on the color information difference value to obtain a color information normalization value; carrying out weighted calculation on the color information normalization value according to the first weight value to obtain a color information evaluation result;
a normalization unit 404, further configured to perform normalization processing on the edge information difference value to obtain an edge information normalization value; carrying out weighted calculation on the edge information normalization value according to the second weight value to obtain an edge information evaluation result;
the calculating unit 403 is further configured to perform summation calculation on the color information evaluation result and the edge information evaluation result to obtain an image quality evaluation result of the image to be evaluated.
In some embodiments, referring to fig. 4, the image quality evaluating apparatus 40 may further include a setting unit 406 configured to set the first weight value to be greater than the second weight value if the image to be evaluated is color information of a relative interest; and if the image to be evaluated is relatively concerned with the edge information, setting the first weight value to be smaller than the second weight value.
It is understood that in this embodiment, a "unit" may be a part of a circuit, a part of a processor, a part of a program or software, etc., and may also be a module, or may also be non-modular. Moreover, each component in the embodiment may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware or a form of a software functional module.
Based on the understanding that the technical solution of the present embodiment essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the present embodiment. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
Accordingly, the present embodiments provide a computer storage medium storing a computer program which, when executed by at least one processor, performs the steps of the method of any of the preceding embodiments.
In yet another embodiment of the present application, based on the composition of the image quality evaluation apparatus 40 and the computer storage medium, refer to fig. 5, which shows a specific hardware structure diagram of an electronic device 50 provided in an embodiment of the present application. As shown in fig. 5, the electronic device 50 may comprise a processor 501, and the processor 501 may call and run a computer program from a memory to perform the method described in any of the foregoing embodiments.
Optionally, as shown in fig. 5, the electronic device 50 may further include a memory 502. From the memory 502, the processor 501 may call and run a computer program to perform the method of any of the previous embodiments.
The memory 502 may be a separate device from the processor 501, or may be integrated into the processor 501.
Optionally, as shown in fig. 5, the electronic device 50 may further include a transceiver 503, and the processor 501 may control the transceiver 503 to communicate with other devices, and specifically, may transmit information or data to the other devices or receive information or data transmitted by the other devices.
The transceiver 503 may include a transmitter and a receiver, among others. The transceiver 503 may further include one or more antennas.
Alternatively, the electronic device 50 may specifically be the electronic device described in the foregoing embodiment, or a device integrated with the image quality evaluation apparatus 40 described in any one of the foregoing embodiments. Here, the electronic device 50 may implement the corresponding processes of the methods in the embodiments of the present application, and for brevity, details are not described here again.
In yet another embodiment of the present application, based on the composition of the image quality evaluation apparatus 40 and the computer storage medium, refer to fig. 6, which shows a specific hardware structure diagram of a chip 60 provided in an embodiment of the present application. As shown in fig. 6, the chip 60 may include a processor 601, and the processor 601 may call and execute a computer program from a memory to perform the method described in any of the foregoing embodiments.
The memory may be a memory provided inside the chip 60 or may be a memory provided outside the chip 60. Whether integrated within the chip 60 or external to the chip 60, the processor 601 may retrieve and execute a computer program from the memory to perform the method of any of the preceding embodiments.
In the embodiment of the present application, when the memory is integrated inside the chip 60, optionally, as shown in fig. 6, the chip 60 may further include a memory 602. From the memory 602, the processor 601 may call and run a computer program to perform the method of any of the previous embodiments.
In the chip 60, the memory 602 may be a separate device independent from the processor 601, or may be integrated in the processor 601.
Optionally, the chip 60 may further comprise an input interface 603. The processor 601 may control the input interface 603 to communicate with other devices or chips, and specifically, may obtain information or data transmitted by other devices or chips.
Optionally, the chip 60 may further include an output interface 604. The processor 601 may control the output interface 604 to communicate with other devices or chips, and may specifically output information or data to the other devices or chips.
Optionally, the chip 60 may be applied to the terminal described in the foregoing embodiment, and the chip may implement corresponding processes of the methods in the embodiments of the present application, which are not described herein again for brevity.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip or a system-on-chip, etc.
It should be noted that the processor of the embodiment of the present application may be an integrated circuit chip having signal processing capability. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off the shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
It should also be noted that the memory in the embodiments of the present application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (ddr Data Rate SDRAM, ddr SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous chained SDRAM (Synchronous link DRAM, SLDRAM), and Direct memory bus RAM (DRRAM). It should be noted that the memories of the systems and methods described herein are intended to comprise, without being limited to, these and any other suitable types of memory.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof. For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Those of ordinary skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
It should be noted that, in the present application, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
The features disclosed in the several product embodiments presented in this application can be combined arbitrarily, without conflict, to arrive at new product embodiments.
The features disclosed in the several method or apparatus embodiments provided herein may be combined in any combination to arrive at a new method or apparatus embodiment without conflict.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (14)

1. An image quality evaluation method, characterized by comprising:
acquiring an image to be evaluated and a corresponding true value image;
determining a color information difference value and an edge information difference value between the image to be evaluated and the true value image; and
and performing weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated.
2. The method according to claim 1, wherein the determining the color information difference value and the edge information difference value between the image to be evaluated and the true value image comprises:
determining a first visual weight factor of the image to be evaluated and a second visual weight factor of the truth-value image;
acquiring a color information value of the image to be evaluated and a color information value of the true value image, and performing difference calculation on the color information value of the image to be evaluated and the color information value of the true value image according to the first visual weight factor and the second visual weight factor to obtain a color information difference value; and
and acquiring edge information of the image to be evaluated and edge information of the true value image, and performing difference calculation on the edge information of the image to be evaluated and the edge information of the true value image according to the first visual weight factor and the second visual weight factor to obtain the edge information difference value.
3. The method of claim 2, wherein determining the first visual weighting factor for the image to be evaluated comprises:
determining edge information of the image to be evaluated and gray information of the image to be evaluated;
determining an edge calculation value of the image to be evaluated based on the edge information of the image to be evaluated;
normalizing the edge calculation value of the image to be evaluated to obtain an edge factor of the image to be evaluated;
determining a gray scale calculation value of the image to be evaluated based on the gray scale information of the image to be evaluated;
normalizing the gray scale calculation value of the image to be evaluated to obtain a gray scale factor of the image to be evaluated;
and multiplying the edge factor of the image to be evaluated and the gray factor of the image to be evaluated to obtain a first visual weight factor of the image to be evaluated.
4. The method of claim 2, wherein determining the second visual weighting factor for the truth image comprises:
determining edge information of the true value image and gray scale information of the true value image;
determining an edge calculation value of the truth-value image based on edge information of the truth-value image;
normalizing the edge calculation value of the true value image to obtain an edge factor of the true value image;
determining a gray scale calculation value of the true value image based on gray scale information of the true value image;
normalizing the gray scale calculation value of the true value image to obtain a gray scale factor of the true value image;
and performing multiplication operation on the edge factor of the true value image and the gray factor of the true value image to obtain a second visual weight factor of the true value image.
5. The method according to claim 2, wherein the color information value comprises a first color component value, a second color component value and a third color component value;
the performing difference calculation on the color information value of the image to be evaluated and the color information value of the true value image according to the first visual weighting factor and the second visual weighting factor to obtain the color information difference value includes:
performing difference calculation on the first color component value of the image to be evaluated and the first color component value of the truth-value image according to the first visual weight factor and the second visual weight factor to obtain a first color component difference value;
performing difference calculation on a second color component value of the image to be evaluated and a second color component value of the truth-value image according to the first visual weight factor and the second visual weight factor to obtain a second color component difference value;
performing difference calculation on a third color component value of the image to be evaluated and a third color component value of the truth-value image according to the first visual weight factor and the second visual weight factor to obtain a third color component difference value;
and performing weighted average calculation on the first color component difference value, the second color component difference value and the third color component difference value by using a preset color weight factor to obtain the color information difference value.
6. The method according to claim 5, wherein the first color component value is a hue value, the second color component value is a saturation value, and the third color component value is a lightness value; alternatively, the first color component value is a hue value, the second color component value is a saturation value, and the third color component value is a luminance value.
7. The method according to claim 2, wherein the obtaining edge information of the image to be evaluated and edge information of the true value image comprises:
performing edge detection on the image to be evaluated by using a preset edge detection model to obtain a transverse gradient approximate value and a longitudinal gradient approximate value of each pixel in the image to be evaluated; performing edge detection on the truth-value image by using a preset edge detection model to obtain a transverse gradient approximate value and a longitudinal gradient approximate value of each pixel in the truth-value image;
performing gradient calculation on a transverse gradient approximation and a longitudinal gradient approximation of each pixel in the image to be evaluated to obtain a gradient value of each pixel in the image to be evaluated, and determining the gradient value of each pixel in the image to be evaluated as edge information of the image to be evaluated;
and performing gradient calculation on a transverse gradient approximation and a longitudinal gradient approximation of each pixel in the truth-value image to obtain a gradient value of each pixel in the truth-value image, and determining the gradient value of each pixel in the truth-value image as edge information of the truth-value image.
8. The method according to claim 7, wherein the performing difference calculation on the edge information of the image to be evaluated and the edge information of the true-value image according to the first visual weighting factor and the second visual weighting factor to obtain the difference value of the edge information comprises:
performing weighted calculation on the gradient value of each pixel in the image to be evaluated according to the first visual weight factor to obtain a gradient correction value of each pixel in the image to be evaluated;
performing weighted calculation on the gradient value of each pixel in the truth value image according to the second visual weight factor to obtain a gradient correction value of each pixel of the image to be evaluated;
and performing distance calculation on the gradient correction value of each pixel in the image to be evaluated and the gradient correction value of each pixel in the true value image to obtain the edge information difference value.
9. The method according to any one of claims 1 to 8, wherein the preset weight value comprises a first weight value and a second weight value;
the weighting calculation of the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated comprises the following steps:
normalizing the color information difference value to obtain a color information normalized value; carrying out weighted calculation on the color information normalization value according to the first weight value to obtain a color information evaluation result;
normalizing the edge information difference value to obtain an edge information normalized value; carrying out weighted calculation on the edge information normalization value according to the second weight value to obtain an edge information evaluation result;
and summing the color information evaluation result and the edge information evaluation result to obtain an image quality evaluation result of the image to be evaluated.
10. The method of claim 9, further comprising:
if the image to be evaluated is color information which is relatively concerned, setting the first weight value to be larger than the second weight value;
and if the image to be evaluated is relatively concerned with the edge information, setting the first weight value to be smaller than the second weight value.
11. An image quality evaluation apparatus characterized by comprising an acquisition unit, a determination unit, and a calculation unit; wherein the content of the first and second substances,
the acquisition unit is configured to acquire an image to be evaluated and a corresponding true value image;
the determining unit is configured to determine a color information difference value and an edge information difference value between the image to be evaluated and the true value image;
the calculating unit is configured to perform weighted calculation on the color information difference value and the edge information difference value by using a preset weight value to obtain an image quality evaluation result of the image to be evaluated.
12. An electronic device, wherein the electronic device comprises a memory and a processor; wherein, the first and the second end of the pipe are connected with each other,
the memory for storing a computer program operable on the processor;
the processor, when running the computer program, is configured to perform the method of any of claims 1 to 10.
13. A chip, wherein the chip comprises a processor; wherein the content of the first and second substances,
the processor, when executing a computer program called from memory, for performing the method of any of claims 1 to 10.
14. A computer storage medium, characterized in that the computer storage medium stores a computer program which, when executed by at least one processor, implements the method of any one of claims 1 to 10.
CN202110029714.5A 2021-01-11 2021-01-11 Image quality evaluation method, device, equipment, chip and storage medium Pending CN114764771A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110029714.5A CN114764771A (en) 2021-01-11 2021-01-11 Image quality evaluation method, device, equipment, chip and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110029714.5A CN114764771A (en) 2021-01-11 2021-01-11 Image quality evaluation method, device, equipment, chip and storage medium

Publications (1)

Publication Number Publication Date
CN114764771A true CN114764771A (en) 2022-07-19

Family

ID=82364009

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110029714.5A Pending CN114764771A (en) 2021-01-11 2021-01-11 Image quality evaluation method, device, equipment, chip and storage medium

Country Status (1)

Country Link
CN (1) CN114764771A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116843582A (en) * 2023-08-31 2023-10-03 南京诺源医疗器械有限公司 Denoising enhancement system and method of 2CMOS camera based on deep learning

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116843582A (en) * 2023-08-31 2023-10-03 南京诺源医疗器械有限公司 Denoising enhancement system and method of 2CMOS camera based on deep learning
CN116843582B (en) * 2023-08-31 2023-11-03 南京诺源医疗器械有限公司 Denoising enhancement system and method of 2CMOS camera based on deep learning

Similar Documents

Publication Publication Date Title
Li et al. Content-partitioned structural similarity index for image quality assessment
Ma et al. Objective quality assessment for color-to-gray image conversion
Ma et al. Perceptual quality assessment for multi-exposure image fusion
CN111741211B (en) Image display method and apparatus
US9020243B2 (en) Image adjustment
US8908989B2 (en) Recursive conditional means image denoising
CN107451969A (en) Image processing method, device, mobile terminal and computer-readable recording medium
US20140079319A1 (en) Methods for enhancing images and apparatuses using the same
US9424632B2 (en) System and method for generating high dynamic range images
CN112384946A (en) Image dead pixel detection method and device
WO2023005818A1 (en) Noise image generation method and apparatus, electronic device, and storage medium
CN113962859B (en) Panorama generation method, device, equipment and medium
Lecca et al. An image contrast measure based on Retinex principles
Saleh et al. Adaptive uncertainty distribution in deep learning for unsupervised underwater image enhancement
CN114764771A (en) Image quality evaluation method, device, equipment, chip and storage medium
CN112651945A (en) Multi-feature-based multi-exposure image perception quality evaluation method
Tade et al. Tone mapped high dynamic range image quality assessment techniques: survey and analysis
CN113422893B (en) Image acquisition method and device, storage medium and mobile terminal
CN115439384A (en) Ghost-free multi-exposure image fusion method and device
CN111179245B (en) Image quality detection method, device, electronic equipment and storage medium
Nair et al. Benchmarking single image dehazing methods
Anitha et al. Quality assessment of resultant images after processing
Benzi et al. A bio-inspired synergistic virtual retina model for tone mapping
Wu et al. Contrast enhancement based on discriminative co-occurrence statistics
CN111800626B (en) Photographing consistency evaluation method and device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination