CN115880537A - Method and system for evaluating image quality of confrontation sample - Google Patents

Method and system for evaluating image quality of confrontation sample Download PDF

Info

Publication number
CN115880537A
CN115880537A CN202310121665.7A CN202310121665A CN115880537A CN 115880537 A CN115880537 A CN 115880537A CN 202310121665 A CN202310121665 A CN 202310121665A CN 115880537 A CN115880537 A CN 115880537A
Authority
CN
China
Prior art keywords
image
sample
feature
countermeasure
original image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310121665.7A
Other languages
Chinese (zh)
Other versions
CN115880537B (en
Inventor
温文媖
黄明辉
方玉明
张玉书
左一帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi University of Finance and Economics
Original Assignee
Jiangxi University of Finance and Economics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi University of Finance and Economics filed Critical Jiangxi University of Finance and Economics
Priority to CN202310121665.7A priority Critical patent/CN115880537B/en
Publication of CN115880537A publication Critical patent/CN115880537A/en
Application granted granted Critical
Publication of CN115880537B publication Critical patent/CN115880537B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a method and a system for evaluating the image quality of a confrontation sample, wherein the method comprises the following steps: screening out original images with confidence degrees larger than a confidence degree threshold value through a classifier to construct an anti-sample data set, attacking the anti-sample data set by using multiple anti-attack methods, and obtaining anti-samples by adjusting parameters of different attack methods; calculating a residual image of the countermeasure sample, performing feature pre-extraction on the original image, the countermeasure sample and the residual image by using a feature coding network, splicing feature graphs extracted from the original image, the countermeasure sample and the residual image to obtain a new feature graph, and processing the new feature graph and the feature graph of the countermeasure sample by using a multi-scale feature extraction network to obtain feature graphs of different scales; and measuring the characteristic graphs of different scales by using a structural similarity measurement method to obtain corresponding scores, and finally averaging the scores of all scales to obtain the final quality score. The method can effectively evaluate the image quality of the confrontation sample.

Description

Method and system for evaluating image quality of confrontation sample
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a system for evaluating the image quality of a confrontation sample.
Background
At present, the deep neural network (DNN) The research of (1) has a breakthrough progress, is more widely applied to the aspects of image classification, semantic recognition and the like, and is successfully applied to real life, such as automatic driving, face recognition and the like. To lift upDNNThe work to find the limitations of deep learning is also ongoing, and in particular, methods for studying how to invade deep learning models are increasing, and the deep learning can promote the classification error or the recognition error of the deep learning. Relevant studies indicate that cases of fighting samples to interfere with deep learning models are prevalent in machine learning models. In particular, a given classifierDDefining a small perturbationrWill berAdding to original image
Figure SMS_1
And (4) putting the image into a classifier, wherein the prediction result is different from the originally predicted label, namely the image class predicted by the classifier is an error class. In real life, it is very dangerous if the recognition system of automatic driving is added with countermeasure samples, causing the car to take unexpected and inappropriate actions. Since the strength of the confrontation sample indirectly reflectsDNNTherefore, it is necessary to improve the robustness of machine learning from the perspective of fighting samples.
The study of challenge samples helps to understand the image features used by the classifier, but the existence of these challenge samples is related toDNNThe generalization capability of the algorithm is contradictory. Although it is a matter of courseDNNAdvanced performance is shown in some areas, but robustness in the face of subtle perturbations is not high. In the face of disturbance which cannot be recognized by human eyes, it is worth studying how to judge whether the image is added with the disturbance. At present, researchers get through
Figure SMS_2
The norm judges the similarity of the challenge sample and the original image to evaluate the strength of the challenge sample. In addition to that, is>
Figure SMS_3
The norm is also used as a constraint to constrain the challenge model by continually decreasing the ≦ for the challenge sample>
Figure SMS_4
The norm value optimizes the countermeasure model. However, with the findings of the study/>
Figure SMS_5
The norm ignores human visual perception in the process of measurement and is not suitable for evaluating the strength of the challenge sample. Therefore, how to effectively evaluate the quality of the countersample image is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
To this end, an embodiment of the present invention proposes a countermeasure sample image quality evaluation method to effectively evaluate the countermeasure sample image quality.
The method for evaluating the image quality of the confrontation sample according to one embodiment of the invention comprises the following steps:
step 1, screening out original images with confidence degrees larger than a confidence degree threshold value through a classifier to construct an anti-sample data set, attacking the anti-sample data set by using multiple anti-attack methods, and obtaining an anti-sample by adjusting parameters of different attack methods;
step 2, calculating a residual image of the countermeasure sample, performing feature pre-extraction on the original image, the countermeasure sample and the residual image by using a feature coding network, splicing the feature maps extracted from the original image, the countermeasure sample and the residual image by splicing to obtain a new feature map, and further processing the new feature map and the feature map of the countermeasure sample by using a multi-scale feature extraction network to obtain feature maps of different scales;
and 3, measuring the characteristic graphs of different scales by using a structural similarity measuring method to obtain corresponding scores, and finally averaging the scores of all scales to obtain final quality scores.
According to the method for evaluating the image quality of the countermeasure sample, the original image with the confidence coefficient larger than the confidence coefficient threshold value is screened out through the classifier to construct the countermeasure sample data set, the image features are pre-extracted through the feature coding network, redundant information in the image is abandoned, and the receptive field is expanded; the hierarchical characteristics of a human visual perception system are simulated through multi-scale features, and the semantic information of the image is enriched; and acquiring a structural similarity score of the image by a structural similarity measurement method, and averaging to acquire a final quality score of the confrontation sample. The method can accurately extract the structure information of the confrontation sample under the disturbance, can more accurately calculate the quality of the confrontation sample through the structure similarity measurement method, has stronger effectiveness in the method for calculating the image quality of the confrontation sample, and has good expandability for the constructed confrontation sample.
In addition, the method for evaluating the image quality of the confrontation sample according to the embodiment of the invention may further have the following additional technical features:
further, step 1 specifically includes:
step 1.1, selecting four classifiers of VGG, resNet, alexNet and GoogleNet to obtain confidence coefficients of input images, and respectively marking the input images as confidence coefficients
Figure SMS_6
For an input image->
Figure SMS_7
When the input image is greater or less>
Figure SMS_8
Is/are>
Figure SMS_9
Greater than 90%, based on the number of the input images &>
Figure SMS_10
As an original image with a confidence level greater than a confidence threshold;
Step 1.2, attacking the set of challenge sample data by using a plurality of challenge attack methods, wherein the plurality of challenge attack methods are FGSM, MI-FGSM, PI-FGSM, gaussian Noise, PGD, ODS-PGD, and the initial Epsilon value of each challenge method is (0.01, 0.1,0.2,0.3, 0.5), (0.5, 15,35, 55), (0.01, 0.1,0.2,0.3, 0.5), (0.03, 0.10,0.20, 0.30, 0.50).
Further, step 2 specifically includes:
step 2.1, based on the original image and the confrontation sample, calculating and generating a residual image by utilizing a normalized logarithmic difference function, wherein the calculation formula is as follows:
Figure SMS_11
wherein ,
Figure SMS_12
represents a pixel of the residual image, -is selected>
Figure SMS_13
Represents a pixel of the original image, is selected>
Figure SMS_14
Pixel representing a challenge sample, <' > or>
Figure SMS_15
Representing an image normalization influence parameter;
step 2.2, constructing a feature coding network, wherein the input channel of the 1 st layer of convolution layer of the feature coding network is 3, the output channel is 32, the convolution kernel size is 3, the step length is 1, the input channels of the 2 nd, 3 rd and 4 th layers of convolution layers of the feature coding network are 32, the output channel is 32, the convolution kernel size is 3, and the step length is 1, and performing feature pre-extraction on the original image, the countermeasure sample and the residual image through the feature coding network to obtain feature maps of the original image, the countermeasure sample and the residual image;
step 2.3, the image quality evaluation method of the confrontation sample based on the multi-scale Feature extraction network splices the Feature maps of the original image, the confrontation sample and the residual image by using a splicing function Torque.
Figure SMS_16
Wherein dim represents a dimension, and when dim =1, columns are spliced;
step 2.4, carrying out dimension reduction treatment on the new characteristic diagram for a plurality of times;
step 2.5, inputting the new feature diagram obtained in the step 2.4 after the dimension reduction treatment and the feature diagram of the countermeasure sample obtained in the step 2.2 into a multi-scale feature extraction network to obtain features of different scales, namely feature diagrams
Figure SMS_17
And antagonistic samples>
Figure SMS_18
。/>
Further, in step 2.1,
Figure SMS_19
is 1.
Further, step 3 specifically includes:
step 3.1, based on the characteristic diagram
Figure SMS_20
And antagonistic samples>
Figure SMS_21
Measuring feature graphs of different scales by using a structural similarity measurement method to obtain corresponding scores, wherein a calculation formula is as follows:
Figure SMS_22
wherein ,
Figure SMS_24
represents a dimension pick>
Figure SMS_26
Corresponding score, <' > or>
Figure SMS_29
and />
Figure SMS_25
Respectively represents a scale->
Figure SMS_28
Is based on a characteristic map>
Figure SMS_30
And antagonizing the sample>
Figure SMS_32
Is greater than or equal to>
Figure SMS_23
Represents a dimension pick>
Figure SMS_27
Characteristic map of>
Figure SMS_31
And antagonizing the sample>
Figure SMS_33
The covariance of (a) is determined,Cto prevent a natural number with a denominator of 0;
step 3.2, averaging the scores of all scales to obtain a final mass score, wherein the calculation formula is as follows:
Figure SMS_34
wherein ,
Figure SMS_35
the final mass fraction is represented as a function of,nrepresenting the total number of scales.
Another embodiment of the present invention provides a countermeasure sample image quality evaluation system for effectively evaluating the countermeasure sample image quality.
The countermeasure sample image quality evaluation system according to an embodiment of the invention includes:
the image preprocessing module is used for screening out original images with confidence degrees larger than a confidence degree threshold value through the classifier to construct an anti-sample data set, attacking the anti-sample data set by utilizing multiple anti-attack methods, and obtaining anti-samples by adjusting parameters of different attack methods;
the calculation splicing module is used for calculating a residual image of the countermeasure sample, performing feature pre-extraction on the original image, the countermeasure sample and the residual image by using a feature coding network, splicing feature maps extracted from the original image, the countermeasure sample and the residual image by splicing to obtain a new feature map, and further processing the new feature map and the feature map of the countermeasure sample by using a multi-scale feature extraction network to obtain feature maps of different scales;
and the score calculating module is used for measuring the characteristic graphs of different scales by using a structural similarity measuring method to obtain corresponding scores, and finally averaging the scores of all scales to obtain the final quality score.
According to the countermeasure sample image quality evaluation system provided by the embodiment of the invention, the original image with the confidence coefficient larger than the confidence coefficient threshold value is screened out through the classifier to construct the countermeasure sample data set, the image characteristics are pre-extracted through the characteristic coding network, redundant information in the image is abandoned, and the receptive field is expanded; the hierarchical characteristics of a human visual perception system are simulated through multi-scale features, and the semantic information of the image is enriched; and acquiring a structural similarity score of the image by a structural similarity measurement method, and averaging to acquire a final quality score of the confrontation sample. The method can accurately extract the structure information of the confrontation sample under the disturbance, can more accurately calculate the quality of the confrontation sample through the structure similarity measurement method, has stronger effectiveness in the method for calculating the image quality of the confrontation sample, and has good expandability for the constructed confrontation sample.
In addition, the system for evaluating the image quality of the confrontation sample according to the above embodiment of the present invention may further have the following additional technical features:
further, the image preprocessing module is specifically configured to perform the following steps:
step 1.1, selecting four classifiers of VGG, resNet, alexNet and GoogleNet to obtain the confidence coefficient of the input image, and marking the classifiers as
Figure SMS_36
For an input image>
Figure SMS_37
When an image is input>
Figure SMS_38
In:>
Figure SMS_39
greater than 90%, based on the number of the input images &>
Figure SMS_40
As an original image with a confidence level greater than a confidence level threshold; />
Step 1.2, attacking the set of challenge sample data by using a plurality of challenge attack methods, wherein the plurality of challenge attack methods are FGSM, MI-FGSM, PI-FGSM, gaussian Noise, PGD, ODS-PGD, and the initial Epsilon value of each challenge method is (0.01, 0.1,0.2,0.3, 0.5), (0.5, 5,15,35, 55), (0.01, 0.1,0.2,0.3, 0.5), (0.03, 0.10,0.20, 0.30, 0.50).
Further, the calculation splicing module is specifically configured to perform the following steps:
step 2.1, based on the original image and the confrontation sample, calculating and generating a residual image by utilizing a normalized logarithmic difference function, wherein the calculation formula is as follows:
Figure SMS_41
wherein ,
Figure SMS_42
pixel representing a residual image>
Figure SMS_43
Represents a pixel of an original image>
Figure SMS_44
Pixel representing a challenge sample, <' > or>
Figure SMS_45
Representing image normalization influence parameters;
step 2.2, constructing a feature coding network, wherein the input channel of the 1 st layer of convolution layer of the feature coding network is 3, the output channel is 32, the convolution kernel size is 3, the step length is 1, the input channels of the 2 nd, 3 rd and 4 th layers of convolution layers of the feature coding network are 32, the output channel is 32, the convolution kernel size is 3, and the step length is 1, and performing feature pre-extraction on the original image, the countermeasure sample and the residual image through the feature coding network to obtain feature maps of the original image, the countermeasure sample and the residual image;
step 2.3, the method for evaluating the image quality of the confrontation sample based on the multi-scale Feature extraction network splices the Feature maps of the original image, the confrontation sample and the residual image by using a splicing function torch.cat (), so as to obtain a new Feature map, wherein the splicing process comprises the following steps:
Figure SMS_46
wherein dim represents dimension, and when dim =1, it represents that columns are spliced;
step 2.4, carrying out dimension reduction treatment on the new characteristic diagram for a plurality of times;
step 2.5, inputting the new feature diagram obtained in the step 2.4 after the dimension reduction treatment and the feature diagram of the countermeasure sample obtained in the step 2.2 into a multi-scale feature extraction network to obtain features of different scales, namely feature diagrams
Figure SMS_47
And antagonizing the sample>
Figure SMS_48
Further, the air conditioner is provided with a fan,
Figure SMS_49
is 1.
Further, the score calculating module is specifically configured to perform the following steps:
step 3.1, based on the characteristic diagram
Figure SMS_50
And antagonizing the sample>
Figure SMS_51
Measuring feature graphs of different scales by using a structural similarity measurement method to obtain corresponding scores, wherein a calculation formula is as follows:
Figure SMS_52
wherein ,
Figure SMS_53
represents a dimension pick>
Figure SMS_56
Corresponding score, <' > or>
Figure SMS_60
and />
Figure SMS_55
Respectively represents a scale->
Figure SMS_58
Is based on a characteristic map>
Figure SMS_59
And antagonizing the sample>
Figure SMS_62
Is greater than or equal to>
Figure SMS_54
Represents a dimension pick>
Figure SMS_57
Is based on a characteristic map>
Figure SMS_61
And antagonizing the sample>
Figure SMS_63
The covariance of (a) of (b),Cto prevent a natural number with a denominator of 0;
step 3.2, averaging the scores of all scales to obtain the final mass score, wherein the calculation formula is as follows:
Figure SMS_64
wherein ,
Figure SMS_65
the final mass fraction is represented as a function of,nrepresenting the total number of scales. />
Drawings
The above and/or additional aspects and advantages of embodiments of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart illustrating a method for evaluating image quality of a challenge sample according to an embodiment of the present invention;
FIG. 2 is a graph of the different confidences of 50 images on the four classifiers VGG, resNet, alexNet and GoogleNet;
fig. 3 is a block diagram of a countermeasure sample image quality evaluation system according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the invention provides a method for evaluating image quality of a confrontation sample, including the following steps:
step 1, screening out original images with confidence degrees larger than a confidence degree threshold value through a classifier to construct an anti-sample data set, attacking the anti-sample data set by utilizing multiple anti-attack methods, and obtaining anti-samples by adjusting parameters of different attack methods.
In the embodiment, the classifier with wider application and stronger performance is used for screening out the image with higher confidence coefficient, so that the attack effect of the anti-attack method can be detected and the robustness of the classification model can be verified.
Because not all images in the selected image dataset meet the requirements, partial images contain less characteristic information, and the classifier has images with lower confidence, although no disturbance is added, the classifier is wrongly classified or has lower accuracy, so that a standard needs to be formulated to screen the images, namely, a plurality of classifiers are selected to obtain the confidence of each original image. The 4 classifiers are more popular at present and are more widely applied.
Specifically, the step 1 comprises:
step 1.1, selecting four classifiers of VGG, resNet, alexNet and GoogleNet to obtain confidence coefficients of input images, and respectively marking the input images as confidence coefficients
Figure SMS_66
For an input image>
Figure SMS_67
When an image is input>
Figure SMS_68
Is/are>
Figure SMS_69
Greater than 90%, based on the number of the input images &>
Figure SMS_70
As an original image with a confidence level greater than a confidence level threshold;
according to the final data, 35 images meeting the requirements are selected, and the images can better verify the robustness of the classification model and the effect of the anti-attack method. Referring to fig. 2, fig. 2 shows the confidence of 50 sub-images in the classifier, wherein 0-50 in the abscissa represents 50 sub-images, and the classifier (1), the classifier (2), the classifier (3) and the classifier (4) represent different classifiers, which correspond to GoogleNet, resNet, VGG and Alexnet, respectively. In addition, the ordinate represents the confidence of different images on the classifier, such as: the confidence of image 1 on classifier (1) is 100%, the confidence on classifier (2) is 100%, the confidence on classifier (3) is 100%, and the confidence on classifier (4) is 86%.
Step 1.2, attacking the set of challenge sample data by using a plurality of challenge attack methods, wherein the plurality of challenge attack methods are FGSM, MI-FGSM, PI-FGSM, gaussian Noise, PGD, ODS-PGD, and the initial Epsilon value of each challenge method is (0.01, 0.1,0.2,0.3, 0.5), (0.5, 5,15,35, 55), (0.01, 0.1,0.2,0.3, 0.5), (0.03, 0.10,0.20, 0.30, 0.50).
The various anti-attack methods are FGSM, MI-FGSM, PI-FGSM, gaussian Noise, PGD, ODS-PGD, which include representative and advanced anti-attack methods. After the counterattack method is selected, in order to better verify the influence of parameters in the counterattack sample on the model, the counterattack samples with different grades are generated by adjusting parameters such as Epsilon in the method.
Through the step 1, images showing high confidence in the classification model can be screened out, the influence of classification errors of the classification model on the final image quality score and the classification model robustness caused by low confidence images is reduced, and the influence of the attack method under different grades is verified by adjusting parameters of the anti-attack method.
And 2, calculating a residual image of the countermeasure sample, performing feature pre-extraction on the original image, the countermeasure sample and the residual image by using a feature coding network, splicing the feature maps extracted from the original image, the countermeasure sample and the residual image to obtain a new feature map, and further processing the new feature map and the feature map of the countermeasure sample by using a multi-scale feature extraction network to obtain feature maps of different scales.
The final result of the quality evaluation method is to match the human visual system, so that the constructed quality evaluation method is more consistent with the human visual system. In this embodiment, the evaluation method is further enhanced by using the residual image. The residual image is not only an average objective error map, but also weights each pixel value in the image to reflect the human visual perception system. In addition, the normalized logarithmic difference function is used for optimizing the residual image, zero values in the residual image are reduced, and the influence of the zero values on training convergence is reduced.
In this embodiment, step 2 specifically includes:
step 2.1, based on the original image and the confrontation sample, calculating and generating a residual image by utilizing a normalized logarithmic difference function, wherein the calculation formula is as follows:
Figure SMS_71
wherein ,
Figure SMS_72
pixel representing a residual image>
Figure SMS_73
Represents a pixel of the original image, is selected>
Figure SMS_74
Pixel representing a challenge sample, <' > or>
Figure SMS_75
Representing image normalization influence parameters; when/is>
Figure SMS_76
The smaller the influence of other factors (e.g., pixel values, distortion, etc.) on the image, the smaller is the->
Figure SMS_77
Is set to 1.
Step 2.2, constructing a feature coding network, wherein the input channel of the 1 st layer of convolution layer of the feature coding network is 3, the output channel is 32, the convolution kernel size is 3, the step length is 1, the input channels of the 2 nd, 3 rd and 4 th layers of convolution layers of the feature coding network are 32, the output channel is 32, the convolution kernel size is 3, and the step length is 1, and performing feature pre-extraction on the original image, the countermeasure sample and the residual image through the feature coding network to obtain feature maps of the original image, the countermeasure sample and the residual image;
wherein, the image characteristics are extracted, and a characteristic coding network is utilized. The single layer 3 x 3 convolutional layer extracts features from the image, reducing the size of the receptive field, which is detrimental to image quality evaluation. In order to enlarge the receptive field, obtain more image characteristics and enhance the human perception image quality, therefore, in the present invention, multiple convolutional layers are used to obtain more image distortion information and more high-level and low-level semantic information. And constructing a characteristic coding network, wherein the characteristic coding network comprises 4 convolutional layers, the input channel of the convolutional layer 1 is 3, the output channel of the convolutional layer 1 is 32, the size of a convolutional kernel is 3, and the step length is 1. The input channel of the convolution layers of the 2 nd, 3 rd and 4 th layers is 32, the output channel is 32, and the rest is the same as the first layer network, so that the characteristic diagrams of the original image, the confrontation sample and the residual image are obtained.
Step 2.3, the image quality evaluation method of the confrontation sample based on the multi-scale Feature extraction network splices the Feature maps of the original image, the confrontation sample and the residual image by using a splicing function Torque.
Figure SMS_78
;/>
Wherein dim represents dimension, and when dim =1, it represents that columns are spliced;
step 2.4, carrying out dimension reduction treatment on the new characteristic diagram for a plurality of times;
specifically, a new Feature map Feature _ map is obtained by splicing, so that the quality score of the image is predicted conveniently by comparing image features. Therefore, dimension reduction processing is carried out, and Feature _ map is subjected to secondary dimension reduction by using a convolution network of 1 x 1. Assuming the feature map is 1 × 96 × 224 by stitching, first, a first dimensionality reduction is performed using convolution layers, with an input channel of 96, an output channel of 64, a convolution kernel size of 1, and a step size of 0, i.e., the feature map is output as 1 × 64 × 224. In the same operation, after the second dimensionality reduction operation, the obtained characteristic diagram is 1 × 32 × 224.
Step 2.5, inputting the new feature diagram obtained in the step 2.4 after the dimension reduction treatment and the feature diagram of the countermeasure sample obtained in the step 2.2 into a multi-scale feature extraction network to obtain features of different scales, namely feature diagrams
Figure SMS_79
And antagonizing the sample>
Figure SMS_80
The quality of the network prediction image coded by using the features is only insufficient, and in the process, although the receptive field is enlarged, more redundant information still exists. In addition, in the process of evaluating the image quality, besides the receptive field, the image global information and the semantic information are also considered. Therefore, in order to obtain more global information and semantic information, the advantages of the multi-scale feature extraction network are combined. The multi-scale feature extraction network well simulates the layering sense of a human visual system and obtains more structural information. Therefore, the obtained feature map after dimensionality reduction and the feature coding network are put into a multi-scale feature extraction network for the feature map for resisting sample processing to obtain features of different scales, namely the feature map
Figure SMS_81
And antagonizing the sample>
Figure SMS_82
. wherein ,/>
Figure SMS_83
Represents a scale, in the present embodiment, based on the measured value>
Figure SMS_84
Has a value range of [1,2,3,4 ]]。
And 3, measuring the characteristic graphs of different scales by using a structural similarity measurement method to obtain corresponding scores, and finally averaging the scores of all scales to obtain the final quality score.
The structural change of the image in the process of generating the countermeasure sample is small, and the feature coding network and the multi-scale feature extraction network acquire more and more detailed structural information. The image quality is predicted using a structural similarity metric method.
Wherein, step 3 specifically includes:
step 3.1, based on the characteristic diagram
Figure SMS_85
And antagonizing the sample>
Figure SMS_86
Measuring feature graphs of different scales by using a structural similarity measurement method to obtain corresponding scores, wherein a calculation formula is as follows:
Figure SMS_87
wherein ,
Figure SMS_90
represents a dimension pick>
Figure SMS_91
Corresponding score, based on the number of points in the image>
Figure SMS_94
and />
Figure SMS_89
Respectively represents a scale->
Figure SMS_92
Is based on a characteristic map>
Figure SMS_95
And antagonizing the sample>
Figure SMS_98
Is greater than or equal to>
Figure SMS_88
Represents a dimension pick>
Figure SMS_93
Is based on a characteristic map>
Figure SMS_96
And antagonizing the sample>
Figure SMS_97
The covariance of (a) of (b),Cto prevent the denominator from being a natural number of 0. In the present embodiment, the first and second electrodes are,Ctaking 0.01;
step 3.2, averaging the scores of all scales to obtain the final mass score, wherein the calculation formula is as follows:
Figure SMS_99
wherein ,
Figure SMS_100
the final mass fraction is represented as a function of,nrepresenting the total number of scales.
It is important to apply various evaluation indexes to measure the effectiveness of the algorithm, and the pearson coefficients (c) and (d) of different evaluation methodsPLCC) The Spireman coefficient (C:)SRCC) Kendel correlation coefficient: (KRCC) Root mean square error (RMS) <RMSE) The correlation between the subjective score and the objective score can be effectively verified. When in usePLCC、SRCC、KRCCThe closer to 1 the more closely the image is,RMSEcloser to 0, indicating a better linear relationship between subjective score and objective score. Table 1 shows the performance of the method of this embodiment compared to other existing image quality evaluation algorithms.
Table 1:
Figure SMS_101
as can be seen from Table 1, this exampleCompared with other existing image quality evaluation algorithms,PLCC、SRCC、 KRCCis closer to 1, andRMSEcloser to 0, the method of the present embodiment is illustrated to clearly prioritize other existing image quality evaluation algorithms.
In conclusion, according to the method for evaluating the image quality of the countermeasure sample provided by the invention, the original image with the confidence coefficient greater than the confidence coefficient threshold value is screened out through the classifier to construct the countermeasure sample data set, the image features are pre-extracted through the feature coding network, redundant information in the image is abandoned, and the receptive field is expanded; the hierarchical characteristics of a human visual perception system are simulated through multi-scale features, and the semantic information of the image is enriched; and acquiring a structural similarity score of the image by a structural similarity measurement method, and averaging to acquire a final quality score of the confrontation sample. The method can accurately extract the structure information of the confrontation sample under the disturbance, can more accurately calculate the quality of the confrontation sample through the structure similarity measurement method, has stronger effectiveness in the method for calculating the image quality of the confrontation sample, and has good expandability for the constructed confrontation sample.
Referring to fig. 3, another embodiment of the present invention provides a system for evaluating image quality of a confrontation sample, including:
the image preprocessing module is used for screening out original images with confidence degrees larger than a confidence degree threshold value through the classifier to construct an anti-sample data set, attacking the anti-sample data set by utilizing multiple anti-attack methods and obtaining anti-samples by adjusting parameters of different attack methods;
the calculation splicing module is used for calculating a residual image of the countermeasure sample, performing feature pre-extraction on the original image, the countermeasure sample and the residual image by using a feature coding network, splicing the feature maps extracted from the original image, the countermeasure sample and the residual image to obtain a new feature map by splicing, and further processing the new feature map and the feature map of the countermeasure sample by using a multi-scale feature extraction network to obtain feature maps of different scales;
and the score calculating module is used for measuring the characteristic graphs of different scales by using a structural similarity measuring method to obtain corresponding scores, and finally averaging the scores of all scales to obtain the final quality score.
In this embodiment, the image preprocessing module is specifically configured to perform the following steps:
step 1.1, selecting four classifiers of VGG, resNet, alexNet and GoogleNet to obtain confidence coefficients of input images, and respectively marking the input images as confidence coefficients
Figure SMS_102
For an input image->
Figure SMS_103
When an image is input>
Figure SMS_104
Is/are>
Figure SMS_105
Greater than 90%, based on the number of the input images &>
Figure SMS_106
As an original image with a confidence level greater than a confidence threshold;
step 1.2, attacking the set of challenge sample data by using a plurality of challenge attack methods, wherein the plurality of challenge attack methods are FGSM, MI-FGSM, PI-FGSM, gaussian Noise, PGD, ODS-PGD, and the initial Epsilon value of each challenge method is (0.01, 0.1,0.2,0.3, 0.5), (0.5, 5,15,35, 55), (0.01, 0.1,0.2,0.3, 0.5), (0.03, 0.10,0.20, 0.30, 0.50).
In this embodiment, the calculation splicing module is specifically configured to execute the following steps:
step 2.1, based on the original image and the confrontation sample, calculating and generating a residual image by utilizing a normalized logarithmic difference function, wherein the calculation formula is as follows:
Figure SMS_107
wherein ,
Figure SMS_108
pixel representing a residual image>
Figure SMS_109
Represents a pixel of the original image, is selected>
Figure SMS_110
Pixels representing antagonistic samples>
Figure SMS_111
Representing an image normalization influence parameter;
step 2.2, constructing a feature coding network, wherein the input channel of the 1 st layer of convolution layer of the feature coding network is 3, the output channel is 32, the size of a convolution kernel is 3, the step length is 1, the input channels of the 2 nd, 3 th and 4 th layers of convolution layers of the feature coding network are 32, the output channel is 32, the size of the convolution kernel is 3, the step length is 1, and performing feature pre-extraction on the original image, the countermeasure sample and the residual image through the feature coding network to obtain feature maps of the original image, the countermeasure sample and the residual image;
step 2.3, the image quality evaluation method of the confrontation sample based on the multi-scale Feature extraction network splices the Feature maps of the original image, the confrontation sample and the residual image by using a splicing function Torque.
Figure SMS_112
Wherein dim represents dimension, and when dim =1, it represents that columns are spliced;
step 2.4, carrying out dimension reduction treatment on the new characteristic diagram for a plurality of times;
step 2.5, inputting the new feature diagram obtained in the step 2.4 after the dimension reduction treatment and the feature diagram of the countermeasure sample obtained in the step 2.2 into a multi-scale feature extraction network to obtain features of different scales, namely feature diagrams
Figure SMS_113
And antagonizing the sample>
Figure SMS_114
In the present embodiment of the present invention,
Figure SMS_115
is 1.
In this embodiment, the score calculating module is specifically configured to execute the following steps:
step 3.1, based on the characteristic diagram
Figure SMS_116
And antagonistic samples>
Figure SMS_117
Measuring feature graphs of different scales by using a structural similarity measurement method to obtain corresponding scores, wherein a calculation formula is as follows:
Figure SMS_118
wherein ,
Figure SMS_120
represents a dimension pick>
Figure SMS_123
Corresponding score, <' > or>
Figure SMS_125
and />
Figure SMS_121
Respectively represent a scale>
Figure SMS_124
Is based on a characteristic map>
Figure SMS_126
And antagonizing the sample>
Figure SMS_128
Is greater than or equal to>
Figure SMS_119
Represents a dimension pick>
Figure SMS_122
Is based on a characteristic map>
Figure SMS_127
And antagonizing the sample>
Figure SMS_129
The covariance of (a) of (b),Cto prevent a natural number with a denominator of 0;
step 3.2, averaging the scores of all scales to obtain the final mass score, wherein the calculation formula is as follows:
Figure SMS_130
wherein ,
Figure SMS_131
the final mass fraction is represented as a function of,nrepresenting the total number of scales.
In conclusion, according to the countermeasure sample image quality evaluation system provided by the invention, the original image with the confidence coefficient greater than the confidence coefficient threshold value is screened out through the classifier to construct the countermeasure sample data set, the image features are pre-extracted through the feature coding network, redundant information in the image is abandoned, and the receptive field is expanded; the hierarchical characteristics of a human visual perception system are simulated through multi-scale features, and the semantic information of the image is enriched; and acquiring a structural similarity score of the image by a structural similarity measurement method, and averaging to acquire a final quality score of the confrontation sample. The method can accurately extract the structure information of the confrontation sample under the disturbance, can more accurately calculate the quality of the confrontation sample through the structure similarity measurement method, has stronger effectiveness in the method for calculating the image quality of the confrontation sample, and has good expandability for the constructed confrontation sample.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (10)

1. A method for evaluating the image quality of a confrontation sample is characterized by comprising the following steps:
step 1, screening out original images with confidence degrees larger than a confidence degree threshold value through a classifier to construct an anti-sample data set, attacking the anti-sample data set by utilizing multiple anti-attack methods, and obtaining anti-samples by adjusting parameters of different attack methods;
step 2, calculating a residual image of the countermeasure sample, performing feature pre-extraction on the original image, the countermeasure sample and the residual image by using a feature coding network, splicing the feature maps extracted from the original image, the countermeasure sample and the residual image by splicing to obtain a new feature map, and further processing the new feature map and the feature map of the countermeasure sample by using a multi-scale feature extraction network to obtain feature maps of different scales;
and 3, measuring the characteristic graphs of different scales by using a structural similarity measurement method to obtain corresponding scores, and finally averaging the scores of all scales to obtain the final quality score.
2. The method for evaluating the image quality of a confrontational sample according to claim 1, wherein step 1 specifically comprises:
step 1.1, selecting four classifiers of VGG, resNet, alexNet and GoogleNet to obtain confidence coefficients of input images, and respectively marking the input images as confidence coefficients
Figure QLYQS_1
For an input image->
Figure QLYQS_2
When the input image is greater or less>
Figure QLYQS_3
Is/are>
Figure QLYQS_4
If at least three of these are greater than 90%, the input image is->
Figure QLYQS_5
As an original image with a confidence level greater than a confidence threshold;
step 1.2, attacking the set of challenge sample data by using a plurality of challenge attack methods, wherein the plurality of challenge attack methods are FGSM, MI-FGSM, PI-FGSM, gaussian Noise, PGD, ODS-PGD, and the initial Epsilon value of each challenge method is (0.01, 0.1,0.2,0.3, 0.5), (0.5, 5,15,35, 55), (0.01, 0.1,0.2,0.3, 0.5), (0.03, 0.10,0.20, 0.30, 0.50).
3. The method for evaluating the image quality of a confrontational sample according to claim 2, wherein the step 2 specifically comprises:
step 2.1, based on the original image and the confrontation sample, calculating and generating a residual image by using a normalized logarithmic difference function, wherein the calculation formula is as follows:
Figure QLYQS_6
wherein ,
Figure QLYQS_7
represents a pixel of the residual image, -is selected>
Figure QLYQS_8
Represents a pixel of the original image, is selected>
Figure QLYQS_9
Pixel representing a challenge sample, <' > or>
Figure QLYQS_10
Representing image normalization influence parameters;
step 2.2, constructing a feature coding network, wherein the input channel of the 1 st layer of convolution layer of the feature coding network is 3, the output channel is 32, the convolution kernel size is 3, the step length is 1, the input channels of the 2 nd, 3 rd and 4 th layers of convolution layers of the feature coding network are 32, the output channel is 32, the convolution kernel size is 3, and the step length is 1, and performing feature pre-extraction on the original image, the countermeasure sample and the residual image through the feature coding network to obtain feature maps of the original image, the countermeasure sample and the residual image;
step 2.3, the image quality evaluation method of the confrontation sample based on the multi-scale Feature extraction network splices the Feature maps of the original image, the confrontation sample and the residual image by using a splicing function Torque.
Figure QLYQS_11
Wherein dim represents dimension, and when dim =1, it represents that columns are spliced;
step 2.4, carrying out dimension reduction treatment on the new characteristic diagram for a plurality of times;
step 2.5, inputting the new feature diagram obtained in the step 2.4 after the dimension reduction treatment and the feature diagram of the countermeasure sample obtained in the step 2.2 into a multi-scale feature extraction network to obtain features of different scales, namely feature diagrams
Figure QLYQS_12
And antagonizing the sample>
Figure QLYQS_13
。/>
4. The countermeasure sample image quality evaluation method of claim 3, wherein, in step 2.1,
Figure QLYQS_14
is 1.
5. The method for evaluating the image quality of a confrontational sample according to claim 3, wherein step 3 specifically comprises:
step 3.1, based on the characteristic diagram
Figure QLYQS_15
And antagonizing the sample>
Figure QLYQS_16
Measuring feature graphs of different scales by using a structural similarity measurement method to obtain corresponding scores, wherein a calculation formula is as follows:
Figure QLYQS_17
wherein ,
Figure QLYQS_20
represents a dimension pick>
Figure QLYQS_22
Corresponding score, <' > or>
Figure QLYQS_26
and />
Figure QLYQS_19
Respectively represents a scale->
Figure QLYQS_23
Is based on a characteristic map>
Figure QLYQS_24
And antagonizing the sample>
Figure QLYQS_27
Is greater than or equal to>
Figure QLYQS_18
Represents a dimension pick>
Figure QLYQS_21
Is based on a characteristic map>
Figure QLYQS_25
And antagonistic samples>
Figure QLYQS_28
The covariance of (a) of (b),Cto prevent a natural number with a denominator of 0;
step 3.2, averaging the scores of all scales to obtain the final mass score, wherein the calculation formula is as follows:
Figure QLYQS_29
wherein ,
Figure QLYQS_30
the final mass fraction is represented as a function of,nrepresenting the total number of scales.
6. A countermeasure sample image quality evaluation system, comprising:
the image preprocessing module is used for screening out original images with confidence degrees larger than a confidence degree threshold value through the classifier to construct an anti-sample data set, attacking the anti-sample data set by utilizing multiple anti-attack methods, and obtaining anti-samples by adjusting parameters of different attack methods;
the calculation splicing module is used for calculating a residual image of the countermeasure sample, performing feature pre-extraction on the original image, the countermeasure sample and the residual image by using a feature coding network, splicing feature maps extracted from the original image, the countermeasure sample and the residual image by splicing to obtain a new feature map, and further processing the new feature map and the feature map of the countermeasure sample by using a multi-scale feature extraction network to obtain feature maps of different scales;
and the score calculating module is used for measuring the characteristic graphs of different scales by using a structural similarity measuring method to obtain corresponding scores, and finally averaging the scores of all scales to obtain the final quality score.
7. The system for image quality evaluation of confrontational samples according to claim 6, wherein said image preprocessing module is specifically configured to perform the steps of:
step 1.1, selecting four classifiers of VGG, resNet, alexNet and GoogleNet to obtain the confidence coefficient of the input image, and marking the classifiers as
Figure QLYQS_31
For an input image->
Figure QLYQS_32
When an image is input>
Figure QLYQS_33
Is/are>
Figure QLYQS_34
Greater than 90%, based on the number of the input images &>
Figure QLYQS_35
As an original image with a confidence level greater than a confidence level threshold;
step 1.2, attacking the set of challenge sample data by using a plurality of challenge attack methods, wherein the plurality of challenge attack methods are FGSM, MI-FGSM, PI-FGSM, gaussian Noise, PGD, ODS-PGD, and the initial Epsilon value of each challenge method is (0.01, 0.1,0.2,0.3, 0.5), (0.5, 15,35, 55), (0.01, 0.1,0.2,0.3, 0.5), (0.03, 0.10,0.20, 0.30, 0.50).
8. The system for image quality evaluation of confrontational sample according to claim 7, wherein said computational stitching module is specifically configured to perform the steps of:
step 2.1, based on the original image and the confrontation sample, calculating and generating a residual image by utilizing a normalized logarithmic difference function, wherein the calculation formula is as follows:
Figure QLYQS_36
wherein ,
Figure QLYQS_37
represents a pixel of the residual image, -is selected>
Figure QLYQS_38
Represents a pixel of an original image>
Figure QLYQS_39
Pixels representing antagonistic samples>
Figure QLYQS_40
Representing image normalization influence parameters;
step 2.2, constructing a feature coding network, wherein the input channel of the 1 st layer of convolution layer of the feature coding network is 3, the output channel is 32, the size of a convolution kernel is 3, the step length is 1, the input channels of the 2 nd, 3 th and 4 th layers of convolution layers of the feature coding network are 32, the output channel is 32, the size of the convolution kernel is 3, the step length is 1, and performing feature pre-extraction on the original image, the countermeasure sample and the residual image through the feature coding network to obtain feature maps of the original image, the countermeasure sample and the residual image;
step 2.3, the image quality evaluation method of the confrontation sample based on the multi-scale Feature extraction network splices the Feature maps of the original image, the confrontation sample and the residual image by using a splicing function Torque.
Figure QLYQS_41
Wherein dim represents dimension, and when dim =1, it represents that columns are spliced;
step 2.4, carrying out dimension reduction treatment on the new characteristic diagram for a plurality of times;
step 2.5, inputting the new feature diagram obtained in the step 2.4 after the dimension reduction treatment and the feature diagram of the countermeasure sample obtained in the step 2.2 into a multi-scale feature extraction network to obtain features of different scales, namely feature diagrams
Figure QLYQS_42
And antagonistic samples>
Figure QLYQS_43
9. The countermeasure sample image quality evaluation system of claim 8,
Figure QLYQS_44
is 1.
10. The system of claim 8, wherein the score computation module is configured to perform the following steps:
step 3.1, based on the characteristic diagram
Figure QLYQS_45
And antagonistic samples>
Figure QLYQS_46
Measuring feature graphs of different scales by using a structural similarity measurement method to obtain corresponding scores, wherein a calculation formula is as follows:
Figure QLYQS_47
wherein ,
Figure QLYQS_48
representing a scale>
Figure QLYQS_53
Corresponding score, based on the number of points in the image>
Figure QLYQS_54
and />
Figure QLYQS_49
Respectively represent a scale>
Figure QLYQS_51
Characteristic map of>
Figure QLYQS_56
And antagonistic samples>
Figure QLYQS_58
Is greater than or equal to>
Figure QLYQS_50
Represents a dimension pick>
Figure QLYQS_52
Is based on a characteristic map>
Figure QLYQS_55
And antagonizing the sample>
Figure QLYQS_57
The covariance of (a) of (b),Cto prevent a natural number with a denominator of 0;
step 3.2, averaging the scores of all scales to obtain the final mass score, wherein the calculation formula is as follows:
Figure QLYQS_59
wherein ,
Figure QLYQS_60
the final mass fraction is represented as a function of,nrepresenting the total number of scales. />
CN202310121665.7A 2023-02-16 2023-02-16 Method and system for evaluating image quality of countermeasure sample Active CN115880537B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310121665.7A CN115880537B (en) 2023-02-16 2023-02-16 Method and system for evaluating image quality of countermeasure sample

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310121665.7A CN115880537B (en) 2023-02-16 2023-02-16 Method and system for evaluating image quality of countermeasure sample

Publications (2)

Publication Number Publication Date
CN115880537A true CN115880537A (en) 2023-03-31
CN115880537B CN115880537B (en) 2023-05-09

Family

ID=85761236

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310121665.7A Active CN115880537B (en) 2023-02-16 2023-02-16 Method and system for evaluating image quality of countermeasure sample

Country Status (1)

Country Link
CN (1) CN115880537B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180293712A1 (en) * 2017-04-06 2018-10-11 Pixar Denoising monte carlo renderings using generative adversarial neural networks
AU2020100274A4 (en) * 2020-02-25 2020-03-26 Huang, Shuying DR A Multi-Scale Feature Fusion Network based on GANs for Haze Removal
CN112001847A (en) * 2020-08-28 2020-11-27 徐州工程学院 Method for generating high-quality image by relatively generating antagonistic super-resolution reconstruction model
CN113178255A (en) * 2021-05-18 2021-07-27 西安邮电大学 Anti-attack method of medical diagnosis model based on GAN
CN115019097A (en) * 2022-06-09 2022-09-06 浙江工商大学 Confrontation sample defense method based on image preprocessing
CN115205196A (en) * 2022-04-29 2022-10-18 天津大学 No-reference image quality evaluation method based on twin network and feature fusion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180293712A1 (en) * 2017-04-06 2018-10-11 Pixar Denoising monte carlo renderings using generative adversarial neural networks
AU2020100274A4 (en) * 2020-02-25 2020-03-26 Huang, Shuying DR A Multi-Scale Feature Fusion Network based on GANs for Haze Removal
CN112001847A (en) * 2020-08-28 2020-11-27 徐州工程学院 Method for generating high-quality image by relatively generating antagonistic super-resolution reconstruction model
CN113178255A (en) * 2021-05-18 2021-07-27 西安邮电大学 Anti-attack method of medical diagnosis model based on GAN
CN115205196A (en) * 2022-04-29 2022-10-18 天津大学 No-reference image quality evaluation method based on twin network and feature fusion
CN115019097A (en) * 2022-06-09 2022-09-06 浙江工商大学 Confrontation sample defense method based on image preprocessing

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
SHUCHAO DUAN 等: ""Multi-Scale Gradients Self-Attention Residual Learning for Face Photo-Sketch Transformation"", 《IEEE》 *
刘恒;吴德鑫;徐剑;: "基于生成式对抗网络的通用性对抗扰动生成方法", 信息网络安全 *
刘遵雄;蒋中慧;任行乐;: "多尺度生成对抗网络的图像超分辨率算法", 科学技术与工程 *
黄?;陶海军;王海峰;: "条件生成对抗网络的低照度图像增强方法", 中国图象图形学报 *

Also Published As

Publication number Publication date
CN115880537B (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN110348319B (en) Face anti-counterfeiting method based on face depth information and edge image fusion
CN112101426B (en) Unsupervised learning image anomaly detection method based on self-encoder
CN109215034B (en) Weak supervision image semantic segmentation method based on spatial pyramid covering pooling
US7362892B2 (en) Self-optimizing classifier
CN111582225A (en) Remote sensing image scene classification method and device
CN105574550A (en) Vehicle identification method and device
JP2008097607A (en) Method to automatically classify input image
CN116994069B (en) Image analysis method and system based on multi-mode information
CN112861785B (en) Instance segmentation and image restoration-based pedestrian re-identification method with shielding function
CN114842267A (en) Image classification method and system based on label noise domain self-adaption
CN105023027A (en) Sole trace pattern image retrieval method based on multi-feedback mechanism
CN113360701A (en) Sketch processing method and system based on knowledge distillation
CN114255403A (en) Optical remote sensing image data processing method and system based on deep learning
CN111753873A (en) Image detection method and device
CN114757342B (en) Electronic data information evidence-obtaining method based on confrontation training
CN113516650A (en) Circuit board hole plugging defect detection method and device based on deep learning
CN114821299B (en) Remote sensing image change detection method
CN113095158A (en) Handwriting generation method and device based on countermeasure generation network
CN115526847A (en) Mainboard surface defect detection method based on semi-supervised learning
US20040096107A1 (en) Method and computer program product for determining an efficient feature set and an optimal threshold confidence value for a pattern recogniton classifier
CN111210018A (en) Method and device for improving robustness of deep neural network model
CN112613354A (en) Heterogeneous remote sensing image change detection method based on sparse noise reduction self-encoder
CN115880537B (en) Method and system for evaluating image quality of countermeasure sample
CN115456693A (en) Automatic evaluation method for automobile exterior design driven by big data
CN112131354B (en) Answer screening method and device, terminal equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant