CN113034463A - Quality evaluation method of multi-exposure X-ray fusion image - Google Patents
Quality evaluation method of multi-exposure X-ray fusion image Download PDFInfo
- Publication number
- CN113034463A CN113034463A CN202110304551.7A CN202110304551A CN113034463A CN 113034463 A CN113034463 A CN 113034463A CN 202110304551 A CN202110304551 A CN 202110304551A CN 113034463 A CN113034463 A CN 113034463A
- Authority
- CN
- China
- Prior art keywords
- image
- gradient
- fusion
- exposure
- reference image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 88
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000013441 quality evaluation Methods 0.000 title claims abstract description 41
- 238000012360 testing method Methods 0.000 claims abstract description 33
- 230000035945 sensitivity Effects 0.000 claims abstract description 20
- 238000001914 filtration Methods 0.000 claims description 8
- 238000013210 evaluation model Methods 0.000 claims description 3
- 230000000737 periodic effect Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 abstract description 8
- 238000011156 evaluation Methods 0.000 description 37
- 230000006870 function Effects 0.000 description 19
- 238000002474 experimental method Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 5
- 238000007500 overflow downdraw method Methods 0.000 description 5
- 238000001303 quality assessment method Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000003646 Spearman's rank correlation coefficient Methods 0.000 description 2
- 229910000831 Steel Inorganic materials 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000000873 masking effect Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000000513 principal component analysis Methods 0.000 description 2
- 239000010959 steel Substances 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000001704 evaporation Methods 0.000 description 1
- 230000008020 evaporation Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000012847 principal component analysis method Methods 0.000 description 1
- 238000007430 reference method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention belongs to the technical field of image processing, and particularly relates to a multi-exposure X-ray fusion image quality evaluation method based on CSF and gradient amplitude similarity; the specific technical scheme is as follows: the quality evaluation method of the multi-exposure X-ray fusion image comprises the steps of firstly fusing a plurality of reference images into a new reference image by utilizing a fusion idea, then calculating the similarity of the gradient amplitude of the new reference image and a test image, and finally weighting by utilizing a contrast sensitivity function to obtain a final quality evaluation index.
Description
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a multi-exposure X-ray fusion image quality evaluation method based on CSF (CSF-CSF) -gradient amplitude similarity.
Background
In the X-ray imaging process of the complex structural part, due to the large difference of effective thickness in the ray transillumination direction, the limited dynamic range of an imaging system and the like, the phenomena of over-exposure and under-exposure are easy to occur in the conventional imaging mode with fixed energy, and the effective detection of the internal structure of the detection object cannot be realized. Therefore, the detection target needs to be subjected to multi-voltage exposure imaging and fused into a fused image with richer details and more moderate brightness, so that subsequent work such as flaw detection, quality judgment and the like can be smoothly performed. The multi-exposure X-ray image fusion can effectively expand the application range of the imaging system and fully exert the hardware potential of the system.
For fusion of multi-exposure X-ray images:
first, in the literature [ weizhong, chenping, panjin xiao ] the fusion of gradient energy X-ray images based on principal component analysis [ J ]. china stereology and image analysis, 2013, 18 (2): 103-108), fusing the variable-energy X-ray sequence images by using a principal component analysis method, wherein the method expands the dynamic range of the images without any prior knowledge;
secondly, in documents [ Jianotong WEI, Ping CHEN, Jinxiao PAN, Multi-voltage Digital radio graphics Fusion Based on Gray Consistency [ J ]. Nondestructive Evaluation/Testing, New Technology & application (FENDT), 2013Far East Forum on, 2013:209-211 ], a Fusion method with Gray level Consistency is utilized to fuse variable energy X-ray image sequences, and meanwhile, the Fusion coefficient has clear physical significance;
thirdly, in the document [ Qi Yanjie, Wang Liming. fusion of Multi-voltage Digital radio graphics Based on NSCT [ J ]. Journal of X-ray Science and Technology, 2016 ], performing Multi-scale decomposition by using NSCT, detecting and processing a false edge in the fusion process, and avoiding the occurrence of artifacts in the fused image;
fourthly, in the document [ Yanjie Qi, Zehui Yang, Lin Kang. fusion of Multi-voltage Digital radio Based on Support Value Transform [ C ]. 20196 th International reference on Information Science and Control Engineering (ICISCE),2019 ], the quality of the Multi-exposure X-ray fusion image is improved by utilizing the Support transformation and combining the principal component analysis.
The algorithms show different fusion performances and show better fusion effect under the condition that the algorithms are respectively suitable.
The quality evaluation of the multi-exposure X-ray fusion image is beneficial to better perfection and application of the multi-exposure X-ray image fusion technology. The purpose of the multi-exposure X-ray fusion image quality evaluation algorithm is to indirectly describe the excellence of the fusion algorithm by evaluating the quality of the image, and belongs to the class of fusion image quality evaluation. The difficulty of image fusion quality evaluation is that an ideal fusion algorithm which can provide data sources for various fusion image quality evaluation methods does not exist at present, so that the quality of a fusion image is difficult to evaluate by adopting a full-reference method. And the fused image is difficult to separate from the source image. Especially, the research on the fusion of multi-exposure X-ray images is less, and no quality evaluation algorithm specially aiming at the multi-exposure X-ray fusion images exists at present.
And in the quality evaluation of the visible light multi-exposure fusion image:
firstly, In documents [ XYDEas C S, Petrovic V S.Objective image fusion measure [ C ]. In: Proceedings of the International Society for Optical Engineering,2000,4051:89-98 ], extracting image edge information by using a Sobel operator, and representing the quality of an image by calculating the storage size of the edge information;
second, In the document [ Wang P, Liu B.A novel image fusion measured on multi-scale analysis [ C ]. In: Proceedings of IEEE 9th International Conference on Signal Processing,2008.965-968 ], the edge information retention at multiple scales is calculated by Haar Wavelet transform (Haar Wavelet), and the components at high and low frequencies are calculated by weight assignment to obtain the final score value. The two evaluation algorithms do not consider the visual characteristics of human eyes, so that the evaluation result is not accurate enough. Since the human visual system is highly adaptive to information such as image structure and is inspired by SSIM, a large number of SSIM-based image quality assessment algorithms are proposed successively.
Third, in the documents [ Ma K, Zeng K, Wang Z.Perceptional quality assessment for multi-exposure Image fusion [ J ]. IEEE Transactions on Image Processing,2015,24(11):3345-3356 ], an MEF Image quality evaluation algorithm based on structural similarity is proposed, a fusion feature set is obtained by utilizing the contrast and structural features in SSIM, and the fusion feature set is compared with a fusion Image to obtain a quality score;
fourthly, in documents [ RAHAN H, SOUNDARARAJAN R, BABU R V.Evaporation multiple exposure Fusion Image Information [ J ]. IEEE Signal Processing Letters,2017,24(11):1671- & gt 1675 ], assuming that a true distortion-free reference is contained in an overexposed/underexposed source Image, a quality map of a test Image is generated for each source Image in multiple scales and directions, then a true quality map at each spatial position in each scale and direction is identified on the source Image by adopting an Information theory-based method to obtain a Fusion quality map, and finally a final quality index is obtained by collecting on sub-bands.
In documents [ XING L, CAI L, ZENG H, et al. A Multi-scale Contrast-based Image Quality Assessment Model for Multi-Exposure Image Fusion [ J ] Signal Processing,2018,145,233 and 240 ], a Multi-scale Image Quality Assessment Model based on Contrast similarity is provided, the Model measures the Contrast distortion condition of a Fusion Image by calculating the Contrast similarity and the Contrast saturation of the Fusion Image and each source Image, and a final Quality Assessment score is obtained by fusing and weighting under Multi-scale according to the weight based on the similarity.
Disclosure of Invention
In order to solve the technical problems in the prior art, on the basis of the quality evaluation method of the visible light multi-exposure fusion image, the invention firstly fuses the source reference image to form a new single reference image, then carries out gradient amplitude similarity comparison, combines the contrast sensitivity characteristic of human eyes, and provides the quality evaluation method of the multi-exposure X-ray fusion image based on CSF and the gradient similarity.
The method has the advantages that the structural integrity of the source image is saved as a challenging task of multi-exposure image fusion, the image gradient contains a lot of important information of the image, local structural changes such as edge distribution, local contrast change and the like can be captured, and therefore, the effect of extracting the gradient amplitude as the image characteristic to evaluate the image quality is good.
The common mask operators for solving the gradient amplitude comprise a Soble operator, a Prewitt operator and a Scharr operator, the Prewitt operator belongs to average filtering and has a certain noise suppression effect, the Sobel operator belongs to weighted filtering, the position distance of adjacent pixel points is considered emphatically in weight distribution, the image edge detection strength is stronger than that of Prewitt, and the image blurring degree is reduced. The Scharr operator is similar to cable, but performs better when detecting small noise.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows: the quality evaluation method of the multi-exposure X-ray fusion image comprises the following specific steps:
selecting Sobel operator as convolution template, making local gradient be obtained by convolution of image and Sobel operator, respectively using H to make horizontal and vertical Sobel operatorsx、HyRepresents:
then, the gradient magnitude of image I is:
The reference image of the multi-exposure fusion image has the characteristics that the reference image needs to pay attention to how to process a plurality of reference images when the quality evaluation is carried out.
For reference pictures under the same sequence, with Ik(k-1, 2, …, M) represents the kth reference image, F represents the test image, and M represents the number of multi-exposure reference images in the same sequence, then the gradient magnitudes of the reference image and the test image are calculated as:
wherein,representing convolution operations, (x, y) representing the position of a pixel, HxHorizontal direction template, H, representing Sobel gradient operatoryA vertical direction template representing the Sobel gradient operator.
The gradient of the image characterizes the change of the edge structure of the image when GkThe larger the value of (x, y), the sharper the edge structure characteristic of the pixel, and therefore the stronger the contrast. By comparison of GkAnd (x, y) taking the maximum value to form a new reference image, so that the edge structure of the new reference image is most prominent, namely the gradient of the new reference image r is as follows:
Gr(x,y)=arg max{Gk(x,y)} (5)
where k is 1,2, …, M indicates the number of multiple-exposure reference images in the same sequence.
Since the SSIM structural similarity index is proposed, most of the following IQA algorithms use a specific form to calculate the similarity between features, and the specific form can be expressed as: (2ab + c)/(a)2+b2+ c), where a, b characterize the two physical quantities to be compared and c is a constant.
This calculation method has a strong masking effect, and as the physical quantities are enhanced, the perceptual difference between these physical quantities becomes small, i.e., the degree of similarity between the physical quantities increases, which is in accordance with the visual characteristics of the HVS. Based on this, the gradient magnitude similarity between the new reference image and the test image F is calculated in this particular form:
wherein, T is a normal number to maintain the stability of the algorithm;
sixthly, the contrast sensitivity function reflects the sensitivity characteristic of human eyes on a frequency domain, and the function model of the contrast sensitivity function is established as follows:
A(f)=2.6(0.192+0.114f)exp[-(0.114f)1.1] (7)
for an M × N image, the spatial frequency is calculated as follows:
wherein the unit of spatial frequency is 'period/degree', fxRepresenting the periodic frequency in the horizontal direction, fyRepresents the period frequency in the vertical direction and I (I, j) represents the pixel values of the selected image block.
The accuracy of the image quality evaluation is improved by considering that the HVS has different degrees of importance for different regions of the image, and the Contrast Sensitivity Function (CSF) reflects the capability of the HVS to distinguish small differences.
f is the spatial frequency, H (f) is the Contrast Sensitivity Function (CSF), and the modified contrast sensitivity function is (CSF):
H(f)=2.6(0.0192+0.114kf)exp(-(0.114kf)1.1) (11)
where k is used to adjust the function.
In the process of calculating the spatial frequency of the image, a method generally used is to calculate the row frequency first, calculate the column frequency, and finally obtain the spatial frequency. Since the spatial frequency and the gradient are related,the present invention utilizes the gradient as an approximation of the spatial frequency. Note GF(x, y) is the gradient of the test image, CSF can be represented by H (G)F(x, y)) representing the degree of sensitivity of the human eye to different positions of the image.
Establishing an evaluation model:
1) performing filtering pretreatment on all images, wherein a circular filtering window with the radius of 7 is adopted as a template;
2) calculating the gradient amplitude G of the reference image by using the formula (3)k(x, y) calculating gradient amplitude G of the test image by using formula (4)F(x, y), and then the gradient magnitude G of the new reference image r is calculated using equation (5)r(x,y);
3) Calculating the gradient amplitude of the new reference image r and the gradient amplitude similarity GM (X, Y) of the test image F by using the formula (6);
4) the gradient G of the test imageF(x, y) is substituted into the CSF function shown in formula (11) to obtain a weight map ω (x, y) of H (G)F(x, y)), the image quality evaluation index Q obtained by combining the similarity of CSF and gradient magnitude is:
and omega represents the whole image domain corresponding to the new reference image and the test image, the image quality evaluation index meets the symmetry, the value range is between 0 and 1, the larger the value is, the better the quality of the test image is, and the smaller the value is, the worse the quality of the test image is.
The evaluation method of the invention has more reasonable evaluation on the fusion result, is more consistent with the subjective feeling of human eyes, is a more reasonable and effective multi-exposure X-ray fusion image quality evaluation method, and can play a reference role in the selection of a multi-exposure X-ray image fusion algorithm and corresponding evaluation indexes.
Drawings
FIG. 1 is an X-ray transillumination image of an electronic combination lock.
Fig. 2 is a fused image of the four methods for multi-exposure X-rays, in which fig. 2(a) is a PCA method fused image, fig. 2(b) is a gray-scale consistency method fused image, fig. 2(c) is an NSCT method fused image, and fig. 2(d) is an SVT method fused image.
Fig. 3 is a partial enlarged view of each image in fig. 2, fig. 3(a) is a partial enlarged view of the upper portion of the PCA method fused image, fig. 3(b) is a partial enlarged view of the upper portion of the gray scale matching method fused image, fig. 3(c) is a partial enlarged view of the upper portion of the NSCT method fused image, and fig. 3(d) is a partial enlarged view of the upper portion of the SVT method fused image.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects to be solved by the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, the quality evaluation method of the multi-exposure X-ray fusion image specifically includes the following steps:
selecting Sobel operator as convolution template, making local gradient be obtained by convolution of image and Sobel operator, respectively using H to make horizontal and vertical Sobel operatorsx、HyRepresents:
then, the gradient magnitude of image I is:
The reference image of the multi-exposure fusion image has the characteristics that the reference image needs to pay attention to how to process a plurality of reference images when the quality evaluation is carried out.
For reference pictures under the same sequence, with Ik(k-1, 2, …, M) represents the kth reference image, F represents the test image, and M represents the number of multi-exposure reference images in the same sequence, then the gradient magnitudes of the reference image and the test image are calculated as:
wherein,representing convolution operations, (x, y) representing the position of a pixel, HxHorizontal direction template, H, representing Sobel gradient operatoryA vertical direction template representing the Sobel gradient operator.
The gradient of the image characterizes the change of the edge structure of the image when GkThe larger the value of (x, y), the sharper the edge structure characteristic of the pixel, and therefore the stronger the contrast. By comparison of GkAnd (x, y) taking the maximum value to form a new reference image, so that the edge structure of the new reference image is most prominent, namely the gradient of the new reference image r is as follows:
Gr(x,y)=arg max{Gk(x,y)} (5)
where k is 1,2, …, M indicates the number of multiple-exposure reference images in the same sequence.
Since the SSIM structural similarity index is proposed, most of the following IQA algorithms use a specific form to calculate the similarity between features, and the specific form can be expressed as: (2ab + c)/(a)2+b2+ c), where a, b characterize the two physical quantities to be compared and c is a constant.
This calculation method has a strong masking effect, and as the physical quantities are enhanced, the perceptual difference between these physical quantities becomes small, i.e., the degree of similarity between the physical quantities increases, which is in accordance with the visual characteristics of the HVS. Based on this, the gradient magnitude similarity between the new reference image and the test image F is calculated in this particular form:
wherein, T is a normal number to maintain the stability of the algorithm;
sixthly, the contrast sensitivity function reflects the sensitivity characteristic of human eyes on a frequency domain, and the function model of the contrast sensitivity function is established as follows:
A(f)=2.6(0.192+0.114f)exp[-(0.114f)1.1] (7)
for an M × N image, the spatial frequency is calculated as follows:
wherein the unit of spatial frequency is 'period/degree', fxRepresenting the periodic frequency in the horizontal direction, fyRepresents the period frequency in the vertical direction and I (I, j) represents the pixel values of the selected image block.
The accuracy of the image quality evaluation is improved by considering that the HVS has different degrees of importance for different regions of the image, and the Contrast Sensitivity Function (CSF) reflects the capability of the HVS to distinguish small differences.
f is the spatial frequency, H (f) is the Contrast Sensitivity Function (CSF), and the modified contrast sensitivity function is (CSF):
H(f)=2.6(0.0192+0.114kf)exp(-(0.114kf)1.1) (11)
where k is used to adjust the function.
In the process of calculating the spatial frequency of the image, a method generally used is to calculate the row frequency first, calculate the column frequency, and finally obtain the spatial frequency. Since spatial frequency and gradient are related, the present invention utilizes the gradient as an approximation of the spatial frequency. Note GF(x, y) is the gradient of the test image, CSF can be represented by H (G)F(x, y)) representing the degree of sensitivity of the human eye to different positions of the image.
Establishing an evaluation model:
1) performing filtering pretreatment on all images, wherein a circular filtering window with the radius of 7 is adopted as a template;
2) calculating the gradient amplitude G of the reference image by using the formula (3)k(x, y) calculating gradient amplitude G of the test image by using formula (4)F(x, y), and then the gradient magnitude G of the new reference image r is calculated using equation (5)r(x,y);
3) Calculating the gradient amplitude of the new reference image r and the gradient amplitude similarity GM (X, Y) of the test image F by using the formula (6);
4) the gradient G of the test imageF(x, y) is substituted into the CSF function shown in formula (11) to obtain a weight map ω (x, y) of H (G)F(x, y)), the image quality evaluation index Q obtained by combining the similarity of CSF and gradient magnitude is:
and omega represents the whole image domain corresponding to the new reference image and the test image, the image quality evaluation index meets the symmetry, the value range is between 0 and 1, the larger the value is, the better the quality of the test image is, and the smaller the value is, the worse the quality of the test image is.
The evaluation method of the invention has more reasonable evaluation on the fusion result, is more consistent with the subjective feeling of human eyes, is a more reasonable and effective multi-exposure X-ray fusion image quality evaluation method, and can play a reference role in the selection of a multi-exposure X-ray image fusion algorithm and corresponding evaluation indexes.
No matter how the image is processed, transmitted and the like, the image is finally seen by people, and the quality of the image is judged by direct observation of human eyes, namely subjective evaluation of the image quality. The subjective evaluation is most direct and accurate, and the performance of different objective evaluation algorithms is compared, namely the conformity degree between the result and the subjective evaluation result is compared.
Because a quality evaluation Database aiming at the multi-exposure X-ray fusion image is not provided, the experiment is divided into two parts, wherein the first part adopts a visible light multi-exposure fusion Database MEF Database to carry out the experiment, and the Database is the most known Database which can be obtained at present and is specially aiming at the multi-exposure fusion image; the second part is a multi-exposure X-ray fusion image experiment.
MEF Database experiment
The MEF Database comprises 17 source images with multiple exposure levels of a scene, 8 fused images generated by an MEF algorithm, and 17 source reference image sequences, wherein each reference sequence at least comprises 3 reference images, and the exposure degree of the reference images comprises low exposure, normal exposure and overexposure.
Firstly, objective evaluation values of all images in the image library are calculated by different methods, then nonlinear fitting is carried out on the objective evaluation values and DMOS corresponding to the objective evaluation values, which are provided by the image library, and the most common regression function is used as a fitting function:
wherein x isiIs the ith original objective prediction score, βj(j ═ 1,2,3,4,5) as fitting parameters, QiIs the objective fraction after nonlinear mapping.
In order to verify the effectiveness of the proposed algorithm in the multi-exposure fusion image quality evaluation method, the algorithm is compared with 5 algorithms, and Pearson Linear Correlation coefficient PLCC (Pearson Linear Correlation coefficient) and Spearman Rank Correlation coefficient SROCC (Spearman Rank order Correlation coefficient) are selected to measure the performance of each evaluation method. The correlation between the Pearson linear correlation coefficient reaction quality evaluation method and subjective evaluation and the prediction monotonicity of the Spearman rank correlation coefficient reaction quality evaluation method are shown, and the larger the two parameters are, the better the prediction model is.
Table 1 shows the comparison between SROCC and PLCC for other algorithms and the present algorithm, where the parameter T ═ 0.03L2Wherein, L is 255 and parameter k is 0.005.
TABLE 1 comparison of SROCC and PLCC Performance
In table 1, the test results for each sequence are shown in bold. It can be seen that the algorithm is in the MEF image database, SROCC is around 0.8914, PLCC is around 0.9287, and PLCC is the largest in the existing advanced model and is higher than other algorithms, which indicates that the consistency of the algorithm with subjective perception is higher. Furthermore, the SROCC only has two sequences of "Candle" and "Land scape" which are lower than 0.8, and the PLCC only has one sequence of "Land scape" which is lower than 0.8. In addition, in terms of computational complexity, taking "Tower" as an example, in MATLAB 2017b, the method runs on Intel Core i7-8550U, CPU @1.8G and 8GB RAM, and the time for processing an image is 0.39S, so that the complexity is lower. Therefore, the algorithm provided by the invention has good evaluation performance in a multi-exposure fusion image database.
Evaluation experiment for multi-exposure X-ray fusion images:
in order to verify the effectiveness of the quality improvement evaluation method for the multi-exposure X-ray fusion image evaluation, the multi-exposure X-ray fusion image evaluation method utilizes a group of multi-exposure X-ray fusion images for analysis, and when the multi-exposure X-ray fusion image is verified, a parameter T (0.03L) in the algorithm is2Wherein L is 4095 and k is 0.005.
According to the invention, a GE ray machine of 450KV and a 12-bit PaxScan2520 detector are selected, X-ray transillumination is carried out on an electronic coded lock by utilizing the fixed tube current of 1.5mA, the tube voltage is transilluminated from low to high, and the transillumination voltage is 50kV, 70kV, 80kV, 90kV, 100kV, 110kV, 120kV, 130kV and 140 kV. The X-ray transillumination sequence image of the electronic combination lock is shown in fig. 1. As can be seen from the figure, because the material of the electronic coded lock is complex, each X-ray transillumination image cannot completely show the structure of the electronic coded lock. When the transillumination voltage is 50kV, the structures of the electric wire, the plastic shell, the socket and the like outside the electronic coded lock are clearly visible, but the transillumination voltage is too low, so that the steel material cannot be effectively penetrated, and the internal structure is hardly visible. With the rise of transillumination voltage, the internal structure of the electronic lock gradually appears, but the structures such as external electric wires and plastic shells are saturated due to the fact that the gray value reaches the upper limit of the dynamic range of the detector, and in the process that the transillumination voltage gradually rises, the X-ray transillumination image has false edges which do not exist in the original electronic coded lock. If not properly processed, a large number of false edges will appear in the fused result.
4 methods are selected to fuse the multi-exposure X-rays, namely a PCA method, a gray level consistency method, an NSCT method and an SVT method, and the fusion result is shown in figure 2.
The final fusion results obtained vary depending on the fusion method. Fig. 3 is a partially enlarged view of the fused image of fig. 2 for a clear view of the internal structure of the electronic combination lock. As can be seen from fig. 2 and 3, the NSCT method and the SVT fusion method have the best fusion result, the external wires, the plastic shell and the socket are completely visible, the internal structure is clearer, and the NSCT method has the advantages of multi-scale and multi-resolution, so that the detail features of the image can be better extracted, and the NSCT method has a better fusion effect on the details of the internal structure. The PCA fusion method is characterized in that wires and a socket part outside the electronic coded lock are fuzzy, and the internal structure of the electronic coded lock is also fuzzy compared with the NSCT method and the SVT method; the external electric wire and the socket part are clear by the gray scale consistency method, but the internal structure of the steel structure is still fuzzy after being amplified, and the fusion effect is not ideal.
A good multi-exposure X-ray image fusion algorithm should comply with the following two principles: 1) the detail information in the initial image should be kept to the maximum; 2) no spurious structures or artifacts are introduced that are not present in the original image. When analyzing the image fusion result, the invention judges the quality of the fused image according to the two principles.
In order to compare with the multi-exposure X-ray fusion image evaluation method provided by the invention, 3 common evaluation indexes of Entropy (ENTRO), Spatial Frequency (SF) and standard deviation (STD) are selected. The evaluation results obtained by the 3 kinds of comparison evaluation indexes and the multi-exposure X-ray fusion image quality evaluation algorithm provided by the invention are shown in Table 2.
PCA method | Method of gray level uniformity | NSCT method | SVT method | |
Standard deviation of | 88.4618 | 92.1832 | 69.477 | 59.447 |
Entropy of the entropy | 3.5576 | 5.2160 | 4.6918 | 4.6429 |
Spatial frequency | 9.2136 | 10.0409 | 8.4816 | 7.9631 |
Proposed model | 0.8612 | 0.8419 | 0.8959 | 0.8841 |
TABLE 2 evaluation results of fused images obtained by different fusion methods
When the fusion quality of the image is improved, the standard deviation, the entropy and the spatial frequency should be increased. As can be seen from Table 2, the standard deviation, entropy and spatial frequency can not effectively reflect the fusion quality of the image, but the evaluation method provided by the invention gives an evaluation which is more in line with subjective feeling, and the NSCT method gives the best fusion result, while the gray level consistency method gives the worst fusion result.
The experiments show that different fusion algorithms have different fusion effects on multi-exposure X-ray images, and the NSCT method obtains a better fusion result due to the fact that the NSCT method is decomposed in multiple dimensions and multiple directions. However, the common image evaluation indexes such as standard deviation, entropy, spatial frequency and the like only consider the property of a certain aspect of the fused image, and the quality of the fused image cannot be comprehensively evaluated. The multi-exposure X-ray fusion image quality evaluation method provided by the invention considers the human eye vision system, so that the given evaluation result is more consistent with the human eye subjective feeling.
The invention provides a multi-exposure fusion image quality evaluation method combining a contrast sensitivity function and gradient amplitude similarity. The method is used for carrying out experiments in an MEF database, the SROCC is about 0.8914, the PLCC is about 0.9287, and the method is superior to the existing MEF image quality evaluation method; in a multi-exposure X-ray fusion image evaluation experiment, compared with common image evaluation indexes such as standard deviation, entropy, spatial frequency and the like, the multi-exposure X-ray fusion image evaluation method provided by the invention has the advantages that the evaluation on the fusion result is more reasonable, the subjective feeling of human eyes is more consistent, the method is a more reasonable and effective multi-exposure X-ray fusion image quality evaluation method, and the selection of a multi-exposure X-ray image fusion algorithm and the corresponding evaluation indexes can be referred.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principles of the present invention are intended to be included therein.
Claims (2)
1. The quality evaluation method of the multi-exposure X-ray fusion image is characterized by comprising the following specific steps of:
selecting a Sobel operator as a convolution template, obtaining a local gradient by convolving an image with the Sobel operator, and respectively using H for the horizontal Sobel operator and H for the vertical Sobel operatorx、HyRepresents:
then, the gradient magnitude of image I is:
fusing the reference image by utilizing a fusion idea to form a new reference image, and then performing quality evaluation;
thirdly, for reference images under the same sequence, using Ik(k-1, 2, …, M) represents the kth reference image, F represents the test image, and M represents the number of multi-exposure reference images in the same sequence, then the gradient magnitudes of the reference image and the test image are calculated as:
wherein,representing convolution operations, (x, y) representing the position of a pixel, HxHorizontal direction template, H, representing Sobel gradient operatoryA vertical direction template representing a Sobel gradient operator;
fourth, by comparison GkAnd (x, y) taking the maximum value to form a new reference image, so that the edge structure of the new reference image is most prominent, namely the gradient of the new reference image r is as follows:
Gr(x,y)=argmax{Gk(x,y)} (5)
wherein k is 1,2, …, M represents the number of multiple-exposure reference images in the same sequence;
fifthly, calculating the gradient magnitude similarity between the new reference image and the test image F:
wherein, T is a normal number to maintain the stability of the algorithm;
sixthly, establishing a function model of the contrast sensitivity function as follows:
A(f)=2.6(0.192+0.114f)exp[-(0.114f)1.1] (7)
for an M × N image, the spatial frequency is calculated as follows:
wherein f isxRepresenting the periodic frequency in the horizontal direction, fyRepresents the period frequency in the vertical direction, and I (I, j) represents the pixel value of the selected image block;
f is the spatial frequency, H (f) is the contrast sensitivity function, and the corrected contrast sensitivity function is:
H(f)=2.6(0.0192+0.114kf)exp(-(0.114kf)1.1) (11)
wherein k is used to adjust the function;
seventhly, establishing an evaluation model:
1) performing filtering pretreatment on all images, wherein a circular filtering window with the radius of 7 is adopted as a template;
2) calculating the gradient amplitude G of the reference image by using the formula (3)k(x, y) calculating gradient amplitude G of the test image by using formula (4)F(x, y), and then the gradient magnitude G of the new reference image r is calculated using equation (5)r(x,y);
3) Calculating the gradient amplitude of the new reference image r and the gradient amplitude similarity GM (X, Y) of the test image F by using the formula (6);
4) the gradient G of the test imageF(x, y) into the formula (11)To obtain a weight map ω (x, y) ═ H (G) of the CSF function of (c)F(x, y)), the image quality evaluation index Q obtained by combining the similarity of CSF and gradient magnitude is:
where Ω represents the entire image domain for the new reference image and the test image.
2. The method of claim 1, wherein the image quality evaluation index satisfies symmetry, and has a value range of 0 to 1, wherein the larger the value, the better the quality of the test image, and the smaller the value, the worse the quality of the test image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110304551.7A CN113034463A (en) | 2021-03-22 | 2021-03-22 | Quality evaluation method of multi-exposure X-ray fusion image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110304551.7A CN113034463A (en) | 2021-03-22 | 2021-03-22 | Quality evaluation method of multi-exposure X-ray fusion image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113034463A true CN113034463A (en) | 2021-06-25 |
Family
ID=76472545
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110304551.7A Pending CN113034463A (en) | 2021-03-22 | 2021-03-22 | Quality evaluation method of multi-exposure X-ray fusion image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113034463A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114332089A (en) * | 2022-03-15 | 2022-04-12 | 武汉市鑫山河塑业有限公司 | Method, device and system for controlling production quality of plastic sheath based on image processing |
CN114463318A (en) * | 2022-02-14 | 2022-05-10 | 宁波大学科学技术学院 | Visual quality evaluation method for multi-exposure fusion image |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109345502A (en) * | 2018-08-06 | 2019-02-15 | 浙江大学 | A kind of stereo image quality evaluation method based on disparity map stereochemical structure information extraction |
-
2021
- 2021-03-22 CN CN202110304551.7A patent/CN113034463A/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109345502A (en) * | 2018-08-06 | 2019-02-15 | 浙江大学 | A kind of stereo image quality evaluation method based on disparity map stereochemical structure information extraction |
Non-Patent Citations (4)
Title |
---|
QI, YANJIE等: "Multi-exposure X-ray image fusion quality evaluation based on CSF and gradient amplitude similarity", 《JOURNAL OF X-RAY SCIENCE AND TECHNOLOGY》 * |
刘弋名等: "基于CSF和梯度相似度的红外图像质量评价", 《激光与红外》 * |
卢彦飞: "基于局部视觉特征的图像质量客观评价方法研究", 《中国优秀博硕士学位论文全文数据库(博士)信息科技辑》 * |
邢露: "多曝光融合图像质量评价方法研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114463318A (en) * | 2022-02-14 | 2022-05-10 | 宁波大学科学技术学院 | Visual quality evaluation method for multi-exposure fusion image |
CN114463318B (en) * | 2022-02-14 | 2022-10-14 | 宁波大学科学技术学院 | Visual quality evaluation method for multi-exposure fusion image |
CN114332089A (en) * | 2022-03-15 | 2022-04-12 | 武汉市鑫山河塑业有限公司 | Method, device and system for controlling production quality of plastic sheath based on image processing |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111932532B (en) | Method for evaluating capsule endoscope without reference image, electronic device, and medium | |
Ikhsan et al. | An analysis of x-ray image enhancement methods for vertebral bone segmentation | |
CN113034463A (en) | Quality evaluation method of multi-exposure X-ray fusion image | |
JP2001057677A (en) | Image processing method, system and recording medium | |
JP6987352B2 (en) | Medical image processing equipment and medical image processing method | |
Harb et al. | Improved image magnification algorithm based on Otsu thresholding | |
CN111507426A (en) | No-reference image quality grading evaluation method and device based on visual fusion characteristics | |
Tang et al. | A reduced-reference quality assessment metric for super-resolution reconstructed images with information gain and texture similarity | |
CN116503392A (en) | Follicular region segmentation method for ovarian tissue analysis | |
CN112819739B (en) | Image processing method and system for scanning electron microscope | |
CN104915930B (en) | The method and device of grey level compensation and noise suppressed is carried out to image | |
CN111445435A (en) | No-reference image quality evaluation method based on multi-block wavelet transform | |
CN112215878B (en) | X-ray image registration method based on SURF feature points | |
CN117522862A (en) | Image processing method and processing system based on CT image pneumonia recognition | |
CN114862762B (en) | Quality evaluation method and device for human body scanning image | |
Kipele et al. | Poisson noise reduction with nonlocal-pca hybrid model in medical x-ray images | |
Qi et al. | Multi-exposure x-ray image fusion quality evaluation based on CSF and gradient amplitude similarity | |
KR100657867B1 (en) | Apparatus and method for searching center point of mass using repetition of adaptive histogram equalization | |
CN113963427B (en) | Method and system for rapid in-vivo detection | |
Liu et al. | Quality assessment for out-of-focus blurred images | |
Zhao et al. | Multi-energy X-ray images fusion method based on fuzzy entropy and sparse representation for complex castings | |
Materka et al. | On the effect of image brightness and contrast nonuniformity on statistical texture parameters | |
CN108288267A (en) | A kind of scanning electron microscope (SEM) photograph image sharpness based on dark is without with reference to evaluation method | |
Wu | An improved fuzzy algorithmic approach applying on medical image to improve the contrast | |
Ganesan et al. | Hybrid contrast enhancement approach for medical image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210625 |
|
RJ01 | Rejection of invention patent application after publication |