CN104408707B - Rapid digital imaging fuzzy identification and restored image quality assessment method - Google Patents

Rapid digital imaging fuzzy identification and restored image quality assessment method Download PDF

Info

Publication number
CN104408707B
CN104408707B CN201410592425.6A CN201410592425A CN104408707B CN 104408707 B CN104408707 B CN 104408707B CN 201410592425 A CN201410592425 A CN 201410592425A CN 104408707 B CN104408707 B CN 104408707B
Authority
CN
China
Prior art keywords
image
gradient
similarity
images
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410592425.6A
Other languages
Chinese (zh)
Other versions
CN104408707A (en
Inventor
遆晓光
尹磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hit Robot Group Co ltd
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN201410592425.6A priority Critical patent/CN104408707B/en
Publication of CN104408707A publication Critical patent/CN104408707A/en
Application granted granted Critical
Publication of CN104408707B publication Critical patent/CN104408707B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a rapid digital imaging fuzzy identification and restored image quality assessment method, and relates to a fuzzy identification and restored image quality assessment method, for solving the problems of poor real-time performance, incapability of effective identification of a blurred image and a clear image and effective assessment of an image restoration result, and instable output result after fuzzy restoration processing by use of a conventional method. The method is realized through the following steps: 1, obtaining a gradient image G (i, j); 2, calculating an average gray scale gradient value; 3, obtaining statistical distribution information of the gradient image; 4, performing threshold determining on BIM; 5, obtaining a restored image; 6, generating reference images F1 and D1; 7, calculating brightness similarities, contrast similarities and structural similarities between the reference images and an image to be assessed; 8, calculating g(F(i, j), F1) and g(D(i, j), D1); 9, obtaining a non-reference-image assessment index; 10, determining the non-reference-image assessment index; and the like. The rapid digital imaging fuzzy identification and restored image quality assessment method is applied to the field of fuzzy identification and restored image quality assessment.

Description

Rapid digital imaging fuzzy identification and restored image quality evaluation method
Technical Field
The invention relates to the field of image quality evaluation, in particular to the field of fuzzy identification and restored image quality evaluation.
Background
When a digital camera images, relative motion between a lens and an imaging scene or lens defocusing can cause blurring of a collected image or video, so that the contrast of the image is reduced, edge and internal detail information is weakened, the image quality is affected, a visual sense is intuitive, a digital image information processing system cannot accurately detect an interested area in the image, and analysis and understanding of the acquired image and video information are seriously affected. The image quality can be improved to a certain extent by a digital blurred image restoration system, and a part of edge and image internal detail information can be restored. However, in many cases, due to the lack of a priori information of the image and the defect of the restoration algorithm, the restoration process may generate more artifacts such as ringing effect, resulting in an image with even lower quality after digital restoration processing than an unprocessed image. Therefore, for a digital image restoration system with real practicability, before and after the digital restoration of an imaging blurred image, two key processing links must be implemented, namely whether the acquired image has imaging blur is identified, and meanwhile, after the restoration processing of the blurred image, the image quality evaluation is carried out, so that an image or video with better quality is obtained after the image restoration. Whether the two information processing rings work effectively or not is important for whether the digital imaging fuzzy recovery method can be really applied to an actual system or not, and the method has important application value in the fields of civil intelligent video monitoring systems, intelligent traffic management systems and the like, military applications such as imaging reconnaissance systems, space optical remote sensing and missile imaging guidance.
In the existing digital image blur restoration system and the image blur restoration method under study, there is almost no blur identification step, and the image is usually restored directly, which has the following two problems:
1) in real-time video processing, the image of each frame is limited by the frame rate to have a very short processing time, the common frame rate in the civil field is 20-30, that is, each frame has a processing time of 30-50 ms, while the frame rate in the military field is 50-60, and each frame has a processing time of 15-20 ms. However, most of the existing restoration algorithms are performed in an iterative manner, the real-time performance is poor, even though the better optimized restoration algorithm has a processing time of approximately 40ms for each frame of video with the same size, image blur restoration in practical application is generally only used as a preprocessing link, tasks such as target detection, identification, tracking and the like are required to be performed in the processing time of each frame of image, if blur identification is performed, the image blur restoration time can be used for subsequent tasks, the realization of a real-time video processing task is facilitated (instead of improving the single-frame processing speed), and if no identification is performed, the processing of the image will cause great waste of computing resources, and the real-time requirement is difficult to be met.
2) In the actual video processing, not every frame of image is fuzzy, and for a clear image, if the fuzzy restoration processing is performed, some artifacts such as ringing effect are introduced, and the quality of the processed image frame is reduced rather than that of the unprocessed image frame, so that the effectiveness and reliability of the fuzzy restoration system are seriously affected.
After the image blur restoration, in order to evaluate the effect of the restoration algorithm, it is necessary to perform quality evaluation on the restored image, thereby ensuring that the restored image with improved quality is obtained. Since there is no sharp image as a reference image, only no reference image quality evaluation can be performed. At present, there are two types of methods for evaluating the quality of a non-reference image, one is that the gradient information of the image, such as gradient amplitude, edge width, edge sharpness and the like, is simply calculated, and because the factors such as image brightness, contrast and the like are not comprehensively considered, evaluation results are relatively smooth, and especially the ringing effect of the restored image is difficult to identify; secondly, a learning mechanism is adopted, training is carried out through a large number of samples, evaluation results are obtained, the performance of the method depends on the large size of the samples, the condition of data under-fitting occurs once the samples are few, data over-fitting occurs if the requirement on classification accuracy is high in training, the image evaluation results are inaccurate no matter which condition occurs, the output results are unstable after fuzzy recovery processing, and the method cannot be used for an actual system.
Chinese invention CN101996406A proposes an index based on structural definition to perform non-reference quality evaluation, which is an improvement on the index of structural similarity, the structural similarity is an index of quality evaluation of full-reference images, the similarity of two images is measured by calculating the information of the two images in three aspects of brightness, contrast and structural degree, if one of the images is a clear image, the evaluation quality of the other image can be obtained, but the index does not consider gradient information, and cannot be used for quality evaluation of blurred images. The method describes the influence of edges by calculating the structural similarity of the gradient images, but the gradient images used by the three indexes reflect the similar information of the gradient images, so that the similarity of the original images cannot be well evaluated, and an objective evaluation result cannot be obtained.
A typical non-reference image quality evaluation index is average gray scale gradient gmg (gray means gradients), which calculates the average gradient value of a gray image as a standard for measuring image quality, and the method has a single calculation index, does not consider information such as brightness and contrast of the image, and does not consider correlation between pixels. In addition, when artifacts such as ringing occur in the image, the average gradient of the image is increased, and the fact that the image quality is degraded does not result in an objective evaluation result, the GMG index cannot cope with such a situation that is often encountered in blur restoration.
Disclosure of Invention
The invention aims to solve the problems that the existing method is poor in real-time performance, cannot effectively identify a blurred image and a clear image, cannot effectively evaluate an image restoration result, and is unstable in output result after the blurring restoration processing, and provides a method for rapidly evaluating the quality of the digital imaging blurring identification and restored image.
The above-mentioned invention purpose is realized through the following technical scheme:
inputting a gray-scale blurred image F (i, j), obtaining the size M × N of the gray-scale blurred image F (i, j), and calculating a horizontal gradient image G of the gray-scale blurred image F (i, j)x(i, j) and vertical gradient image Gy(i, j), thereby obtaining a gradient image G (i, j); wherein, M is the width of the grayscale blurred image F (i, j), and N is the height of the grayscale blurred image F (i, j);
step two: calculating an average gray gradient value GMG of the gray fuzzy image F (i, j);
step three: calculating a histogram h of the gradient image G (i, j), and traversing the histogram h to obtain a non-zero gray level number NGN of the gradient image;
step four: performing product operation according to the GMG and the NGN obtained in the second step and the third step to obtain a comprehensive fuzzy identification index BIM; carrying out threshold judgment on the obtained BIM, and if the BIM is greater than a preset threshold T, judging that a clear image is directly output; if BIM is less than the predetermined threshold value T, F (i, j) is transferred to step five; wherein, the threshold T is obtained by carrying out statistical analysis on the clear image and the blurred image;
step five: carrying out restoration operation on the gray fuzzy image F (i, j) by adopting a typical image blind restoration algorithm to obtain a restored image D (i, j);
step six, using the size of m × m and the variance of sigma2The gaussian blur kernel of (a) performs secondary blurring on the grayscale blurred image F (i, j) and the restored image D (i, j) to generate a reference image F1 of F (i, j) and a reference image D1 of D (i, j); wherein, the secondary blurring uses Gaussian blurring, and m is the size of a blurring kernel;
step seven: calculating the brightness similarity, the contrast similarity and the structure similarity of the reference image and the image to be evaluated according to F (i, j), F1, D (i, j) and D1; wherein, the images to be evaluated are gray fuzzy images F (i, j) and restored images D (i, j); reference pictures are F1 and D1; the brightness similarity of the reference image and the image to be evaluated comprises l (F (i, j), F1) and l (D (i, j), D1); the contrast similarity c (F (i, j), F1) and c (D (i, j), D1) of the reference image and the image to be evaluated; structural similarity s (F (i, j), F1) and s (D (i, j), D1) of the reference image and the image to be evaluated; l (F (i, j), F1) is the luminance similarity of the two images F (i, j) and F1, l (D (i, j), D1) is the luminance similarity of the two images D (i, j) and D1, and c (F (i, j), F1) is the contrast similarity of the two images F (i, j) and F1; c (D (i, j), D1) is the contrast similarity of the two images D (i, j) and D1; s (F (i, j), F1) is the structural similarity of the two images F (i, j) and F1; s (D (i, j), D1) is the structural similarity of the two images D (i, j) and D1;
step eight: respectively calculating gradient images g of a reference image and an image to be evaluatedF,gF1,gD,gD1Using gradient images gF,gF1,gD,gD1Calculating gradient similarity g (F (i, j), F1) of the two images F (i, j) and F1 and gradient similarity g (D (i, j), D1) of the two images D (i, j) and D1 respectively; wherein, gFGradient image of F (i, j), gF1Gradient image of F1, gDA gradient image of D (i, j); gD1A gradient image of D1; g (F (i, j), F1) is the gradient similarity of the two images F (i, j) and F1; g (D (i, j), D1) is the gradient similarity of the two images D (i, j) and D1;
step nine: calculating an improved structure similarity index ISSIM according to the brightness similarity, the contrast similarity, the structure similarity and the gradient similarity of the reference image and the image to be evaluated, partitioning the improved structure similarity index ISSIM to obtain M1 ISSIM indexes, then taking the average value of the M1 ISSIM indexes to obtain MISSIM, and calculating non-reference image evaluation indexes INRSS1 and INRSS2 according to the MISSIM; wherein the obtained MISSIM specifically comprises MISSIM (F (i, j), F1) and MISSIM (D (i, j), D1); the improved structure similarity index ISSIM comprises gradient similarity indexes ISSIM (F (i, j), F1) of two images F (i, j) and F1 and gradient similarity indexes ISSIM (D (i, j), D1) of two images D (i, j) and D1;
the improved structural similarity index ISSIM has a calculation formula as follows:
wherein, alpha, beta, gamma and lambda are each item weight;
step ten: judging according to the no-reference image evaluation indexes INRSS1 and INRSS2 obtained in the step nine, wherein the INRSS1>The INRSS2 indicates that the quality is not improved, the blur restoration system outputs a gray blurred image F (i, j), otherwise, indicates that the image quality is improved, and outputs a restored image D (i, j); wherein,thus, the method for rapidly identifying the digital imaging blur and evaluating the restored image quality is completed.
Effects of the invention
The invention is an improvement aiming at the prior blurred image restoration technology, can be used for the blurred identification of a single image and the quality evaluation work of the restored image, and can also be used for real-time video processing to carry out a more effective video restoration process.
Aiming at gray fuzzy images, for color images, only components of each channel are extracted and respectively processed, and finally indexes are averaged; aiming at the problems of the existing digital blurred video and image restoration system, a blurred identification module is introduced to carry out blurred identification on an image to be processed, the time consumption of a blurred identification process is lower than 1ms on a typical PC platform with a 4GB memory and a 2.5GHz processor, if the image is judged to be a sharp image, the time of 15-20 ms of each frame can be used for related subsequent image processing in the military field, and conditions are provided for the real-time processing of related videos. Meanwhile, a set of image quality evaluation method capable of comprehensively evaluating image structure information, image details and edge information is provided to evaluate the quality of the image after the fuzzy restoration, so that the problem that the quality of the restored image is inferior to that of the original image and is still output to a user or a next link of a video and image information processing system due to artificial traces introduced by an image restoration algorithm is solved, particularly, adverse influence factors such as ringing effect and edge blurring can be effectively identified by the provided image quality evaluation method, and therefore the method has important value for effective work of an image-based target recognition and tracking system.
Aiming at the defects of the existing fuzzy recovery system, the invention mainly solves the following technical problems:
1) aiming at the problems that the quality of a restored image is degraded and information processing resources are wasted due to the fact that the conventional digital imaging fuzzy restoration system does not have fuzzy discrimination capability and the clear image is also subjected to deblurring restoration processing, a fuzzy discrimination module is added to the fuzzy image restoration system, an effective fuzzy discrimination algorithm is provided, an image frame needing fuzzy restoration can be discriminated, the addition of the fuzzy discrimination module is beneficial to the realization of real-time video processing, and the method has strong practicability; as shown in fig. 2(a), fig. 2(b), fig. 3(a) and fig. 3 (b);
2) the fuzzy discrimination algorithm is designed by adopting the statistical rule of the prior distribution of the natural clear images, and meanwhile, the fuzzy discrimination algorithm has scale invariance and wide application range, and in addition, the algorithm has simple calculation process and strong implementability;
3) aiming at the problem that the quality of a restored image is deteriorated due to artificial traces introduced in the restoration process of the blurred image, a novel improved non-reference image quality assessment method is provided, so that the quality of the restored image can be accurately and effectively assessed, and the problem that the quality of a restored output image is poor compared with that of an initial input blurred image, such as the problem shown in fig. 2(a) and the problem shown in fig. 2(b), are avoided; the reliability of the image restoration system is ensured, if a clear image is restored, the effect shown in fig. 3(a) and 3(b) can be caused, the quality of the processed image is obviously lower than that before the processing, and the evaluation index of GMG and the like based on single gradient information can obtain a higher evaluation value due to obvious ringing effect in the processed image, so that an error output result is obtained. By adopting the index provided by the invention, because the correlation among the image pixels is considered, the index result can obtain a value which tends to be fuzzy, because the correlation among adjacent pixels at the position with larger gradient is smaller, the obtained structural similarity is smaller, and the ringing effect is identified through the difference.
4) The image quality evaluation method enhances the accuracy of restored image quality evaluation by introducing the structure, edge and internal detail information of the image, has good discrimination capability particularly on the ringing effect of the restored image, and ensures that the objectivity of the image quality evaluation, namely the change of the INRSS (improved non-reference image quality evaluation index based on the structural similarity) is-37.8%, and the change rate is larger than that of the GMG which is-15.0%. The invention ensures the reliability of the image restoration system;
5) aiming at the situation that the full reference image quality evaluation method only can be used for reference of clear images, the full reference image quality evaluation method is improved and used for quality evaluation of non-reference images, and the full reference image quality evaluation method has wider practical applicability.
The method can identify the ringing effect mainly aiming at the GMG index, because the evaluation index considers the structure information of the image, the gradient information is only used as a part of the index to depict the edge, and the gradient is not used as the only index. The structural information of the image is mainly characterized by calculating the correlation among pixels, and the correlation of the pixels at the positions with larger gradients is smaller, so that the influence of part of ringing on the evaluation result can be counteracted, and a certain discrimination effect is achieved.
The invention provides a fuzzy restoration system based on fuzzy identification and restoration quality evaluation, and belongs to the field of image and video processing. Specifically, the present invention can be used to improve the working efficiency of the digital imaging blur restoration system, improve the information processing speed of the digital imaging blur restoration system, and effectively evaluate the digital imaging blur restoration effect. The method can be applied to the fields of civil digital photography and videography, intelligent video monitoring, optical imaging remote sensing, military imaging reconnaissance, missile imaging guidance and the like.
The invention introduces the fuzzy identification step into the digital image fuzzy restoration system for the first time, greatly saves the processing time compared with the traditional digital fuzzy image restoration system, and provides more computing resources for real-time video processing tasks. Meanwhile, in order to ensure the robustness of the restoration result, the invention introduces the quality evaluation step of the restored image, so that the quality of the output image is not lower than that of the input image, and the reliability of a fuzzy restoration system is ensured. In the image fuzzy identification, the invention introduces two indexes of image gradient amplitude and gradient distribution, more accurately describes the characteristics of clear images and has more accurate identification. In the quality evaluation of the restored image, the existing image quality evaluation method based on the structural similarity is improved, and the gradient similarity index of the edge and detail information of the depicted image is added, so that the adverse influence factors such as ringing effect, unclear edge and the like in the restored image can be better evaluated, and the quality of the restored image can be more objectively reflected. The invention can obviously improve the real-time performance and the reliability of the digital blurred video and the image restoration system, so that the development of a corresponding system can better meet the actual application requirements of related military and civil systems.
Drawings
FIG. 1 is a flow chart of a fast digital imaging fuzzy discrimination and restored image quality assessment method;
FIG. 2(a) is a gray-scale blurred image;
FIG. 2(b) is an improved image after the quality evaluation of the blur restoration;
FIG. 3(a) is identified as a clear image through a fuzzy identification module, and the clear image does not need to be subjected to fuzzy processing;
FIG. 3(b) is a degraded image obtained by restoring a sharp image by a conventional method without discrimination by a blur discrimination module;
fig. 4 is a flow chart of the improved non-reference image quality evaluation index INRSS calculation.
Detailed Description
The first embodiment is as follows: the method for fast fuzzy identification of digital imaging and quality evaluation of restored images in the embodiment is specifically prepared according to the following steps:
inputting a gray fuzzy image F (i, j), obtaining the size M × N of the gray fuzzy image F (i, j), and calculating a horizontal gradient image G of the gray fuzzy image F (i, j)x(i, j) and vertical gradient image Gy(i, j), thereby obtaining a gradient image G (i, j); wherein, M is the width of the grayscale blurred image F (i, j), and N is the height of the grayscale blurred image F (i, j);
step two: calculating the average gray gradient value GMG (Gray means gradients) of the gray fuzzy image F (i, j), wherein the index reflects the gradient amplitude characteristic of the fuzzy identification index of the image;
step three: performing histogram statistics on the gradient image, namely calculating a histogram h of the gradient image G (i, j), and traversing the histogram h to obtain a Non-zero Gray scale number NGN (Non-zero Gray scale number) corresponding to the gradient histogram h, namely obtaining the Non-zero Gray scale number NGN of the gradient image G (i, j);
step four: performing product operation (BIM is NGN multiplied by GMG) according to the GMG and the NGN obtained in the second step and the third step to obtain a comprehensive fuzzy Identification index BIM (blur Identification metric); carrying out threshold judgment on the obtained BIM, and if the BIM is greater than a preset threshold T, judging that a clear image is directly output; if BIM is less than the predetermined threshold value T, F (i, j) is transferred to step five; the threshold T is obtained by carrying out statistical analysis on the clear image and the blurred image, namely, the blurred threshold T is obtained by carrying out analysis and synthesis on images of different scenes, the threshold can be changed according to the tolerable blurring degree in practical application, so that the index has good adaptability, and the blurred judgment can be carried out after the threshold is determined;
step five: performing restoration operation on the gray fuzzy image F (i, j) by adopting a typical image blind restoration algorithm such as Tikhonov regularization, total variation regularization and other methods to obtain a restored image D (i, j);
step six, using the size of m × m and the variance of sigma2The gaussian blur kernel of (a) performs secondary blurring on the grayscale blurred image F (i, j) and the restored image D (i, j) to generate a reference image F1 of F (i, j) and a reference image D1 of D (i, j); wherein, the secondary blurring uses Gaussian blurring, and m is the size of a blurring kernel;
step seven: calculating the brightness similarity, the contrast similarity and the structure similarity of the reference image and the image to be evaluated according to F (i, j), F1, D (i, j) and D1; wherein, the images to be evaluated are gray fuzzy images F (i, j) and restored images D (i, j); reference pictures are F1 and D1; the brightness similarity of the reference image and the image to be evaluated comprises l (F (i, j), F1) and l (D (i, j), D1); the contrast similarity c (F (i, j), F1) and c (D (i, j), D1) of the reference image and the image to be evaluated; structural similarity s (F (i, j), F1) and s (D (i, j), D1) of the reference image and the image to be evaluated; l (F (i, j), F1) is the luminance similarity of the two images F (i, j) and F1, l (D (i, j), D1) is the luminance similarity of the two images D (i, j) and D1, and c (F (i, j), F1) is the contrast similarity of the two images F (i, j) and F1; c (D (i, j), D1) is the contrast similarity of the two images D (i, j) and D1; s (F (i, j), F1) is the structural similarity of the two images F (i, j) and F1; s (D (i, j), D1) is the structural similarity of the two images D (i, j) and D1;
step eight: respectively calculating gradient images g of a reference image and an image to be evaluatedF,gF1,gD,gD1Using gradient images gF,gF1,gD,gD1Calculating gradient similarity g (F (i, j), F1) of the two images F (i, j) and F1 and gradient similarity g (D (i, j), D1) of the two images D (i, j) and D1 respectively; wherein, gFGradient image of F (i, j), gF1Gradient image of F1, gDA gradient image of D (i, j); gD1A gradient image of D1; g (F (i, j), F1) is the gradient similarity of the two images F (i, j) and F1; g (D (i, j), D1) is the gradient similarity of the two images D (i, j) and D1;
step nine: calculating an Improved structure Similarity index ISSIM (Improved structured Similarity) according to the brightness Similarity, the contrast Similarity, the structure Similarity and the gradient Similarity of a Reference image and an image to be evaluated, dividing the Improved structure Similarity index ISSIM into blocks to obtain M1 ISSIM indexes, averaging the M1 ISSIM indexes to obtain MISSIM (mean of Improved structured Similarity), and calculating non-Reference image evaluation indexes INRSS1 and INRSS2 (InRSS) according to the MISSIM to obtain a comprehensive evaluation result of the quality of a blurred image and a restored original image, wherein the comprehensive evaluation result of the quality of the blurred image and the restored original image comprises brightness, contrast, structure and edge information; the flow charts of the six to nine steps are shown in figure 4; wherein the obtained MISSIM specifically comprises MISSIM (F (i, j), F1) and MISSIM (D (i, j), D1); the improved structure similarity index ISSIM comprises gradient similarity indexes ISSIM (F (i, j), F1) of two images F (i, j) and F1 and gradient similarity indexes ISSIM (D (i, j), D1) of two images D (i, j) and D1;
the improved structural similarity index ISSIM has a calculation formula as follows:
wherein, alpha, beta, gamma and lambda are each item weight;
step ten: judging according to the no-reference image evaluation indexes INRSS1 and INRSS2 obtained in the step nine, wherein the INRSS1>INRSS2 indicates no improvement in quality, and to ensure the stability of the restoration result, the blur restoration system outputs a grayscale blurred image F (i, j), otherwise indicates that the image quality is improved and outputs a restored image D (i, j), wherein,as shown in fig. 1, a fast digital imaging blur identification and restored image quality evaluation method is completed.
The second embodiment is as follows: the first difference between the present embodiment and the specific embodiment is: obtaining a gradient image G (i, j) in the first step, wherein the calculation formula is as follows:
wherein, i and j are respectively the horizontal coordinate value and the vertical coordinate value corresponding to the image pixel point. Other steps and parameters are the same as those in the first embodiment.
The third concrete implementation mode: the present embodiment differs from the first or second embodiment in that: calculating the average gray gradient value GMG of the image F (i, j) in the second step according to the following formula:
. Other steps and parameters are the same as those in the first or second embodiment.
The fourth concrete implementation mode: the difference between this embodiment mode and one of the first to third embodiment modes is: and step three, obtaining the nonzero gray scale number NGN in h, wherein the calculation formula is as follows:
h=hist(G(i,j))
NGN=num(hi≠0)
in the formula, hist represents a histogram operation; h isiAnd describing a trailing part in trailing distribution of the image by calculating an NGN index for the pixel number corresponding to the ith gray level in h so as to identify the blurring degree of the image. Other steps and parameters are the same as those in one of the first to third embodiments.
The fifth concrete implementation mode: the difference between this embodiment and one of the first to fourth embodiments is: in the fifth step, blind restoration is performed on the gray fuzzy image F (i, j) by using a typical image blind restoration algorithm such as a total variation regularization method, and of course, other blind restoration methods can be used, and the restoration objective functions are as follows:
wherein k is an unknown fuzzy kernel, f is an unknown sharp image,for the convolution operation, the x,for adjustable parameters, Gx(i, j) is a horizontal gradient image, Gy(i, j) is a vertical gradient image, and a final restored image D (i, j) can be obtained by alternately solving the restored objective function; blurred images and better restoration effect are shown in FIGS. 2(a) and 2(b)As shown, poor recovery with noticeable artifacts is shown in fig. 3(a) and 3 (b); the importance of the fuzzy discrimination is highlighted in fig. 3(a) and fig. 3(b), and the subsequent evaluation of the two images is another guarantee to ensure that the clear image can be discriminated through the quality evaluation even if the fuzzy restoration operation is carried out on the clear image by mistake, and the clear image with better quality is output. Other steps and parameters are the same as in one of the first to fourth embodiments.
The sixth specific implementation mode: the difference between this embodiment and one of the first to fifth embodiments is: the specific process of calculating the brightness similarity, the contrast similarity and the structure similarity of the reference image and the image to be evaluated in the step seven comprises the following steps:
the first three terms of improved structural similarity, i.e., luminance similarity l (F (i, j), F1), l (D (i, j), D1), contrast similarity c (F (i, j), F1), c (D (i, j), D1) and structural similarity s (F (i, j), F1), s (D (i, j), D1), are calculated for F (i, j), F1, D (i, j) and D1, respectively, as follows:
wherein, muFIs the mean value of the gray values of the F (i, j) image, μF1Is the mean value of the gray values of the F1 image, muDIs the mean value of the gray values of the D (i, j) image, μD1Is the mean value, σ, of the gray values of the D1 imageFIs the variance, σ, of the gray-scale values of the F (i, j) imageF1Is the variance, σ, of the gray-scale values of the F1 imageDOf grey values of D (i, j) imagesVariance, σD1Is the variance, σ, of the gray-scale values of the D1 imageFF1Is the covariance of the gray values of the F (i, j) and F1 images, σDD1Is the covariance of the grey values of the D (i, j) image and the D1 image, C1=(k1×L)2,C2=(k2×L)2K1 is 0.01, k2 is 0.03, and L is the pixel gray scale. Other steps and parameters are the same as those in one of the first to fifth embodiments.
The seventh embodiment: the difference between this embodiment and one of the first to sixth embodiments is: step eight, respectively calculating gradient images g of the reference image and the image to be evaluatedF,gF1,gD,gD1Using gradient images gF,gF1,gD,gD1The specific process of calculating the gradient similarity g (F (i, j), F1) of the two images F (i, j) and F1 and the gradient similarity g (D (i, j), D1) of the two images D (i, j) and D1 respectively is as follows:
the gradient image g is calculated for F (i, j), F1 and D (i, j), D1 respectivelyF,gF1,gD,gD1Then, the gradient image g is calculatedF,gF1,gD,gD1The calculation formula of the gradient similarity is as follows:
wherein, C4=(k4×L)2K4 is 0.03, L is the pixel gray level, C4Is a constant, ensures that the denominator is not zero; by the step, the edge information of the image is also considered in the similarity index evaluation, so that the quality of the blurred image can be better evaluated; meanwhile, compared with other definition evaluation indexes, the method can better identify the ringing effect introduced when the fuzzy recovery effect is poor,thereby ensuring the stability of the rehabilitation system. Other steps and parameters are the same as those in one of the first to sixth embodiments.
The specific implementation mode is eight: the present embodiment differs from one of the first to seventh embodiments in that: in the ninth step, the non-reference image evaluation indexes INRSS1 and INRSS2 are obtained according to the MISSIM calculation, and the related calculation formula is as follows:
(1) adjusting according to actual needs; during actual calculation, dividing the image into M1 blocks to obtain ISSIM indexes of the image respectively; calculating the average value MSSIM, namely calculating by adopting the following formula:
Fiis the ith sub-image, D, in the gray-scale blurred image F (i, j)iThe ith sub-image in the restored image D (i, j); f1iIs the ith sub-image in reference image F1; d1iThe ith sub-image in reference image D1;
(2) improved no-reference evaluation indexes INRSS1 and INRSS2 were obtained, and the calculation formula was as follows:
other steps and parameters are the same as those in one of the first to seventh embodiments.
The following examples were used to demonstrate the beneficial effects of the present invention:
the first embodiment is as follows:
the embodiment of the invention provides a method for rapidly identifying digital imaging blur and evaluating restored image quality, which is specifically prepared according to the following steps:
inputting a gray fuzzy image F (i, j), obtaining the size M × N of the gray fuzzy image F (i, j), and calculating a gray fuzzy imageHorizontal gradient image G like F (i, j)x(i, j) and vertical gradient image Gy(i, j), thereby obtaining a gradient image G (i, j); wherein M is the width of the gray fuzzy image, and N is the height of the gray fuzzy image;
a gradient image G (i, j) is obtained, the calculation formula being as follows:
wherein, i, j are respectively the horizontal coordinate value and the vertical coordinate value corresponding to the image pixel point;
step two: calculating the average gray gradient value GMG (Gray means gradients) of the gray fuzzy image F (i, j), wherein the index reflects the gradient amplitude characteristic of the fuzzy identification index of the image;
the formula for calculating the mean gray gradient value GMG of the image F (i, j) is as follows:
step three: performing histogram statistics on the gradient image, namely calculating a histogram h of the gradient image G (i, j), and traversing the histogram h to obtain a Non-zero Gray scale number NGN (Non-zero Gray scale number) corresponding to the gradient histogram h, namely obtaining the Non-zero Gray scale number NGN of the gradient image;
obtaining the nonzero gray scale number NGN in h, wherein the calculation formula is as follows:
h=hist(G(i,j))
NGN=num(hi≠0)
in the formula, hist represents a histogram operation; h isiAnd describing a trailing part in trailing distribution of the image by calculating an NGN index for the pixel number corresponding to the ith gray level in h so as to identify the blurring degree of the image.
Step four: performing product operation (BIM is NGN multiplied by GMG) according to the GMG and the NGN obtained in the second step and the third step to obtain a comprehensive fuzzy Identification index BIM (blur Identification metric); carrying out threshold judgment on the obtained BIM, and if the BIM is greater than a preset threshold T, judging that a clear image is directly output; if BIM is less than the predetermined threshold value T, F (i, j) is transferred to step five; the threshold T is obtained through statistical analysis of a clear image and a blurred image, namely, the blurred threshold T is obtained through analysis and synthesis of images of different scenes, the threshold can be changed according to the tolerable blurring degree in practical application, so that the index has good adaptability, the blurred judgment can be carried out after the threshold is determined, and the value is generally 700-900;
step five: the gray fuzzy image F (i, j) is blindly restored by using a typical blind image restoration algorithm such as a total variation regularization method, and of course, other blind restoration methods may be used, and the restoration objective function is as follows:
wherein k is an unknown fuzzy kernel, f is an unknown sharp image,for the convolution operation, the x,for adjustable parameters, Gx(i, j) is a horizontal gradient image, Gy(i, j) is a vertical gradient image, and a final restored image D (i, j) can be obtained by alternately solving the restored objective function; a blurred image and a good restoration effect are shown in fig. 2(a) and 2(b), and a poor restoration effect with noticeable artifacts is shown in fig. 3(a) and 3 (b). FIGS. 3(a) and 3(b) highlight the importance of fuzzy discrimination, and followThe evaluation of the two images is equivalent to another guarantee, even if the fuzzy restoration operation is carried out on the clear image by mistake, the clear image can be identified through the quality evaluation, and the clear image with better quality is output.
Step six, using the size of m × m and the variance of sigma2The reference images F1 and D (j) of the gray-scale blurred image F (i, j) and the restored image D (i, j) are generated by blurring twiceiJ), where a secondary blurring process is performed using a Gaussian blur kernel with m × m of 7 × and variance of 6, and m and σ are used to ensure the secondary blurring effect2Generally, the values are respectively 5-8; wherein, the secondary blurring uses Gaussian blurring, and m is the size of a blurring kernel;
step seven: the specific process of calculating the brightness similarity, the contrast similarity and the structure similarity of the reference image and the image to be evaluated according to F (i, j), F1, D (i, j) and D1 comprises the following steps:
the first three terms of improved structural similarity, i.e., luminance similarity l (F (i, j), F1), l (D (i, j), D1), contrast similarity c (F (i, j), F1), c (D (i, j), D1) and structural similarity s (F (i, j), F1), s (D (i, j), D1), are calculated for F (i, j), F1, D (i, j) and D1, respectively, as follows:
wherein, muFIs the mean value of the gray values of the F (i, j) image, μF1Is the mean value of the gray values of the F1 image, muDIs the mean value of the gray values of the D (i, j) image, μD1Is the mean of the gray values of the D1 image,σFis the variance, σ, of the gray-scale values of the F (i, j) imageF1Is the variance, σ, of the gray-scale values of the F1 imageDIs the variance, σ, of the gray values of the D (i, j) imageD1Is the variance, σ, of the gray-scale values of the D1 imageFF1Is the covariance of the gray values of the F (i, j) and F1 images, σDD1Is the covariance of the grey values of the D (i, j) image and the D1 image, C1=(k1×L)2,C2=(k2×L)2K1 is 0.01, k2 is 0.03, and L is the pixel gray level, typically 255. The image to be evaluated is a gray fuzzy image F (i, j) and a restoration image D (i, j); reference pictures are F1 and D1; the brightness similarity of the reference image and the image to be evaluated comprises l (F (i, j), F1) and l (D (i, j), D1); the contrast similarity c (F (i, j), F1) and c (D (i, j), D1) of the reference image and the image to be evaluated; structural similarity s (F (i, j), F1) and s (D (i, j), D1) of the reference image and the image to be evaluated; l (F (i, j), F1) is the luminance similarity of the two images F (i, j) and F1, l (D (i, j), D1) is the luminance similarity of the two images D (i, j) and D1, and c (F (i, j), F1) is the contrast similarity of the two images F (i, j) and F1; c (D (i, j), D1) is the contrast similarity of the two images D (i, j) and D1; s (F (i, j), F1) is the structural similarity of the two images F (i, j) and F1; s (D (i, j), D1) is the structural similarity of the two images D (i, j) and D1;
step eight: respectively calculating gradient images g of a reference image and an image to be evaluatedF,gF1,gD,gD1Using gradient images gF,gF1,gD,gD1Calculating gradient similarity g (F (i, j), F1) of the two images F (i, j) and F1 and gradient similarity g (D (i, j), D1) of the two images D (i, j) and D1 respectively;
the gradient image g is calculated for F (i, j), F1 and D (i, j), D1 respectivelyF,gF1,gD,gD1Then, the gradient image g is calculatedF,gF1,gD,gD1The calculation formula of the gradient similarity is as follows:
wherein, C4=(k4×L)2K4 is 0.03, L is the pixel gray level, typically 255, C4Is a constant, ensuring that the denominator is not zero. By taking the edge information of the image into consideration in the similarity index evaluation through the steps, the quality of the blurred image can be evaluated better. Meanwhile, compared with other definition evaluation indexes, the method can better identify the ringing effect introduced when the fuzzy recovery effect is poor, thereby ensuring the stability of the recovery system. gFGradient image of F (i, j), gF1Gradient image of F1, gDA gradient image of D (i, j); gD1A gradient image of D1; g (F (i, j), F1) is the gradient similarity of the two images F (i, j) and F1; g (D (i, j), D1) is the gradient similarity of the two images D (i, j) and D1;
step nine: calculating an Improved structure Similarity index ISSIM (Improved structured Similarity) according to the brightness Similarity, the contrast Similarity, the structure Similarity and the gradient Similarity of a Reference image and an image to be evaluated, dividing the Improved structure Similarity index ISSIM into blocks to obtain M1 ISSIM indexes, averaging the M1 ISSIM indexes to obtain MISSIM (mean of Improved structured Similarity), and calculating non-Reference image evaluation indexes INRSS1 and INRSS2 (InRSS) according to the MISSIM to obtain a comprehensive evaluation result of the quality of a blurred image and a restored original image, wherein the comprehensive evaluation result of the quality of the blurred image and the restored original image comprises brightness, contrast, structure and edge information; M1E (1, 12) step six-nine flow charts are shown in figure 4; wherein the obtained MISSIM specifically comprises MISSIM (F (i, j), F1) and MISSIM (D (i, j), D1); the improved structure similarity index ISSIM comprises gradient similarity indexes ISSIM (F (i, j), F1) of two images F (i, j) and F1 and gradient similarity indexes ISSIM (D (i, j), D1) of two images D (i, j) and D1;
the improved structural similarity index ISSIM has a calculation formula as follows:
wherein, alpha, beta, gamma and lambda are each item weight;
(1) adjusting according to actual needs; during actual calculation, dividing the image into M1 blocks to obtain ISSIM indexes of the image respectively; calculating the average value MSSIM, namely calculating by adopting the following formula:
Fiis the ith sub-image, D, in the gray-scale blurred image F (i, j)iThe ith sub-image in the restored image D (i, j); f1iIs the ith sub-image in reference image F1; d1iThe ith sub-image in reference image D1;
(2) improved no-reference evaluation indexes INRSS1 and INRSS2 were obtained, and the calculation formula was as follows:
INRSS1=1-MISSIM(F(i,j),F1)
INRSS2=1-MISSIM(D(i,j),D1)
the results of INRSS evaluations obtained for both images of fig. 2(a) and 2(b) are shown in the following table:
index (I) Before recovery After recovery Variations in
INRSS 0.3172 0.5769 29.05%
The results of comparing the INRSS index and the GMG index obtained for the two images of fig. 3(a) and 3(b) are shown in the following table:
index (I) Before recovery After recovery Variations in
INRSS 0.4063 0.2527 -37.8%
GMG 7.0012 5.9559 -15.0%
The result shows that the INRSS has a good evaluation effect on the image, and the influence of the artifact on the recovery result can be more effectively distinguished.
Step ten: judging according to the no-reference image evaluation indexes INRSS1 and INRSS2 obtained in the step nine, wherein the INRSS1>INRSS2 shows no quality changePreferably, in order to ensure the stability of the restoration result, the blur restoration system outputs the gray-scale blurred image F (i, j), otherwise, the output restoration image D (i, j) is output with the improved image quality, wherein,as shown in fig. 1, a fast digital imaging blur identification and restored image quality evaluation method is completed.
The present invention is capable of other embodiments and its several details are capable of modifications in various obvious respects, all without departing from the spirit and scope of the present invention.

Claims (8)

1. A fast digital imaging fuzzy identification and restored image quality assessment method is characterized in that: a quick digital imaging fuzzy identification and restored image quality assessment method is specifically carried out according to the following steps:
inputting a gray fuzzy image F (i, j), obtaining the size M × N of the gray fuzzy image F (i, j), and calculating a horizontal gradient image G of the gray fuzzy image F (i, j)x(i, j) and vertical gradient image Gy(i, j), thereby obtaining a gradient image G (i, j); where M is the width of the grayscale blurred image F (i, j), N is the grayscale blurred image F (i,j) the height of (d);
step two: calculating an average gray gradient value GMG of the gray fuzzy image F (i, j);
step three: calculating a histogram h of the gradient image G (i, j), and traversing the histogram h to obtain a non-zero gray level number NGN of the gradient image G (i, j);
step four: performing product operation according to the GMG and the NGN obtained in the second step and the third step to obtain a comprehensive fuzzy identification index BIM; carrying out threshold judgment on the obtained BIM, and if the BIM is greater than a preset threshold T, judging that a clear image is directly output; if BIM is less than the predetermined threshold value T, F (i, j) is transferred to step five; wherein, the threshold T is obtained by carrying out statistical analysis on the clear image and the blurred image;
step five: carrying out restoration operation on the gray fuzzy image F (i, j) by adopting a typical image blind restoration algorithm to obtain a restored image D (i, j);
step six, using the size of m × m and the variance of sigma2The gaussian blur kernel of (a) performs secondary blurring on the grayscale blurred image F (i, j) and the restored image D (i, j) to generate a reference image F1 of F (i, j) and a reference image D1 of D (i, j); wherein, the secondary blurring uses Gaussian blurring, and m is the size of a blurring kernel;
step seven: calculating the brightness similarity, the contrast similarity and the structure similarity of the reference image and the image to be evaluated according to F (i, j), F1, D (i, j) and D1; wherein, the images to be evaluated are gray fuzzy images F (i, j) and restored images D (i, j); reference pictures are F1 and D1; the brightness similarity of the reference image and the image to be evaluated comprises l (F (i, j), F1) and l (D (i, j), D1); the contrast similarity c (F (i, j), F1) and c (D (i, j), D1) of the reference image and the image to be evaluated; structural similarity s (F (i, j), F1) and s (D (i, j), D1) of the reference image and the image to be evaluated; l (F (i, j), F1) is the luminance similarity of the two images F (i, j) and F1, l (D (i, j), D1) is the luminance similarity of the two images D (i, j) and D1, and c (F (i, j), F1) is the contrast similarity of the two images F (i, j) and F1; c (D (i, j), D1) is the contrast similarity of the two images D (i, j) and D1; s (F (i, j), F1) is the structural similarity of the two images F (i, j) and F1; s (D (i, j), D1) is the structural similarity of the two images D (i, j) and D1;
step (ii) ofEighthly: respectively calculating gradient images g of a reference image and an image to be evaluatedF,gF1,gD,gD1Using gradient images gF,gF1,gD,gD1Calculating gradient similarity g (F (i, j), F1) of the two images F (i, j) and F1 and gradient similarity g (D (i, j), D1) of the two images D (i, j) and D1 respectively; wherein, gFGradient image of F (i, j), gF1Gradient image of F1, gDA gradient image of D (i, j); gD1A gradient image of D1; g (F (i, j), F1) is the gradient similarity of the two images F (i, j) and F1; g (D (i, j), D1) is the gradient similarity of the two images D (i, j) and D1;
step nine: calculating an improved structure similarity index ISSIM according to the brightness similarity, the contrast similarity, the structure similarity and the gradient similarity of the reference image and the image to be evaluated, partitioning the improved structure similarity index ISSIM to obtain M1 ISSIM indexes, then taking the average value of the M1 ISSIM indexes to obtain MISSIM, and calculating non-reference image evaluation indexes INRSS1 and INRSS2 according to the MISSIM; wherein the obtained MISSIM specifically comprises MISSIM (F (i, j), F1) and MISSIM (D (i, j), D1); the improved structure similarity index ISSIM comprises gradient similarity indexes ISSIM (F (i, j), F1) of two images F (i, j) and F1 and gradient similarity indexes ISSIM (D (i, j), D1) of two images D (i, j) and D1; the improved structural similarity index ISSIM has a calculation formula as follows:
I S S I M ( F ( i , j ) , F 1 ) = [ l ( F ( i , j ) , F 1 ) ] α [ c ( F ( i , j ) , F 1 ) ] β [ s ( F ( i , j ) , F 1 ) ] γ [ g ( F ( i , j ) , F 1 ) ] λ I S S I M ( D ( i , j ) , D 1 ) = [ l ( D ( i , j ) , D 1 ) ] α [ c ( D ( i , j ) , D 1 ) ] β [ s ( D ( i , j ) , D 1 ) ] γ [ g ( D ( i , j ) , D 1 ) ] λ ;
wherein, alpha, beta, gamma and lambda are each item weight;
step ten: judging according to the no-reference image evaluation indexes INRSS1 and INRSS2 obtained in the step nine, wherein the INRSS1>The INRSS2 indicates that the quality is not improved, the blur restoration system outputs a gray blurred image F (i, j), otherwise, indicates that the image quality is improved, and outputs a restored image D (i, j); wherein,thus, the method for rapidly identifying the digital imaging blur and evaluating the restored image quality is completed.
2. The method of claim 1, wherein the method comprises the steps of: obtaining a gradient image G (i, j) in the first step, wherein the calculation formula is as follows:
wherein, i and j are respectively the horizontal coordinate value and the vertical coordinate value corresponding to the image pixel point.
3. The method of claim 1, wherein the method comprises the steps of: calculating the average gray gradient value GMG of the image F (i, j) in the second step according to the following formula:
G M G = Σ i = 1 M - 1 Σ j = 1 N - 1 [ F ( i + 1 , j ) - F ( i , j ) ] 2 + [ F ( i , j + 1 ) - F ( i , j ) ] 2 2 ( M - 1 ) ( N - 1 ) .
4. the method of claim 1, wherein the method comprises the steps of: and step three, obtaining the nonzero gray scale number NGN in h, wherein the calculation formula is as follows:
h=hist(G(i,j))
NGN=num(hi≠0)
in the formula, hist represents a histogram operation; h isiThe number of pixels corresponding to the ith gray level in h.
5. The method of claim 1, wherein the method comprises the steps of: in the fifth step, blind restoration is carried out on the gray fuzzy image F (i, j) by adopting a total variation regularization method, and the restoration objective function is as follows:
argmin f { λ 2 | | k ⊗ f - F ( i , j ) | | 2 + | | G x ( i , j ) | | 1 + | | G y ( i , j ) | | 1 }
wherein k is an unknown fuzzy kernel, f is an unknown sharp image,for the convolution operation, the x,for adjustable parameters, Gx(i, j) is a horizontal gradient image, GyAnd (i, j) is a vertical gradient image, and the final restored image D (i, j) can be obtained by alternately solving the restoration objective function.
6. The method of claim 1, wherein the method comprises the steps of: the specific process of calculating the brightness similarity, the contrast similarity and the structure similarity of the reference image and the image to be evaluated in the step seven comprises the following steps:
the luminance similarity l (F (i, j), F1), l (D (i, j), D1), the contrast similarity c (F (i, j), F1), c (D (i, j), D1) and the structural similarity s (F (i, j), F1), s (D (i, j), D1) are calculated for F (i, j), F1, D (i, j) and D1, respectively, as follows:
l ( F ( i , j ) , F 1 ) = 2 μ F μ F 1 + C 1 μ F 2 + μ F 1 2 + C 1 , l ( D ( i , j ) , D 1 ) = 2 μ D μ D 1 + C 1 μ D 2 + μ D 1 2 + C 1
c ( F ( i , j ) , F 1 ) = 2 σ F σ F 1 + C 2 σ F 2 + σ F 1 2 + C 2 , c ( D ( i , j ) , D 1 ) = 2 σ D σ D 1 + C 2 σ D 2 + σ D 1 2 + C 2
s ( F ( i , j ) , F 1 ) = 2 σ F F 1 + C 2 σ F σ F 1 + C 2 , s ( D ( i , j ) , D 1 ) = 2 σ D D 1 + C 2 σ D σ D 1 + C 2
wherein, muFIs the mean value of the gray values of the F (i, j) image, μF1Is the mean value of the gray values of the F1 image, muDIs the mean value of the gray values of the D (i, j) image, μD1Is the mean value, σ, of the gray values of the D1 imageFIs the variance, σ, of the gray-scale values of the F (i, j) imageF1Is the variance, σ, of the gray-scale values of the F1 imageDIs the variance, σ, of the gray values of the D (i, j) imageD1Is the variance, σ, of the gray-scale values of the D1 imageFF1Is the covariance of the gray values of the F (i, j) and F1 images, σDD1Is the covariance of the grey values of the D (i, j) image and the D1 image, C1=(k1×L)2,C2=(k2×L)2K1 is 0.01, k2 is 0.03, and L is the pixel gray scale.
7. The method of claim 1, wherein the method comprises the steps of: step eight, respectively calculating gradient images g of the reference image and the image to be evaluatedF,gF1,gD,gD1Using gradient images gF,gF1,gD,gD1The specific process of calculating the gradient similarity g (F (i, j), F1) of the two images F (i, j) and F1 and the gradient similarity g (D (i, j), D1) of the two images D (i, j) and D1 respectively is as follows:
are respectively paired with F (i, j)F1 and D (i, j), D1 calculate the gradient image gF,gF1,gD,gD1Then, the gradient image g is calculatedF,gF1,gD,gD1The calculation formula of the gradient similarity is as follows:
g ( F ( i , j ) , F 1 ) = 2 Σ i = 1 M Σ j = 1 N g F ( i , j ) g F 1 ( i , j ) + C 4 Σ i = 1 M Σ j = 1 N [ ( g F ( i , j ) ) 2 + ( g F 1 ( i , j ) ) 2 ] + C 4
g ( D ( i , j ) , D 1 ) = 2 Σ i = 1 M Σ j = 1 N g D ( i , j ) g D 1 ( i , j ) + C 4 Σ i = 1 M Σ j = 1 N [ ( g D ( i , j ) ) 2 + ( g D 1 ( i , j ) ) 2 ] + C 4
wherein, C4=(k4×L)2K4 is 0.03, L is the pixel gray level, and C4 is a constant.
8. The method of claim 1, wherein the method comprises the steps of: in the ninth step, the non-reference image evaluation indexes INRSS1 and INRSS2 are obtained according to the MISSIM calculation, and the related calculation formula is as follows:
(1) dividing the image into M1 blocks to respectively obtain ISSIM indexes of the image; calculating the average value MSSIM, namely calculating by adopting the following formula:
M I S S I M ( F ( i , j ) , F 1 ) = 1 M 1 Σ i = 1 M 1 I S S I M ( F i , F 1 i )
M I S S I M ( D ( i , j ) , D 1 ) = 1 M 1 Σ i = 1 M 1 I S S I M ( D i , D 1 i )
Fiis the ith sub-image, D, in the gray-scale blurred image F (i, j)iThe ith sub-image in the restored image D (i, j); f1iIs the ith sub-image in reference image F1; d1iThe ith sub-image in reference image D1;
(2) improved no-reference evaluation indexes INRSS1 and INRSS2 were obtained, and the calculation formula was as follows:
I N R S S 1 = 1 - M I S S I M ( F ( i , j ) , F 1 ) I N R S S 2 = 1 - M I S S I M ( D ( i , j ) , D 1 ) .
CN201410592425.6A 2014-10-28 2014-10-28 Rapid digital imaging fuzzy identification and restored image quality assessment method Active CN104408707B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410592425.6A CN104408707B (en) 2014-10-28 2014-10-28 Rapid digital imaging fuzzy identification and restored image quality assessment method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410592425.6A CN104408707B (en) 2014-10-28 2014-10-28 Rapid digital imaging fuzzy identification and restored image quality assessment method

Publications (2)

Publication Number Publication Date
CN104408707A CN104408707A (en) 2015-03-11
CN104408707B true CN104408707B (en) 2017-04-19

Family

ID=52646336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410592425.6A Active CN104408707B (en) 2014-10-28 2014-10-28 Rapid digital imaging fuzzy identification and restored image quality assessment method

Country Status (1)

Country Link
CN (1) CN104408707B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105469413B (en) * 2015-12-10 2017-12-26 哈尔滨工业大学 It is a kind of based on normalization ring weighting without refer to smear restoration image synthesis method for evaluating quality
CN107545251A (en) * 2017-08-31 2018-01-05 北京图铭视界科技有限公司 Face quality discrimination and the method and device of picture enhancing
CN107767350B (en) * 2017-10-17 2020-10-23 青岛海信医疗设备股份有限公司 Video image restoration method and device
CN108109147B (en) * 2018-02-10 2022-02-18 北京航空航天大学 No-reference quality evaluation method for blurred image
CN108428232B (en) * 2018-03-20 2019-07-19 合肥工业大学 A kind of blind appraisal procedure of cartoon image quality
CN108681684B (en) * 2018-04-16 2021-02-02 佛山市顺德区中山大学研究院 Non-reference quality evaluation method, device and system for two-dimensional bar code image
CN109003296B (en) * 2018-05-31 2020-08-14 南京航空航天大学 Feature extraction method for representing ringing effect of restored image
CN108985350B (en) * 2018-06-27 2021-12-10 北京中安未来科技有限公司 Method and device for recognizing blurred image based on gradient amplitude sparse characteristic information, computing equipment and storage medium
CN109360157B (en) * 2018-08-22 2020-08-25 浙江工业大学 TV and wavelet regularization-based spatial variation blurred image restoration method
CN109242790A (en) * 2018-08-22 2019-01-18 大连理工大学 A kind of image deblurring appraisal procedure towards fields of measurement
CN109712134B (en) * 2018-12-28 2020-11-06 武汉虹识技术有限公司 Iris image quality evaluation method and device and electronic equipment
CN110211105B (en) * 2019-05-24 2021-09-07 厦门美柚股份有限公司 Blurred image detection method, device and system
CN111914898B (en) * 2020-06-30 2022-06-28 电子科技大学 Adaptive threshold-based airborne SAR task adaptability evaluation method
CN112767327B (en) * 2021-01-08 2021-09-07 上海大学 Image quality management system and method based on neural network
CN117593192B (en) * 2024-01-18 2024-03-26 大连云间来客科技有限公司 Gynecological cervical image enhancement analysis method
CN118015033A (en) * 2024-02-02 2024-05-10 嘉洋智慧安全科技(北京)股份有限公司 Image processing method, device, equipment and medium
CN118075449A (en) * 2024-04-18 2024-05-24 智广海联(天津)大数据技术有限公司 Method and device for evaluating camera monitoring video quality

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102339460A (en) * 2011-08-19 2012-02-01 清华大学 Adaptive satellite image restoration method
US8214793B1 (en) * 2007-06-28 2012-07-03 Adobe Systems Incorporated Automatic restoration of tool configuration while navigating layers of a composition
CN103559693A (en) * 2013-11-18 2014-02-05 东南大学 Image local structure self-adaption recovery method based on non-continuity indicator

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101810876B1 (en) * 2012-03-13 2018-01-26 삼성전자주식회사 A method and an apparatus for debluring non-uniform motion blur of a large scale input image based on a tile unit

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8214793B1 (en) * 2007-06-28 2012-07-03 Adobe Systems Incorporated Automatic restoration of tool configuration while navigating layers of a composition
CN102339460A (en) * 2011-08-19 2012-02-01 清华大学 Adaptive satellite image restoration method
CN103559693A (en) * 2013-11-18 2014-02-05 东南大学 Image local structure self-adaption recovery method based on non-continuity indicator

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于对比度敏感度的无参考图像清晰度评价;范媛媛 等;《光学精密工程》;20111031;第19卷(第10期);第2485-2493页 *

Also Published As

Publication number Publication date
CN104408707A (en) 2015-03-11

Similar Documents

Publication Publication Date Title
CN104408707B (en) Rapid digital imaging fuzzy identification and restored image quality assessment method
Shen et al. Hybrid no-reference natural image quality assessment of noisy, blurry, JPEG2000, and JPEG images
CN105279772B (en) A kind of trackability method of discrimination of infrared sequence image
CN111612741B (en) Accurate reference-free image quality evaluation method based on distortion recognition
CN111340749B (en) Image quality detection method, device, equipment and storage medium
CN111062895B (en) Microscopic image restoration method based on multi-view-field segmentation
US9508018B2 (en) Systems and methods for object detection
Zhou et al. Improved estimation of motion blur parameters for restoration from a single image
US20110085026A1 (en) Detection method and detection system of moving object
CN116129195A (en) Image quality evaluation device, image quality evaluation method, electronic device, and storage medium
Ljubičić et al. SSIMS-Flow: Image velocimetry workbench for open-channel flow rate estimation
CN111275687B (en) Fine-grained image stitching detection method based on connected region marks
Kanchev et al. Blurred image regions detection using wavelet-based histograms and SVM
CN104463910B (en) High-speed motion target extraction method based on motion vector
Zou et al. Statistical analysis of signal-dependent noise: application in blind localization of image splicing forgery
CN105825480A (en) Image deblurring method based on sapphire surface defect automatic detection
Sonawane et al. Image quality assessment techniques: An overview
CN114677670A (en) Automatic identification and positioning method for identity card tampering
Chong et al. Image extrema analysis and blur detection with identification
Frosio et al. Adaptive segmentation based on a learned quality metric.
CN109359513B (en) Anomaly detection method based on edge detection and color matching
CN108389219B (en) Weak and small target tracking loss re-detection method based on multi-peak judgment
Zhu et al. Efficient perceptual-based spatially varying out-of-focus blur detection
Zhou et al. A self-adaptive learning method for motion blur kernel estimation of the single image
RU2774058C1 (en) Method for definition (recognition) of the fact of presentation of digital copy of the document in screen reshoot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20190614

Address after: 150000 Heilongjiang Harbin Dalian economic and Trade Zone, the North Road and Xingkai Road intersection

Patentee after: HIT ROBOT GROUP Co.,Ltd.

Address before: 150001 No. 92 West straight street, Nangang District, Heilongjiang, Harbin

Patentee before: Harbin Institute of Technology

PP01 Preservation of patent right

Effective date of registration: 20240626

Granted publication date: 20170419