CN113628192A - Image blur detection method, device, apparatus, storage medium, and program product - Google Patents

Image blur detection method, device, apparatus, storage medium, and program product Download PDF

Info

Publication number
CN113628192A
CN113628192A CN202110923111.XA CN202110923111A CN113628192A CN 113628192 A CN113628192 A CN 113628192A CN 202110923111 A CN202110923111 A CN 202110923111A CN 113628192 A CN113628192 A CN 113628192A
Authority
CN
China
Prior art keywords
image
gradient
blurred
processing
quotient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110923111.XA
Other languages
Chinese (zh)
Other versions
CN113628192B (en
Inventor
王向阳
邢怀飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110923111.XA priority Critical patent/CN113628192B/en
Publication of CN113628192A publication Critical patent/CN113628192A/en
Application granted granted Critical
Publication of CN113628192B publication Critical patent/CN113628192B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The disclosure provides an image blur detection method and device, electronic equipment, a computer readable storage medium and a computer program product, and relates to the technical field of artificial intelligence such as image recognition and cloud service. One embodiment of the method comprises: acquiring a first image to be detected; carrying out first fuzzy processing on the first image to obtain a second image; performing second fuzzy processing on the second image to obtain a third image; and determining whether the first image is a blurred image or not according to the gradient information quotient of the third image and the second image. According to the embodiment, whether the first image is a blurred image or not is determined according to the gradient quotient between the third image subjected to twice blurring processing and the second image subjected to once blurring processing, so that whether noise (noise) interfering with discrimination exists in the first image or not can be kept to have obvious correlation between the calculated gradient and a discrimination conclusion, and the accuracy of a discrimination result is further improved.

Description

Image blur detection method, device, apparatus, storage medium, and program product
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to the field of artificial intelligence technologies such as image recognition and cloud services, and in particular, to an image blur detection method and apparatus, an electronic device, a computer-readable storage medium, and a computer program product.
Background
In scenes such as video playing, meeting, live broadcasting and monitoring, the phenomenon of blurred pictures may occur, so that the quality of videos or pictures is reduced, and poor experience is brought to viewers.
Therefore, how to detect the blurred image quickly and accurately is the basis for improving the viewing experience.
Disclosure of Invention
The embodiment of the disclosure provides an image blur detection method and device, electronic equipment, a computer readable storage medium and a computer program product.
In a first aspect, an embodiment of the present disclosure provides an image blur detection method, including: acquiring a first image to be detected; carrying out first fuzzy processing on the first image to obtain a second image; performing second fuzzy processing on the second image to obtain a third image; and determining whether the first image is a blurred image or not according to the gradient information quotient of the third image and the second image.
In a second aspect, an embodiment of the present disclosure provides an image blur detection apparatus, including: a first image acquisition unit configured to acquire a first image to be detected; the primary blurring processing unit is configured to perform first blurring processing on the first image to obtain a second image; the secondary blurring processing unit is configured to perform secondary blurring processing on the second image to obtain a third image; and the blurred image judging unit is configured to determine whether the first image is a blurred image according to the gradient information quotient of the third image and the second image.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of image blur detection as described in any one of the implementations of the first aspect when executed.
In a fourth aspect, the disclosed embodiments provide a non-transitory computer-readable storage medium storing computer instructions for enabling a computer to implement the image blur detection method as described in any implementation manner of the first aspect when executed.
In a fifth aspect, the present disclosure provides a computer program product comprising a computer program, which when executed by a processor is capable of implementing the image blur detection method as described in any implementation manner of the first aspect.
The image blur detection method provided by the embodiment of the disclosure includes the steps of firstly, acquiring a first image to be detected; then, carrying out first fuzzy processing on the first image to obtain a second image; then, carrying out second fuzzy processing on the second image to obtain a third image; and finally, determining whether the first image is a blurred image or not according to the gradient information quotient of the third image and the second image.
Conventionally, whether the first image is a blurred image is determined based on a gradient quotient between the second image subjected to the primary blurring processing and the original first image, that is, if the first image is a sharp image, the gradient quotient is slightly small, that is, the gradient information of the first image is greatly changed by the blurring processing, otherwise, the gradient quotient is not greatly changed. However, if a noise (also referred to as noise) exists in the first image, the existence of the noise may make the gradient quotient between the second image and the first image no longer have a significant correlation with whether the first image is a blurred image, i.e., the first image with the noise is no longer suitable for detecting whether the first image is a blurred image.
Aiming at the problem of the first image with noise in the aspect of image blur detection, the method additionally performs a blurring process on the second image after the blurring process once on the basis of performing the blurring process once on the first image conventionally, and adjusts the calculation mode of the gradient quotient for judging whether the first image is a blurred image into the gradient information of the two images after the blurring process twice and the blurring process once, namely, the first image with noise does not have correlation between the original gradient quotient and the blurred image judgment, because the image of the noise part interferes the original judgment after the blurring process, and the interference factor is eliminated as much as possible after the blurring process once, namely, the gradient quotient of the two images after the blurring process twice and the blurring process once recovers the correlation with the judgment whether the images are blurred images again, and then the technical scheme provided by the disclosure improves the judging accuracy of the blurred image.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture to which the present disclosure may be applied;
fig. 2 is a flowchart of an image blur detection method provided in an embodiment of the present disclosure;
fig. 3 is a flowchart of another image blur detection method provided by the embodiment of the present disclosure;
fig. 4 is a schematic flowchart of a method for deblurring a blurred image according to an embodiment of the present disclosure;
FIG. 5 is a flowchart illustrating a method for training an image deblurring model according to an embodiment of the present disclosure
Fig. 6 is a block diagram of an image blur detection apparatus according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device suitable for executing an image blur detection method according to an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness. It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, necessary security measures are taken, and the customs of the public order is not violated.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the image blur detection method, apparatus, electronic device, and computer-readable storage medium of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 and the server 105 may be installed with various applications for implementing information communication between the two, such as an image processing application, a blurred image recognition application, an instant messaging application, and the like.
The terminal apparatuses 101, 102, 103 and the server 105 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices with display screens, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like; when the terminal devices 101, 102, and 103 are software, they may be installed in the electronic devices listed above, and they may be implemented as multiple software or software modules, or may be implemented as a single software or software module, and are not limited in this respect. When the server 105 is hardware, it may be implemented as a distributed server cluster composed of multiple servers, or may be implemented as a single server; when the server is software, the server may be implemented as a plurality of software or software modules, or may be implemented as a single software or software module, which is not limited herein.
The server 105 may provide various services through various built-in applications, taking a blurred image recognition application which may provide a blurred image recognition service as an example, the server 105 may implement the following effects when running the blurred image recognition application: firstly, acquiring a first image to be detected from terminal equipment 101, 102 and 103 through a network 104; then, carrying out first fuzzy processing on the first image to obtain a second image; then, carrying out first fuzzy processing on the second image to obtain a third image; and finally, determining whether the first image is a blurred image or not according to the gradient information quotient of the third image and the second image.
It should be noted that the first image to be detected may be acquired from the terminal devices 101, 102, and 103 through the network 104, or may be stored locally in the server 105 in advance in various ways. Thus, when the server 105 detects that such data is already stored locally (e.g., a pending blurred image recognition task remaining before starting processing), it may choose to retrieve such data directly from locally, in which case the exemplary system architecture 100 may also not include the terminal devices 101, 102, 103 and the network 104.
Since the fuzzy image recognition requires more computation resources and stronger computation capability, the image blur detection method provided in the following embodiments of the present disclosure is generally executed by the server 105 having stronger computation capability and more computation resources, and accordingly, the image blur detection apparatus is generally disposed in the server 105. However, it should be noted that, when the terminal devices 101, 102, and 103 also have computing capabilities and computing resources meeting the requirements, the terminal devices 101, 102, and 103 may also complete the above-mentioned operations that are originally delivered to the server 105 through the application of fuzzy image recognition installed thereon, and then output the same result as the server 105. Particularly, when there are a plurality of types of terminal devices having different computation capabilities at the same time, but when the application for recognizing blurred images determines that the terminal device has a strong computation capability and a large amount of computation resources are left, the terminal device may execute the above computation to appropriately reduce the computation pressure of the server 105, and accordingly, the image blur detection device may be provided in the terminal devices 101, 102, and 103. In such a case, the exemplary system architecture 100 may also not include the server 105 and the network 104.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring to fig. 2, fig. 2 is a flowchart of an image blur detection method according to an embodiment of the present disclosure, where the process 200 includes the following steps:
step 201: acquiring a first image to be detected;
this step is intended to acquire, by an execution subject of the image blur detection method (for example, the server 105 shown in fig. 1), a first image for which detection as to whether or not a blurred image is required.
The first image may be any frame in a video stream, or a specially selected image, or an image set with a number greater than 2, or each image uploaded through an image uploading interface of a website or an application, or each image obtained by shooting through a camera application, that is, all images that may need to be subjected to blur detection may be used as the first image.
Step 202: carrying out first fuzzy processing on the first image to obtain a second image;
in addition to step 201, this step is intended to blur the first image, which is a blurred image, by the execution subject, and the processing mechanism based on the blur processing indicates that the second image obtained after the blur processing is always a blurred image regardless of whether the first image is a blurred image.
In contrast, if the first image itself is a sharp or blurred image without noise, the blurring process of the first image can reversely deduce whether the first image is a sharp or blurred image by comparing the blurring degree between the second image and the first image. This is because, assuming that the first image is a sharp image without noise, the blurring degree between the second image and the first image is large after the blurring process in this step (i.e., the second image is significantly changed in blurring degree compared to the first image), whereas if the first image is a blurred image without noise, the blurring degree is increased even after the blurring process in this step, but the blurring degree is small, i.e., for an originally blurred image, even if the blurring process is performed again, a particularly significant blurring effect is not generated.
The above is an implementation manner for identifying whether a first image is blurred or not based on a conventional secondary blurring principle, that is, by using correlation between a blurring degree between a second image and the first image and whether a blurred image is determined or not, but the applicant finds that if noise (noise) exists in the first image, the correlation is lost, that is, the existence of the noise makes the determination limit of whether the blurred image is unclear or not, that is, some first images containing the noise result in obtaining a correct determination result, and some first images containing the noise result in obtaining an incorrect determination result.
It should be noted that in some scenarios, before the first image is blurred, the first image needs to be resized and converted into a grayscale map in order to better calculate the gradient information.
Step 203: performing second fuzzy processing on the second image to obtain a third image;
on the basis of step 202, this step is intended to perform blurring again on the second image subjected to blurring once by the above-described executing body to obtain a third image subjected to blurring twice.
The image is blurred, for example, common mean blurring and gaussian blurring are performed by adopting different blurring modes, and other processing modes capable of achieving the same or similar effects may also be used, which are not listed here, and the blurring processing method can be flexibly selected according to the requirements on the blurring processing result in the actual application scene, where the requirements may include: time consumption, degree of ambiguity, degree of performance consumption, memory occupancy, etc. The first blurring process and the second blurring process may be the same or different as the specific blurring process described above is arbitrarily selected.
Step 204: and determining whether the first image is a blurred image or not according to the gradient information quotient of the third image and the second image.
On the basis of step 203, this step is intended to determine whether the first image is a blurred image or not by the execution subject described above, based on the gradient information quotient of the third image and the second image. The gradient information of the image is used to represent the difference of color values between different pixels in the image, for example, two adjacent pixels are respectively black and white (or the gray value of one pixel is 0, and the other is 255), because this case will cause obvious color value change, which is equivalent to the occurrence of step-like change of color values, the gradient information is used to refer to it vividly. It can therefore be understood that the sharper the image is, the higher the quantized gradient parameters are, and the more the blurred the image is, the lower the quantized gradient parameters are, because the color value difference of two pixels whose color value change is obvious originally in the sharp image is not so obvious by blurring.
The parameter used in this step to determine whether the first image blurs the image is the gradient information quotient of the third image and the second image, and it is because the gradient information quotient can better represent the change of the blurriness than the difference of the gradient information. It can be understood that, compared with the gradient information quotient of the second image and the first image, the gradient information quotient of the third image and the second image is an image which is subjected to the blurring processing at least once because both the numerator and the denominator of the gradient information are the same, and compared with the first image which is not subjected to the blurring processing once, the influence of the existence of the noise on the gradient information can be eliminated, because the existence of the noise can obviously change the gradient information of the original image.
When judging whether the first image is a blurred image or not according to the size of the gradient information quotient, the judgment can be realized in a large and small mode according to a threshold value which is reasonably set in advance and has distinguishing capacity, namely, the judgment result is larger than or not larger than the threshold value, so that different judgment results can be obtained. In particular, the threshold value may also be a threshold value determined by an experienced evaluator regarding whether the first image of the plurality of samples belongs to a blurred image or a sharp image.
The image blur detection method provided by the embodiment of the disclosure, on the basis of conventionally performing blur processing only once on the first image, the second image after the primary blurring processing is additionally subjected to the primary blurring processing, the calculation mode of the gradient quotient for judging whether the image is a blurred image is adjusted to the gradient information of the two images after the secondary blurring processing and the primary blurring processing, that is, the first image with noise causes no correlation between the original gradient quotient and the blurred image discrimination because the image of the noise part interferes with the original discrimination after the blurring process, and the interference factor of the layer is eliminated as much as possible after the blurring process is performed more than once, the gradient quotient of the two images after the secondary blurring processing and the primary blurring processing restores the correlation with the blurred image again, and the accuracy of the blurred image is improved by means of the technical scheme provided by the disclosure.
Referring to fig. 3, fig. 3 is a flowchart of another image blur detection method according to an embodiment of the present disclosure, where the process 300 includes the following steps:
step 301: acquiring a first image to be detected;
step 302: carrying out mean value fuzzy processing on the first image to obtain a second image;
step 303: performing Gaussian blur processing on the second image to obtain a third image;
in this embodiment, an implementation manner is specifically provided in which a mean fuzzy manner is used when primary fuzzy processing is performed and a gaussian fuzzy manner is used when secondary fuzzy processing is performed, the matching is a fuzzy processing manner matching that has better final determination accuracy on a sample image provided in a test scene through an actual test, and how to select the matching of the fuzzy processing manners (for example, whether to select the same or different fuzzy processing manners, how to arrange the sequence of different fuzzy processing manners, and the like) can be performed according to a common image type in an actual application scene.
Step 304: performing Laplace transform on the second image to obtain first gradient information;
step 305: performing Laplace transformation on the third image to obtain second gradient information;
in this embodiment, an implementation manner of determining gradient information of an image by performing laplacian transform is specifically selected, and besides performing laplacian transform, a horizontal-vertical difference method, a Robert gradient operator, a Sobel operator, a Prewitt operator, and the like may also be used to perform the same or similar functions, which is not listed here.
Step 306: obtaining the mean value of the first gradient information to obtain a first gradient mean value;
step 307: obtaining the mean value of the second gradient information to obtain a second gradient mean value;
on the basis of steps 304-305, steps 306-307 are intended to obtain the mean value of the first gradient information and the second gradient information by the above-mentioned executing body respectively, so as to obtain the first gradient mean value and the second gradient mean value correspondingly, so as to simplify the calculation by means of the mean value obtaining.
Step 308: and determining whether the first image is a blurred image according to the quotient of the second gradient mean value and the first gradient mean value.
Specifically, the step can be performed by adopting the following discrimination method:
determining the first image as a blurred image in response to the fact that the quotient of the second gradient mean value and the first gradient mean value is larger than a preset blurred judging threshold value;
and determining the first image as a clear image in response to the quotient of the second gradient mean and the first gradient mean not being greater than a preset threshold.
And the fuzzy discrimination threshold is used as a critical value for determining whether the first image is the fuzzy image or not according to the magnitude of the gradient mean quotient, and is determined according to the labeling result of whether the first image is the fuzzy image or not.
On the basis of the embodiment of the process 200, the embodiment provides a matching of a fuzzy processing mode adapted to an actual application scene through steps 302 to 303, further provides a mode of calculating gradient information of an image by using a laplacian operator through step 304 and 307, and calculating an average value to be used as a calculation basis of a subsequent quotient, and further provides a specific implementation mode of determining whether the first image is a fuzzy image based on a preset fuzzy discrimination threshold for step 308.
It should be noted that there is no cause-effect or dependency relationship between the above-mentioned preferred implementation portions, and different independent embodiments can be formed by combining the above-mentioned preferred implementation portions with the process 200, respectively.
With any of the above embodiments, it can be concluded that whether the first image is a blurred image can be finally determined, but considering that the light recognition of the blurred image is not enough, it is necessary to try to correct the interference caused by the existence of the blurred image in most scenes, so this embodiment further passes through two deblurring processing manners as shown in fig. 4 and fig. 5, where the flow 400 in fig. 4 includes the following steps:
step 401: searching for a desired sharp image having the same color distribution as the first image;
since the present embodiment is based on the first image being a blurred image, in the case where the first image is a blurred image, it can only be attempted to find a sharp image corresponding thereto from the same color distribution, which is referred to as a desired sharp image in this step.
Step 402: deblurring the first image to obtain a deblurred image;
the step aims to perform deblurring processing on the first image by the execution main body according to a conventional or general deblurring processing mode to obtain a deblurred image. Typically, the deblurring effect of such conventional or universal de-erasure processing methods is limited.
Step 403: and adjusting the processing parameters of the deblurring processing based on the difference between the expected sharp image and the deblurred image until the difference meets the preset requirement.
On the basis of step 401 and step 402, this step is intended to adjust the processing parameters of the deblurring process by the execution subject described above based on the difference between the desired sharp image and the deblurred image until the difference meets the preset requirements. That is, in this step, on the basis that the desired sharp image is definitely present, the adjustment of the deblurring processing parameter is guided by the difference between the desired sharp image and the deblurred image, so that the blurred first image can be deblurred into the desired sharp image by the deblurring processing parameter after adjustment.
Unlike the implementation of directly adjusting the parameters of the deblurring process shown in fig. 4, which is an attempt to train to obtain an effective and usable image deblurring model as shown in fig. 5, the process 500 includes the following steps:
step 501: searching for a desired sharp image having the same color distribution as the first image;
step 502: and training to obtain an image deblurring model by taking the first image as an input sample and the expected clear image as an output sample.
The method is characterized in that a large number of input-output sample pairs are searched to be used as training samples to train a deblurring model, so that a deblurring image gradually learns a mode of correcting the blurred image into a clear image in a training process, and the learned adjustment mode is further applied to a new blurred image in a generalization mode.
In order to deepen understanding, the disclosure also provides a specific implementation scheme by combining a specific application scenario:
assume that the following scenario requirements exist: the user A acquires a group of pictures on the network, the group of pictures record the change process of a role demonstration group of actions, and the user A needs to model a motion model of the actions according to the group of pictures, so that the definition of the pictures needs to be ensured to meet the modeling requirement. To meet the above requirements, the following may be performed:
1) the user A uploads the group of pictures to an image fuzzy recognition application to carry out fuzzy detection on each image respectively;
2) the image fuzzy recognition application sequentially carries out fuzzy processing on each original image twice, and gradient mean values X1 and X2 of the primary fuzzy image and the secondary fuzzy image are obtained through calculation of a Laplacian operator;
3) the image fuzzy recognition application respectively calculates X2/X1 of each original image, and sets a fuzzy discrimination threshold value to be 0.85;
4) the image fuzzy recognition application finds that the gradient mean quotient of the 1 st picture and the 7 th picture in the group of 10 pictures is 0.9 and 0.89 respectively through comparison, so that the 0.9 is more than 0.85, the 0.89 is more than 0.85, the two pictures are judged to belong to a fuzzy image, and the remaining 8 pictures belong to a clear image;
5) and the image fuzzy recognition application calls a self-prepared deblurring processing model to deblur the No. 1 and No. 7 images until the new gradients of the deblurred images are less than 0.85, and then returns the images to the user A.
With further reference to fig. 6, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an image blur detection apparatus, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 6, the image blur detection apparatus 600 of the present embodiment may include: a first image acquisition unit 601, a primary blur processing unit 602, a secondary blur processing unit 603, and a blurred image discrimination unit 604. The first image acquiring unit 601 is configured to acquire a first image to be detected; a primary blurring processing unit 602 configured to perform a first blurring process on the first image to obtain a second image; a secondary blurring unit 603 configured to perform a second blurring process on the second image to obtain a third image; and a blurred image judging unit 604 configured to determine whether the first image is a blurred image according to a gradient information quotient of the third image and the second image.
In the present embodiment, in the image blur detection apparatus 600: the detailed processing of the first image obtaining unit 601, the first blur processing unit 602, the second blur processing unit 603, and the blurred image determining unit 604 and the technical effects thereof can refer to the related descriptions of step 201 and step 204 in the corresponding embodiment of fig. 2, and are not repeated herein.
In some optional implementations of the present embodiment, the blurred image determination unit 604 may include:
the first transformation subunit is configured to perform Laplace transformation on the second image to obtain first gradient information;
the second transformation subunit is configured to perform laplace transformation on the third image to obtain second gradient information;
the first mean value obtaining subunit is configured to obtain a mean value of the first gradient information to obtain a first gradient mean value;
the second mean value obtaining subunit is configured to obtain a mean value of the second gradient information to obtain a second gradient mean value;
and the blurred image judging subunit is configured to determine whether the first image is a blurred image according to the quotient of the second gradient mean value and the first gradient mean value.
In some optional implementations of the present embodiment, the blurred image discrimination subunit may be further configured to:
determining the first image as a blurred image in response to the fact that the quotient of the second gradient mean value and the first gradient mean value is larger than a preset blurred judging threshold value; the fuzzy discrimination threshold is used as a critical value for determining whether the first image is a fuzzy image or not according to the magnitude of the gradient mean quotient, and is determined based on a labeling result of whether the first image is the fuzzy image or not;
and determining the first image as a clear image in response to the quotient of the second gradient mean and the first gradient mean not being greater than a preset threshold.
In some optional implementations of the present embodiment, the primary blur processing unit 602 may be further configured to:
carrying out mean value fuzzy processing on the first image to obtain a second image;
correspondingly, the second blurring processing unit is further configured to:
and performing Gaussian blur processing on the second image to obtain a third image.
In some optional implementations of this embodiment, the image blur detection apparatus 600 may further include:
a first desired sharp image searching unit configured to search for a desired sharp image having the same color distribution as the first image in response to the first image being determined as a blurred image;
a deblurring processing unit configured to deblur the first image to obtain a deblurred image;
and the parameter adjusting unit is configured to adjust the processing parameters of the deblurring processing until the difference meets the preset requirement based on the difference between the expected sharp image and the deblurred image.
In some optional implementations of this embodiment, the image blur detection apparatus 600 may further include:
a second desired sharp image searching unit configured to search for a desired sharp image having the same color distribution as the first image in response to the first image being determined as a blurred image;
and the deblurring image training unit is configured to train the first image as an input sample and the expected sharp image as an output sample to obtain an image deblurring model.
The present embodiment exists as an embodiment of an apparatus corresponding to the above method embodiment, and the image blur detection apparatus provided in the present embodiment additionally performs a blurring process on the second image after the first blurring process on the first image, and adjusts the calculation manner of the gradient quotient for determining whether the second image is a blurred image to be the gradient information of the two images after the second blurring process and the first blurring process, that is, the first image with noise causes no correlation between the original gradient quotient and the blurred image determination, because the image with noise interferes with the original determination after the blurring process, and the image with one blurring process mostly eliminates the interference factor, that is, the gradient quotient of the two images after the second blurring process and the one blurring process recovers again the correlation with the determination whether the images are blurred images, and then the technical scheme provided by the disclosure improves the judging accuracy of the blurred image.
According to an embodiment of the present disclosure, the present disclosure also provides an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of image blur detection described in any of the above embodiments when executed by the at least one processor.
According to an embodiment of the present disclosure, there is also provided a readable storage medium storing computer instructions for enabling a computer to implement the image blur detection method described in any of the above embodiments when executed.
The disclosed embodiments provide a computer program product that, when executed by a processor, is capable of implementing the image blur detection method described in any of the embodiments above.
FIG. 6 illustrates a schematic block diagram of an example electronic device 600 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the apparatus 600 includes a computing unit 601, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the device 600 can also be stored. The calculation unit 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
A number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, or the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 601 executes the respective methods and processes described above, such as the image blur detection method. For example, in some embodiments, the image blur detection method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into the RAM 603 and executed by the computing unit 601, one or more steps of the image blur detection method described above may be performed. Alternatively, in other embodiments, the calculation unit 601 may be configured to perform the image blur detection method in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server may be a cloud Server, which is also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service extensibility in the conventional physical host and Virtual Private Server (VPS) service.
According to the technical scheme of the embodiment of the disclosure, on the basis of conventionally performing the blurring processing only once on the first image, the second image after the primary blurring processing is additionally subjected to the primary blurring processing, the calculation mode of the gradient quotient for judging whether the image is a blurred image is adjusted to the gradient information of the two images after the secondary blurring processing and the primary blurring processing, that is, the first image with noise causes no correlation between the original gradient quotient and the blurred image discrimination because the image of the noise part interferes with the original discrimination after the blurring process, and the interference factor of the layer is eliminated as much as possible after the blurring process is performed more than once, the gradient quotient of the two images after the secondary blurring processing and the primary blurring processing restores the correlation with the blurred image again, and the accuracy of the blurred image is improved by means of the technical scheme provided by the disclosure.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (15)

1. An image blur detection method, comprising:
acquiring a first image to be detected;
performing first blurring processing on the first image to obtain a second image;
performing second fuzzy processing on the second image to obtain a third image;
and determining whether the first image is a blurred image or not according to the gradient information quotient of the third image and the second image.
2. The method of claim 1, wherein the determining whether the first image is a blurred image according to a gradient information quotient of the third image and the second image comprises:
performing Laplace transform on the second image to obtain first gradient information;
performing Laplace transform on the third image to obtain second gradient information;
obtaining a mean value of the first gradient information to obtain a first gradient mean value;
obtaining the average value of the second gradient information to obtain a second gradient average value;
and determining whether the first image is a blurred image or not according to the quotient of the second gradient mean value and the first gradient mean value.
3. The method of claim 2, wherein the determining whether the first image is a blurred image according to the quotient of the second gradient mean and the first gradient mean comprises:
determining the first image as a blurred image in response to the fact that the quotient of the second gradient mean value and the first gradient mean value is larger than a preset blurred discrimination threshold; the fuzzy discrimination threshold is used as a critical value for measuring whether the first image is a fuzzy image or not through the magnitude of the gradient mean quotient, and is determined based on the labeling result of whether the first image is a fuzzy image or not;
and determining the first image to be a clear image in response to the quotient of the second gradient mean and the first gradient mean not being greater than the preset threshold.
4. The method of claim 1, wherein the first blurring the first image to obtain a second image comprises:
carrying out mean value fuzzy processing on the first image to obtain a second image;
correspondingly, the performing a second blurring process on the second image to obtain a third image includes:
and carrying out Gaussian blur processing on the second image to obtain the third image.
5. The method of any of claims 1-4, further comprising:
searching for a desired sharp image having the same color distribution as the first image in response to the first image being determined to be a blurred image;
deblurring processing is carried out on the first image to obtain a deblurred image;
and adjusting the processing parameters of the deblurring processing until the difference meets the preset requirement based on the difference between the expected clear image and the deblurred image.
6. The method of any of claims 1-4, further comprising:
searching for a desired sharp image having the same color distribution as the first image in response to the first image being determined to be a blurred image;
and training to obtain an image deblurring model by taking the first image as an input sample and the expected clear image as an output sample.
7. An image blur detection apparatus comprising:
a first image acquisition unit configured to acquire a first image to be detected;
the primary blurring processing unit is configured to perform first blurring processing on the first image to obtain a second image;
the secondary blurring processing unit is configured to perform secondary blurring processing on the second image to obtain a third image;
a blurred image determination unit configured to determine whether the first image is a blurred image according to a gradient information quotient of the third image and the second image.
8. The apparatus according to claim 7, wherein the blurred image discrimination unit includes:
the first transformation subunit is configured to perform laplace transformation on the second image to obtain first gradient information;
the second transformation subunit is configured to perform laplace transformation on the third image to obtain second gradient information;
a first average value obtaining subunit configured to obtain an average value of the first gradient information, so as to obtain a first gradient average value;
a second average value obtaining subunit configured to obtain an average value of the second gradient information to obtain a second gradient average value;
a blurred image determination subunit configured to determine whether the first image is a blurred image according to a quotient of the second gradient mean and the first gradient mean.
9. The apparatus of claim 8, wherein the blurred image discrimination subunit is further configured to:
determining the first image as a blurred image in response to the fact that the quotient of the second gradient mean value and the first gradient mean value is larger than a preset blurred discrimination threshold; the fuzzy discrimination threshold is used as a critical value for measuring whether the first image is a fuzzy image or not through the magnitude of the gradient mean quotient, and is determined based on the labeling result of whether the first image is a fuzzy image or not;
and determining the first image to be a clear image in response to the quotient of the second gradient mean and the first gradient mean not being greater than the preset threshold.
10. The apparatus of claim 7, wherein the primary blur processing unit is further configured to:
carrying out mean value fuzzy processing on the first image to obtain a second image;
correspondingly, the second-time blurring processing unit is further configured to:
and carrying out Gaussian blur processing on the second image to obtain the third image.
11. The apparatus of any of claims 7-10, further comprising:
a first desired sharp image searching unit configured to search for a desired sharp image having the same color distribution as the first image in response to the first image being determined as a blurred image;
a deblurring processing unit configured to deblur the first image to obtain a deblurred image;
a parameter adjusting unit configured to adjust a processing parameter of the deblurring process based on a difference between the desired sharp image and the deblurred image until the difference satisfies a preset requirement.
12. The apparatus of any of claims 7-10, further comprising:
a second desired sharp image searching unit configured to search for a desired sharp image having the same color distribution as the first image in response to the first image being determined as a blurred image;
and the deblurring image training unit is configured to train the first image as an input sample and the expected sharp image as an output sample to obtain an image deblurring model.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the image blur detection method of any one of claims 1-6.
14. A non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the image blur detection method of any one of claims 1-6.
15. A computer program product comprising a computer program which, when executed by a processor, implements the image blur detection method according to any of claims 1-6.
CN202110923111.XA 2021-08-12 2021-08-12 Image blur detection method, apparatus, device, storage medium, and program product Active CN113628192B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110923111.XA CN113628192B (en) 2021-08-12 2021-08-12 Image blur detection method, apparatus, device, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110923111.XA CN113628192B (en) 2021-08-12 2021-08-12 Image blur detection method, apparatus, device, storage medium, and program product

Publications (2)

Publication Number Publication Date
CN113628192A true CN113628192A (en) 2021-11-09
CN113628192B CN113628192B (en) 2023-07-11

Family

ID=78384708

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110923111.XA Active CN113628192B (en) 2021-08-12 2021-08-12 Image blur detection method, apparatus, device, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN113628192B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051390A (en) * 2022-08-15 2023-05-02 荣耀终端有限公司 Motion blur degree detection method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007057808A2 (en) * 2005-11-16 2007-05-24 Koninklijke Philips Electronics N.V. Blur estimation
CN107240078A (en) * 2017-06-06 2017-10-10 广州优创电子有限公司 Lens articulation Method for Checking, device and electronic equipment
CN108109147A (en) * 2018-02-10 2018-06-01 北京航空航天大学 A kind of reference-free quality evaluation method of blurred picture
CN110807745A (en) * 2019-10-25 2020-02-18 北京小米智能科技有限公司 Image processing method and device and electronic equipment
CN110852997A (en) * 2019-10-24 2020-02-28 普联技术有限公司 Dynamic image definition detection method and device, electronic equipment and storage medium
CN112950723A (en) * 2021-03-05 2021-06-11 湖南大学 Robot camera calibration method based on edge scale self-adaptive defocus fuzzy estimation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007057808A2 (en) * 2005-11-16 2007-05-24 Koninklijke Philips Electronics N.V. Blur estimation
CN107240078A (en) * 2017-06-06 2017-10-10 广州优创电子有限公司 Lens articulation Method for Checking, device and electronic equipment
CN108109147A (en) * 2018-02-10 2018-06-01 北京航空航天大学 A kind of reference-free quality evaluation method of blurred picture
CN110852997A (en) * 2019-10-24 2020-02-28 普联技术有限公司 Dynamic image definition detection method and device, electronic equipment and storage medium
CN110807745A (en) * 2019-10-25 2020-02-18 北京小米智能科技有限公司 Image processing method and device and electronic equipment
CN112950723A (en) * 2021-03-05 2021-06-11 湖南大学 Robot camera calibration method based on edge scale self-adaptive defocus fuzzy estimation

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051390A (en) * 2022-08-15 2023-05-02 荣耀终端有限公司 Motion blur degree detection method and device
CN116051390B (en) * 2022-08-15 2024-04-09 荣耀终端有限公司 Motion blur degree detection method and device

Also Published As

Publication number Publication date
CN113628192B (en) 2023-07-11

Similar Documents

Publication Publication Date Title
US11222211B2 (en) Method and apparatus for segmenting video object, electronic device, and storage medium
CN108337505B (en) Information acquisition method and device
CN109191512B (en) Binocular image depth estimation method, binocular image depth estimation device, binocular image depth estimation apparatus, program, and medium
CN112949767B (en) Sample image increment, image detection model training and image detection method
CN109284673B (en) Object tracking method and device, electronic equipment and storage medium
EP2660753B1 (en) Image processing method and apparatus
CN113436100B (en) Method, apparatus, device, medium, and article for repairing video
CN113691733B (en) Video jitter detection method and device, electronic equipment and storage medium
CN111340749B (en) Image quality detection method, device, equipment and storage medium
CN112714309A (en) Video quality evaluation method, device, apparatus, medium, and program product
CN112995535B (en) Method, apparatus, device and storage medium for processing video
CN114511041B (en) Model training method, image processing method, device, equipment and storage medium
CN113643260A (en) Method, apparatus, device, medium and product for detecting image quality
CN112732553A (en) Image testing method and device, electronic equipment and storage medium
CN108509876B (en) Object detection method, device, apparatus, storage medium, and program for video
CN113628192B (en) Image blur detection method, apparatus, device, storage medium, and program product
CN116703925B (en) Bearing defect detection method and device, electronic equipment and storage medium
CN114724144B (en) Text recognition method, training device, training equipment and training medium for model
CN115861077A (en) Panoramic image determination method, device, equipment and storage medium
CN113643257B (en) Image noise detection method, device, equipment, storage medium and program product
CN114821596A (en) Text recognition method and device, electronic equipment and medium
CN115205163A (en) Method, device and equipment for processing identification image and storage medium
CN110211085B (en) Image fusion quality evaluation method and system
CN108447107B (en) Method and apparatus for generating video
CN113409199A (en) Image processing method, image processing device, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant