CN113628192B - Image blur detection method, apparatus, device, storage medium, and program product - Google Patents

Image blur detection method, apparatus, device, storage medium, and program product Download PDF

Info

Publication number
CN113628192B
CN113628192B CN202110923111.XA CN202110923111A CN113628192B CN 113628192 B CN113628192 B CN 113628192B CN 202110923111 A CN202110923111 A CN 202110923111A CN 113628192 B CN113628192 B CN 113628192B
Authority
CN
China
Prior art keywords
image
gradient
blurring
blurred
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110923111.XA
Other languages
Chinese (zh)
Other versions
CN113628192A (en
Inventor
王向阳
邢怀飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110923111.XA priority Critical patent/CN113628192B/en
Publication of CN113628192A publication Critical patent/CN113628192A/en
Application granted granted Critical
Publication of CN113628192B publication Critical patent/CN113628192B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Abstract

The disclosure provides an image blurring detection method, an image blurring detection device, electronic equipment, a computer readable storage medium and a computer program product, and relates to the technical fields of image recognition, cloud service and other artificial intelligence. One embodiment of the method comprises the following steps: acquiring a first image to be detected; performing first blurring processing on the first image to obtain a second image; performing second blurring processing on the second image to obtain a third image; and determining whether the first image is a blurred image according to the gradient information quotient of the third image and the second image. According to the embodiment, whether the first image is the blurred image or not is determined according to the gradient quotient between the third image subjected to the two times of blurring processing and the second image subjected to the one time of blurring processing, so that whether noise points (noise) interfering with discrimination exist in the first image or not can be kept to have obvious correlation with a discrimination conclusion on the calculated gradient, and the accuracy of a discrimination result is further improved.

Description

Image blur detection method, apparatus, device, storage medium, and program product
Technical Field
The disclosure relates to the technical field of image processing, in particular to the technical field of artificial intelligence such as image recognition and cloud service, and especially relates to an image blur detection method, an image blur detection device, electronic equipment, a computer readable storage medium and a computer program product.
Background
In the scenes of video playing, conference, live broadcasting, monitoring and the like, the phenomenon of blurring of pictures can occur, so that the quality of video or pictures is reduced, and poor experience is brought to viewers.
Therefore, how to quickly and accurately detect the blurred image is the basis for improving the viewing experience.
Disclosure of Invention
The embodiment of the disclosure provides an image blurring detection method, an image blurring detection device, electronic equipment, a computer readable storage medium and a computer program product.
In a first aspect, an embodiment of the present disclosure provides an image blur detection method, including: acquiring a first image to be detected; performing first blurring processing on the first image to obtain a second image; performing second blurring processing on the second image to obtain a third image; and determining whether the first image is a blurred image according to the gradient information quotient of the third image and the second image.
In a second aspect, an embodiment of the present disclosure proposes an image blur detection apparatus including: a first image acquisition unit configured to acquire a first image to be detected; the primary blurring processing unit is configured to perform first blurring processing on the first image to obtain a second image; the secondary blurring processing unit is configured to perform second blurring processing on the second image to obtain a third image; and a blurred image discrimination unit configured to determine whether the first image is a blurred image or not, based on a gradient information quotient of the third image and the second image.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to implement the image blur detection method as described in any one of the implementations of the first aspect when executed.
In a fourth aspect, embodiments of the present disclosure provide a non-transitory computer-readable storage medium storing computer instructions for enabling a computer to implement an image blur detection method as described in any one of the implementations of the first aspect when executed.
In a fifth aspect, embodiments of the present disclosure provide a computer program product comprising a computer program which, when executed by a processor, is capable of implementing an image blur detection method as described in any one of the implementations of the first aspect.
The image blurring detection method provided by the embodiment of the disclosure includes the steps of firstly, acquiring a first image to be detected; then, performing first blurring processing on the first image to obtain a second image; then, performing second blurring processing on the second image to obtain a third image; and finally, determining whether the first image is a blurred image according to the gradient information quotient of the third image and the second image.
Conventionally, it is determined whether the first image is a blurred image based on a gradient quotient between the second image subjected to the blurring process and the original first image, that is, if the first image is a clear image, the gradient quotient is small, that is, the blurring process is performed to greatly change the gradient information of the first image, and otherwise, the gradient information is not greatly changed. However, if a noise (also called noise) exists in the first image, the gradient quotient between the second image and the first image is no longer obviously correlated with whether the first image is a blurred image or not, that is, the first image with the noise is no longer suitable for detecting whether the first image is a blurred image or not.
Aiming at the problems in the aspect of image blur detection of the first image with noise, the method disclosed by the invention is based on the conventional one-time blur processing of the first image, additionally carries out one-time blur processing on the second image after one-time blur processing, adjusts the calculation mode of the gradient quotient for judging whether the second blur processing and the first blur processing are carried out to be gradient information of two images after one-time blur processing, namely, the first image with noise causes no correlation between the original gradient quotient and the blur image judgment, because the image of the noise part is subjected to the blur processing to interfere with the original judgment, and the gradient quotient of the two images after one-time blur processing is restored to the correlation between the blur images or not as much as possible because the image of the noise part is subjected to the blur processing to interfere with the original judgment, thereby improving the judgment accuracy of the blur image by means of the technical scheme provided by the method disclosed by the invention.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
Other features, objects and advantages of the present disclosure will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings:
FIG. 1 is an exemplary system architecture in which the present disclosure may be applied;
fig. 2 is a flowchart of an image blur detection method according to an embodiment of the present disclosure;
FIG. 3 is a flowchart of another image blur detection method according to an embodiment of the present disclosure;
FIG. 4 is a flow chart of a method for deblurring a blurred image according to an embodiment of the present disclosure;
FIG. 5 is a flowchart of a method for training an image deblurring model according to an embodiment of the present disclosure
Fig. 6 is a block diagram of an image blur detection device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device adapted to perform an image blur detection method according to an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness. It should be noted that, without conflict, the embodiments of the present disclosure and features of the embodiments may be combined with each other.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the related personal information of the user accord with the regulations of related laws and regulations, necessary security measures are taken, and the public order harmony is not violated.
FIG. 1 illustrates an exemplary system architecture 100 to which embodiments of the image blur detection methods, apparatus, electronic devices, and computer readable storage media of the present disclosure may be applied.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various applications for implementing information communication between the terminal devices 101, 102, 103 and the server 105, such as an image processing type application, a blurred image recognition type application, an instant messaging type application, and the like, may be installed on the terminal devices.
The terminal devices 101, 102, 103 and the server 105 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices with display screens, including but not limited to smartphones, tablets, laptop and desktop computers, etc.; when the terminal devices 101, 102, 103 are software, they may be installed in the above-listed electronic devices, which may be implemented as a plurality of software or software modules, or may be implemented as a single software or software module, which is not particularly limited herein. When the server 105 is hardware, it may be implemented as a distributed server cluster formed by a plurality of servers, or may be implemented as a single server; when the server is software, the server may be implemented as a plurality of software or software modules, or may be implemented as a single software or software module, which is not particularly limited herein.
The server 105 can provide various services through various built-in applications, and, taking as an example a blurred image recognition class application that can provide a blurred image recognition service, the server 105 can achieve the following effects when running the blurred image recognition class application: first, a first image to be detected is acquired from terminal devices 101, 102, 103 through a network 104; then, performing first blurring processing on the first image to obtain a second image; then, performing first blurring processing on the second image to obtain a third image; and finally, determining whether the first image is a blurred image according to the gradient information quotient of the third image and the second image.
It is noted that the first image to be detected may be stored in advance in the server 105 in various ways, in addition to being acquired from the terminal apparatuses 101, 102, 103 through the network 104. Thus, when the server 105 detects that such data has been stored locally (e.g., a pending blurred image identification task left until processing is started), it may choose to obtain such data directly from the local, in which case the exemplary system architecture 100 may also exclude the terminal devices 101, 102, 103 and the network 104.
Since the fuzzy image is identified with more computing resources and stronger computing power, the image fuzzy detection method provided in the following embodiments of the present disclosure is generally executed by the server 105 having stronger computing power and more computing resources, and accordingly, the image fuzzy detection device is also generally disposed in the server 105. However, it should be noted that, when the terminal devices 101, 102, 103 also have the required computing capability and computing resources, the terminal devices 101, 102, 103 may also complete each operation performed by the server 105 through the fuzzy image recognition application installed thereon, and further output the same result as the server 105. Particularly, in the case where a plurality of terminal devices having different computing capabilities exist at the same time, when the blurred image recognition application determines that the terminal device has a stronger computing capability and more computing resources remain, the terminal device may be allowed to perform the above-mentioned computation, so that the computing pressure of the server 105 is appropriately reduced, and accordingly, the image blur detecting device may be provided in the terminal devices 101, 102, 103. In this case, the exemplary system architecture 100 may also not include the server 105 and the network 104.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring to fig. 2, fig. 2 is a flowchart of an image blur detection method according to an embodiment of the disclosure, wherein the flowchart 200 includes the following steps:
step 201: acquiring a first image to be detected;
this step aims at acquiring, by an execution subject of the image blur detection method (for example, the server 105 shown in fig. 1), a first image for which detection as to whether or not a blurred image is required.
The first image may be any frame in the video stream, may be a specially selected image, may be an image set with a number greater than 2, may also be each image uploaded through an image uploading interface of a website or an application, or may also be each image obtained through shooting by a camera application, that is, all images that may need to be subjected to blur detection may be used as the first image.
Step 202: performing first blurring processing on the first image to obtain a second image;
in step 201, the first image that is a blurred image is blurred by the execution subject, and the processing mechanism based on the blurring process can know that the second image obtained after the blurring process is performed is a blurred image no matter whether the first image is a blurred image.
In contrast, if the first image is a clear image or a blurred image without noise, the blurring process of the step can reversely deduce whether the first image is a clear image or a blurred image by comparing the blurring degree between the second image and the first image. This is because, assuming that the first image is a clear image without noise, the blurring process by this step will result in a greater degree of blurring between the second image and the first image (i.e., the second image significantly changes in degree of blurring as compared to the first image), whereas if the first image is a blurred image without noise, the degree of blurring will increase even after the blurring process by this step, but the degree of blurring will be smaller, i.e., no particularly significant blurring effect will be produced even if blurring is performed again for the image that was originally blurred.
The above-mentioned matters are the implementation manner of identifying whether the first image is blurred based on the conventional secondary blurring principle, namely, by means of the degree of blurring between the second image and the first image and the correlation existing between whether the first image is blurred or not, but the applicant finds that if noise exists in the first image, the correlation is lost, namely, the existence of the noise makes the discrimination boundary of whether the first image is blurred or not unclear, namely, some first images containing the noise can lead to obtaining the correct discrimination conclusion, and some first images containing the noise can lead to obtaining the wrong discrimination conclusion.
It should be noted that, in some scenarios, before the blurring process is performed on the first image, the first image needs to be further resized and converted into a gray-scale image, so as to better calculate gradient information.
Step 203: performing second blurring processing on the second image to obtain a third image;
on the basis of step 202, this step aims at blurring the second image once subjected to blurring processing again by the above-described execution subject to obtain a third image subjected to blurring processing twice.
The image blurring processing method includes a plurality of methods, such as common mean blurring and gaussian blurring, in which different blurring methods are used to blur the image, and other processing methods that can have the same or similar effects may be used, which are not listed here, and the requirements of the blurring processing result may be flexibly selected according to the actual application scenario, where the requirements may include: time consuming, blurriness, performance consumption, memory occupancy, etc. The first blurring process and the second blurring process may be performed in any manner, and may be the same or different.
Step 204: and determining whether the first image is a blurred image according to the gradient information quotient of the third image and the second image.
On the basis of step 203, this step aims at determining, by the execution subject, whether the first image is a blurred image or not, according to the gradient information quotient of the third image and the second image. The gradient information of the image is used to represent the color value difference between different pixels in the image, for example, two adjacent pixels are respectively black and white (or the gray value of one pixel is 0 and the gray value of the other pixel is 255), because this situation can cause obvious color value change, which is equivalent to the step-type change of the color value, so the gradient information is used to refer to the image as the gradient information. It will thus be appreciated that the more a sharp image, the higher the quantized gradient parameters will typically be, and the more blurred the image, the lower the quantized gradient parameters will typically be, since the difference in color values between the two pixels, which were originally in sharp image, is not so pronounced by blurring.
The step is used to determine whether the first image is blurred by the gradient information quotient of the third image and the second image, and the gradient information quotient can better represent the change of the blurring degree compared with the gradient information difference. It can be understood that, compared to the gradient information quotient of the second image and the first image, the gradient information quotient of the third image and the second image is an image which is subjected to at least one blurring process due to the molecular and the denominator of the gradient information, and compared with the first image which is not subjected to one blurring process, the influence of noise on the gradient information due to the existence of noise can be eliminated, and the existence of noise can obviously change the gradient information of the original image.
When judging whether the first image is a blurred image according to the size of the gradient information quotient, the first image can be realized in a mode of comparing the size according to a threshold value which is reasonably set in advance and has distinguishing capability, namely, different judging results can be obtained when the threshold value is larger or not larger than the threshold value. In particular, the threshold may also be a threshold determined by an experienced rater regarding whether the first image of the plurality of samples belongs to a blurred or sharp image.
According to the image blurring detection method provided by the embodiment of the disclosure, on the basis of conventionally performing blurring processing on a first image, additionally performing blurring processing on a second image subjected to blurring processing, adjusting a calculation mode of a gradient quotient for judging whether the second image is a blurred image to gradient information of two images subjected to blurring processing, namely, the original gradient quotient is not related to blurring image judgment due to the fact that a first image with noise points is subjected to blurring processing, and the original judgment is interfered by the image with noise points, and the interference factor is eliminated as much as possible due to the fact that the first blurring processing is performed, namely, the correlation between the gradient quotient of the two images subjected to blurring processing and the blurred image is restored again, and further judging accuracy of the blurred image is improved by means of the technical scheme provided by the disclosure.
Referring to fig. 3, fig. 3 is a flowchart of another image blur detection method according to an embodiment of the disclosure, wherein the flowchart 300 includes the following steps:
step 301: acquiring a first image to be detected;
step 302: performing mean value blurring processing on the first image to obtain a second image;
step 303: carrying out Gaussian blur processing on the second image to obtain a third image;
in this embodiment, an implementation manner is specifically provided in which a mean value blurring manner is used when primary blurring processing is performed, and a gaussian blurring manner is used when secondary blurring processing is performed, where the collocation is a blurring processing manner collocation with good final discrimination accuracy on a sample image provided in a test scene through an actual test, and how to select the collocation of blurring processing manners (e.g. whether to select the same or different blurring processing manners, how to arrange the sequence of different blurring processing manners, etc.) can be according to a common image type in an actual application scene.
Step 304: carrying out Laplace transformation on the second image to obtain first gradient information;
step 305: carrying out Laplacian transformation on the third image to obtain second gradient information;
in this embodiment, an implementation manner that the laplace transform is specifically selected to determine gradient information of the image may be adopted, and besides the laplace transform, a horizontal-vertical difference method, a Robert gradient operator, a Sobel operator, a Prewitt operator and the like may also be adopted to play the same or similar roles, which are not listed here.
Step 306: obtaining the average value of the first gradient information to obtain a first gradient average value;
step 307: obtaining the average value of the second gradient information to obtain a second gradient average value;
on the basis of steps 304-305, steps 306-307 aim at respectively averaging the first gradient information and the second gradient information by the execution subject to correspondingly obtain a first gradient average value and a second gradient average value, so as to simplify calculation by averaging.
Step 308: and determining whether the first image is a blurred image according to the quotient of the second gradient mean value and the first gradient mean value.
Specifically, the step can be performed by the following determination method:
determining that the first image is a blurred image in response to the quotient of the second gradient mean value and the first gradient mean value being greater than a preset blur discrimination threshold;
and determining that the first image is a clear image in response to the quotient of the second gradient mean and the first gradient mean being not greater than a preset threshold.
The fuzzy judgment threshold is used as a critical value for measuring whether the first image is a fuzzy image or not through the magnitude of the gradient mean quotient, and the fuzzy judgment threshold is determined based on a labeling result of whether the first image is the fuzzy image or not.
Based on the embodiment of the flow 200, the present embodiment provides a collocation of a fuzzy processing manner adapted to an actual application scene through steps 302-303, and further provides a manner of calculating gradient information of an image by using a laplace operator through steps 304-307 and solving a mean value as a calculation basis of a subsequent quotient, and in addition, the present embodiment provides a specific implementation manner of determining whether the first image is a fuzzy image based on a preset fuzzy discrimination threshold for step 308.
It should be noted that, the foregoing preferred implementation parts do not have any causal and dependency relationships, and different independent embodiments may be formed in conjunction with the flowchart 200, and this embodiment exists as only one preferred embodiment that includes the foregoing preferred implementation parts at the same time.
By any of the above embodiments, it can be finally determined whether the first image is a blurred image, but considering that the light identification of the blurred image is insufficient, the interference caused by the presence of the blurred image needs to be corrected as much as possible in most scenes, so the present embodiment further uses two deblurring processing manners as shown in fig. 4 and fig. 5, where the process 400 in fig. 4 includes the following steps:
Step 401: searching for a desired sharp image having the same color distribution as the first image;
since the present embodiment is based on the first image being a blurred image, in the case where the first image is a blurred image, only a clear image corresponding thereto can be sought according to the same color distribution, which is called a desired clear image in this step.
Step 402: deblurring the first image to obtain a deblurred image;
the present step aims at performing deblurring processing on the first image by the execution body according to a conventional or general deblurring processing manner to obtain a deblurred image. Typically, such conventional or common de-erasure treatments have limited de-blurring effects.
Step 403: and adjusting processing parameters of the deblurring processing based on the difference between the expected clear image and the deblurring image until the difference meets the preset requirement.
Based on the steps 401 and 402, the present step aims to adjust the processing parameters of the deblurring process by the execution subject based on the difference between the desired clear image and the deblurred image until the difference meets the preset requirement. That is, the step is to use the difference between the desired clear image and the deblurred image to guide the adjustment of the deblurring processing parameters on the basis that the desired clear image is clearly present, so that the deblurred processing parameters after the adjustment can be expected to deblur the blurred first image into the desired clear image.
Unlike the implementation of directly adjusting the parameters of the deblurring process shown in fig. 4, which is illustrated in fig. 5, the process 500 includes the steps of:
step 501: searching for a desired sharp image having the same color distribution as the first image;
step 502: and training to obtain an image deblurring model by taking the first image as an input sample and the expected clear image as an output sample.
The deblurring model is trained by searching a large number of input-output sample pairs as training samples, so that the deblurring image gradually learns how to correct the blurred image into a clear image in the training process, and the learned adjustment mode is generalized and applied to a new blurred image.
For deepening understanding, the disclosure further provides a specific implementation scheme in combination with a specific application scenario:
assume that the following scenario requirements exist: the user A acquires a group of pictures on a network, the group of pictures records the change process of a character to demonstrate a group of actions, and the user A needs to model a motion model of the action according to the group of pictures, so that the definition of the pictures needs to be ensured to meet the modeling requirement. To meet the above requirements, it may be performed in the following manner:
1) The user A uploads the group of pictures to an image blur recognition application to respectively carry out blur detection on each image;
2) The image blurring identification application sequentially carries out blurring processing for two times on each original image respectively, and gradient average values X1 and X2 of a primary blurring image and a secondary blurring image are obtained through calculation of a Laplacian operator;
3) The image blur recognition application calculates X2/X1 of each original image respectively, and sets a blur discrimination threshold value to be 0.85;
4) The image blurring identification application finds that the gradient mean quotient of the 1 st and 7 th pictures in the group of 10 pictures is 0.9 and 0.89 respectively, so that 0.9 is more than 0.85 and 0.89 is more than 0.85, the two pictures are judged to belong to blurred images, and the rest 8 pictures belong to clear images;
5) The image blurring identification application calls a pre-manufactured deblurring processing model to perform deblurring processing on the 1 st and the 7 th images until the new gradients of the deblurred images are less than 0.85, and then the images are returned to the user A.
With further reference to fig. 6, as an implementation of the method shown in the foregoing figures, the present disclosure provides an embodiment of an image blur detection apparatus, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable to various electronic devices.
As shown in fig. 6, the image blur detection apparatus 600 of the present embodiment may include: a first image acquisition unit 601, a primary blurring processing unit 602, a secondary blurring processing unit 603, and a blurred image discrimination unit 604. Wherein, the first image acquisition unit 601 is configured to acquire a first image to be detected; a primary blurring processing unit 602 configured to perform a first blurring process on the first image to obtain a second image; a secondary blurring processing unit 603 configured to perform a second blurring process on the second image to obtain a third image; the blurred image discrimination unit 604 is configured to determine whether the first image is a blurred image according to a gradient information quotient of the third image and the second image.
In the present embodiment, in the image blur detection apparatus 600: the specific processing of the first image obtaining unit 601, the primary blurring processing unit 602, the secondary blurring processing unit 603, and the blurred image determining unit 604 and the technical effects thereof may refer to the relevant descriptions of steps 201 to 204 in the corresponding embodiment of fig. 2, and are not repeated herein.
In some optional implementations of the present embodiment, the blurred image discrimination unit 604 may include:
The first transformation subunit is configured to perform Laplace transformation on the second image to obtain first gradient information;
the second transformation subunit is configured to perform laplace transformation on the third image to obtain second gradient information;
the first average value obtaining subunit is configured to obtain an average value of the first gradient information and obtain a first gradient average value;
the second average value obtaining subunit is configured to obtain an average value of the second gradient information and obtain a second gradient average value;
and a blurred image discrimination subunit configured to determine whether the first image is a blurred image according to a quotient of the second gradient average value and the first gradient average value.
In some optional implementations of the present embodiment, the blurred image discrimination subunit may be further configured to:
determining that the first image is a blurred image in response to the quotient of the second gradient mean value and the first gradient mean value being greater than a preset blur discrimination threshold; the fuzzy judgment threshold is used as a critical value for measuring whether the first image is a fuzzy image or not through the magnitude of the gradient mean quotient, and the fuzzy judgment threshold is determined and obtained based on a labeling result of whether the first image is the fuzzy image or not;
and determining that the first image is a clear image in response to the quotient of the second gradient mean and the first gradient mean being not greater than a preset threshold.
In some optional implementations of the present embodiment, the primary blurring processing unit 602 may be further configured to:
performing mean value blurring processing on the first image to obtain a second image;
correspondingly, the secondary blur processing unit is further configured to:
and carrying out Gaussian blur processing on the second image to obtain a third image.
In some optional implementations of the present embodiment, the image blur detection apparatus 600 may further include:
a first desired clear image search unit configured to search for a desired clear image having the same color distribution as the first image in response to the first image being determined as a blurred image;
the deblurring processing unit is configured to deblur the first image to obtain a deblurred image;
and a parameter adjustment unit configured to adjust processing parameters of the deblurring process based on a difference between the desired clear image and the deblurring image until the difference satisfies a preset requirement.
In some optional implementations of the present embodiment, the image blur detection apparatus 600 may further include:
a second desired clear image search unit configured to search for a desired clear image having the same color distribution as the first image in response to the first image being determined as a blurred image;
And the deblurring image training unit is configured to train to obtain an image deblurring model by taking the first image as an input sample and the expected clear image as an output sample.
The embodiment exists as an embodiment of the device corresponding to the embodiment of the method, and the image blur detection device provided by the embodiment additionally performs one-time blur processing on the second image after one-time blur processing on the basis of performing one-time blur processing on the first image in a conventional manner, adjusts a calculation mode of a gradient quotient for judging whether the second blur processing is performed on the first image and the second image after one-time blur processing to gradient information of two images after one-time blur processing, namely, the first image with noise causes no correlation between the original gradient quotient and the blur image judgment, because the image of the noise part interferes with the original judgment after one-time blur processing, the interference factor of the layer is eliminated as much as possible after one-time blur processing, namely, the gradient quotient of the second blur processing and the first image after one-time blur processing is restored again to the correlation between the blur image and the judgment whether the blur image is judged, and further improves the accuracy of the blur image by means of the technical scheme provided by the disclosure.
According to an embodiment of the present disclosure, the present disclosure further provides an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to implement the image blur detection method described in any of the above embodiments when executed.
According to an embodiment of the present disclosure, there is also provided a readable storage medium storing computer instructions for enabling a computer to implement the image blur detection method described in any of the above embodiments when executed.
The disclosed embodiments provide a computer program product which, when executed by a processor, enables the image blur detection method described in any of the above embodiments.
Fig. 6 illustrates a schematic block diagram of an example electronic device 600 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the apparatus 600 includes a computing unit 601 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the device 600 may also be stored. The computing unit 601, ROM 602, and RAM 603 are connected to each other by a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Various components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, mouse, etc.; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 601 performs the respective methods and processes described above, such as an image blur detection method. For example, in some embodiments, the image blur detection method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into the RAM 603 and executed by the computing unit 601, one or more steps of the image blur detection method described above may be performed. Alternatively, in other embodiments, the computing unit 601 may be configured to perform the image blur detection method by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of large management difficulty and weak service expansibility in the traditional physical host and virtual private server (VPS, virtual Private Server) service.
According to the technical scheme of the embodiment of the disclosure, on the basis of conventionally performing primary blurring processing on the first image, additionally performing primary blurring processing on the second image after primary blurring processing, adjusting a calculation mode of a gradient quotient for judging whether the second image is a blurred image to gradient information of the two images after secondary blurring processing and primary blurring processing, namely, the original gradient quotient is not related to the blurred image judgment due to the fact that the first image with noise points is subjected to blurring processing, and the original judgment is interfered by the images with noise points, and the interference factor is eliminated as much as possible after the primary blurring processing, namely, the correlation between the gradient quotient of the two images after secondary blurring processing and primary blurring processing and the blurred image is restored again, so that the judgment accuracy of the blurred image is improved by means of the technical scheme provided by the disclosure.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (12)

1. An image blur detection method, comprising:
acquiring a first image to be detected;
performing first blurring processing on the first image to obtain a second image;
performing second blurring processing on the second image to obtain a third image;
performing Laplace transformation on the second image to obtain first gradient information, and performing Laplace transformation on the third image to obtain second gradient information;
calculating the average value of the first gradient information to obtain a first gradient average value, and calculating the average value of the second gradient information to obtain a second gradient average value;
and determining whether the first image is a blurred image according to the quotient of the second gradient mean value and the first gradient mean value.
2. The method of claim 1, wherein the determining whether the first image is a blurred image based on a quotient of the second gradient mean and the first gradient mean comprises:
Determining that the first image is a blurred image in response to the quotient of the second gradient mean value and the first gradient mean value being greater than a preset blur discrimination threshold; the fuzzy judgment threshold is used as a critical value for measuring whether the first image is a fuzzy image or not through the magnitude of a gradient mean quotient, and the fuzzy judgment threshold is determined based on a labeling result of whether the first image is the fuzzy image or not;
and determining that the first image is a clear image in response to the quotient of the second gradient mean and the first gradient mean being not greater than the blur discrimination threshold.
3. The method of claim 1, wherein the performing the first blurring process on the first image to obtain a second image comprises:
performing mean blurring processing on the first image to obtain the second image;
correspondingly, the performing the second blurring process on the second image to obtain a third image includes:
and carrying out Gaussian blur processing on the second image to obtain the third image.
4. A method according to any one of claims 1-3, further comprising:
searching for a desired sharp image having the same color distribution as the first image in response to the first image being determined to be a blurred image;
Deblurring the first image to obtain a deblurred image;
and adjusting processing parameters of the deblurring processing based on the difference between the expected clear image and the deblurring image until the difference meets a preset requirement.
5. A method according to any one of claims 1-3, further comprising:
searching for a desired sharp image having the same color distribution as the first image in response to the first image being determined to be a blurred image;
and training to obtain an image deblurring model by taking the first image as an input sample and the expected clear image as an output sample.
6. An image blur detection apparatus comprising:
a first image acquisition unit configured to acquire a first image to be detected;
a primary blurring processing unit configured to perform a first blurring process on the first image to obtain a second image;
the secondary blurring processing unit is configured to perform second blurring processing on the second image to obtain a third image;
the fuzzy image judging unit is configured to perform Laplace transformation on the second image to obtain first gradient information, and perform Laplace transformation on the third image to obtain second gradient information; calculating the average value of the first gradient information to obtain a first gradient average value, and calculating the average value of the second gradient information to obtain a second gradient average value; and determining whether the first image is a blurred image according to the quotient of the second gradient mean value and the first gradient mean value.
7. The apparatus of claim 6, wherein the blurred image discrimination subunit is further configured to:
determining that the first image is a blurred image in response to the quotient of the second gradient mean value and the first gradient mean value being greater than a preset blur discrimination threshold; the fuzzy judgment threshold is used as a critical value for measuring whether the first image is a fuzzy image or not through the magnitude of a gradient mean quotient, and the fuzzy judgment threshold is determined based on a labeling result of whether the first image is the fuzzy image or not;
and determining that the first image is a clear image in response to the quotient of the second gradient mean and the first gradient mean being not greater than the blur discrimination threshold.
8. The apparatus of claim 6, wherein the primary blurring processing unit is further configured to:
performing mean blurring processing on the first image to obtain the second image;
correspondingly, the secondary blur processing unit is further configured to:
and carrying out Gaussian blur processing on the second image to obtain the third image.
9. The apparatus of any of claims 6-8, further comprising:
a first desired clear image search unit configured to search for a desired clear image having the same color distribution as the first image in response to the first image being determined as a blurred image;
A deblurring processing unit configured to deblur the first image to obtain a deblurred image;
and the parameter adjustment unit is configured to adjust the processing parameters of the deblurring processing based on the difference between the expected clear image and the deblurring image until the difference meets the preset requirement.
10. The apparatus of any of claims 6-8, further comprising:
a second desired clear image search unit configured to search for a desired clear image having the same color distribution as the first image in response to the first image being determined as a blurred image;
and the deblurring image training unit is configured to train to obtain an image deblurring model by taking the first image as an input sample and the expected clear image as an output sample.
11. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the image blur detection method according to any one of claims 1-5.
12. A non-transitory computer-readable storage medium storing computer instructions for causing the computer to perform the image blur detection method of any one of claims 1-5.
CN202110923111.XA 2021-08-12 2021-08-12 Image blur detection method, apparatus, device, storage medium, and program product Active CN113628192B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110923111.XA CN113628192B (en) 2021-08-12 2021-08-12 Image blur detection method, apparatus, device, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110923111.XA CN113628192B (en) 2021-08-12 2021-08-12 Image blur detection method, apparatus, device, storage medium, and program product

Publications (2)

Publication Number Publication Date
CN113628192A CN113628192A (en) 2021-11-09
CN113628192B true CN113628192B (en) 2023-07-11

Family

ID=78384708

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110923111.XA Active CN113628192B (en) 2021-08-12 2021-08-12 Image blur detection method, apparatus, device, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN113628192B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116051390B (en) * 2022-08-15 2024-04-09 荣耀终端有限公司 Motion blur degree detection method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007057808A2 (en) * 2005-11-16 2007-05-24 Koninklijke Philips Electronics N.V. Blur estimation
CN107240078A (en) * 2017-06-06 2017-10-10 广州优创电子有限公司 Lens articulation Method for Checking, device and electronic equipment
CN108109147A (en) * 2018-02-10 2018-06-01 北京航空航天大学 A kind of reference-free quality evaluation method of blurred picture
CN110807745A (en) * 2019-10-25 2020-02-18 北京小米智能科技有限公司 Image processing method and device and electronic equipment
CN110852997A (en) * 2019-10-24 2020-02-28 普联技术有限公司 Dynamic image definition detection method and device, electronic equipment and storage medium
CN112950723A (en) * 2021-03-05 2021-06-11 湖南大学 Robot camera calibration method based on edge scale self-adaptive defocus fuzzy estimation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007057808A2 (en) * 2005-11-16 2007-05-24 Koninklijke Philips Electronics N.V. Blur estimation
CN107240078A (en) * 2017-06-06 2017-10-10 广州优创电子有限公司 Lens articulation Method for Checking, device and electronic equipment
CN108109147A (en) * 2018-02-10 2018-06-01 北京航空航天大学 A kind of reference-free quality evaluation method of blurred picture
CN110852997A (en) * 2019-10-24 2020-02-28 普联技术有限公司 Dynamic image definition detection method and device, electronic equipment and storage medium
CN110807745A (en) * 2019-10-25 2020-02-18 北京小米智能科技有限公司 Image processing method and device and electronic equipment
CN112950723A (en) * 2021-03-05 2021-06-11 湖南大学 Robot camera calibration method based on edge scale self-adaptive defocus fuzzy estimation

Also Published As

Publication number Publication date
CN113628192A (en) 2021-11-09

Similar Documents

Publication Publication Date Title
CN108337505B (en) Information acquisition method and device
CN108229591B (en) Neural network adaptive training method and apparatus, device, program, and storage medium
CN112949767B (en) Sample image increment, image detection model training and image detection method
US20190138816A1 (en) Method and apparatus for segmenting video object, electronic device, and storage medium
CN109284673B (en) Object tracking method and device, electronic equipment and storage medium
CN113436100B (en) Method, apparatus, device, medium, and article for repairing video
CN111753701B (en) Method, device, equipment and readable storage medium for detecting violation of application program
CN113691733B (en) Video jitter detection method and device, electronic equipment and storage medium
CN114511041B (en) Model training method, image processing method, device, equipment and storage medium
CN112995535B (en) Method, apparatus, device and storage medium for processing video
CN111031359B (en) Video playing method and device, electronic equipment and computer readable storage medium
CN113628192B (en) Image blur detection method, apparatus, device, storage medium, and program product
CN108509876B (en) Object detection method, device, apparatus, storage medium, and program for video
CN113643260A (en) Method, apparatus, device, medium and product for detecting image quality
CN116703925B (en) Bearing defect detection method and device, electronic equipment and storage medium
CN114724144B (en) Text recognition method, training device, training equipment and training medium for model
CN113643257B (en) Image noise detection method, device, equipment, storage medium and program product
CN110889817A (en) Image fusion quality evaluation method and device
CN114821596A (en) Text recognition method and device, electronic equipment and medium
CN110211085B (en) Image fusion quality evaluation method and system
CN113313642A (en) Image denoising method and device, storage medium and electronic equipment
CN113409199A (en) Image processing method, image processing device, electronic equipment and computer readable medium
WO2020113563A1 (en) Facial image quality evaluation method, apparatus and device, and storage medium
CN113117341B (en) Picture processing method and device, computer readable storage medium and electronic equipment
CN113643266B (en) Image detection method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant