CN104867128B - Image blurring detection method and device - Google Patents

Image blurring detection method and device Download PDF

Info

Publication number
CN104867128B
CN104867128B CN201510169849.6A CN201510169849A CN104867128B CN 104867128 B CN104867128 B CN 104867128B CN 201510169849 A CN201510169849 A CN 201510169849A CN 104867128 B CN104867128 B CN 104867128B
Authority
CN
China
Prior art keywords
gradient
data
described image
image data
edge width
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510169849.6A
Other languages
Chinese (zh)
Other versions
CN104867128A (en
Inventor
王明英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN201510169849.6A priority Critical patent/CN104867128B/en
Publication of CN104867128A publication Critical patent/CN104867128A/en
Application granted granted Critical
Publication of CN104867128B publication Critical patent/CN104867128B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention provides a kind of image blurring detection method and device, wherein, the above method includes:Obtain the view data of a frame of digital image;Calculate the various dimensions gradient information of each pixel in described image data;Various dimensions gradient information according to each pixel calculates the joint gradient information of view data;When the joint gradient information of described image data meets the first prerequisite, the global refinement gradient of described image data is calculated;Whether obscured according to the global refinement gradient detection image of described image data.The image blurring detection method provided using the present invention, the image blurring scene adaptability for accurately being judged, enhancing image blurring detection that can exist to the digital picture shot under various scenes, improves the accuracy rate that fuzzy detection is carried out to image.

Description

Image blurring detection method and device
Technical field
The present invention relates to digital image processing techniques field, more particularly to a kind of image blurring detection method and device.
Background technology
Video quality diagnosis system is a kind of intelligentized monitor video accident analysis and warning system, and it can be to front end The video image that video camera is passed back carries out quality analysis judgement, accurately judges so as to be made to video failure and sends alarm signal Breath.Using Predistribution Algorithm detect video image it is whether fuzzy be video quality diagnosis system important process part.
In the prior art, video quality diagnosis system pass through frequently with the gradient based on Canny edges obscure detection method or Fuzzy detection method based on image border width carries out fuzzy detection to video image.The above method is belonged to from image gradient Set out, by the average gradient value of statistical picture so as to judge the method whether video image obscures.However, prior art is used Above-mentioned fuzzy detection method be only capable of to part because video camera is focused on it is inaccurate caused by fuzzy carry out effective detection, and for one On Local Fuzzy, the localized mode of night-time scene hypograph of the fuzzy image enriched such as details under scene of a little special screne hypographs Paste, the fuzzy etc. of specific direction epigraph can not accomplish accurate detection under moving scene.
The content of the invention
In view of this, the present invention provides a kind of image blurring detection method and device, and various fields are accurately detected to realize The image obscured under scape.
On the one hand, the embodiments of the invention provide a kind of image blurring detection method, including:Obtain a frame of digital image View data;Calculate the various dimensions gradient information of each pixel in described image data;Various dimensions according to each pixel Gradient information calculates the joint gradient information of view data;When the joint gradient information of described image data meets the first preset bar During part, the global refinement gradient of described image data is calculated;It is according to the global refinement gradient detection image of described image data It is no fuzzy.
Alternatively, the joint gradient information of described image data includes the joint gradient average value and joint ladder of view data Spend maximum;First prerequisite is:The joint gradient average value of described image data is between first gradient threshold value and Between two Grads threshold;Or, the joint gradient average value of described image data is less than second Grads threshold and the figure As the joint maximum of gradients of data is not less than 3rd gradient threshold value;Wherein, the 3rd gradient threshold value is between the described first ladder Spend between threshold value and second Grads threshold;
Calculating the global refinement gradient of described image data includes:Described image data are divided into the data of predetermined number Bulk;The color variance of each data chunk is calculated according to formula fColorVar=sqrt (fVarR+fVarG+fVarB) FColorVar, wherein, fVarR represents the variance of current data bulk R component;FVarG represents current data bulk G components Variance;FVarB represents the variance of current data bulk B component;Described R, G, B represent that the red, green, blue three of current data bulk is former Colouring component;If the color variance fColorVar of the data chunk is not less than preset color variance threshold values, and the data are big Block is labeled as valid data bulk;Calculate the refinement Grad of each valid data bulk;It is big according to each valid data The refinement Grad of block calculates the average value of the refinement gradient of each valid data bulk, by the refinement of the valid data bulk The average value of gradient refines gradient as the global of described image data.
Alternatively, the refinement Grad for calculating each valid data bulk, is specifically included:Will each significant figure The data patch of predetermined number is divided into according to bulk, the joint gradient calculation according to each pixel in each data patch is per number According to the gradient variance and the maximum of gradients of each data patch of fritter;If the gradient variance of data patch is not less than preset ladder Variance threshold values are spent, then the data patch are labeled as valid data fritter;The gradient for calculating each valid data fritter is maximum The average value of value, using the average value of the valid data fritter maximum of gradients as the valid data bulk refinement gradient Value.
Alternatively, the method that whether the global refinement gradient detection image according to view data obscures, including:By institute The global refinement gradient for stating view data is made comparisons with 4th gradient threshold value, the 5th Grads threshold;If described image data is complete Office's refinement gradient is more than the 4th gradient threshold value, and process decision chart picture is obvious picture rich in detail;If the overall situation of described image data is thin Change gradient and be less than the 5th Grads threshold, process decision chart picture is obvious blurred picture;If the global refinement ladder of described image data Degree between the 4th gradient threshold value and the 5th Grads threshold, calculate described image data horizontal edge width and Vertical edge width;The horizontal edge width and vertical edge width of described image data are made comparisons with border width threshold value; If the horizontal edge width or vertical edge width are less than the border width threshold value, process decision chart picture is picture rich in detail;If institute State horizontal edge width and vertical edge width and be not less than border width threshold value, process decision chart picture is blurred picture.
Alternatively, the horizontal edge width and vertical edge width for calculating view data, including:Travel through line by line described The vertical gradient of each pixel described in view data, obtains the position of vertical direction edge pixel point, according to the side The position of edge pixel counts the horizontal edge width of each row one by one;The average value of the horizontal edge width of each row is calculated, by institute The average value for stating horizontal edge width is used as the horizontal edge width of described image data;Institute in described image data is traveled through by column The horizontal direction gradient of each pixel is stated, the position of horizontal direction edge pixel point, the position according to the edge pixel point is obtained Put the vertical edge width for counting each row one by one;The average value of the vertical edge width of each row is calculated, the vertical edge is wide The average value of degree as described image data vertical edge width.
It is corresponding, the embodiments of the invention provide a kind of image blurring detection means, including:Data acquisition module, is used for Obtain the view data of a frame of digital image;Gradient calculation module, the multidimensional for calculating each pixel in described image data Spend gradient information;Joint gradient calculation module, view data is calculated for the various dimensions gradient information according to each pixel Joint gradient information;Overall situation refinement gradient calculation module, first is met for the joint gradient information in described image data In the case of prerequisite, the global refinement gradient of described image data is calculated;Fuzzy detection module, for according to described image Whether the global refinement gradient detection image of data obscures.
Optionally, the joint gradient information of described image data includes the joint gradient average value and joint ladder of view data Spend maximum;First prerequisite is:The joint gradient average value of described image data is between first gradient threshold value and Between two Grads threshold;Or, the joint gradient average value of described image data is less than second Grads threshold and the figure As the joint maximum of gradients of data is not less than 3rd gradient threshold value;Wherein, the 3rd gradient threshold value is between the described first ladder Spend between threshold value and second Grads threshold;
The global refinement gradient calculation module includes:Data chunk division unit, described image data is divided into pre- If the data chunk of quantity;Color variance computing unit, for according to formula fColorVar=sqrt (fVarR+fVarG+ FVarB the color variance fColorVar of each data chunk) is calculated, wherein, fVarR represents current data bulk R component Variance;FVarR represents the variance of current data bulk G components;FVarR represents the variance of current data bulk B component;It is described R, G, B represent the Red Green Blue component of current data bulk;Valid data bulk acquiring unit, in data chunk Color variance fColorVar be not less than preset color variance threshold values in the case of, by the data chunk be labeled as significant figure According to bulk;First computing unit, the refinement Grad for calculating each valid data bulk;Second computing unit, for according to The average value of the refinement gradient of each valid data bulk is calculated according to the refinement Grad of each valid data bulk, by institute State valid data bulk refinement gradient average value as described image data global refinement gradient.
Optionally, first computing unit is specifically included:Data patch divides subelement, for by the valid data Bulk is divided into the data patch of predetermined number, each data of joint gradient calculation according to each pixel in each data patch The gradient variance of fritter and the maximum of gradients of each data patch;Valid data fritter obtains subelement, for small in data It is in the case that the gradient variance of block is not less than preset gradient variance threshold values, the data patch is small labeled as valid data Block;Computation subunit, the average value of the maximum of gradients for calculating each valid data fritter, by the valid data fritter The average value of maximum of gradients as the valid data bulk refinement Grad.
Optionally, the fuzzy detection module includes:First comparing unit, for the overall situation of described image data to be refined Gradient is made comparisons with 4th gradient threshold value, the 5th Grads threshold;First identifying unit is thin for the overall situation in described image data Change gradient more than in the case of the 4th gradient threshold value, process decision chart picture is obvious picture rich in detail;Second identifying unit, for In the case that the global refinement gradient of described image data is less than the 5th Grads threshold, process decision chart picture is obvious fuzzy graph Picture;Border width computing unit, for the global refinement gradient in described image data between the 4th gradient threshold value and institute In the case of stating between the 5th Grads threshold, the horizontal edge width and vertical edge width of described image data are calculated;Second Comparing unit, for the horizontal edge width and vertical edge width of described image data to be made comparisons with border width threshold value; 3rd identifying unit, the situation for being less than the border width threshold value in the horizontal edge width or vertical edge width Under, process decision chart picture is picture rich in detail;4th identifying unit, for not small in the horizontal edge width and vertical edge width In the case of border width threshold value, process decision chart picture is blurred picture.
Optionally, the border width computing unit includes:Vertical edge determination subelement, for traveling through the figure line by line The vertical gradient of each pixel, obtains the position of vertical direction edge pixel point, according to the edge pixel as described in The position of point counts the horizontal edge width of each row one by one;Horizontal edge width computation subunit, the level for calculating each row The average value of border width, using the average value of the horizontal edge width as described image data horizontal edge width;Water Flat edge determination subelement, the horizontal direction gradient for traveling through each pixel described in described image by column obtains level side To the position of edge pixel point, the position according to the edge pixel point counts the vertical edge width of each row one by one;Vertical edges Edge width calculation subelement, the average value of the vertical edge width for calculating each row, by being averaged for the vertical edge width It is worth the vertical edge width as described image data.
The image blurring detection method provided using the present invention, calculates the joint gradient information of a two field picture first, obtains The method of image overall edge feature, substantially image that is clear or substantially obscuring is belonged to image and is judged, for that can not sentence The method of discrimination that disconnected image further refines gradient using piecemeal is detected to the local feature of image, can both differentiate figure The whole of picture is obscured, and can be realized the accurate judgement obscured to various scene hypographs with the On Local Fuzzy in resolution image again, be increased The strong scene adaptability of image blurring detection method, improves the accuracy rate that fuzzy detection is carried out to image.
Brief description of the drawings
Fig. 1 is the flow chart of image blurring detection method embodiment of the invention;
Fig. 2 is the flow chart of step S2 embodiments in image blurring detection method embodiment of the invention;
Fig. 3 is the flow chart of step 4 embodiment in image blurring detection method embodiment of the invention;
Fig. 4 is the schematic diagram that the embodiment of the present invention carries out data chunk division;
Fig. 5 is the flow chart of step 44 embodiment in image blurring detection method embodiment of the invention;
Fig. 6 is the schematic diagram that the embodiment of the present invention carries out data patch division;
Fig. 7 is the flow chart of step 5 embodiment in image blurring detection method embodiment of the invention;
Fig. 8 is the flow chart of step 54 embodiment in image blurring detection method embodiment of the invention;
Fig. 9 is the structured flowchart of image blurring device embodiment of the invention;
Figure 10 is the structured flowchart of global refinement gradient calculation module embodiments in apparatus of the present invention embodiment;
Figure 11 is the structured flowchart of the first computing unit embodiment in apparatus of the present invention embodiment;
Figure 12 is the structured flowchart of fuzzy detection module embodiments in apparatus of the present invention embodiment;
Figure 13 is the structured flowchart of border width computing unit embodiment in apparatus of the present invention embodiment.
Embodiment
In order to facilitate the understanding of the purposes, features and advantages of the present invention, it is below in conjunction with the accompanying drawings and specific real Applying mode, the present invention is further detailed explanation.
It is the purpose only merely for description specific embodiment in term used in this application, and is not intended to be limiting the application. " one kind ", " described " and "the" of singulative used in the application and appended claims are also intended to including majority Form, unless context clearly shows that other implications.It is also understood that term "and/or" used herein refers to and wrapped It may be combined containing one or more associated any or all of project listed.
It will be appreciated that though various information may be described using term first, second, third, etc. in this application, but These information should not necessarily be limited by these terms.These terms are only used for same type of information being distinguished from each other out.For example, not taking off In the case of the application scope, the first information can also be referred to as the second information, similarly, and the second information can also be referred to as The first information.Depending on linguistic context, word as used in this " if " can be construed to " ... when " or " when ... When " or " in response to determining ".
The present invention provides a kind of image blurring detection method, and the digital picture that shot under various scenes can be obscured Detection.The embodiment of the present invention will be carried out specifically by taking the fuzzy detection in monitoring application in actual video to video image as an example It is bright.
Reference picture 1 shows the flow chart of image blurring detection method embodiment of the invention, including:
Step 1, the view data for obtaining a frame of digital image;
Generally, the view data that step 1 is obtained can be the raw image data of digital picture.In actual video In monitoring application, usually contained in the raw image data of camera output by screen menu type regulative mode (on-screen Display, OSD) loading shooting time, shoot place such as so-and-so parking lot osd information.In image blurring detection, on State osd information and belong to interference region relative to whole frame of digital image, therefore, it can use Preprocessing Algorithm in advance by image Above-mentioned interference region is excluded, and then carries out fuzzy Judgment to pretreated image again.
That is, as a kind of optional embodiment, above-mentioned steps 1 may further include:S1, obtain a two field picture it is original View data;S2, the interference region in above-mentioned raw image data excluded using Preprocessing Algorithm, obtain pretreated image Data.
In step s 2, it is necessary first to the osd information i.e. boundary position of OSD character zones is determined, then again by above-mentioned OSD Character zone is excluded from whole two field picture.Reference picture 2, shows and determines OSD character zones in the embodiment of the present invention in step S2 Boundary position embodiment flow chart, including:
Step 101, the image block that original image is divided into predetermined number;For example, can be by the horizontal cutting of a two field picture Into the strip image block of predetermined number, i.e., the width of each image block and the width of original image are equal, according to from top to bottom Order is each image block number;
Step 102, from above-mentioned image block choose one piece or some pieces be used as OSD detection candidate region;For example, can root According to priori, using first piece and/or last block image block be used as OSD detection candidate region.
Step 103, Canny rim detections are done to each piece of candidate region;
Step 104, the position for primarily determining that using black and white threshold method OSD character pixels point;Due to OSD characters be generally it is white Color or black, it is possible to determine the position of OSD character pixels point roughly using self-adaption binaryzation method.
Step 105, the edge image for determining above-mentioned OSD character zones;
Step 106, the edge image to above-mentioned OSD character zones carry out horizontal and vertical projection, and two jumps are obtained respectively Become critical localisation, regard above-mentioned saltus step critical localisation as the border of OSD character zones.
Step 2, the various dimensions gradient information for calculating each pixel in above-mentioned view data;
Specifically, the various dimensions gradient information of each pixel in view data can be calculated using preset gradient operator.On It can be the differential operators such as Sobel operators, Prewitt operators to state preset gradient operator.The various dimensions gradient of above-mentioned each pixel Information can include:Horizontal direction gradient, vertical gradient, 45 ° of direction gradients, 135 ° of direction gradients.
By taking Sobel operators as an example, for any pixel point in view data, it can calculate each using below equation (1) Gradient on pixel four direction:
... ... formula (1)
Wherein, in above-mentioned formula (1), afMx, afMy, afMxy, afMyx represent respectively Sobel operators level, it is vertical, The template in 45 ° and 135 ° directions.Convolution fortune is done to each pixel in image using Sobel operators template shown in formula (1) Calculate, obtain the horizontal direction gradient fSobelx of each pixel, vertical gradient fSobely, 45 ° of direction gradient fSobelxy, 135 ° of direction gradient fSobelyx.
Step 3, the various dimensions gradient information according to each pixel calculate the joint gradient information of view data;Wherein, on Stating joint gradient information includes:Joint gradient average value, the joint maximum of gradients of view data of view data.Specifically, Above-mentioned steps 3 can include:
Step A, the various dimensions gradient information according to each pixel calculate the joint gradient of each pixel in view data;
Wherein it is possible to calculate the joint gradient fUnionGrad of each pixel using below equation (2):
FUnionGrad=sqrt [(fMx)2+(fMy)2+(fMxy)2+(fMyx)2] ... ... formula (2)
Wherein, the M in formula (2) represents gradient operator, can be the differential operators such as Sobel operators, Prewitt operators. As above, corresponding if M represents Sobel operators, the fMx in formula (2) is specially above-mentioned fSobelx, and the rest may be inferred, and fMy is specific For above-mentioned fSobely, fMxy is specially above-mentioned fSobelxy, and fMyx is specially above-mentioned fSobelyx.
Step B, the joint gradient according to each pixel obtain the joint gradient average value and joint gradient of view data most Big value;
Specifically, the joint gradient average value fUnionGradMean of view data can be by each in view data The joint gradient of pixel asks arithmetic mean of instantaneous value to obtain, or asks weighting flat by the joint gradient to each pixel in view data Average is obtained.
The joint maximum of gradients fUnionGradMax of view data acquisition modes are then:From view data each A maximum is chosen in the joint gradient of pixel, the joint maximum of gradients of current frame image is used as.The joint of each pixel Gradient can be calculated using above-mentioned formula (2) and obtained.
Step 4, when the joint gradient information of view data meets the first prerequisite, the overall situation for calculating view data is thin Change gradient.
The step 4 actually implies deterministic process, and the deterministic process is:Judge whether the joint gradient information of view data is full The first prerequisite of foot, wherein the first prerequisite is:The joint gradient average value of view data between first gradient threshold value and Between second Grads threshold;Or, the joint gradient average value of view data is less than the second Grads threshold, and the connection of view data Close maximum of gradients and be not less than 3rd gradient threshold value;
Specifically, if representing above-mentioned first gradient threshold value with iGradThre1;IGradThre2 represents above-mentioned second gradient Threshold value;IGradThre3 represents above-mentioned 3rd gradient threshold value;Wherein, iGradThre1>iGradThre3>iGradThre2.
Above-mentioned deterministic process is:The joint gradient average value fUnionGradMean of view data, joint gradient is maximum Value fUnionGradMax and above three Grads threshold:IGradThre1, iGradThre2, iGradThre3 make comparisons.
If fUionGradMean>IGradThre1, then process decision chart picture is obvious picture rich in detail, terminates image detection.
If fUionGradMean<IGradThre2, and fUionGradMax<IGradThre3, then process decision chart picture is obvious Blurred picture, terminates image detection.
Above-mentioned two differentiates that process is in the case where the joint gradient information of view data is unsatisfactory for the first prerequisite Image clearly is judged or fuzzy process using joint gradient method of discrimination.
If iGradThre2≤fUionGradMean≤iGradThre1, or, fUionGradMean<iGradThre2 And fUionGradMax >=iGradThre3, then calculate the global refinement gradient of view data.The condition is the first preset bar Part, meet above-mentioned first prerequisite image be according to above-mentioned joint gradient method of discrimination can not judge be clearly image or Fuzzy image, thus need to perform follow-up method of discrimination continuation to two field picture progress fuzzy detection.
The deterministic process of above-mentioned three kinds of situations can also be expressed as with computer language conciseness:if(fUionGradMean> IGradThre1) { clear };else if(fUionGradMean<IGradThre2, and fUionGradMax<iGradThre3) { fuzzy };Else { remaining other situations perform step 4 }.
Step 4 shown in reference picture 3 calculates the flow chart of the embodiment of the global refinement gradient of view data, on Stating step 4 can include:
Step 41, the data chunk that view data is divided into predetermined number;Data of the embodiment of the present invention as shown in Figure 4 The view data of one two field picture, exemplarily, can be divided into 4 × 4 data chunks by the schematic diagram that bulk is divided.
Step 42, the color variance fColorVar according to each data chunk of formula (3) calculating;
FColorVar=sqrt (fVarR+fVarG+fVarB) ... ... .. formula (3)
Wherein, fVarR represents the variance of current data bulk R component;FVarG represents the side of current data bulk G components Difference;FVarB represents the variance of current data bulk B component;R, G, B in above-mentioned expression formula represent current data bulk it is red, Green, primary colors component;
Assuming that comprising n pixel in current data bulk, the variance fVarR of above-mentioned current data bulk R component can be with Represented with below equation (4):
... ... formula (4)
Similarly, fVarG and fVarB can be calculated using similar formula.
If the color variance fColorVar of step 43, data chunk is not less than preset color variance threshold values, by above-mentioned number Valid data bulk is labeled as according to bulk;
Specifically, the implementation process of above-mentioned steps 43 is:By the color variance fColorVar of each data chunk with it is preset Color variance threshold values fThre makes comparisons, wherein, above-mentioned preset color variance threshold values fThre can manually be set according to prophet's experience Put;If the color variance of data chunk meets condition:FColorVar < fThre, then by the above-mentioned data chunk labeled as invalid Data chunk;If the color variance of data chunk meets condition:Above-mentioned data chunk, then be labeled as by fColorVar >=fThre Valid data bulk.After above-mentioned differentiation process, the valid data bulk in the view data of whole two field picture is counted.Still to scheme It is remaining 9 comprising 5 invalid data bulks in the view data of current frame image after step 43 is differentiated shown in 4 Valid data bulk.The data chunk that being marked in Fig. 4 has printed words represents invalid data bulk;It is marked with the data chunk of y printed words Represent valid data bulk.
Step 44, the refinement Grad for calculating each valid data bulk;
Alternatively, the flow chart of the embodiment of the step 44 shown in reference picture 5, above-mentioned steps 44 can include:.
Step 441, the data patch that each valid data bulk is further divided into predetermined number, according to each data The gradient variance fWeeGradVar of each data patch of joint gradient calculation of each pixel and each data patch in fritter Maximum of gradients fWeeGradMax;
Exemplarily, the schematic diagram that the data patch shown in reference picture 6 is divided, can enter one by each valid data bulk Step is divided into 2 × 2 data patch.Assuming that each data patch after dividing includes m pixel, then the ladder of each data patch Spending variance fWeeGradVar can be represented with below equation (5):
... ... formula (5)
In formula (5), XiThe joint gradient for representing a pixel in data patch is fUnionGrad, fUnionGrad Calculated in above-mentioned steps A using formula (2).X represents the joint gradient average value of the data patch, its computational methods Similar with the joint gradient average value fUnionGradMean of above-mentioned whole two field picture computational methods, here is omitted.
The maximum of gradients fWeeGradMax of each data patch acquisition and the acquisition of the maximum of gradients of whole two field picture Mode is similar, i.e., a maximum is chosen from the joint gradient of each pixel of data patch as the gradient of the data patch Maximum fWeeGradMax.
Because the joint gradient of each pixel in data patch is obtained in above-mentioned steps A, so the implementation of step 441 Too many amount of calculation will not be increased.
If the gradient variance of step 442, data patch is not less than preset gradient variance threshold values, by above-mentioned data patch Labeled as valid data fritter;
Specifically, it is assumed that represent preset gradient variance threshold values with fThreVar, the implementation process of above-mentioned steps 442 can be with For:The gradient variance fWeeGradVar for each data patch that step 441 is obtained and preset gradient variance threshold values FThreVar makes comparisons.If the gradient variance of data patch meets condition:FWeeGradVar < fThreVar, then by above-mentioned number Invalid data fritter is labeled as according to fritter;If the gradient variance of data patch meets condition:FWeeGradVar >=fThreVar, Above-mentioned data patch is then labeled as valid data fritter;Count the valid data fritter in each valid data bulk.Such as Fig. 6 It is shown, after the differentiation of step 442, in the data chunk remaining 2 valid data fritters be marked with y printed words data it is small Block, 2 data patch are that invalid data fritter is the data patch for being marked with w printed words in addition.
Step 443, calculate each valid data fritter maximum of gradients fWeeGradMax average value, have above-mentioned The average value for imitating data patch maximum of gradients is used as the refinement Grad of above-mentioned valid data bulk.
Specifically, the maximum of gradients fWeeGradMax of valid data fritter is obtained by step 441, to above-mentioned The maximum of gradients of each valid data fritter is averaging, and obtains the average value of each valid data fritter maximum of gradients, will The average value calculated as above-mentioned valid data bulk refinement Grad.
Step 45, the refinement Grad according to each valid data bulk calculate the refinement gradient of each valid data bulk Average value, using the average value of the refinement gradient of above-mentioned valid data bulk as view data global refinement gradient.Specifically Ground, averages to the refinement Grad of each valid data bulk that step 443 is obtained, using the average value calculated as whole The global refinement gradient of individual view data.
Above-mentioned steps 41 to step 45 is that the embodiment of the present invention calculates view data using image block refinement gradient algorithm Global refinement gradient embodiment, above-mentioned view data is the view data of a frame of digital image.
Step 5, whether obscured according to the global refinement gradient detection image of above-mentioned view data.
Specifically, a kind of embodiment of above-mentioned steps 5 can be:The global refinement for the view data that step 45 is obtained Gradient is made comparisons with preset Grads threshold;If the global refinement gradient of above-mentioned view data is more than above-mentioned preset Grads threshold, Judge current frame image as obvious picture rich in detail;If the global refinement gradient of above-mentioned view data is not more than i.e. less than or equal to above-mentioned Preset Grads threshold, then judge current frame image as obvious blurred picture.So far, image blurring detection process terminates.
Further to improve the accuracy of image blurring detection, present invention also offers the another embodiment of step 5, The flow chart of the embodiment of step 5 shown in reference picture 7, above-mentioned steps 5 can include:
Step 51, the global refinement gradient of view data made comparisons with 4th gradient threshold value, the 5th Grads threshold;
If the global refinement gradient of step 52, view data is more than 4th gradient threshold value, process decision chart picture is substantially clear figure Picture;
If the global refinement gradient of step 53, view data is less than the 5th Grads threshold, process decision chart picture is obvious mould Paste image;
If the global refinement gradient of step 54, view data is between 4th gradient threshold value and the 5th Grads threshold, meter Calculate the horizontal edge width and vertical edge width of view data;
Specifically, it is assumed that the global refinement gradient of the view data of a two field picture is represented with fGlobalWeeGradMean; FGlobalH represents 4th gradient threshold value;FGlobalL represents the 5th Grads threshold.Above-mentioned steps 51 to step 54 implementation Cheng Wei:Global refinement gradient fGlobalWeeGradMean and the 4th gradient threshold value for the view data that step 45 is obtained FGlobalH, the 5th Grads threshold fGlobalL make comparisons:
If fGlobalWeeGradMean>FGlobalH, then judge current frame image as obvious picture rich in detail;
If fGlobalWeeGradMean<FGlobalL, then judge current frame image as obvious blurred picture;
Above-mentioned two differentiates that process is to judge that a frame of digital image belongs to clear figure using piecemeal refinement gradient method of discrimination The process of picture or blurred picture.
If fGlobalL≤fGlobalWeeGradMean≤fGlobalH, the horizontal edge width of view data is calculated And vertical edge width, the image for meeting above-mentioned condition is according to joint gradient method of discrimination and piecemeal the refinement ladder used at present Degree method of discrimination can not still judge to belong to picture rich in detail or blurred picture, thus need to continue executing with follow-up method of discrimination to working as Prior image frame carries out fuzzy detection.
Reference picture 8, shows that step 54 calculates the flow chart of the border width of view data, including:
Step 541, the vertical gradient for traveling through each pixel in above-mentioned view data line by line, obtain vertical direction edge The position of pixel, the position according to above-mentioned edge pixel point counts the horizontal edge width of each row one by one;
Wherein, the vertical gradient fMy of each pixel is calculated using above-mentioned formula (1) in above-mentioned view data Go out, when traveling through the often vertical gradient of row pixel, the pixel of saltus step critical localisation can be regard as vertical direction edge Pixel.
Step 542, calculate each row horizontal edge width average value, using the average value of above-mentioned horizontal edge width as The horizontal edge width of above-mentioned view data;
Step 543, the horizontal direction gradient fMx for traveling through above-mentioned each pixel in above-mentioned image by column, obtain horizontal direction The position of edge pixel point, the position according to above-mentioned edge pixel point counts the vertical edge width of each row one by one;
Wherein, the horizontal direction gradient fMx of each pixel is calculated using above-mentioned formula (1) in above-mentioned view data Go out, when traveling through the horizontal direction gradient of each column pixel, the pixel of saltus step critical localisation can be regard as horizontal direction edge Pixel.
Step 544, calculate each row vertical edge width average value, using the average value of above-mentioned vertical edge width as The vertical edge width of above-mentioned view data.
Herein it should be noted that above-mentioned steps 541,542 have no point of sequencing with step 543,544, in this hair In bright another embodiment, step 543,544 can also be first carried out and perform step 541,542 again.
Step 55, the horizontal edge width and vertical edge width of view data made comparisons with border width threshold value;
If step 56, horizontal edge width or vertical edge width are less than border width threshold value, process decision chart picture is clear figure Picture;
If step 57, horizontal edge width and vertical edge width are not less than border width threshold value, process decision chart picture is mould Paste image.
Specifically, it is assumed that preset border width threshold value is represented with fEdgeWidth, then above-mentioned steps 55 to 57 is specific Implementation process is:
The view data obtained by the horizontal edge width of the view data obtained according to step 542 and according to step 544 Vertical edge width made comparisons with preset border width threshold value fEdgeWidth;If the horizontal edge of above-mentioned view data is wide Degree or vertical edge width are any less than fEdgeWidth, then judge current frame image as picture rich in detail;If above-mentioned view data Horizontal edge width and vertical edge width be all higher than or equal to fEdgeWidth, then judge current frame image as fuzzy graph Picture.
The implementation process of above-mentioned steps 55 to 57 is using joint gradient method of discrimination and piecemeal refinement gradient differentiation side In the case that method can not still judge that current frame image is clear or fuzzy, continue to judge present frame figure using border width method of discrimination Process as belonging to picture rich in detail or blurred picture.
To sum up, the image blurring detection method that provides of the present invention first using the joint gradient for calculating view data and with it is pre- The method that Grads threshold compares is put, image is tentatively judged first according to the global marginal information of view data.For nothing The imagery exploitation piecemeal refinement gradient method of discrimination that method judges is compared according to the global refinement gradient of view data with preset threshold value Method, special screne blurred picture is made it is further differentiate, relative to prior art, image mould provided in an embodiment of the present invention Paste detection method effectively increases the scene adaptability and accuracy rate of image blurring detection.It is further preferred that further to carry The accuracy of high fuzzy discrimination, one is entered during image blurring detection is carried out according to the global refinement gradient of view data again Step is accurately differentiated that the scene for further increasing image blurring detection is fitted using border width method of discrimination to image blurring Answering property and accuracy rate.Also, carry out, can't substantially increase in result of calculation of the follow-up method of discrimination based on the first method of discrimination Plus the amount of calculation and computation complexity of whole fuzzy detection process.Therefore, the image blurring detection method provided using the present invention It can realize to image blurring as focus blur, local motion blur, details enrich the image under scene under various scenes The accurate detection of On Local Fuzzy, the On Local Fuzzy of night-time scene hypograph etc..
For foregoing each method embodiment, in order to be briefly described, therefore it is all expressed as to a series of combination of actions, but Be those skilled in the art's use this know that the present invention is not limited by described sequence of movement because according to the present invention, certain A little steps can be carried out sequentially or simultaneously using other.
Secondly, those skilled in the art should also know, embodiment described in this description belongs to alternative embodiment, Necessary to involved action and the module not necessarily present invention.
The image blurring detection method embodiment that correspondence the invention described above is provided, present invention also offers one kind is image blurring Detection means, the structured flowchart of the image blurring device embodiment of the invention shown in reference picture 9, including:
Data acquisition module 11, the view data for obtaining a frame of digital image;
Optionally, data acquisition module 11 can also be specifically included:Initial data acquiring unit, for obtaining a frame of digital The raw image data of image;Pretreatment unit, for excluding the interference in above-mentioned raw image data using Preprocessing Algorithm Region, obtains pretreated view data.
Gradient calculation module 12, the various dimensions gradient for calculating each pixel in above-mentioned view data is believed
Breath;In the embodiment of the present invention, the various dimensions gradient of each pixel can include:It is horizontal direction gradient fM (x), vertical Direction gradient fM (y), 45 ° of direction gradient fM (xy), 135 ° of direction gradient fM (yx).
Joint gradient calculation module 13, above-mentioned picture number is calculated for the various dimensions gradient information according to above-mentioned each pixel According to joint gradient information;Wherein, the joint gradient information of view data includes the joint gradient average value and figure of view data As the joint maximum of gradients of data.
Specifically, joint gradient calculation module 13 can include:
Computing unit A, for the various dimensions gradient information according to each pixel, each picture is calculated using above-mentioned formula (2) The joint gradient fUnionGrad of element:
FUnionGrad=sqrt [(fMx)2+(fMy)2+(fMxy)2+(fMyx)2] ... ... .. formula (2)
Wherein, fMx represents that the horizontal direction gradient of each pixel, fMy represent the vertical gradient of each pixel, fMxy Represent that 45 ° of direction gradients, the fMyx of each pixel represent 135 ° of direction gradients of each pixel;
Computing unit B, the joint gradient for obtaining above-mentioned view data for the joint gradient according to above-mentioned each pixel is put down Average and joint maximum of gradients;
Overall situation refinement gradient calculation module 14, the first preset bar is met for the joint gradient information in above-mentioned view data In the case of part, the global refinement gradient of above-mentioned view data is calculated;Wherein, above-mentioned first prerequisite is:Above-mentioned picture number According to joint gradient average value between first gradient threshold value and the second Grads threshold;Or, the joint of above-mentioned view data Gradient average value is less than above-mentioned second Grads threshold and the joint maximum of gradients of above-mentioned view data is not less than 3rd gradient threshold Value;Wherein, above-mentioned 3rd gradient threshold value is between first gradient threshold value and the second Grads threshold.
Optionally, the global structure for refining the embodiment of gradient calculation module 14 in the embodiment of the present invention shown in reference picture 10 Block diagram, above-mentioned global refinement gradient calculation module 14 can include:
Data chunk division unit 141, above-mentioned view data is divided into the data chunk of predetermined number;
Color variance computing unit 142, the color variance for calculating each data chunk according to above-mentioned formula (3) FColorVar,
FColorVar=sqrt (fVarR+fVarG+fVarB) ... ... .. formula (3)
Wherein, fVarR represents the variance of current data bulk R component;FVarR represents the side of current data bulk G components Difference;FVarR represents the variance of current data bulk B component;Above-mentioned R, G, B represent the Red Green Blue of current data bulk Component;
Valid data bulk acquiring unit 143, is not less than preset face for the color variance fColorVar in data chunk In the case of color variance threshold values, above-mentioned data chunk is labeled as valid data bulk;
First computing unit 144, the refinement Grad for calculating each valid data bulk;
Specifically, the structured flowchart of the embodiment of the first computing unit 144 shown in reference picture 11, the first computing unit 144 It can include:
Data patch divides subelement 1441, and the data for above-mentioned valid data bulk to be divided into predetermined number are small Block, the gradient variance and each data according to each data patch of joint gradient calculation of each pixel in each data patch is small The maximum of gradients of block;
Valid data fritter obtains subelement 1442, is not less than preset gradient side for the gradient variance in data patch In the case of poor threshold value, above-mentioned data patch is labeled as valid data fritter;
Computation subunit 1443, the average value of the maximum of gradients for calculating each valid data fritter, has above-mentioned The average value for imitating data patch maximum of gradients is used as the refinement Grad of above-mentioned valid data bulk.
Second computing unit 145, calculates each effective for the refinement Grad according to above-mentioned each valid data bulk The average value of the refinement gradient of data chunk, regard the average value of the refinement gradient of above-mentioned valid data bulk as above-mentioned picture number According to global refinement gradient.
Whether fuzzy detection module 15, obscure for the global refinement gradient detection image according to view data.Fuzzy inspection Module 15 is surveyed when implementing image blurring detection, the global refinement ladder for the view data that the second computing unit 145 can be exported Degree is made comparisons with preset Grads threshold, if the global refinement gradient of the view data of current frame image is more than above-mentioned preset gradient threshold Value, judges current frame image as picture rich in detail;If the global refinement gradient of the view data of current frame image is not more than above-mentioned pre- Grads threshold is put, judges current frame image as blurred picture.
Optionally, the structured flowchart of the embodiment of fuzzy detection module 15 shown in reference picture 12, including:
First comparing unit 151, for the global of view data to be refined into gradient and 4th gradient threshold value, the 5th gradient threshold Value is made comparisons;
First identifying unit 152, in the case of being more than 4th gradient threshold value in the global refinement gradient of view data, Process decision chart picture is obvious picture rich in detail;
Second identifying unit 153, in the case of being less than the 5th Grads threshold in the global refinement gradient of view data, Process decision chart picture is obvious blurred picture;
Border width computing unit 154, for the global refinement gradient in view data between 4th gradient threshold value and the In the case of between five Grads threshold, the horizontal edge width and vertical edge width of view data are calculated;
Specifically, Figure 13 shows the structured flowchart of the embodiment of border width computing unit 154 of the present invention, including:
Vertical edge determination subelement 1541, the vertical direction ladder for traveling through above-mentioned each pixel in above-mentioned image line by line FMy is spent, the position of vertical direction edge pixel point is obtained, the position according to above-mentioned edge pixel point counts the level of each row one by one Border width;
Horizontal edge width computation subunit 1542, the average value of the horizontal edge width for calculating each row, will be above-mentioned The average value of horizontal edge width as above-mentioned view data horizontal edge width;
Horizontal edge determination subelement 1543, the second preset bar is met for the global refinement gradient in above-mentioned view data In the case of part, the horizontal direction gradient fMx of above-mentioned each pixel in above-mentioned image is traveled through by column, obtains horizontal direction edge picture The position of vegetarian refreshments, the position according to above-mentioned edge pixel point counts the vertical edge width of each row one by one;
Vertical edge width computation subunit 1544, the average value of the vertical edge width for calculating each row, will be above-mentioned The average value of vertical edge width as above-mentioned view data vertical edge width.
Second comparing unit 155, for by the horizontal edge width and vertical edge width and border width of view data Threshold value is made comparisons;
3rd identifying unit 156, the feelings for being less than border width threshold value in horizontal edge width or vertical edge width Under condition, process decision chart picture is picture rich in detail;
4th identifying unit 157, for being not less than border width threshold value in horizontal edge width and vertical edge width In the case of, process decision chart picture is blurred picture.
Each embodiment in this specification is described by the way of progressive, what each embodiment was stressed be with Between the difference of other embodiment, each embodiment identical similar part mutually referring to.For device embodiment For, because it is substantially similar to embodiment of the method, so description is fairly simple, referring to the portion of embodiment of the method in place of correlation Defend oneself bright.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all essences in the present invention God is with principle, and any modification, equivalent substitution and improvements done etc. should be included within the scope of protection of the invention.

Claims (10)

1. a kind of image blurring detection method, it is characterised in that including:
Obtain the view data of a frame of digital image;
Calculate the various dimensions gradient information of each pixel in described image data;
Various dimensions gradient information according to each pixel calculates the joint gradient information of view data, described image data Joint gradient information includes the joint gradient average value and joint maximum of gradients of view data;
When the joint gradient information of described image data meets the first prerequisite, the global refinement of described image data is calculated Gradient, first prerequisite is:The joint gradient average value of described image data is between first gradient threshold value and the second ladder Spend between threshold value;Or, the joint gradient average value of described image data is less than second Grads threshold and described image number According to joint maximum of gradients be not less than 3rd gradient threshold value;Wherein, the 3rd gradient threshold value is between the first gradient threshold Between value and second Grads threshold;
Whether obscured according to the global refinement gradient detection image of described image data;
Wherein, calculating the global refinement gradient of described image data includes:
Described image data are divided into the data chunk of predetermined number;
If the color variance fColorVar of the data chunk is not less than preset color variance threshold values, by the data chunk Labeled as valid data bulk;
Calculate the refinement Grad of each valid data bulk;
Refinement Grad according to each valid data bulk calculates being averaged for the refinement gradient of each valid data bulk Value, gradient is refined using the average value of the refinement gradient of the valid data bulk as the global of described image data.
2. image blurring detection method according to claim 1, it is characterised in that methods described also includes:
The color variance of each data chunk is calculated according to formula fColorVar=sqrt (fVarR+fVarG+fVarB) FColorVar, wherein, fVarR represents the variance of current data bulk R component;FVarG represents current data bulk G components Variance;FVarB represents the variance of current data bulk B component;Described R, G, B represent that the red, green, blue three of current data bulk is former Colouring component.
3. image blurring detection method according to claim 2, it is characterised in that each valid data bulk of calculating Refinement Grad, specifically include:
Each valid data bulk is divided into the data patch of predetermined number, according to each pixel in each data patch The each data patch of joint gradient calculation gradient variance and the maximum of gradients of each data patch;
If the gradient variance of data patch is not less than preset gradient variance threshold values, the data patch is labeled as significant figure According to fritter;
The average value of the maximum of gradients of each valid data fritter is calculated, by the flat of the valid data fritter maximum of gradients Average as the valid data bulk refinement Grad.
4. image blurring detection method according to claim 3, it is characterised in that the overall situation according to view data is thin Change the method whether gradient detection image obscures, including:
The global refinement gradient of described image data is made comparisons with 4th gradient threshold value, the 5th Grads threshold;
If the global refinement gradient of described image data is more than the 4th gradient threshold value, process decision chart picture is obvious picture rich in detail;
If the global refinement gradient of described image data is less than the 5th Grads threshold, process decision chart picture is obvious blurred picture;
If the global refinement gradient of described image data is between the 4th gradient threshold value and the 5th Grads threshold, meter Calculate the horizontal edge width and vertical edge width of described image data;
The horizontal edge width and vertical edge width of described image data are made comparisons with border width threshold value;
If the horizontal edge width or vertical edge width are less than the border width threshold value, process decision chart picture is picture rich in detail;
If the horizontal edge width and vertical edge width are not less than border width threshold value, process decision chart picture is blurred picture.
5. image blurring detection method according to claim 4, it is characterised in that the horizontal sides of the calculating view data Edge width and vertical edge width, including:
The vertical gradient of each pixel described in traversal described image data, obtains vertical direction edge pixel point line by line Position, the position according to the edge pixel point counts the horizontal edge width of each row one by one;
The average value of the horizontal edge width of each row is calculated, the average value of the horizontal edge width is regard as described image data Horizontal edge width;
The horizontal direction gradient of each pixel described in traversal described image data, obtains horizontal direction edge pixel point by column Position, the position according to the edge pixel point counts the vertical edge width of each row one by one;
The average value of the vertical edge width of each row is calculated, the average value of the vertical edge width is regard as described image data Vertical edge width.
6. a kind of image blurring detection means, it is characterised in that including:
Data acquisition module, the view data for obtaining a frame of digital image;
Gradient calculation module, the various dimensions gradient information for calculating each pixel in described image data;
Joint gradient calculation module, the joint ladder of view data is calculated for the various dimensions gradient information according to each pixel Information is spent, joint gradient average value and joint gradient of the joint gradient information including view data of described image data are maximum Value;
Overall situation refinement gradient calculation module, the feelings for meeting the first prerequisite in the joint gradient information of described image data Under condition, the global refinement gradient of described image data is calculated, first prerequisite is:The joint gradient of described image data Average value is between first gradient threshold value and the second Grads threshold;Or, the joint gradient average value of described image data is small It is not less than 3rd gradient threshold value in second Grads threshold and the joint maximum of gradients of described image data;Wherein, it is described 3rd gradient threshold value is between the first gradient threshold value and second Grads threshold;
Whether fuzzy detection module, obscure for the global refinement gradient detection image according to described image data;
Wherein, the global refinement gradient calculation module includes:
Data chunk division unit, described image data is divided into the data chunk of predetermined number;
Valid data bulk acquiring unit, is not less than preset color variance for the color variance fColorVar in data chunk In the case of threshold value, the data chunk is labeled as valid data bulk;
First computing unit, the refinement Grad for calculating each valid data bulk;
Second computing unit, each valid data bulk is calculated for the refinement Grad according to each valid data bulk Refinement gradient average value, using the average value of the refinement gradient of the valid data bulk as described image data the overall situation Refine gradient.
7. image blurring detection means according to claim 6, it is characterised in that the global refinement gradient calculation module Also include:
Color variance computing unit, for calculating each institute according to formula fColorVar=sqrt (fVarR+fVarG+fVarB) The color variance fColorVar of data chunk is stated, wherein, fVarR represents the variance of current data bulk R component;FVarR is represented The variance of current data bulk G components;FVarR represents the variance of current data bulk B component;Described R, G, B represent current number According to the Red Green Blue component of bulk.
8. image blurring detection means according to claim 7, it is characterised in that first computing unit is specifically wrapped Include:
Data patch divides subelement, the data patch for the valid data bulk to be divided into predetermined number, according to every The gradient variance and the gradient of each data patch of each data patch of joint gradient calculation of each pixel in individual data patch Maximum;
Valid data fritter obtains subelement, for the gradient variance in data patch not less than preset gradient variance threshold values In the case of, the data patch is labeled as valid data fritter;
Computation subunit, the average value of the maximum of gradients for calculating each valid data fritter is small by the valid data The average value of block gradient maximum as the valid data bulk refinement Grad.
9. image blurring detection means according to claim 8, it is characterised in that the fuzzy detection module includes:
First comparing unit, for the global of described image data to be refined into gradient and 4th gradient threshold value, the 5th Grads threshold Make comparisons;
First identifying unit, the situation for being more than the 4th gradient threshold value in the global refinement gradient of described image data Under, process decision chart picture is obvious picture rich in detail;
Second identifying unit, the situation for being less than the 5th Grads threshold in the global refinement gradient of described image data Under, process decision chart picture is obvious blurred picture;
Border width computing unit, for the global refinement gradient in described image data between the 4th gradient threshold value and institute In the case of stating between the 5th Grads threshold, the horizontal edge width and vertical edge width of described image data are calculated;
Second comparing unit, for by the horizontal edge width and vertical edge width of described image data and border width threshold value Make comparisons;
3rd identifying unit, the feelings for being less than the border width threshold value in the horizontal edge width or vertical edge width Under condition, process decision chart picture is picture rich in detail;
4th identifying unit, the feelings for being not less than border width threshold value in the horizontal edge width and vertical edge width Under condition, process decision chart picture is blurred picture.
10. image blurring detection means according to claim 9, it is characterised in that the border width computing unit bag Include:
Vertical edge determination subelement, the vertical gradient for traveling through each pixel described in described image line by line is obtained The position of vertical direction edge pixel point, the position according to the edge pixel point counts the horizontal edge width of each row one by one;
Horizontal edge width computation subunit, the average value of the horizontal edge width for calculating each row, by the horizontal edge The average value of width as described image data horizontal edge width;
Horizontal edge determination subelement, the horizontal direction gradient for traveling through each pixel described in described image by column is obtained The position of horizontal direction edge pixel point, the position according to the edge pixel point counts the vertical edge width of each row one by one;
Vertical edge width computation subunit, the average value of the vertical edge width for calculating each row, by the vertical edge The average value of width as described image data vertical edge width.
CN201510169849.6A 2015-04-10 2015-04-10 Image blurring detection method and device Active CN104867128B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510169849.6A CN104867128B (en) 2015-04-10 2015-04-10 Image blurring detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510169849.6A CN104867128B (en) 2015-04-10 2015-04-10 Image blurring detection method and device

Publications (2)

Publication Number Publication Date
CN104867128A CN104867128A (en) 2015-08-26
CN104867128B true CN104867128B (en) 2017-10-31

Family

ID=53912943

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510169849.6A Active CN104867128B (en) 2015-04-10 2015-04-10 Image blurring detection method and device

Country Status (1)

Country Link
CN (1) CN104867128B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296665B (en) * 2016-07-29 2019-05-14 北京小米移动软件有限公司 Card image fuzzy detection method and apparatus
CN106599783B (en) * 2016-11-09 2020-01-14 浙江宇视科技有限公司 Video occlusion detection method and device
CN107945156A (en) * 2017-11-14 2018-04-20 宁波江丰生物信息技术有限公司 A kind of method of automatic Evaluation numeral pathology scan image image quality
CN108629766A (en) * 2018-04-26 2018-10-09 北京大米科技有限公司 Image blur detection method, device, computer equipment and readable storage medium storing program for executing
CN110111261B (en) * 2019-03-28 2021-05-28 瑞芯微电子股份有限公司 Adaptive balance processing method for image, electronic device and computer readable storage medium
CN111161211B (en) * 2019-12-04 2023-11-03 成都华为技术有限公司 Image detection method and device
CN111047575A (en) * 2019-12-12 2020-04-21 青海奥珞威信息科技有限公司 Unmanned aerial vehicle power line patrol image quality blind evaluation method
CN112085701B (en) * 2020-08-05 2024-06-11 深圳市优必选科技股份有限公司 Face ambiguity detection method and device, terminal equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101877127A (en) * 2009-11-12 2010-11-03 北京大学 Image reference-free quality evaluation method and system based on gradient profile
CN101996406A (en) * 2010-11-03 2011-03-30 中国科学院光电技术研究所 No-reference structure definition image quality evaluation method
CN102968800A (en) * 2012-12-14 2013-03-13 宁波江丰生物信息技术有限公司 Image definition evaluation method
CN103455994A (en) * 2012-05-28 2013-12-18 佳能株式会社 Method and equipment for determining image blurriness
CN104134204A (en) * 2014-07-09 2014-11-05 中国矿业大学 Image definition evaluation method and image definition evaluation device based on sparse representation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9065999B2 (en) * 2011-03-24 2015-06-23 Hiok Nam Tay Method and apparatus for evaluating sharpness of image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101877127A (en) * 2009-11-12 2010-11-03 北京大学 Image reference-free quality evaluation method and system based on gradient profile
CN101996406A (en) * 2010-11-03 2011-03-30 中国科学院光电技术研究所 No-reference structure definition image quality evaluation method
CN103455994A (en) * 2012-05-28 2013-12-18 佳能株式会社 Method and equipment for determining image blurriness
CN102968800A (en) * 2012-12-14 2013-03-13 宁波江丰生物信息技术有限公司 Image definition evaluation method
CN104134204A (en) * 2014-07-09 2014-11-05 中国矿业大学 Image definition evaluation method and image definition evaluation device based on sparse representation

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Gradient-based Sharpness Function;Maria Rudnaya;《Proceedings of the World Congress on Engineering》;20110708;正文第1-6页 *
一种新的高灵敏度聚焦评价函数;黄亿 等;《微计算机信息》;20090925;第25卷(第9-3期);第163页2.3节第1段 *
基于梯度及HVS特性的离焦模糊图像质量评价;黄隆华;《计算机应用研究》;20100215;第781-783页 *
基于梯度结构相似度的无参考模糊图像质量评价;桑庆兵 等;《光电子·激光》;20130315;第24卷(第3期);第573-577页 *
基于梯度阈值计数的清晰度评价算法;张宏飞 等;《科学技术与工程》;20131208;第13卷(第34期);第10365页右栏第1段、第10366页左栏第1段 *

Also Published As

Publication number Publication date
CN104867128A (en) 2015-08-26

Similar Documents

Publication Publication Date Title
CN104867128B (en) Image blurring detection method and device
US6141434A (en) Technique for processing images
US6556708B1 (en) Technique for classifying objects within an image
US10810438B2 (en) Setting apparatus, output method, and non-transitory computer-readable storage medium
US6400830B1 (en) Technique for tracking objects through a series of images
US6421462B1 (en) Technique for differencing an image
CN109284674A (en) A kind of method and device of determining lane line
CN102737370B (en) Method and device for detecting image foreground
CN105163110A (en) Camera cleanliness detection method and system and shooting terminal
US11534063B2 (en) Interpupillary distance measuring method, wearable ophthalmic device and storage medium
CN105678811A (en) Motion-detection-based human body abnormal behavior detection method
CN106462953A (en) Image processing system and computer-readable recording medium
CN106030653A (en) Image processing system and method for generating high dynamic range image
US6434271B1 (en) Technique for locating objects within an image
CN111340749B (en) Image quality detection method, device, equipment and storage medium
US9280209B2 (en) Method for generating 3D coordinates and mobile terminal for generating 3D coordinates
CN106447701A (en) Methods and devices for image similarity determining, object detecting and object tracking
Le Meur et al. A spatio-temporal model of the selective human visual attention
CN104077776B (en) A kind of visual background extracting method based on color space adaptive updates
CN105469427B (en) One kind is for method for tracking target in video
CN108133488A (en) A kind of infrared image foreground detection method and equipment
CN103096117B (en) Video noise detection method and device
CN109166137A (en) For shake Moving Object in Video Sequences detection algorithm
US6240197B1 (en) Technique for disambiguating proximate objects within an image
CN106611165B (en) A kind of automotive window detection method and device based on correlation filtering and color-match

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant