CN100474337C - Noise-possessing movement fuzzy image restoration method based on radial basis nerve network - Google Patents

Noise-possessing movement fuzzy image restoration method based on radial basis nerve network Download PDF

Info

Publication number
CN100474337C
CN100474337C CNB2006100534659A CN200610053465A CN100474337C CN 100474337 C CN100474337 C CN 100474337C CN B2006100534659 A CNB2006100534659 A CN B2006100534659A CN 200610053465 A CN200610053465 A CN 200610053465A CN 100474337 C CN100474337 C CN 100474337C
Authority
CN
China
Prior art keywords
image
motion blur
uproar
picture
lambda
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB2006100534659A
Other languages
Chinese (zh)
Other versions
CN101079149A (en
Inventor
朱信忠
赵建民
徐慧英
章琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Normal University CJNU
Original Assignee
Zhejiang Normal University CJNU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Normal University CJNU filed Critical Zhejiang Normal University CJNU
Priority to CNB2006100534659A priority Critical patent/CN100474337C/en
Publication of CN101079149A publication Critical patent/CN101079149A/en
Application granted granted Critical
Publication of CN100474337C publication Critical patent/CN100474337C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a noise motion blurred picture recovery method based on radial neural net, which comprises the following steps: (1) making the noise motion blurred picture y(m, n) generate the corresponding smoothing picture s(m, n) after low-pass filtering with the two-dimensional median filter; (2) calculating the inaccuracy picture e(m, n)=y(m, n)-s(m, n) between the noise motion blurred picture and the smoothing picture; (3) detecting the edge with Canny operator in order to estimate the gradient f' (m, n) of y(m, n); (4) calculating regular parameter lambda (m, n) of every picture element according to the size of f' (m, n) and generating interpolation picture f lambda (m, n) by radial neural net RBFN according to the inaccuracy picture e(m, n); (5) acquiring the silent motion blurred picture f(m, n) by superimposing the interpolation picture f lambda (m, n) to the smoothing picture s(m, n) after low-pass filtering; (6) acquiring the width HL and the height VL of two-dimensional blurred picture by identifying the motion blurred direction and the motion blurred length of the motion blurred picture automatically and acquiring the recovery picture with the picture recovery method. The invention provides the automatic identification, the low complicated calculation and the good recovery effect.

Description

A kind of motion blur image restoration method of making an uproar based on radial base neural net
Technical field
The present invention relates to a kind of motion blur image restoration method of making an uproar.
Background technology
The research and the application that have the blurred picture of making an uproar to restore are very extensive, as violating the regulations or accident car plate high-speed motion fuzzyly restore, the burst suspect captures and defocuses or motion blur restores, the scenes of the crime vestige fuzzyly restores, the recording monitor particular frame is fuzzy restores etc.Generally be owing to the defocusing blurring in the imaging process, imaging device cause imaging to be degenerated with the relative high-speed motion of object, inherent shortcoming, shooting shake and the external noise interference etc. of equipment and material.Wherein, the sharpening recovery technique difficulty maximum of the blurred picture of making an uproar is arranged with the local non-uniform movement of isolated single width especially, its reason is that fuzzy complex genesis, the image lesion of this class image is bigger and unmatched back correlated series frame is for reference.
At present, the recovering research of motion blur image has been become one of hot issue of domestic and international concern, the key issue of motion blur image restoration be to determine point spread function (point spread function, PSF).Then must from blurred picture itself, estimate unknown PSF parameter for the single width motion blur image, common PSF method of estimation is by the spectrogram of observing blurred picture and then extracts blur direction and fuzzy length that these class methods need manual intervention can't realize automatic discriminating.Chinese scholars has also proposed some other automatic identification algorithm in recent years, but is having under the situation of noise, and noise will cause these methods to lose efficacy or have a strong impact on it differentiating precision.So, before estimating and restore, the PSF of motion blur image should eliminate noise earlier, yet current a lot of de-noising filtering algorithm or computation complexity height, or cause the important high-frequency information of part of image to be lost, this will inevitably reduce the effect of motion blur image restoration greatly.
The existing shortcoming that has the motion blur image restoration method of making an uproar to exist has: (1), can't realize automatic discriminating, differentiate that precision is low; (2), in the noise-removed filtering algorithm, the computation complexity height; (3), the denoising process causes the important high-frequency information of image to be lost, and reduces the effect of motion blur image restoration.
Summary of the invention
For overcome existing the motion blur image restoration method of making an uproar arranged can't realize automatic discriminating, computation complexity height, the unfavorable deficiency of recovery effect, the invention provides and a kind ofly can realize automatic discriminating, the motion blur image restoration method of making an uproar that computation complexity is low, recovery effect is good based on radial base neural net.
The technical solution adopted for the present invention to solve the technical problems is:
A kind of motion blur image restoration method of making an uproar based on radial base neural net, this restored method may further comprise the steps:
(1), definition make an uproar motion blur image be y (m, n), earlier with two-dimentional mean filter to the motion blur image y that makes an uproar (calculating formula is for m, the n) smoothed image after the corresponding low-pass filtering of generation:
s ( m , n ) = 1 ( 2 K + 1 ) 2 Σ i = - K K Σ j = - K K y ( n + i , m + j ) - - - ( 1 )
In the formula (1), 2K+1 represents the size of filter window, and guarantees that the size of window on both direction is odd number, and to guarantee that image does not produce skew, i represents that i is capable, and j represents the j row;
(2), calculate the motion blur image y that makes an uproar (m, n) with low-pass filtering after smoothed image s (m, n) the error image e between (m, n)=y (m, n)-s (m, n);
(3), carry out rim detection with Canny operator (a kind of optimal edge detects operator, has characteristics such as low False Rate, high position precision, the false edge of drawing up), with estimation y (m, gradient f ' n) (m, n);
(4), according to f ' (m, size n) calculate each pixel regularization parameter λ (m, n), calculating formula is:
λ(m,n)=Ae -a|f′(m,n)|. (2)
In the formula (2), A and a are constants, A=q σ 2, q is a proportionality constant, the note λ (m, minimum value n) and | (m, n) | maximal value is respectively λ to f ' MinWith f ' Max, then a = log ( qσ 2 / λ min ) f ′ max ;
(m n) generates interpolation image f by radial base neural net RBFN to refer again to error image e λ(m, n), each pixel band λ (m, f n) λ(m, calculating formula n) is:
f λ ( m , n ) ( m , n ) = [ ( V ⊗ V ) ( Λ ⊗ Λ ) ] ( m - 1 ) N + n · ( Λ ⊗ Λ + λ ( m , n ) I ) - 1 · ( V T ⊗ V T ) y ; - - - ( 3 )
In the formula (3),
Figure C200610053465D00083
Represented institute to ask
Figure C200610053465D00084
(m-1) N+n row of matrix;
(5), pass through interpolation image f λ(m, n) be added to smoothed image s after the low-pass filtering (m, n), obtain after the denoising motion blur image f (m, n):
f(m,n)=f λ(m,n)+s(m,n) (4);
(6), automatically differentiate that (what obtain two-dimentional blurred picture widely is H to motion blur image f for m, motion blur direction n) and motion blur length L, height is V L, and utilize Image Restoration Algorithm to obtain restored image.
As preferred a kind of scheme: in described step (1), the motion blur image of making an uproar is carried out after the local motion fuzzy object extracts, again to local motion blur object y (m, n) carry out low-pass filtering, at target object and the apparent in view single image of background gray levels difference, the step of described extraction is:
(1.1), with suitable gray threshold the moving image of making an uproar is intercepted and cut apart;
(1.2), the size according to this image generates suitable rectangle matching template by preset proportion;
(1.3), carry out mathematics form closure operation to cutting apart good image by the rectangle template that generates, wherein bigger rectangular-shaped to extract, carry out the mathematics form then and open computing, with some less subject in the deleted image.
Or: in described step (1), the motion blur image of making an uproar is carried out after the local motion fuzzy object extracts, and (m n) carries out low-pass filtering to local motion blur object y again, at target object and the comparatively approaching single image of background gray levels, the step of described extraction is:
(1.1), rim detection is carried out in make an uproar motion blur image integrated use Prewitt operator and Canny operator logic and computing;
(1.2), binary edge map is detected all length greater than L with the Radon conversion MinLine segment, preserve they starting point, terminal point and and horizontal direction between angle;
(1.3), described straight line is sorted out by angle, the absolute value of differential seat angle is less than θ MinAnd distance differs by more than L MinLine segment mate in twos;
(1.4), whether there is line segment to link to each other with between their starting point of eight neighborhood communicating methods detections and terminal point respectively to two line segments of coupling, if have, then rectangular area segmented extraction that four summits of two line segments are surrounded and the new images that saves as local motion fuzzy object object, if do not have, continue to detect up to there not being such coupling line segment.
As preferred another kind of scheme: in described step (4), set a threshold value R, when | f ' (m, n) | value is during less than R, and the regularization parameter of this pixel is fixed as | f ' (m, n) |=0 o'clock λ (m, n) value is still then calculated by formula (2) when greater than R.
As preferred another scheme: in the described step (6), in α ∈ [90 °, 90 °] scope, get the α value, differentiate that automatically the step of motion blur direction is by setting step-length:
(6.1), with bicubic C Spline Interpolation Method, coordinate (i carries out the C spline interpolation three times to row and column respectively on j), obtains the direction differential map picture of image:
Δf(i,j) α=f(i′,j′)-f(i,j) (5)
In the formula (5), the deflection when α is the travel direction differential, (i, j) interpolation obtains the value of f (i ', j '), wherein by blurred picture f i ′ = i + Δr · sin α j ′ = j + Δr · sin α , Differential length when Δ r is the travel direction differential;
(6.2), to the direction differential map as Δ f (i, j) αThe absolute value weighted sum of gray-scale value, calculating formula is:
I ( Δf ) α = Σ i = 0 N - 1 Σ j = 0 M - 1 | Δf ( i , j ) α | · p ( Δf ) - - - ( 6 )
In the formula (6), the frequency p (Δ f) that gray level occurs for Δ f is as weighting coefficient;
(6.3), obtain wherein minimum value min (I (Δ f) α), its corresponding α angle value is the angle of motion blur direction and transverse axis in the motion blur image, promptly.
Figure C200610053465D00101
Further, in the described step (6), finished automatic discriminating to the motion blur direction after, the motion blur image reverse rotation to horizontal direction, is differentiated that automatically the step of motion blur length is:
(6.4), the image behind the motion blur image Fourier transform be F (u, v), calculate log (| F (u, v) |), the parameter displacement makes u=0 be positioned at the center of spectrogram;
(6.5), calculate S ( u ) = Σ v = 0 mid log ( | F ( u , v ) | ) , Obtain respectively center left from
Figure C200610053465D0010105125QIETU
The u value u of the individual minimum point of k (k〉1) of beginning search left LkWith the right side, center from
Figure C200610053465D0010153811QIETU
Beginning is the u value u of k minimum point of search to the right Rk
(6.6), calculate arbitrarily angled motion blur image f (calculating formula is for x, fuzzy length y):
L≈Round(2k*N/|u Lk-u Rk|) (7)。
Further again, in the described step (6), finished automatic discriminating to motion blur direction and length after, calculating two-dimentional fuzzy graph image width is H L, height is V L, carry out image restoration according to the corresponding relation of optimum window regional extent and this zone interior element value with optimum window method Wiener filtering.
The described motion blur image of making an uproar is black white image or coloured image.
Technical conceive of the present invention is: establish motion blur image f (m, n) image obtain with transmission course in because of noise deteriorated to the motion blur image y that makes an uproar (m, n), exist between them relational expression y (m, n)=f (m, n)+ε (m, n), wherein ε (m is that variance is σ n) 2Additive white Gaussian noise.Utilize the general approximate characteristic of radial base neural net (RBFN), and select for use and the proportional variable regularization parameter λ of noise at each pixel, thereby reach the purpose that suppresses noise under the important images information prerequisite effectively not destroying according to image local feature.Select the center x of radial basis function (RBF) for use (m-1) N+nAs the position of current pixel (m, n), and with the output valve y of RBF (n-1) N+mValue as this pixel.By this corresponding relation, then can use the RBFN method obtained making an uproar motion blur image y (m, n) eliminate motion blur image f behind the noise (m, n), this method relates generally to two aspects:
RBFN is with cost function H [ f ] = Σ i = 1 L ( y i - f ( x i ) ) 2 + λS [ f ] Be minimised as target, generate interpolation image:
f λ = Δ ( f λ ( x 1 ) , f λ ( x 2 ) , · · · f λ ( x L ) ) T = Gc = G ( G + λI ) - 1 y - - - ( 8 )
First deviation of representing between raw data and the expectation value in the cost function, second then is the cost metric relevant with level and smooth degree, the balance between these two is determined by parameter lambda.Algorithm to each pixel (m, regularization parameter λ n) is basis | f ' (m, n) | size is adjusted automatically, its choosing method is determined by formula (2)
λ(m,n)=Ae -a|f′(m,n)| (2)
Wherein A and a are constants, A=q σ 2(q is a proportionality constant), the note λ (m, minimum value n) and | (m, n) | maximal value is respectively λ to f ' MinWith f ' Max, then a = log ( qσ 2 / λ min ) f ′ max Can obtain.
Then calculated amount is very big if directly ask RBFN with formula (8), so be necessary to realize with the long-pending characteristic of Kronecker (kronecker) the quick output of RBFN interpolation image.According to the Gaussian distribution rule of noise, matrix G can be expressed as
G = G s ⊗ G s = ( VΛ V T ) ⊗ ( VΛ V T ) - - - ( 9 )
Λ=diag (μ wherein 1, μ 2..., μ N), V=(v 1, v 2..., v N) known, μ iAnd v iBe respectively G sEigenwert and proper vector.Substitution formula (8) and abbreviation get
f λ = G ( G + λI ) - 1 y = ( V ⊗ V ) ( Λ ⊗ Λ ) ( Λ ⊗ Λ + λI ) - 1 ( V T ⊗ V T ) y - - - ( 10 )
From the motion blur image y that makes an uproar, calculate RBFN output vector f like this λTime complexity just by O (N 6) dropped to O (N 3).On the basis of formula (4) conclusion that draws, each pixel band λ (m, RBFN value n) can be calculated like this:
f λ ( m , n ) ( m , n ) = [ ( V ⊗ V ) ( Λ ⊗ Λ ) ] ( m - 1 ) N + n · ( Λ ⊗ Λ + λ ( m , n ) I ) - 1 · ( V T ⊗ V T ) y - - - ( 3 )
Wherein Represented institute to ask
Figure C200610053465D00118
(m-1) N+n row of matrix.Can see that (m, the time complexity of RBFN value n) is O (N for each pixel calculating belt variable regularization parameter λ 4), being higher than fixedly, the RBFN of regularization parameter calculates.Therefore consider to use | f ' (m, n) | characteristic distributions reduces calculated amount, because | f ' (m, n) most numerical value all concentrates near 0 in | the grey level histogram, therefore sets a threshold value R, when | f ' (m, n) | value is fixed as the regularization parameter of these pixels during less than R | f ' (m, n) |=0 o'clock λ (m, still then calculate by formula (2) during greater than R by n) value.Through such optimization, even each pixel is asked the RBFN output valve of belt variable regularization parameter, time complexity still remains on O (N 3) the order of magnitude on.
Though can effectively remove that part of information that contains noise in the mild zone by low-pass filtering, along with toothed edge becomes smoothly, important marginal information is also lost (shown in Fig. 1 (a) and Fig. 1 (b)) thereupon.Therefore, earlier the error that produces in this process is kept among the e (n), RBFN will extract the marginal information (shown in Fig. 1 (c)) of losing by noise error image e (n); Then, choose less regularization parameter near the edge to produce the interpolation image f consistent with e (n) λAnd choose bigger regularization parameter in mild zone and make f (n), λ(n) become level and smooth (shown in Fig. 1 (c) and Fig. 1 (d)); Pass through f at last again λ(n)+s (n) obtains the image after the denoising.
Such situation is often arranged in the daily life, and moving object causes motion blur with respect to the imaging device high-speed motion, but background is static relatively, or a plurality of imageable target is loose to non-uniform movement.Handle if still image is carried out the overall situation for the fuzzy situation of this local motion, not only may cause the artificial obfuscation of background image clearly originally, and, local non-at the uniform velocity blurred picture does not possess or the not obvious global property that possesses the ordinary movement blurred picture, this will cause most of inefficacy of automatic identification method of the present conventional motion blur parameters that is proposed, thereby can't realize its clear effective recovery.
When wherein local motion fuzzy object is extracted, find, in the newly-generated image that only comprises the fuzzy object scope, obviously present the feature of ordinary movement blurred picture, fully can be by differentiating that automatically image recovery method is to its effective recovery.
For the recovery of single width local motion blurred picture, crucial problem is exactly to extract the motion blur object under single-frame images lacks the situation of correlated series frame reference information.As being example with single frames multilane road condition monitoring system video image, as shown in Figure 7, can consider problem to be simplified with priori, the object that will extract in the application mainly is the automobile of high-speed motion in each track, and model approaches oblique side's rectangle, in view of the above, can in the spatial domain, extract the motion blur object.
To the automatic discriminating of motion blur direction mainly based on following principle: original image can be regarded as isotropic single order Markov (Markov) process, and promptly the auto-correlation of original image and power spectrum thereof are isotropic.Motion blur has reduced the radio-frequency component of direction of motion epigraph, and less for the radio-frequency component influence of other direction epigraph, the big more influence of deviation in driction is more little, for almost not influencing perpendicular to the image radio-frequency component on the direction of motion.Therefore, if high-pass filtering (being the direction differential) to the motion blur image travel direction, when the filtering direction is the direction of motion blur just, because the radio-frequency component of this direction blurred picture correspondence is minimum, high-pass filtering will make the energy loss maximum of blurred picture, and the differential map that obtains must be minimum as the absolute value sum of gray-scale value.So the absolute value sum of the gradation of image value that obtains when certain party tropism's high-pass filtering direction of passage differential hour pairing direction is the blur direction of motion blur image.
After the automatic discriminating of having finished direction, just can be with the motion blur image reverse rotation to horizontal direction so that differentiate its motion blur length automatically.The cardinal principle that length is differentiated automatically is based on specific interval can appear in motion blur image in frequency domain black-tape phenomenon, especially can accurately estimate its motion blur length according to the black position of equi-spaced apart on frequency domain for the uniform motion image after the denoising.
(wide is H to two-dimentional blurred picture L, height is V L) with optimum window method Wiener filtering [25]Restore the different reset errors that cause of mean intensity that can effectively remove because of image boundary, thus the suppressed ringing phenomenon.
Select to eliminate The noise with radial base neural net (RBFN) method of belt variable regularization parameter, the RBFN algorithm can suppress noise effectively under the prerequisite of not destroying important image information, and carry out conversion process by the long-pending characteristic of Kronecker (kronecker), the RBFN interpolation output of belt variable regularization parameter can be realized each pixel is asked, O (N can be kept again simultaneously 3) low time complexity.
Beneficial effect of the present invention mainly shows: 1, can realize differentiating automatically image; 2, complexity is low; 3, recovery effect is good; 4, differentiate the precision height; 5, not only the PSF parameter of single image is had very high automatic discriminating precision, and considered real complicated imaging situation such as the fuzzy and noise of local non-uniform movement.
Description of drawings
Fig. 1 is the basic principle schematic with the RBFN method denoising of belt variable regularization parameter, wherein, and (a) original image; (b) smoothed image after the low-pass filtering; (c) error image; (d) interpolation image.
Fig. 2 is to the synoptic diagram of picture cameraman with the RBFN method denoising of belt variable regularization parameter, wherein, and (a) motion blur image of making an uproar; (b) smoothed image after the low-pass filtering; (c) error image; (d) interpolation image; (e) motion blur image after the denoising; (f) spectrogram of Fig. 2 (e).
Fig. 3 is to the RBFN method denoising synoptic diagram of picture lena with the belt variable regularization parameter, wherein, and (a) motion blur image of making an uproar; (b) smoothed image after the low-pass filtering; (c) error image; (d) interpolation image; E) motion blur image after the denoising; (f) spectrogram of Fig. 2 (e)
Fig. 4 is the corresponding relation synoptic diagram of optimum window regional extent and this zone interior element value.
Fig. 5 is make an uproar motion blur image restoration process and denoising front and back discriminating precision and recovery effect contrast synoptic diagram.
Fig. 6 is the recovery result schematic diagram of the motion blur image after the denoising, wherein, and (a) to the recovery result of Fig. 2 (e); (b) to the recovery result of Fig. 3 (e).
Fig. 7 is the recovery synoptic diagram of local motion blurred picture, wherein, and (a) former fuzzy image; (b) Fig. 7 (a) spectrogram; (c) Threshold Segmentation image; (d) main body is extracted image; (e) only contain the image that blurs main body; (f) spectrogram of Fig. 7 (e); (g) the recovery effect figure of Fig. 7 (e).
Fig. 8 is the automatic identification result comparison diagram of PSF parameter, wherein, (a) fuzzy length fixedly the time with of the contrast of three kinds of distinct methods to the automatic discriminating of blur direction; (b) contrast with three kinds of distinct methods fuzzy length differentiated automatically fixedly time of mould direction.
Embodiment
Below in conjunction with accompanying drawing the present invention is further described.
Embodiment 1
With reference to Fig. 1~Fig. 6, Fig. 8, a kind of motion blur image restoration method of making an uproar based on radial base neural net, this restored method may further comprise the steps:
(1), definition make an uproar motion blur image be y (m, n), earlier with the two dimension median filter device to the motion blur image y that makes an uproar (calculating formula is for m, the n) smoothed image after the corresponding low-pass filtering of generation:
s ( m , n ) = 1 ( 2 K + 1 ) 2 Σ i = - K K Σ j = - K K y ( n + i , m + j ) - - - ( 1 )
In the formula (1), 2K+1 represents the size of filter window, and guarantees that the size of window on both direction is odd number, and to guarantee that image does not produce skew, i represents that i is capable, and j represents the j row;
(2), calculate the motion blur image y that makes an uproar (m, n) with low-pass filtering after smoothed image s (m, n) the error image e between (m, n)=y (m, n)-s (m, n);
(3), carry out rim detection with Canny operator (a kind of optimal edge detects operator, has characteristics such as low False Rate, high position precision, the false edge of drawing up), with estimation y (m, gradient f ' n) (m, n);
(4), according to f ' (m, size n) calculate each pixel regularization parameter λ (m, n), calculating formula is:
λ(m,n)=Ae -a|f′(m,n)|. (2)
In the formula (2), A and a are constants, A=q σ 2, q is a proportionality constant, the note λ (m, minimum value n) and | (m, n) | maximal value is respectively λ to f ' MinWith f ' Max, then a = log ( qσ 2 / λ min ) f ′ max ;
(m n) generates interpolation image f by radial base neural net RBFN to refer again to error image e λ(m, n), each pixel band λ (m, f n) λ(m, calculating formula n) is:
f λ ( m , n ) ( m , n ) = [ ( V ⊗ V ) ( Λ ⊗ Λ ) ] ( m - 1 ) N + n · ( Λ ⊗ Λ + λ ( m , n ) I ) - 1 · ( V T ⊗ V T ) y ; - - - ( 3 )
In the formula (3),
Figure C200610053465D00162
Represented institute to ask
Figure C200610053465D00163
(m-1) N+n row of matrix;
(5), pass through interpolation image f λ(m, n) be added to smoothed image s after the low-pass filtering (m, n), finally obtain after the denoising motion blur image f (m, n):
f(m,n)=f λ(m,n)+s(m,n) (4);
(6), automatically differentiate that (what obtain two-dimentional blurred picture widely is H to motion blur image f for m, motion blur direction n) and motion blur length L, height is V L, and utilize Image Restoration Algorithm to obtain restored image.
Can effectively remove that part of information that contains noise in the mild zone by low-pass filtering, but along with toothed edge becomes smoothly, important marginal information is also lost (shown in Fig. 1 (a) and Fig. 1 (b)) thereupon.Therefore, earlier the error that produces in this process is kept among the e (n), RBFN will extract the marginal information (shown in Fig. 1 (c)) of losing by noise error image e (n); Then, choose less regularization parameter near the edge to produce the interpolation image f consistent with e (n) λAnd choose bigger regularization parameter in mild zone and make f (n), λ(n) become level and smooth (shown in Fig. 1 (c) and Fig. 1 (d)); Pass through f at last again λ(n)+s (n) obtains the image after the denoising.Specific implementation is provided by algorithm 1:
The RBFN denoise algorithm of algorithm 1. belt variable regularization parameters.
Step1. earlier with two-dimentional mean filter to the motion blur image y that makes an uproar (m, n) smoothed image after the corresponding low-pass filtering of generation s ( m , n ) = 1 ( 2 K + 1 ) 2 Σ i = - K K Σ j = - K K y ( n + i , m + j ) ;
Step2. calculate the motion blur image y that makes an uproar (m, n) with low-pass filtering after smoothed image s (m, n) the error image e between (m, n)=y (m, n)-s (m, n);
Step3. with the Canny operator carry out rim detection with estimate y (m, gradient f ' n) (m, n);
Step4. according to f ' (m, size n) is calculated the regularization parameter of each pixel, (m n) generates interpolation image f by RBFN to refer again to error image e λ(m, n);
Step5. pass through interpolation image f λ(m, n) be added to smoothed image s after the low-pass filtering (m, n), finally obtain after the denoising motion blur image f (m, n).
The denoising process of utilizing algorithm 1 is (is example with image cameraman) as shown in Figure 2.Wherein, Fig. 2 (a) is for having added variances sigma 2Motion blur image y (the m that=400 white Gaussian noise disturbs, n), Fig. 2 (b) is through the smoothed image s (m after 5 * 5 intermediate value low-pass filtering, n), Fig. 2 (c) for the blurred picture y that makes an uproar (m, n) with low-pass filtering after smoothed image s (m, n) the error image e (m between, n), Fig. 2 (d) is approximate error image e (m, interpolation image f n) by the RBFN method generation of belt variable regularization parameter λ(m, n), Fig. 2 (e) is for passing through interpolation image f λ(m, n) with low-pass filtering after smoothed image s (m, n) the motion blur image f after the denoising that produces of stack (m, n), Fig. 2 (f) is f (m, spectrum analysis figure n) (spectrogram after the denoising has obviously presented the direction and the length characteristic of motion blur image as can be seen).
The denoising method that proposes in the present embodiment equally also is applicable to coloured image, and denoising process and principle are identical with Fig. 2, and Fig. 3 is example with Lena.
Bicubic C spline interpolation is adopted in the automatic discriminating of motion blur direction, and promptly (i carries out the C spline interpolation three times to row and column respectively on j) at coordinate.At h k=x K+1-x kUnder=1 the condition, get boundary condition s " (x 0)=m 0=0, s " (x N)=m N=0, can get interpolation coefficient by C spline interpolation definition and be:
s k , 0 = g k ; s k , 1 = d k - 1 3 m k - 1 6 m k + 1 ; s k , 2 = 1 2 m k ; s k , 3 = 1 6 m k + 1 - 1 6 m k - - - ( 11 )
Interpolating function is:
s k(x)=((s k,3v+s k,2)v+s k,1)v+s k,0 (12)
V=x-x wherein k(x k≤ x≤x K+1).
In interpolation process, the gray-scale value of direction differential map picture is done weighted mean adjustment (specifically seeing algorithm 2 descriptions), because the overwhelming majority is a low-frequency component in the motion blur image, if the frequency p (x) that makes certain gray level x appearance as its coefficient, can make the pixel of low gray area become the key factor that influences interpolation differential map picture.The specific implementation that direction is differentiated is automatically provided by algorithm 2:
The automatic identification algorithm of algorithm 2. motion blur directions.
In α ∈ [90 °, 90 °] scope, get the α value, each α value moved Step1 and Step2 by a fixed step size (as 1 °):
Step1. the direction differential map that obtains image with bicubic C Spline Interpolation Method as Δ f (i, j) α=f (i ', j ')-f (i, j), the deflection when wherein α is the travel direction differential, the value of f (i ', j ') by blurred picture f (i, j) interpolation obtains, i ′ = i + Δr · sin α j ′ = j + Δr · sin α , Differential length when Δ r is the travel direction differential;
Step2. to the direction differential map as Δ f (i, j) αThe absolute value weighted sum of gray-scale value have I ( Δf ) α = Σ i = 0 N - 1 Σ j = 0 M - 1 | Δf ( i , j ) α | · p ( Δf ) , The frequency p (Δ f) that gray level occurs for Δ f in the formula is as weighting coefficient;
Step3. obtain minimum value min (I (Δ f) wherein α), its corresponding α angle value is the angle of motion blur direction and transverse axis in the motion blur image, promptly
Figure C200610053465D00183
After the automatic discriminating of having finished direction, just can be with the motion blur image reverse rotation to horizontal direction so that differentiate its motion blur length automatically.The cardinal principle that length is differentiated automatically is based on specific interval can appear in motion blur image in frequency domain black-tape phenomenon, especially can accurately estimate its motion blur length according to the black position of equi-spaced apart on frequency domain for the uniform motion image after the denoising, specific implementation is provided by algorithm 3:
The automatic identification algorithm of algorithm 3. motion blur length.
Step1. the image behind the motion blur image Fourier transform be F (u, v), calculate log (| F (u, v) |), the parameter displacement makes u=0 be positioned at the center of spectrogram;
Step2. calculate S ( u ) = Σ v = 0 mid log ( | F ( u , v ) | ) , Obtain respectively center left from
Figure C200610053465D00185
The u value u of the individual minimum point of k (k〉1) of beginning search left LkWith the right side, center from Beginning is the u value u of k minimum point of search to the right Rk
Step3. calculate arbitrarily angled motion blur image f (x, fuzzy length L ≈ Round (2k*N/|u y) Lk-u Rk|).
This method adopts the u projection summation of the individual minimum point of k (k〉1), utilized the permutation statistical information, having reduced the randomness of original image disturbs, avoided the error of calculation preferably, wherein, minimum point is corresponding to the black-tape position, promptly corresponding to u=Round (kN/a), also can as comparison, calculate automatically and blur length by search acquisition automatically by adjacent each 2 pixel in the left and right sides.
(wide is H to two-dimentional blurred picture L, height is V L) with optimum window method Wiener filtering [25]Restore the different reset errors that cause of mean intensity that can effectively remove because of image boundary, thus the suppressed ringing phenomenon.The plane of delineation is divided into 9 zones, and the corresponding relation of optimum window element value as shown in Figure 4 in regional extent and this zone.
Automatic identification algorithm and optimum window method Wiener filtering image restoration suppressed ringing algorithm for above-mentioned motion blur direction, length, present embodiment is verified its recuperation by an original distinct image being carried out manually fuzzy and adding to make an uproar to handle, and the RBFN denoising restored method that proposes in the conventional restored method of not denoising and the present embodiment differentiated the contrast of precision and recovery effect, as shown in Figure 5.
Use the Image Restoration Algorithm based on motion blur direction and the automatic discriminating of length that present embodiment proposes, respectively the motion blur image after Fig. 2 (e) and Fig. 3 (e) denoising is restored, experimental result as shown in Figure 6.
Under PC Windows development platform and MATLAB 7.0 environment, with the cameraman image motion blur image restoration model of making an uproar based on radial base neural net that present embodiment proposed is tested, the result is as shown in Figure 8.
The comparison of when Fig. 8 (a) is fixed as 30 pixels for fuzzy length direction being differentiated automatically, the comparison of when Fig. 8 (b) is fixed as 45 ° for blur direction direction being differentiated automatically.As can be seen from the figure, the discriminating precision after the RBFN interpolation method denoising of the belt variable regularization parameter that proposes by present embodiment does not consider that apparently higher than common the restored method of denoising and the PSF parameter after the general filter process differentiate precision.
Through comparative analysis to Fig. 8, also find: the automatic identification method of not considering denoising may cause the identification result of parameter very unstable, on the estimated value that direction is differentiated, present the result of a species stage property, promptly the result trends towards same direction value in certain fixed interval, and increasing along with degree of noise interference, this interval progressively is extended to whole zone, and the result who identifies nearly all is tending towards 45 ° on whole interval, cause identification result to be broken one's promise fully; And on length was differentiated comparison diagram, the automatic discriminating estimated value of not considering denoising presented then that deviation amplitude with standard value is big, tangible result fluctuates; For identification result with general filter process, cause important marginal information disappearance owing to having lost the part high-frequency information, though make identification result make moderate progress with respect to the former error vibrations amplitude, but error is still very big, differentiate that precision is lower, it is accurate to can not show a candle to the discriminating estimated value that the method that adopts present embodiment obtains.
Embodiment 2
With reference to Fig. 1~Fig. 8, in the present embodiment, when being extracted, wherein local motion fuzzy object finds, in the newly-generated image that only comprises the fuzzy object scope, obviously present the feature of ordinary movement blurred picture, can pass through the automatic discriminating image recovery method of embodiment 1 proposition fully its effective recovery.
For the recovery of single width local motion blurred picture, crucial problem is exactly to extract the motion blur object under single-frame images lacks the situation of correlated series frame reference information.As being example with single frames multilane road condition monitoring system video image, as shown in Figure 7, can consider problem to be simplified with priori, the object that will extract in the application mainly is the automobile of high-speed motion in each track, and model approaches oblique side's rectangle, in view of the above, can extract the motion blur object in the spatial domain, detailed process is provided by algorithm 4:
Algorithm 4. local motion fuzzy object extraction algorithms.
Step1. with suitable gray threshold image is intercepted and cut apart;
Step2. the size according to this image generates suitable rectangle matching template by a certain percentage;
Step3. carry out mathematics form closure operation by the rectangle template that generates to cutting apart good image,, carry out the mathematics form then and open computing, with some less subject in the deleted image to extract wherein bigger rectangular-shaped automobile.
Fig. 7 adopts algorithm 4 to carry out the extraction of motion blur object for the single frames blurred picture of shooting with video-corder at the actual multilane road condition monitoring system of a width of cloth and the experimental result of recuperation is showed.Fig. 7 (a) is former single width motion blur image, and Fig. 7 (b) is the spectrum analysis figure of Fig. 7 (a).The process of extracting with the 4 pairs of fuzzy objects of algorithm is shown in Fig. 7 (c) and 7 (d), Fig. 7 (c) is for intercepting result after cutting apart with suitable gray threshold to image, Fig. 7 (d) is each body profile of the image that extracts after cutting apart on the basis by template matches and mathematics morphological operations, cross the too small main body of elimination pixel coverage, an and newly-generated image that only comprises the fuzzy object scope, shown in Fig. 7 (e), Fig. 7 (f) is the spectrum analysis figure of Fig. 7 (e).Effect after it is restored with above-mentioned automatic identification algorithm of present embodiment and restored method (algorithm 2, algorithm 3) is shown in Fig. 7 (g).
The automatic discriminating image recovery method and the principle of work of present embodiment are identical with embodiment 1.
Embodiment 3
With reference to Fig. 1~Fig. 8, adopt another destination object extracting method in the present embodiment, the operand of the algorithm of embodiment 2 is little, implement quick and convenient, be mainly used in the apparent in view single image of target object and background gray levels difference, for the comparatively approaching single image of some gray-scale values, cut apart with gray threshold intercepting and may cause domain error bigger, consider to realize the destination object extraction of this type of blurred picture with algorithm 5:
Algorithm 5. local motion fuzzy object extraction algorithms.
Step1. motion blur image integrated use Prewitt operator and Canny operator logic and computing are carried out rim detection (the degree of accuracy height can detect weak edge);
Step2. binary edge map is detected all length greater than L with the Radon conversion MinLine segment, preserve they starting point, terminal point and and horizontal direction between angle (can overcome traditional Hough conversion is subjected to intermediate point to disturb big, the slow shortcoming of computing velocity);
Step3. these straight lines are sorted out by angle, angle is close, and (absolute value of difference is less than θ Min) and apart from differing by more than L MinLine segment mate in twos;
Step4. to two line segments of coupling respectively with whether line segment link to each other (the interruption error that allows certain pixel here) is arranged between their starting point of eight neighborhood communicating methods detections and terminal point, if have, then rectangular area segmented extraction that four summits of two line segments are surrounded and the new images that saves as local motion fuzzy object object then do not continue to detect up to there not being such coupling line segment if having.
If the sequence frame video flowing multiple goal object extraction under the complex background, then can consider the method that adopts statistic algorithm to combine with characteristic matching, effective localizing objects object, and with most of background removal, its ultimate principle is: if camera position remains unchanged, then in a very short time section, taken background image remains unchanged substantially, suppose for an image sequence, the pixel process of extracting certain picture position on time shaft satisfies a certain specific Gaussian distribution, just can obtain the statistical information of image sequence, estimate the distribution average and the variance of all pixel processes, thus the background of estimating; And on the background image basis of estimating, utilize feature matching method, and realize the characteristic matching of background image and destination object to be identified, matching area is considered as background, from object to be identified, remove, obtain foreground image at last; Because characteristic matching is based on the zone, therefore, the foreground image that obtains is more accurate than the foreground image that directly utilizes difference to obtain; At last, again according to connected region, remove part mistake matching area; Carry out subsequent treatment after thereby destination object to be identified can being cut apart.
Other steps of present embodiment are identical with embodiment 2 with principle of work.

Claims (8)

1, a kind of motion blur image restoration method of making an uproar based on radial base neural net, this restored method may further comprise the steps:
(1), definition make an uproar motion blur image be y (m, n), earlier with two-dimentional mean filter to the motion blur image y that makes an uproar (calculating formula is for m, the n) smoothed image after the corresponding low-pass filtering of generation:
s ( m , n ) = 1 ( 2 K + 1 ) 2 Σ i = - K K Σ j = - K K y ( n + i , m + j ) - - - ( 1 )
In the formula (1), 2K+1 represents the size of filter window, and guarantees that the size of window on both direction is odd number, and to guarantee that image does not produce skew, i represents that i is capable, and j represents the j row;
(2), calculate the motion blur image y that makes an uproar (m, n) with low-pass filtering after smoothed image s (m, n) the error image e between (m, n)=y (m, n)-s (m, n);
(3), carry out rim detection with the Canny operator, with estimation y (m, gradient f ' n) (m, n);
(4), according to f ' (m, size n) calculate each pixel regularization parameter λ (m, n), calculating formula is:
λ(m,n)=Ae -a|f′(m,n)| (2)
In the formula (2), A and a are constants, A=q σ 2, q is a proportionality constant, the note λ (m, minimum value n) and | (m, n) | maximal value is respectively λ to f ' MinAnd f ' Max, then a = log ( qσ 2 / λ min ) f ′ max ;
(m n) generates interpolation image f by radial base neural net RBFN to refer again to error image e λ(m, n), each pixel band λ (m, f n) λ(m, calculating formula n) is:
f λ ( m , n ) ( m , n ) = [ ( V ⊗ V ) ( Λ ⊗ Λ ) ] ( m - 1 ) N + n · ( Λ ⊗ Λ + λ ( m , n ) I ) - 1 · ( V T ⊗ V T ) y - - - ( 3 )
In the formula (3), [ ( V ⊗ V ) ( Λ ⊗ Λ ) ] ( m - 1 ) N + n Represent institute to ask (m-1) N+n row of matrix;
(5), pass through interpolation image f λ(m, n) be added to smoothed image s after the low-pass filtering (m, n), obtain after the denoising motion blur image f (m, n):
f(m,n)=f λ(m,n)+s(m,n) (4);
(6), automatically differentiate that (what obtain two-dimentional blurred picture widely is H to motion blur image f for m, motion blur direction n) and motion blur length L, height is V L, and utilize Image Restoration Algorithm to obtain restored image.
2, a kind of motion blur image restoration method of making an uproar as claimed in claim 1 based on radial base neural net, it is characterized in that: in described step (1), the motion blur image of making an uproar is carried out after the local motion fuzzy object extracts, again to local motion blur object y (m, n) carry out low-pass filtering, at target object and the apparent in view single image of background gray levels difference, the step of described extraction is:
(1.1), with suitable gray threshold the moving image of making an uproar is intercepted and cut apart;
(1.2), the size according to this image generates suitable rectangle matching template by preset proportion;
(1.3), carry out mathematics form closure operation to cutting apart good image by the rectangle template that generates, wherein bigger rectangular-shaped to extract, carry out the mathematics form then and open computing, with some less subject in the deleted image.
3, a kind of motion blur image restoration method of making an uproar as claimed in claim 1 based on radial base neural net, it is characterized in that: in described step (1), the motion blur image of making an uproar is carried out after the local motion fuzzy object extracts, again to local motion blur object y (m, n) carry out low-pass filtering, at target object and the comparatively approaching single image of background gray levels, the step of described extraction is:
(1.1), rim detection is carried out in make an uproar motion blur image integrated use Prewitt operator and Canny operator logic and computing;
(1.2), binary edge map is detected all length greater than L with the Radon conversion MinLine segment, preserve they starting point, terminal point and and horizontal direction between angle;
(1.3), described straight line is sorted out by angle, the absolute value of differential seat angle is less than θ MinAnd distance differs by more than L MinLine segment mate in twos;
(1.4), whether there is line segment to link to each other with between their starting point of eight neighborhood communicating methods detections and terminal point respectively to two line segments of coupling, if have, then rectangular area segmented extraction that four summits of two line segments are surrounded and the new images that saves as local motion fuzzy object object, if do not have, continue to detect up to there not being such coupling line segment.
4, as claim 1---one of 3 described a kind of motion blur image restoration methods of making an uproar based on radial base neural net, it is characterized in that: in described step (4), set a threshold value R, when | f ' (m, n) | value is during less than R, and the regularization parameter of this pixel is fixed as | f ' (m, n) |=0 o'clock λ (m, n) value is still then calculated by formula (2) when greater than R.
5, a kind of motion blur image restoration method of making an uproar as claimed in claim 4 based on radial base neural net, it is characterized in that: in the described step (6), in α ∈ [90 °, 90 °] scope, get the α value, differentiate that automatically the step of motion blur direction is by setting step-length:
(6.1), with bicubic C Spline Interpolation Method, coordinate (i carries out the C spline interpolation three times to row and column respectively on j), obtains the direction differential map picture of image:
Δf(i,j) α=f(i′,j′-f(i,j) (5)
In the formula (5), the deflection when α is the travel direction differential, (i, j) interpolation obtains the value of f (i ', j '), wherein by blurred picture f i ′ = i + Δr · sin α j ′ = j + Δr · sin α , Differential length when Δ r is the travel direction differential;
(6.2), to the direction differential map as Δ f (i, j) αThe absolute value weighted sum of gray-scale value, calculating formula is:
I ( Δf ) α = Σ i = 0 N - 1 Σ j = 0 M - 1 | Δf ( i , j ) α | · p ( Δf ) - - - ( 6 )
In the formula (6), the frequency p (Δ f) that gray level occurs for Δ f is as weighting coefficient;
(6.3), obtain wherein minimum value min (I (Δ f) α), its corresponding α angle value is the angle of motion blur direction and transverse axis in the motion blur image, promptly
Figure C200610053465C00043
6, a kind of motion blur image restoration method of making an uproar as claimed in claim 5 based on radial base neural net, it is characterized in that: in the described step (6), after having finished automatic discriminating to the motion blur direction, the motion blur image reverse rotation to horizontal direction, is differentiated that automatically the step of motion blur length is:
(6.4), the image behind the motion blur image Fourier transform be F (u, v), calculate log (| F (u, v) |), the parameter displacement makes u=0 be positioned at the center of spectrogram;
(6.5), calculate S ( u ) = Σ v = 0 mid log ( | F ( u , v ) | ) , Obtain respectively center left from
Figure C200610053465C00052
The u value u of the individual minimum point of k (k〉1) of beginning search left LkWith the right side, center from
Figure C200610053465C00053
Beginning is the u value u of k minimum point of search to the right Rk
(6.6), (calculating formula is for x, fuzzy length y): L ≈ Round (2k*N/|u to calculate arbitrarily angled motion blur image f LK-u Rk|) (7).
7, a kind of motion blur image restoration method of making an uproar as claimed in claim 6 based on radial base neural net, it is characterized in that: in the described step (6), after having finished the automatic discriminating to motion blur direction and length, calculating two-dimentional fuzzy graph image width is H L, height is V L, carry out image restoration according to the corresponding relation of optimum window regional extent and this zone interior element value with optimum window method Wiener filtering.
8, a kind of motion blur image restoration method of making an uproar based on radial base neural net as claimed in claim 7, it is characterized in that: the described motion blur image of making an uproar is black white image or coloured image.
CNB2006100534659A 2006-09-08 2006-09-08 Noise-possessing movement fuzzy image restoration method based on radial basis nerve network Expired - Fee Related CN100474337C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2006100534659A CN100474337C (en) 2006-09-08 2006-09-08 Noise-possessing movement fuzzy image restoration method based on radial basis nerve network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2006100534659A CN100474337C (en) 2006-09-08 2006-09-08 Noise-possessing movement fuzzy image restoration method based on radial basis nerve network

Publications (2)

Publication Number Publication Date
CN101079149A CN101079149A (en) 2007-11-28
CN100474337C true CN100474337C (en) 2009-04-01

Family

ID=38906615

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2006100534659A Expired - Fee Related CN100474337C (en) 2006-09-08 2006-09-08 Noise-possessing movement fuzzy image restoration method based on radial basis nerve network

Country Status (1)

Country Link
CN (1) CN100474337C (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727668B (en) * 2008-10-28 2011-12-14 北京大学 Method and device for detecting image boundary
CN101697227B (en) * 2009-10-30 2011-11-30 武汉工程大学 Image Schrodinger transformation method and application thereof
CN102222318B (en) * 2010-04-16 2013-11-06 深圳迈瑞生物医疗电子股份有限公司 Grid artifact detection and suppression method and device
CN101968881B (en) * 2010-10-27 2012-07-18 东南大学 Motion blurring and defocusing composite blurring image restoration method
CN102254309B (en) * 2011-07-27 2016-03-23 清华大学 A kind of motion blur image deblurring method based on near-infrared image and device
CN103136754B (en) * 2012-10-09 2015-07-29 四川师范大学 A kind of image blurring direction discrimination method of feature based Block direction differential
CN103955944B (en) * 2014-05-22 2018-01-19 苏州大学 A kind of method for detecting image edge and device
CN105338339B (en) * 2014-07-29 2018-02-27 联想(北京)有限公司 Information processing method and electronic equipment
CN104616257A (en) * 2015-01-26 2015-05-13 山东省计算中心(国家超级计算济南中心) Recovery evidence obtaining method for blurred degraded digital images in administration of justice
JP6243087B2 (en) 2015-04-23 2017-12-06 富士フイルム株式会社 Imaging apparatus, image processing method of imaging apparatus, and program
CN106485182B (en) * 2016-06-27 2018-10-30 中国计量大学 A kind of fuzzy Q R code restored methods based on affine transformation
CN106651791B (en) * 2016-11-21 2023-07-07 云南电网有限责任公司电力科学研究院 Single motion blurred image recovery method
CN107945125B (en) * 2017-11-17 2021-06-22 福州大学 Fuzzy image processing method integrating frequency spectrum estimation method and convolutional neural network
CN108182664A (en) * 2017-12-26 2018-06-19 努比亚技术有限公司 A kind of image processing method, mobile terminal and computer readable storage medium
CN110728626A (en) * 2018-07-16 2020-01-24 宁波舜宇光电信息有限公司 Image deblurring method and apparatus and training thereof
CN109991438B (en) * 2019-03-26 2021-12-07 武汉理工大学 Method and device for eliminating angular speed measurement error of automobile gear ring
CN110264417B (en) * 2019-05-31 2022-04-12 西安理工大学 Local motion fuzzy area automatic detection and extraction method based on hierarchical model
CN111553920B (en) * 2020-05-14 2021-06-25 上海映城网络科技有限公司 Block chain server cultural relic restoration data storage platform
CN112435190A (en) * 2020-11-24 2021-03-02 同济大学 Motion fuzzy target detection method based on vehicle motion and data augmentation
CN112652000A (en) * 2020-12-30 2021-04-13 南京航空航天大学 Method for judging small-scale motion direction of image
CN115049671A (en) * 2022-08-17 2022-09-13 南通东德纺织科技有限公司 Cloth surface defect detection method and system based on computer vision

Also Published As

Publication number Publication date
CN101079149A (en) 2007-11-28

Similar Documents

Publication Publication Date Title
CN100474337C (en) Noise-possessing movement fuzzy image restoration method based on radial basis nerve network
CN102324021B (en) Infrared dim-small target detection method based on shear wave conversion
CN107220988B (en) Part image edge extraction method based on improved canny operator
CN104978715A (en) Non-local mean value image denoising method based on filter window and parameter adaption
CN105913396A (en) Noise estimation-based image edge preservation mixed de-noising method
CN103369209A (en) Video noise reduction device and video noise reduction method
Li et al. Road lane detection with gabor filters
CN101551901B (en) Method for compensating and enhancing dynamic shielded image in real time
CN103413276A (en) Depth enhancing method based on texture distribution characteristics
CN103295225B (en) Train bogie edge detection method under the conditions of low-light
CN114399522A (en) High-low threshold-based Canny operator edge detection method
CN103400357A (en) Method for removing salt-pepper noises in images
CN103208105A (en) Infrared image detail enhancement and noise adaptive inhibition method
CN101482969A (en) SAR image speckle filtering method based on identical particle computation
CN109389612A (en) A kind of municipal rail train pantograph pan edge detection method
Kofidis et al. Nonlinear adaptive filters for speckle suppression in ultrasonic images
McCloskey et al. Iris capture from moving subjects using a fluttering shutter
CN106780545A (en) A kind of Weak target energy accumulation Enhancement Method of combination time-space domain
CN103177428A (en) Synthetic aperture radar (SAR) image denoising method based on nonsubsampled directional wavelet transform and fusion
CN103413138A (en) Method for detecting point target in infrared image sequence
Lee et al. Real-time detection of illegally parked vehicles using 1-D transformation
CN102800069A (en) Image super-resolution method for combining soft decision self-adaptation interpolation and bicubic interpolation
Zhang et al. An improved edge detection algorithm based on mathematical morphology and directional wavelet transform
Rao et al. Statistical analysis for performance evaluation of image segmentation quality using edge detection algorithms
CN114155174A (en) Edge detection algorithm based on fusion histogram layering and nonlinear derivative

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090401

Termination date: 20170908

CF01 Termination of patent right due to non-payment of annual fee