CN107220945B - Restoration method of multiple degraded extremely blurred image - Google Patents
Restoration method of multiple degraded extremely blurred image Download PDFInfo
- Publication number
- CN107220945B CN107220945B CN201710234684.5A CN201710234684A CN107220945B CN 107220945 B CN107220945 B CN 107220945B CN 201710234684 A CN201710234684 A CN 201710234684A CN 107220945 B CN107220945 B CN 107220945B
- Authority
- CN
- China
- Prior art keywords
- sub
- degradation
- degradation function
- function
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000006731 degradation reaction Methods 0.000 claims abstract description 97
- 230000015556 catabolic process Effects 0.000 claims abstract description 33
- 230000003287 optical effect Effects 0.000 claims abstract description 11
- 238000013139 quantization Methods 0.000 claims abstract description 7
- 238000003384 imaging method Methods 0.000 claims description 17
- 239000011159 matrix material Substances 0.000 claims description 11
- 238000002474 experimental method Methods 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 239000000126 substance Substances 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 229920013655 poly(bisphenol-A sulfone) Polymers 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000011840 criminal investigation Methods 0.000 description 2
- 241001123248 Arma Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a method for restoring a multiply-degraded extremely-blurred image, which is used for solving the problem that the conventional criminal investigation software cannot effectively restore the blurred image. The method mainly takes multiple fuzzy factors caused by optical diffraction, quantization, defocusing and relative movement into consideration, initializes each sub-degradation function according to actually measured physical parameters, solves the optimal distribution of each sub-degradation function by using a variational Bayes algorithm in an alternate iteration mode, and synthesizes each sub-degradation function into a point spread function of a system by using the relationship between the sub-degradation functions and the system degradation function. And finally, restoring the image by utilizing an L-R algorithm.
Description
Technical Field
The invention belongs to the field of image restoration, relates to a restoration method of a multiple degraded extremely-blurred image, and is particularly suitable for restoration of an extremely-blurred criminal investigation image.
Background
With the gradual expansion of the application range, the video monitoring technology has become an important means for collecting crime evidences and extracting crime clues in the process of detecting various cases [1-4 ]. When the image is blurred, the Investigator uses Photoshop, police Video analysis platform, Video investor, "doctor shadow", "VCS", vReveal Premium, and "Sobel" software to process [1 ]. Generally, these software are effective for blurred images due to a single degradation cause, but it is difficult to obtain an ideal effect for extremely blurred images due to multiple degradations.
Early methods typically assumed PSFs to be of simple parametric form. In practice, PSFs are much more complex and should be described with a complex parametric model. The PSF is regarded as a matrix called fuzzy matrix by Dilip G.Warrier and Uday B.Desai, elements of the matrix adopt a random model, and a closed form fuzzy element average value expression is obtained by using average field approximation. Thus the PSF estimation algorithm using the non-functional model is developed rapidly: the early estimation method based on image complex cepstrum, the estimation method based on two-dimensional ARMA model, the multi-frame image sequence wiener filtering method and the estimation method based on singular value decomposition which appears recently, etc.
Recently, there are many good algorithms for estimating the PSF based on the statistical characteristics of natural images. Fergus et al used a mixed Gaussian model for the gray scale gradients of natural images, combined with variational Bayes to estimate the PSF, and then deconvolved using an L-R algorithm. This approach works better for smaller PSFs. Later, Krishan and Fergus adopt a super Laplacian prior model for the heavy tail distribution of the gray gradient of a natural image, and apply a cross iteration scheme, wherein the scheme utilizes an algorithm of a lookup table to carry out rapid optimization, but the restored image still has an obvious step effect. Levin et al generalize these methods and propose an optimization method of edge likelihood based on maximum a posteriori probability that proves to be more robust than the usual a posteriori probability. Although these methods work well, the amount of computation is too large. Because the latent image needs to be marginalized, optimizing the energy function usually requires a rather complex iterative numerical algorithm, such as alternating iterations of PSF estimation and image restoration in the optimization scheme. However, if the initial blur kernel is not well set in a matching manner or with an appropriate size, it will generally not converge to a true global minimum.
Disclosure of Invention
The method for restoring the extremely-blurred image with multiple degradations is provided by taking a test value of a scene as an initial value, and each sub-degradation function is accurately estimated by applying a variational Bayes theory, so that the PSF of an imaging system is accurately estimated, and the restoration of the extremely-blurred criminal investigation image is realized. Due to careful treatment of multiple degradation factors, the image restoration effect is good.
A restoration method of a multiple-degraded extremely-blurred image, S1, acquiring data;
s01, acquiring external parameters:
s02, obtaining a matrix g (x, y) of the blurred image and solving a gradient cepstrum C of the matrix g▽g;
S2, establishing a system degradation function h (x, y) and each sub degradation function;
s3, initializing each sub-degradation function, and solving initial gradient cepstrum of each sub-degradation function;
s4, taking the initial gradient cepstrum of each sub-degradation function and the gradient cepstrum of the fuzzy image as the input of variational Bayes, and solving the optimal distribution of each sub-degradation function;
and S5, solving the point spread function h (x, y) of the system by utilizing the relation between each sub-degradation function and the point spread function of the system.
And S6, taking h (x, y) and g (x, y) as input of the L-R algorithm, and obtaining a clear image matrix f (x, y).
Further, the acquiring of the external parameter in S01 specifically includes:
s01, acquiring external parameters: obtaining the working parameters of the camera, such as focal length F, diameter D of the lens, aperture coefficient F and CCD imaging pixel size wx、wyAnd the object distance z, the image distance v, the number of pixel points K, P, the length l of each pixel point of the photosensitive element, the movement distance d of the object within the exposure time tau, and the movement direction theta of the target.
Further, it is defined that the respective degradation functions of S2 specifically include:
the sub-degradation function of the airy disk mode caused by optical diffraction is h1(x, y);
the sub-degradation function of the image sensor at quantization is h2(x, y);
the sub-degradation function of camera defocus is h3(x, y);
the sub-degradation function of the camera moving linearly at a constant speed relative to the target is h4(x, y).
Further defining, the sub-degradation function h1(x, y) of the airy disk mode caused by optical diffraction is specifically:
the sub-degradation function of the image sensor during quantization is h2(x, y) specifically:
the sub-degradation function of camera defocus is h3(x, y) specifically:
the sub-degradation function of the camera and the target which move linearly relatively at a constant speed is h4(x, y), which is specifically as follows:
further, the S3 specifically includes:
substituting the parameter values obtained in the step S01 into the corresponding sub-degradation functions in the step S2 to initialize each sub-degradation function, so as to obtain initial values h10, h20, h30 and h40 of each sub-degradation function;
solving initial gradient cepstrum C of each sub-degradation functionh10、Ch20、Ch30、Ch40And the gradient cepstrum of the system degradation function h (x, y) is set as Ch。
Further, the S4 specifically includes:
cepstrum C of the initial gradients of the respective sub-degenerate functionsh10、Ch20、Ch30、Ch40Ch40、C▽gObtaining C as an input to variational Bayesh1、Ch2、Ch3、Ch4;
They satisfy the linear relationship:
Ch=k1Ch1+k2Ch2+k3Ch3+k4Ch4wherein k is1、k2、k3、k4Is constant, andk1、k2、k3、k4at least one is not 0.
Further defined, the k parameter k1、k2、k3、k4A typical value is 1, and for a certain imaging system degradation situation, the value of the k parameter changes around 1, and the specific value is determined by experiments. The patent is not limited to four degradation factors, and if there are more than four degradation factors, the number of k parameters should be increased.
The invention has the beneficial effects that: by considering multiple degradation factors of image blur and accurately estimating each sub degradation function through the variational Bayes theory, the PSF of the imaging system is accurately estimated, and the restoration of the extremely blurred image is realized.
Detailed Description
Description of the drawings:
FIG. 1 is a schematic flow chart of the present invention.
Fig. 2 is an imaging optical path diagram of the camera.
Fig. 3 is a photographic picture of the camera during clear photographing.
Fig. 4 is a frame diagram of a blurred video of the camera.
FIG. 5 is a clear picture after processing by the method of the present invention
As shown in fig. 1, a method for restoring an extremely blurred image with multiple degradation factors includes the following steps:
s1, acquiring data;
s01, acquiring external parameters: obtaining the working parameters of the camera, such as focal length F, diameter D of the lens, aperture coefficient F and CCD imaging pixel size omegax、ωyThe object distance z, the image distance v, the number of pixel points K, P, the length l of each pixel point of the photosensitive element, the movement distance d of the object within the exposure time tau and the movement direction theta of the target;
s02, acquiring a blurred image matrix g (x, y);
s2, establishing a degradation function,
the system degradation function is h (x, y);
the sub-degradation function of the airy disk mode caused by optical diffraction is h1(x, y);
the sub-degradation function of the image sensor at quantization is h2(x, y);
the sub-degradation function of camera defocus is h3(x, y);
the sub-degradation function of the camera and the target which move linearly relatively at a constant speed is h4(x, y);
the sub-degradation function h1(x, y) of the airy disk mode caused by optical diffraction is specifically:
the sub-degradation function of the image sensor during quantization is h2(x, y) specifically:
the sub-degradation function of camera defocus is h3(x, y) specifically:
the sub-degradation function of the camera and the target which move linearly relatively at a constant speed is h4(x, y), which is specifically as follows:
s3, substituting the parameter values obtained in the S01 into the corresponding sub-degeneration functions in the S2 to initialize each sub-degeneration function, and obtaining initial values h10, h20, h30 and h40 of each sub-degeneration function;
solving initial gradient cepstrum C of each sub-degradation functionh10、Ch20、Ch30、Ch40The gradient cepstrum with the system degradation function h (x, y) is Ch;
Cepstrum C of the initial gradients of the respective sub-degenerate functionsh10、Ch20、Ch30、Ch40、CgObtaining C as an input to variational Bayesh1、Ch2、Ch3、Ch4;
They satisfy the linear relationship:
Ch=k1Ch1+k2Ch2+k3Ch3+k4Ch4wherein k is1、k2、k3、k4Is a constant, and said k1、k2、k3、k4At least two are not 0, and the values are obtained by experiments.
And S4, taking h (x, y) and g (x, y) as input of the L-R algorithm, and obtaining a clear image matrix f (x, y).
The specific principle is as follows:
the degraded image g (x, y) of the sharp image f (x, y) can be expressed as:
where represents convolution operation, h (x, y) is the system Point Spread Function (PSF) upsilon (x, y) is the system noise. The gradient of the natural image f (x, y) follows a heavy-tailed distribution, whose distribution is represented by a mixed gaussian distribution:
wherein C is the number of Gaussian mixture distribution, picWeight, v, representing the c-th distributioncIs the variance of the c-th distribution. For the later computational aspect, the image is subjected to gradient cepstrum processing, and the cepstrum of the blurred image g (x, y) is defined as:
Cg(p,q)=FFT-1{log[1+|G(u,v)|]} (3)
and (4) respectively carrying out gradient processing on the formula (3) in the x direction and the y direction to obtain a gradient cepstrum of the blurred image.
The system takes the image degradation caused by factors such as optical diffraction, quantification, defocusing, relative movement and the like into consideration, and constructs a model of a point spread function caused by multiple degradation factors. Assuming that the sub-degradation functions of the imaging process are denoted as h1, h2, h3, h4, the degraded image g (x, y) under the combined action of multiple degradations and noise v (x, y) is:
here, v (x, y) is zero-mean white gaussian noise, and h1(x, y) represents a sub-degradation function of the airy disk mode caused by optical diffraction:
wherein the content of the first and second substances,j1() is a first order Bessel function; r is0The distance from the center of the airy disk to the main lobe; b ═ pi F/F (F is the lens focal length, F is the aperture factor).
h2(x, y) represents the sub-degradation function of the rectangular sensing cell when quantized:
wherein ω x and ω y are respectively expressed as the length and width of the rectangular pixel.
h3(x, y) represents the sub-degradation function at defocus:
where R is the defocus spot radius, expressed as:
R=(1/f-1/v-1/z)Dv/2 (8)
wherein f is the focal length of the lens, z is the object distance, D is the radius of the convex lens, and v is the image distance.
h4(x, y) represents a sub-degradation function when the imaging system moves linearly relative to the target at a constant speed:
where θ is the blur angle and L is the length of the degradation function. And the theta is estimated by analyzing the target motion condition of the video frame, and the L can be actually measured on site according to the imaging proportional relation.
In a single frame image, the motion blur length is the distance of motion of the vehicle when exposed, and the motion time is the exposure time. Calculating the corresponding length of the motion blur length on the photosensitive element according to an imaging proportional relation method, wherein the principle is as follows:
as shown in the imaging optical path diagram of fig. 2, d is the distance that the vehicle moves when being photographed, i.e., the length of the motion blur, the included angle with the focal plane is α, k is the length of the motion blur corresponding to the length d of the motion blur on the photosensitive element, on which the distance between the motion blur edge and the normal is p, the focal length is f, and the object distance is z. According to the similar triangle proportion relation, the following can be obtained:
since k and p are length units, but the length is actually obtained by counting the number of pixels, let K, P be the actual number of pixels, and the length of each pixel of the photosensitive element is l, so k equals kl and p equals Pl.
Let us assume cepstrum C of h, h1, h2, h3, h4h、Ch1、Ch2、Ch3、Ch4Then the following relationship exists:
Ch=k1Ch1+k2Ch2+k3Ch3+k4Ch4(11)
wherein k is1、k2、k3、k4Is a constant. For blurred images caused by four degradation factors, the k parameter k1、k2、k3、k4A typical value is 1, and for a certain imaging system degradation situation, the value of the k parameter changes around 1, and the specific value is determined by experiments. The patent is not limited to four degradation factors, if there are more than four degradation factorsThe number of k parameters should be increased and the estimation method is still effective.
As can be seen from the equation (1), the process of recovering a clear image when the system point spread function is unknown is blind deconvolution. Therefore, two steps are taken for image restoration. First, the point spread function of the system is estimated. Secondly, the image is restored by using an L-R algorithm. And (4) estimating a fuzzy kernel with higher precision by using a variational Bayesian theory.
Interpretation of point spread function of variational Bayesian estimation system:
given a blurred image g (x, y), the PSF (system point spread function) and sharp image f (x, y) are estimated by finding the maximum a posteriori probability given the prior information of f (x, y).
For equation (4), it is cumbersome to directly calculate the posterior distribution p (Θ | g) assuming the hidden variables Θ ═ f, h1, h2, h3, h4, and VB approximates the posterior distribution p (Θ | g) with a simpler distribution q (Θ | g) whose KL divergence is:
q (Θ | g) is solved by minimizing equation (12). Since the integral variables are Θ and p (g) is constant, equation (12) can be expressed as:
definition CKLAs a cost function, it can be expressed as:
assuming that q (Θ) and q (g) are independent of each other, q (Θ, g) is q (Θ) q (g), that is, formula (14) can be rewritten as:
by substituting Θ into equation (15), we can obtain:
assuming that the sub-degenerate functions are independent of each other, then
q(f,h1.h2,h3,h4)=q(f)q(h1)q(h2)q(h3)q(h4) (17)
Bringing (17) into (16) yields:
further derivation of equation (18) yields:
CKL=∫q(f)q(h1)q(h2)q(h3)q(h4)[lnq(f)+lnq(h1)+lnq(h2)+lnq(h3)+lnq(h4)-lnp(f)-lnp(h1)-lnp(h2)-lnp(h3)-lnp(h4)-lnp(g|f,h1.h2,h3,h4)]dΘ (19)
when the cost function (19) is minimized, the solution is carried out by adopting an alternate iteration method, and when a variable is solved, the rest variables are assumed to be constant.
In solving for q (h1), assuming q (f), q (h2), q (h3), and q (h4) are known constants, let:assuming that the noise in the degradation model is white gaussian noise with intensity σ 2, then there are:
where N () represents a gaussian distribution, we can obtain:
equation (19) then integrates only h1 and the rewritable cost function is as follows:
adding Lagrange to (22) according to the constraint condition ^ h1dh1 ═ 1Lambd multiplierh1And (5) obtaining an extremum of the cost function to obtain q (h 1):
it can be seen that q (h1) is a function of f, h2, h3, h4, and therefore requires alternate updates to iterate. The same can be obtained:
therefore, each sub-degradation function value obtained through field test is used as an initial value, and the optimal distribution of each sub-degradation function can be obtained by using the equations (23) - (26) and adopting an alternate iteration method, so that the accurate value of each sub-degradation function can be obtained through calculating mathematical expectation.
Just because of the above relationship, the method for estimating the variational Bayes sub-degradation function needs to be carried out in the gradient cepstrum domain, and the method estimates each sub-degradation function Ch1、Ch2、Ch3、Ch4Then using the point spread function C of the formula (11) synthesis systemhAnd obtaining a system point spread function h. Finally, by utilizing the L-R algorithm,a sharp image f (x, y) is obtained.
The method comprises the following specific implementation steps:
firstly, obtaining parameters of focal length F, diameter D of lens, aperture coefficient F and CCD imaging pixel size wx、wyAnd the object distance z, the image distance v, the number of pixel points K, P, the length l of each pixel point of the photosensitive element, the movement distance d of the object within the exposure time tau, and the movement direction theta of the target.
And secondly, obtaining initial values h10, h20, h30 and h40 of each sub-degradation function by using the function modes of the formulas (5) to (10).
Thirdly, solving initial gradient cepstrum C of each sub-degradation functionh10、Ch20、Ch30、Ch40And gradient cepstrum C of the blurred image▽g。
The fourth step, with Ch10、Ch20、Ch30、Ch40、C▽gTaking the initial value and utilizing the super Laplacian distribution of the gray gradient of the natural image, and obtaining C through variational Bayesh1、Ch2、Ch3、Ch4。
And fifthly, obtaining a point spread function of the system by using the formula (11).
And sixthly, solving the distribution of the point spread function of the system in the time domain, and calculating the expectation to obtain the point spread function of the system.
And seventhly, restoring by using an L-R algorithm to obtain a clear image f (x, y).
For the investigation personnel, the point spread function estimated by the variational Bayes and the principle of the L-R restoration algorithm do not need to be understood, and the extremely-blurred image restoration system can be used by adopting the following steps:
(1) for an extremely blurred image obtained by monitoring, parameters of focal length F, diameter D of lens, aperture coefficient F and CCD imaging pixel size omega are inputx、ωyAnd the object distance z, the image distance v, the number of pixel points K, P, the length l of each pixel point of the photosensitive element, the movement distance d of the object within the exposure time tau, and the movement direction theta of the target.
(2) These parameters and the blurred image are input to the system, and a restored image is obtained by the restoration system.
In the process, the precision of the testing parameters F, D, F, z, v and D is required to reach 0.1mm, the precision of tau is required to reach 0.001s, and the precision of theta is required to reach 0.1 degree. The parametric testing requires rigorous procedures. The same result can be obtained if different persons run the software following this step.
As shown in fig. 4, a blurred video with a frame image in the region of 5:46 pm on 3/15/2017 is read from the hard disk of the monitoring system.
From the product specification, we know that the focal length F of the lens of the monitoring system is 6mm, the aperture coefficient F is 1.4, the size ω x of the imaging pixel is 6.4 μm, and the exposure time τ of the image sensor is 0.01 s. These vehicles were measured in situ at a distance of about 108m from the lens. The k parameters tested by the experiments are specifically: k1 ═ 1.2; k2 ═ 0.9; k3 ═ 0.8; k4 is 1.3.
The results obtained by the patented process are shown in fig. 5, where the signal-to-noise ratio is improved by about 9 dB.
Claims (4)
1. A restoration method of a multiple-degraded extremely blurred image is characterized in that:
s1, acquiring data:
s01, acquiring external parameters;
s02, acquiring a blurred image matrix g (x, y), and solving a gradient cepstrum of the blurred image matrix g (x, y);
s2, establishing a system degradation function h (x, y) and each sub degradation function;
each sub-degradation function of S2 specifically includes:
the sub-degradation function of the airy disk mode caused by optical diffraction is h1(x, y);
the sub-degradation function of the image sensor at quantization is h2(x, y);
the sub-degradation function of camera defocus is h3(x, y);
the sub-degradation function of the camera and the target which move linearly relatively at a constant speed is h4(x, y);
s3, initializing each sub-degradation function and obtaining an initial value of each sub-degradation function; solving initial gradient cepstrum of each sub-degradation function;
s3 specifically includes:
substituting the parameter values obtained in the step S01 into the corresponding sub-degradation functions in the step S2 to initialize each sub-degradation function, so as to obtain initial values h10, h20, h30 and h40 of each sub-degradation function;
solving initial gradient cepstrum C of each sub-degradation functionh10、Ch20、Ch30、Ch40And the gradient cepstrum with a system degradation function h (x, y) is set as Ch;
S4, taking the initial gradient cepstrum of each sub-degradation function and the gradient cepstrum of the fuzzy image as the input of variational Bayes, and solving the optimal distribution of each sub-degradation function;
the S4 specifically includes:
cepstrum C of the initial gradients of the respective sub-degenerate functionsh10、Ch20、Ch30、Ch40、C▽gObtaining C as an input to variational Bayesh1、Ch2、Ch3、Ch4;
S5, solving a point spread function h (x, y) of the system by utilizing the relation between each sub-degradation function and the point spread function of the system;
each sub-degradation function satisfies a linear relationship:
Ch=k1Ch1+k2Ch2+k3Ch3+k4Ch4wherein k is1、k2、k3、k4Is a constant, and k1、k2、k3、k4At least one is not 0;
and S6, taking h (x, y) and g (x, y) as input of the L-R algorithm, and obtaining a clear image matrix f (x, y).
2. A method for restoring a multiple degraded extremely blurred image according to claim 1, wherein:
the acquiring of the external parameter in S01 specifically includes:
s01, acquiring external parameters: obtaining the working parameters of the camera, such as focal length F, diameter D of the lens and aperture coefficient F,CCD imaging pixel size wx、wyAnd the object distance z, the image distance v, the number of pixel points K, P, the length l of each pixel point of the photosensitive element, the movement distance d of the object within the exposure time tau, and the movement direction theta of the target.
3. A method for restoring a multiple degraded extremely blurred image according to claim 1, wherein:
among the sub-degradation functions of S2, the sub-degradation function h1(x, y) of the airy disk mode caused by optical diffraction is specifically:
the sub-degradation function of the image sensor during quantization is h2(x, y) specifically:
the sub-degradation function of camera defocus is h3(x, y) specifically:
the sub-degradation function of the camera and the target which move linearly relatively at a constant speed is h4(x, y), which is specifically as follows:
4. a method for restoring a multiple degraded extremely blurred image according to claim 1, wherein:
for blurred images caused by four degradation factors, k parameter k1、k2、k3、k4The typical value is 1, the value of the k parameter changes near 1 according to the degradation condition of the imaging system, and when the degradation factors are more than four, the number of the k parameters is correspondingly increased.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710234684.5A CN107220945B (en) | 2017-04-12 | 2017-04-12 | Restoration method of multiple degraded extremely blurred image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710234684.5A CN107220945B (en) | 2017-04-12 | 2017-04-12 | Restoration method of multiple degraded extremely blurred image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107220945A CN107220945A (en) | 2017-09-29 |
CN107220945B true CN107220945B (en) | 2020-09-22 |
Family
ID=59927558
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710234684.5A Expired - Fee Related CN107220945B (en) | 2017-04-12 | 2017-04-12 | Restoration method of multiple degraded extremely blurred image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107220945B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111179158B (en) * | 2019-12-30 | 2023-07-18 | 深圳市商汤科技有限公司 | Image processing method, device, electronic equipment and medium |
CN113313655B (en) * | 2021-06-28 | 2022-09-23 | 合肥工业大学 | Blind image deblurring method based on saliency mapping and gradient cepstrum technology |
CN117058038B (en) * | 2023-08-28 | 2024-04-30 | 北京航空航天大学 | Diffraction blurred image restoration method based on even convolution deep learning |
CN117058118B (en) * | 2023-08-28 | 2024-04-05 | 北京航空航天大学 | Method for evaluating image deterioration effect caused by plane shielding glass silk screen |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101968881A (en) * | 2010-10-27 | 2011-02-09 | 东南大学 | Motion blurring and defocusing composite blurring image restoration method |
CN102708550A (en) * | 2012-05-17 | 2012-10-03 | 浙江大学 | Blind deblurring algorithm based on natural image statistic property |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102023518B1 (en) * | 2013-02-06 | 2019-09-23 | 삼성전자주식회사 | Imaging apparatus, ultrasonic imaging apparatus, method for image processing and method for ultrasonic image processing |
-
2017
- 2017-04-12 CN CN201710234684.5A patent/CN107220945B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101968881A (en) * | 2010-10-27 | 2011-02-09 | 东南大学 | Motion blurring and defocusing composite blurring image restoration method |
CN102708550A (en) * | 2012-05-17 | 2012-10-03 | 浙江大学 | Blind deblurring algorithm based on natural image statistic property |
Non-Patent Citations (1)
Title |
---|
"运动成像混合模糊的全变分图像复原";石明珠等;《光学 精密工程》;20110831;第19卷(第8期);正文第3页 * |
Also Published As
Publication number | Publication date |
---|---|
CN107220945A (en) | 2017-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Lee et al. | Iterative filter adaptive network for single image defocus deblurring | |
JP6957197B2 (en) | Image processing device and image processing method | |
Li et al. | Aod-net: All-in-one dehazing network | |
Wang et al. | Recent progress in image deblurring | |
Chakrabarti et al. | Depth and deblurring from a spectrally-varying depth-of-field | |
CN110097509B (en) | Restoration method of local motion blurred image | |
Kee et al. | Modeling and removing spatially-varying optical blur | |
CN107220945B (en) | Restoration method of multiple degraded extremely blurred image | |
Quan et al. | Gaussian kernel mixture network for single image defocus deblurring | |
JP2013531268A (en) | Measuring distance using coded aperture | |
CN108629741B (en) | Fuzzy kernel estimation method based on L0 and L1 regular terms | |
Gilles et al. | Atmospheric turbulence restoration by diffeomorphic image registration and blind deconvolution | |
Quan et al. | Neumann network with recursive kernels for single image defocus deblurring | |
El-Henawy et al. | A comparative study on image deblurring techniques | |
Gupta et al. | Motion blur removal via coupled autoencoder | |
US11967096B2 (en) | Methods and apparatuses of depth estimation from focus information | |
CN116029924A (en) | Image processing method of infrared system by single-chip diffraction | |
CN114037636A (en) | Multi-frame blind restoration method for correcting image by adaptive optical system | |
Ranipa et al. | A practical approach for depth estimation and image restoration using defocus cue | |
Fazlali et al. | Atmospheric turbulence removal in long-range imaging using a data-driven-based approach | |
Maik et al. | Blind deconvolution using maximum a posteriori (MAP) estimation with directional edge based priori | |
Liu et al. | Guided image deblurring by deep multi-modal image fusion | |
Li et al. | Joint motion deblurring with blurred/noisy image pair | |
CN101131429A (en) | Image restoration method for image quality degraded imaging system with optical aberration and small hole diffraction | |
HWANG et al. | Multi-aperture image processing using deep learning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200922 Termination date: 20210412 |