US20100266218A1 - Method for deblurring an image that produces less ringing - Google Patents

Method for deblurring an image that produces less ringing Download PDF

Info

Publication number
US20100266218A1
US20100266218A1 US12/741,184 US74118408A US2010266218A1 US 20100266218 A1 US20100266218 A1 US 20100266218A1 US 74118408 A US74118408 A US 74118408A US 2010266218 A1 US2010266218 A1 US 2010266218A1
Authority
US
United States
Prior art keywords
image
blurred image
mask
blurred
extending
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/741,184
Other versions
US8687909B2 (en
Inventor
Radka Tezaur
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Assigned to NIKON CORPORATION reassignment NIKON CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TEZAUR, RADKA
Publication of US20100266218A1 publication Critical patent/US20100266218A1/en
Application granted granted Critical
Publication of US8687909B2 publication Critical patent/US8687909B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • G06T2207/20012Locally adaptive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Definitions

  • FIG. 1 is a simplified illustration of a prior art, blurred, captured image 14 P of a starfish. In this illustration, the relatively thick lines are used to represent the blurring of the captured image 14 P.
  • FIG. 1 also illustrates a prior art reconstructed image 26 P that was deconvouted using the Lucy-Richardson method. Although this method works quite well in reducing blurring, the reconstructed image is left with two types of ringing artifacts, namely ringing artifacts around the edges and other boundaries of objects in the image, and ringing artifacts around boundaries of the image (e.g. the image border).
  • the resulting reconstructed image 26 P has ringing artifacts 28 P (illustrated with two dashed lines) around the edges 30 P of the captured object(s) 20 P in the reconstructed image 26 P, and has ringing artifacts 32 P (illustrated with two dashed lines) around the boundaries 34 P of the reconstructed image 26 P.
  • the ringing in the reconstructed image 26 P typically consists of multiple parallel lines around the object 20 P and image edges 34 P, with the first line being the strongest and subsequent lines gradually become weaker.
  • the present invention is directed to a method for reducing blurring in a blurred image.
  • the method comprising the steps of: (i) creating an edge mask from the blurred image; (ii) extending the edge mask; (iii) forming an initial array (e.g. extending the blurred image); and (iv) performing Lucy-Richardson iterations, with masking.
  • this method can include the step of cropping of the extended image so that the reconstructed image is approximately the same size as the blurred image.
  • the reconstructed adjusted image (i) has significantly less ringing artifacts around the edges of the captured object(s), and (ii) has significantly less ringing artifacts around boundaries.
  • the adjusted image is more attractive and more accurately represents the scene.
  • the step of performing Lucy-Richardson iterations, with masking is done until a pre-selected stopping criterion is reached.
  • the step of performing Lucy-Richardson iterations includes the step of computing a blurred mask
  • ⁇ ⁇ ( n ) ⁇ m ⁇ h ⁇ ( m , n ) ⁇ M ⁇ ( m ) .
  • the blurred image and the edge mask can be extended with padding that does not introduce artificial edges.
  • the blurred image can be extended with symmetric reflection and the blurred image can be extended by padding with zeros.
  • the present invention is also directed to a device for deblurring a blurred image.
  • FIG. 1 is a simplified prior art illustration of a captured blurred image and a prior art reconstructed image
  • FIG. 2A is a simplified view of a scene, an image apparatus having features of the present invention, a captured blurred image, and an adjusted image;
  • FIG. 2B is a simplified front perspective view of the image apparatus of FIG. 2A ;
  • FIG. 3 illustrates the blurred image, an extended blurred image, an extended masked image, and the resulting adjusted image
  • FIG. 4 illustrates a constant array and a random array having features of the present invention.
  • FIG. 5 is a simplified illustration of another embodiment having features of the present invention.
  • FIG. 2A is a simplified perspective illustration of an image apparatus 210 having features of the present invention, and a scene 212 .
  • the image apparatus 210 (i) captures a raw, blurred (illustrated with thick lines) image 214 (illustrated away from the image apparatus 210 ), and (ii) provides an adjusted image 216 (illustrated away from the image apparatus 210 ).
  • the image apparatus 210 includes a control system 218 (illustrated in phantom) that uses a unique deblurring method on the blurred image 214 to reduce the amount of blur in the generated adjusted image 216 .
  • the adjusted image 216 is more attractive and more accurately represents the scene 212 .
  • the type of scene 212 captured by the image apparatus 210 can vary.
  • the scene 212 can include one or more objects 220 , e.g. animals, plants, mammals, and/or environments.
  • the scene 212 is illustrated as including one object 220 .
  • the scene 212 can include more than one object 220 .
  • the object 220 is a starfish.
  • the reconstructed adjusted image 216 (i) does not have (or has significantly less) ringing artifacts around the edges 222 of the captured object(s) 220 C, and (ii) does not have (or has significantly less) ringing artifacts around boundaries 224 .
  • FIG. 2B illustrates a simplified, front perspective view of one, non-exclusive embodiment of the image apparatus 210 .
  • the image apparatus 210 is a digital camera, and includes an apparatus frame 236 , an optical assembly 238 , and a capturing system 240 (illustrated as a box in phantom), in addition to the control system 218 (illustrated as a box in phantom).
  • the design of these components can be varied to suit the design requirements and type of image apparatus 210 . Further, the image apparatus 210 could be designed without one or more of these components.
  • the apparatus frame 236 can be rigid and support at least some of the other components of the image apparatus 210 .
  • the apparatus frame 236 includes a generally rectangular shaped hollow body that forms a cavity that receives and retains at least some of the other components of the image apparatus 210 .
  • the apparatus frame 236 can include an aperture 244 and a shutter mechanism 246 that work together to control the amount of light that reaches the capturing system 240 .
  • the shutter mechanism 246 can be activated by a shutter button 248 .
  • the shutter mechanism 246 can include a pair of blinds (sometimes referred to as “blades”) that work in conjunction with each other to allow the light to be focused on the capturing system 240 for a certain amount of time.
  • the shutter mechanism 246 can be all electronic and contain no moving parts.
  • an electronic capturing system 240 can have a capture time controlled electronically to emulate the functionality of the blinds.
  • the optical assembly 238 can include a single lens or a combination of lenses that work in conjunction with each other to focus light onto the capturing system 240 .
  • the image apparatus 210 includes an autofocus assembly (not shown) including one or more lens movers that move one or more lenses of the optical assembly 238 in or out until the sharpest possible image of the subject is received by the capturing system 240 .
  • the capturing system 240 captures information for the raw image.
  • the design of the capturing system 240 can vary according to the type of image apparatus 210 .
  • the capturing system 240 includes an image sensor 250 (illustrated in phantom), a filter assembly 252 (illustrated in phantom), and a storage system 254 (illustrated in phantom).
  • the image sensor 250 receives the light that passes through the aperture 244 and converts the light into electricity.
  • One non-exclusive example of an image sensor 250 for digital cameras is known as a charge coupled device (“CCD”).
  • An alternative image sensor 250 that may be employed in digital cameras uses complementary metal oxide semiconductor (“CMOS”) technology.
  • CMOS complementary metal oxide semiconductor
  • the image sensor 250 by itself, produces a grayscale image as it only keeps track of the total quantity of the light that strikes the surface of the image sensor 250 . Accordingly, in order to produce a full color image, the filter assembly 252 is generally used to capture the colors of the image.
  • the storage system 254 stores the various raw images (illustrated in FIG. 2A ) and/or the adjusted images 16 (illustrated in FIG. 2A ) before these images are ultimately printed out, deleted, transferred or downloaded to an auxiliary storage system or a printer.
  • the storage system 254 can be fixedly or removable coupled to the apparatus frame 236 .
  • suitable storage systems 254 include flash memory, a floppy disk, a hard disk, or a writeable CD or DVD.
  • the control system 218 is electrically connected to and controls the operation of the electrical components of the image apparatus 210 .
  • the control system 218 can include one or more processors and circuits, and the control system 218 can be programmed to perform one or more of the functions described herein.
  • the control system 218 is secured to the apparatus frame 236 and the rest of the components of the image apparatus 210 . Further, the control system 218 is positioned within the apparatus frame 236 .
  • control system 218 includes software that reduces the blurring in the blurred image 214 to provide the adjusted image 216 .
  • the image apparatus 210 includes an image display 256 that displays the blurred image 214 and/or the adjusted images 216 .
  • the image display 256 is fixedly mounted to the rest of the image apparatus 210 .
  • the image display 256 can be secured with a hinge mounting system (not shown) that enables the display 256 to be pivoted.
  • a hinge mounting system not shown
  • One non-exclusive example of an image display 256 includes an LCD screen.
  • the image display 256 can display other information that can be used to control the functions of the image apparatus 210 .
  • the image apparatus 210 can include one or more control switches 258 electrically connected to the control system 218 that allows the user to control the functions of the image apparatus 210 .
  • one or more of the control switches 258 can be used to selectively switch the image apparatus 210 to the blur reduction mode in which the deblurring method disclosed herein is activated.
  • m,n are vectors that denote positions of elements within the 2-D pixel arrays.
  • Multiplication of arrays in formulas is element-wise multiplication, not the matrix product.
  • the division of arrays is element-wise division.
  • a blurred image g that is blurred by a known point spread function (“PSF”) h can be expressed as:
  • Equation 3 the first term on the right hand side of Equation 3 is omitted because the PSF is commonly normalized to satisfy
  • the PSF is often assumed to be spatially invariant.
  • Equations 2 and 3 can be represented by a single 2-D array.
  • the sums in the formulas above then become convolution respectively correlation with the PSF array h (depending whether the summation is over n or m).
  • FFT Fast Fourier Transform
  • one of the main problems with the basic Lucy-Richardson algorithm is the unpleasant ringing artifacts that appear in the reconstructed image 26 P (illustrated in FIG. 1 ).
  • the basic Lucy-Richardson method there are two types of ringing artifacts, namely (i) the ringing 28 P around object edges 30 P inside the reconstructed image 26 P, and (ii) the ringing 32 P around image boundaries 34 P of the reconstructed image 26 P.
  • This technique helps to exclude (mask out) bad pixels within the observed image and to eliminate (or reduce) their influence in the reconstructed image.
  • the same masking technique can be used to suppress ringing around object edges. It is accomplished by not allowing pixel values in edge regions to influence pixel values in smooth regions (where the resulting ringing is most visible). That is, the edge pixels are viewed as the “bad” pixels and pixels in the smooth regions are viewed as good pixels.
  • the edge mask can be created using any of many existing edge detection techniques that can identify locations of steep change of image intensity or color.
  • Equation 10 is similar to Equation 3, except the weight (“w(m)”) has been added to reduce the influence of the bad pixels on the reconstructed image.
  • the fast implementation using FFT is again possible:
  • This masking technique can be used also in combination with other modifications of Lucy-Richardson method, such as damping or acceleration.
  • the weight w(m) assigned to each pixel is allowed to be any number between zero (“0”) and one (“1”).
  • this method is not designed to and does not reduce ringing artifacts 32 P near the image border 34 P.
  • border method A different algorithm (referred to as “border method”) has been developed to reduce ringing artifacts 32 P around image border 34 P.
  • g is defined as the blurred image padded by zeros (“0's”).
  • the raw blurred image is extended around the border by padding the boundary with zeros (“0's”).
  • Equation is computed
  • ⁇ ⁇ ( n ) ⁇ m ⁇ h ⁇ ( m , n ) ⁇ M ⁇ ( m ) .
  • Equation 14 ⁇ is the blurred mask that results from the blurring of mask M. Subsequently, from the ⁇ mask, another mask ⁇ is created. More specifically, mask ⁇ is the reciprocal of the ⁇ mask, except that a small constant ⁇ is chosen and set to prevent division by zero (“0”) in Equation 18,
  • the ⁇ mask is the reciprocal of the ⁇ mask.
  • the resulting image is cropped to have the same field of view as the original blurred image. Stated in another fashion, the resulting image is cropped to remove the extended boundary.
  • FFT can be used for fast implementation of this method. It can be used to compute both the Lucy-Richardson iterations,
  • the masking method for the suppression of ringing around object edges based on the technique for masking out bad pixels described in Equations 9-13, and the border method for elimination of ringing around image borders described in Equations 17-21 can be combined into a single new method that reduces blurring and both (i) inhibits ringing around object edges, and (ii) inhibits ringing at image borders.
  • Equation 14 Equation 14
  • Equation 18 for border method
  • Equation 10 for Lucy-Richardson method with masking
  • the ones (“1's”) marking the field of view of the original blurred image are replaced with an edge mask that has values between 0 and 1.
  • the pixels that are at or near object edges have weight close to 0, while pixels in smooth areas have weight close to 1.
  • FIG. 3 illustrates the blurred image 214 , an expanded blurred image 360 , an extended edge mask 362 and the resulting adjusted image 216 .
  • an edge mask M is created from the blurred image 214 .
  • values in the mask are between 0 and 1.
  • the edge mask 362 illustrated in FIG. 3 for simplicity, only values of 0 and 1 are represented therein.
  • the 1's in FIG. 3 can represent values that are closer to 1 than 0 while the O′s in FIG. 3 can represent values that are closer to 0 than 1.
  • the edge mask 362 has been extended by padding with zeros (“0's”).
  • the blurred image 214 has been expanded to create the extended blurred image (g) 360 . If the blurred image 214 is used as the initial guess for the iterations, padding that does not introduce artificial edges needs to be used. For example, extension 368 of the image by symmetric reflection (as illustrated in FIG. 3 ) can be used.
  • ⁇ ⁇ ( n ) ⁇ m ⁇ h ⁇ ( m , n ) ⁇ M ⁇ ( m ) .
  • the initial guess e.g. an initial array is utilized.
  • the initial array can be generated by extending the blurred image 360 using symmetric reflection or in another fashion that does not introduce artificial edges.
  • the initial array can be a non-zero constant array 451 or a random array 452 as illustrated in FIG. 4 .
  • the arrays 451 , 453 illustrated in FIG. 4 are merely a small, non-exclusive portion of possible arrays.
  • the initial array will have the size and shape of the image plus the extension.
  • f (0) constant
  • f (0) random variable
  • f (0) g with symmetric extension.
  • resulting adjust image 216 is cropped to have the same field of view as the original blurred image 14 .
  • FIG. 5 illustrates another embodiment of a deblurring system 570 having features of the present invention.
  • the image apparatus 210 again captures the blurred image 214 (illustrated in FIG. 1 ).
  • the blurred image 214 is transferred to computer 572 (e.g. a personal computer) that includes a computer control system 518 (illustrated in phantom) that uses the deblurring method disclosed herein to deblur the blurred image 214 and provide the adjusted image 316 (illustrated in FIG. 3 ).
  • computer 572 e.g. a personal computer
  • computer control system 518 illustrated in phantom

Abstract

A method for reducing blurring in a blurred image (14) of a scene (12) includes the steps of: (i) creating an edge mask (362) from the blurred image (14); (ii) extending the edge mask (362; (iii) forming an initial array (e.g. extending the blurred image (360); and (iv) performing Lucy-Richardson iterations, with masking. With the deblurring method disclosed herein, the reconstructed adjusted image (16) (i) does not have (or has significantly less) ringing artifacts around the edges (22) of the captured object(s) (20C), and (ii) does not have (or has significantly less) ringing artifacts around border (24). As a result thereof, the adjusted image (16) is more attractive and more accurately represents the scene (12).

Description

    BACKGROUND
  • Cameras are commonly used to capture an image of a scene. Unfortunately, some of the captured images are blurred. FIG. 1 is a simplified illustration of a prior art, blurred, captured image 14P of a starfish. In this illustration, the relatively thick lines are used to represent the blurring of the captured image 14P.
  • A number of different methods exist for deblurring an image. For example, blurring of an image is commonly modeled as convolution and many deconvolution methods exist that can help to reduce blurring. However, most of these methods tend to produce ringing artifacts in the reconstructed image.
  • One commonly used deconvolution method is the Lucy-Richardson deconvolution method. FIG. 1 also illustrates a prior art reconstructed image 26P that was deconvouted using the Lucy-Richardson method. Although this method works quite well in reducing blurring, the reconstructed image is left with two types of ringing artifacts, namely ringing artifacts around the edges and other boundaries of objects in the image, and ringing artifacts around boundaries of the image (e.g. the image border). As a result thereof, the resulting reconstructed image 26P has ringing artifacts 28P (illustrated with two dashed lines) around the edges 30P of the captured object(s) 20P in the reconstructed image 26P, and has ringing artifacts 32P (illustrated with two dashed lines) around the boundaries 34P of the reconstructed image 26P. The ringing in the reconstructed image 26P typically consists of multiple parallel lines around the object 20P and image edges 34P, with the first line being the strongest and subsequent lines gradually become weaker.
  • As the result thereof, the corrected image is not completely satisfactory.
  • SUMMARY
  • The present invention is directed to a method for reducing blurring in a blurred image. In one embodiment, the method comprising the steps of: (i) creating an edge mask from the blurred image; (ii) extending the edge mask; (iii) forming an initial array (e.g. extending the blurred image); and (iv) performing Lucy-Richardson iterations, with masking. Additionally, this method can include the step of cropping of the extended image so that the reconstructed image is approximately the same size as the blurred image. In certain embodiments, with the deblurring method disclosed herein, the reconstructed adjusted image (i) has significantly less ringing artifacts around the edges of the captured object(s), and (ii) has significantly less ringing artifacts around boundaries. As a result thereof, the adjusted image is more attractive and more accurately represents the scene.
  • In one embodiment, the step of performing Lucy-Richardson iterations, with masking, is done until a pre-selected stopping criterion is reached.
  • In another embodiment, the step of performing Lucy-Richardson iterations includes the step of computing a blurred mask
  • α ( n ) = m h ( m , n ) M ( m ) .
  • This equation is described in more detail below.
  • As provided herein, the blurred image and the edge mask can be extended with padding that does not introduce artificial edges. For example, the blurred image can be extended with symmetric reflection and the blurred image can be extended by padding with zeros.
  • The present invention is also directed to a device for deblurring a blurred image.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The novel features of this invention, as well as the invention itself, both as to its structure and its operation, will be best understood from the accompanying drawings, taken in conjunction with the accompanying description, in which similar reference characters refer to similar parts, and in which:
  • FIG. 1 is a simplified prior art illustration of a captured blurred image and a prior art reconstructed image;
  • FIG. 2A is a simplified view of a scene, an image apparatus having features of the present invention, a captured blurred image, and an adjusted image;
  • FIG. 2B is a simplified front perspective view of the image apparatus of FIG. 2A;
  • FIG. 3 illustrates the blurred image, an extended blurred image, an extended masked image, and the resulting adjusted image;
  • FIG. 4 illustrates a constant array and a random array having features of the present invention; and
  • FIG. 5 is a simplified illustration of another embodiment having features of the present invention.
  • DESCRIPTION
  • FIG. 2A is a simplified perspective illustration of an image apparatus 210 having features of the present invention, and a scene 212. The image apparatus 210 (i) captures a raw, blurred (illustrated with thick lines) image 214 (illustrated away from the image apparatus 210), and (ii) provides an adjusted image 216 (illustrated away from the image apparatus 210). In one embodiment, the image apparatus 210 includes a control system 218 (illustrated in phantom) that uses a unique deblurring method on the blurred image 214 to reduce the amount of blur in the generated adjusted image 216. As a result thereof, the adjusted image 216 is more attractive and more accurately represents the scene 212.
  • The type of scene 212 captured by the image apparatus 210 can vary. For example, the scene 212 can include one or more objects 220, e.g. animals, plants, mammals, and/or environments. For simplicity, in FIG. 2A, the scene 212 is illustrated as including one object 220. Alternatively, the scene 212 can include more than one object 220. In FIG. 2A, the object 220 is a starfish.
  • It should be noted that with the deblurring method disclosed herein, the reconstructed adjusted image 216 (i) does not have (or has significantly less) ringing artifacts around the edges 222 of the captured object(s) 220C, and (ii) does not have (or has significantly less) ringing artifacts around boundaries 224.
  • FIG. 2B illustrates a simplified, front perspective view of one, non-exclusive embodiment of the image apparatus 210. In this embodiment, the image apparatus 210 is a digital camera, and includes an apparatus frame 236, an optical assembly 238, and a capturing system 240 (illustrated as a box in phantom), in addition to the control system 218 (illustrated as a box in phantom). The design of these components can be varied to suit the design requirements and type of image apparatus 210. Further, the image apparatus 210 could be designed without one or more of these components.
  • The apparatus frame 236 can be rigid and support at least some of the other components of the image apparatus 210. In one embodiment, the apparatus frame 236 includes a generally rectangular shaped hollow body that forms a cavity that receives and retains at least some of the other components of the image apparatus 210.
  • The apparatus frame 236 can include an aperture 244 and a shutter mechanism 246 that work together to control the amount of light that reaches the capturing system 240. The shutter mechanism 246 can be activated by a shutter button 248. The shutter mechanism 246 can include a pair of blinds (sometimes referred to as “blades”) that work in conjunction with each other to allow the light to be focused on the capturing system 240 for a certain amount of time. Alternatively, for example, the shutter mechanism 246 can be all electronic and contain no moving parts. For example, an electronic capturing system 240 can have a capture time controlled electronically to emulate the functionality of the blinds.
  • The optical assembly 238 can include a single lens or a combination of lenses that work in conjunction with each other to focus light onto the capturing system 240. In one embodiment, the image apparatus 210 includes an autofocus assembly (not shown) including one or more lens movers that move one or more lenses of the optical assembly 238 in or out until the sharpest possible image of the subject is received by the capturing system 240.
  • The capturing system 240 captures information for the raw image. The design of the capturing system 240 can vary according to the type of image apparatus 210. For a digital type camera, the capturing system 240 includes an image sensor 250 (illustrated in phantom), a filter assembly 252 (illustrated in phantom), and a storage system 254 (illustrated in phantom).
  • The image sensor 250 receives the light that passes through the aperture 244 and converts the light into electricity. One non-exclusive example of an image sensor 250 for digital cameras is known as a charge coupled device (“CCD”). An alternative image sensor 250 that may be employed in digital cameras uses complementary metal oxide semiconductor (“CMOS”) technology.
  • The image sensor 250, by itself, produces a grayscale image as it only keeps track of the total quantity of the light that strikes the surface of the image sensor 250. Accordingly, in order to produce a full color image, the filter assembly 252 is generally used to capture the colors of the image.
  • The storage system 254 stores the various raw images (illustrated in FIG. 2A) and/or the adjusted images 16 (illustrated in FIG. 2A) before these images are ultimately printed out, deleted, transferred or downloaded to an auxiliary storage system or a printer. The storage system 254 can be fixedly or removable coupled to the apparatus frame 236. Non-exclusive examples of suitable storage systems 254 include flash memory, a floppy disk, a hard disk, or a writeable CD or DVD.
  • The control system 218 is electrically connected to and controls the operation of the electrical components of the image apparatus 210. The control system 218 can include one or more processors and circuits, and the control system 218 can be programmed to perform one or more of the functions described herein. In FIG. 2B, the control system 218 is secured to the apparatus frame 236 and the rest of the components of the image apparatus 210. Further, the control system 218 is positioned within the apparatus frame 236.
  • In certain embodiments, the control system 218 includes software that reduces the blurring in the blurred image 214 to provide the adjusted image 216.
  • Referring back to FIG. 2A, the image apparatus 210 includes an image display 256 that displays the blurred image 214 and/or the adjusted images 216. With this design, the user can decide which images 214, 216 should be stored and which images 214, 216 should be deleted. In FIG. 2A, the image display 256 is fixedly mounted to the rest of the image apparatus 210. Alternatively, the image display 256 can be secured with a hinge mounting system (not shown) that enables the display 256 to be pivoted. One non-exclusive example of an image display 256 includes an LCD screen.
  • Further, the image display 256 can display other information that can be used to control the functions of the image apparatus 210.
  • Moreover, the image apparatus 210 can include one or more control switches 258 electrically connected to the control system 218 that allows the user to control the functions of the image apparatus 210. For example, one or more of the control switches 258 can be used to selectively switch the image apparatus 210 to the blur reduction mode in which the deblurring method disclosed herein is activated.
  • In this discussion of the new deblurring method, m,n are vectors that denote positions of elements within the 2-D pixel arrays. Multiplication of arrays in formulas is element-wise multiplication, not the matrix product. Similarly, the division of arrays is element-wise division.
  • A blurred image g that is blurred by a known point spread function (“PSF”) h can be expressed as:
  • g ( m ) = n h ( m , n ) f ( n ) , Eq . 1
  • in which f represents an ideal, sharp image.
  • A prior art standard Lucy-Richardson algorithm uses the following iterations to find the approximation of the sharp image f:
  • g ( k ) ( m ) = n h ( m , n ) f ( k ) ( n ) , Eq . 2 f ( k + 1 ) ( n ) = 1 m h ( m , n ) ( m h ( m , n ) g ( m ) g k ( m ) ) f ( k ) ( n ) . Eq . 3
  • The iterative process is usually started with an initial guess f(0)=constant or f(0)=g. Usually, the first term on the right hand side of Equation 3 is omitted because the PSF is commonly normalized to satisfy
  • m h ( m , n ) = 1 Eq . 4
  • to preserve image brightness. Also, the PSF is often assumed to be spatially invariant. As a result thereof,

  • h(m,n)=h(m−n)  Eq.5
  • and h can be represented by a single 2-D array. The sums in the formulas above then become convolution respectively correlation with the PSF array h (depending whether the summation is over n or m). In this case, a fast implementation of Equations 2 and 3 using FFT (Fast Fourier Transform) is possible:
  • g ( k ) = F - 1 HFf ( k ) , Eq . 6 f ( k + 1 ) = F - 1 H * F ( g g ( k ) ) u , Eq . 7
  • where F is the Fourier transform operator, H is (discretized) optical transfer function corresponding to the given PSF (the Fourier transform of h padded by zeros and shifted), *denotes complex conjugation, and u is defined by the equation below,

  • u=F−1H*F1  Eq.8
  • where 1 is an array of ones.
  • As described above, one of the main problems with the basic Lucy-Richardson algorithm is the unpleasant ringing artifacts that appear in the reconstructed image 26P (illustrated in FIG. 1). Referring back to FIG. 1, with basic Lucy-Richardson method, there are two types of ringing artifacts, namely (i) the ringing 28P around object edges 30P inside the reconstructed image 26P, and (ii) the ringing 32P around image boundaries 34P of the reconstructed image 26P.
  • One technique that is used with Lucy-Richardson iterations helps to exclude (mask out) bad pixels within the observed image and to eliminate (or reduce) their influence in the reconstructed image. This technique (referred to as “masking method”) consists of introducing weights (w(m)) into the algorithm to reduce the influence caused by bad pixels on the reconstructed image. With weights, if the pixel at location m is considered to be bad (for example, because it is know that the sensor has a defect at this particular location), the weight is zero (w(m)=0). Alternatively, if the pixel at location m is considered to be good, the weight is one (w(m)=1). The weight mask inhibits bad pixel values influencing the other pixel values during the deconvolution process. However, the other good pixel values still do influence values in bad pixels. The same masking technique can be used to suppress ringing around object edges. It is accomplished by not allowing pixel values in edge regions to influence pixel values in smooth regions (where the resulting ringing is most visible). That is, the edge pixels are viewed as the “bad” pixels and pixels in the smooth regions are viewed as good pixels. The edge mask can be created using any of many existing edge detection techniques that can identify locations of steep change of image intensity or color.
  • The resulting formulas for the Lucy-Richardson iterations with masking look as follows:
  • g ( k ) ( m ) = n h ( m , n ) f ( k ) ( n ) , Eq . 9 f ( k + 1 ) ( n ) = 1 m w ( m ) h ( m , n ) ( m w ( m ) h ( m , n ) g ( m ) g ( k ) ( m ) ) f ( k ) ( n ) . Eq . 10
  • Basically, Equation 10 is similar to Equation 3, except the weight (“w(m)”) has been added to reduce the influence of the bad pixels on the reconstructed image. When the PSF is spatially invariant, the fast implementation using FFT is again possible:
  • g ( k ) = F - 1 HFf ( k ) , Eq . 11 f ( k + 1 ) = F - 1 H * F ( w g g ( k ) ) u Eq . 12 u = F - 1 H * Fw . Eq . 13
  • This masking technique can be used also in combination with other modifications of Lucy-Richardson method, such as damping or acceleration.
  • Further, in a variation to the masking method described above, rather than only zero (“0”) or one (“1”), the weight w(m) assigned to each pixel is allowed to be any number between zero (“0”) and one (“1”). In this variation, to suppress the ringing 28P around object edges 30P, pixels at or near edges 30P get a weight of zero (w(m)=0) or close to zero (“0”), while pixels in smooth areas get larger weight (e.g. w(m)=1 or less). However, this method is not designed to and does not reduce ringing artifacts 32P near the image border 34P.
  • A different algorithm (referred to as “border method”) has been developed to reduce ringing artifacts 32P around image border 34P. First, in this algorithm, g is defined as the blurred image padded by zeros (“0's”). Stated in another fashion, the raw blurred image is extended around the border by padding the boundary with zeros (“0's”). Next, an array M is created such that M(m)=1 for pixel locations corresponding to the field of view of the blurred image before it got extended, and M(m)=0 at the locations corresponding to the extension. Next the following Equation is computed
  • α ( n ) = m h ( m , n ) M ( m ) . Eq . 14
  • In Equation 14, α is the blurred mask that results from the blurring of mask M. Subsequently, from the α mask, another mask β is created. More specifically, mask β is the reciprocal of the α mask, except that a small constant σ is chosen and set to prevent division by zero (“0”) in Equation 18,

  • β(n)=1/α(n) if α(n)>σ,  Eq.15

  • or

  • β(n)=0 otherwise.  Eq.16
  • As stated above, in these equations, the β mask is the reciprocal of the α mask.
  • Subsequently, the algorithm is initialized with f(0)=constant.
  • Next, the following iterations are computed until a given stopping criterion is reached:
  • g ( k ) ( m ) = n h ( m , n ) f ( k ) ( n ) , Eq . 17 f ( k + 1 ) ( n ) = β ( n ) ( m h ( m , n ) g ( m ) g ( k ) ( m ) ) f ( k ) ( n ) . Eq . 18
  • Finally, the resulting image is cropped to have the same field of view as the original blurred image. Stated in another fashion, the resulting image is cropped to remove the extended boundary.
  • If the PSF is spatially invariant, FFT can be used for fast implementation of this method. It can be used to compute both the Lucy-Richardson iterations,
  • g ( k ) = F - 1 HFf ( k ) Eq . 19 f ( k + 1 ) = β F - 1 H * F ( g g ( k ) ) , Eq . 20
  • and the array α becomes,

  • α=F−1H*FM.  Eq.21
  • As disclosed herein, in one embodiment, the masking method for the suppression of ringing around object edges based on the technique for masking out bad pixels described in Equations 9-13, and the border method for elimination of ringing around image borders described in Equations 17-21 can be combined into a single new method that reduces blurring and both (i) inhibits ringing around object edges, and (ii) inhibits ringing at image borders.
  • One key to combining both methods is the unique observation that the border method described in Equations 17-21 can be reformulated to use a mask in exactly the same way as the masking method described in Equations 9-13. In the border method, because g is the original image with an extended boundary padded by zeros (“0's”), and M has ones (“1's”) in the locations corresponding to the field of view of the original blurred image and zeros (“0's”) in the locations corresponding to the extension,

  • g=Mg  Eq.22
  • As a reminder, the multiplication of arrays here is element-wise, same as in all other formulas in this document. If the additional step is left aside in the border method in which the reciprocal mask β is created to prevent division by 0, then, by using Equation 14 and replacing g(m) by M(m)g(m), Equation 18 (for border method) becomes Equation 10 (for Lucy-Richardson method with masking), in which

  • w=M.  Eq.23
  • Thus, to combine the both methods, in M, the ones (“1's”) marking the field of view of the original blurred image are replaced with an edge mask that has values between 0 and 1. In this design, the pixels that are at or near object edges have weight close to 0, while pixels in smooth areas have weight close to 1. The resulting algorithm is described in more detail below.
  • FIG. 3 illustrates the blurred image 214, an expanded blurred image 360, an extended edge mask 362 and the resulting adjusted image 216. In the current method, an edge mask M is created from the blurred image 214. In the edge mask M, values in the mask are between 0 and 1. In one embodiment, pixels in smooth regions are assigned relatively high values (i.e. w(m)=1 or values closer to 1 than 0) while pixels in the edge regions get assigned relatively small values (i.e. w(m)=0 or values closer to 0 than 1). In the edge mask 362 illustrated in FIG. 3, for simplicity, only values of 0 and 1 are represented therein. It should be noted that the 1's in FIG. 3 can represent values that are closer to 1 than 0 while the O′s in FIG. 3 can represent values that are closer to 0 than 1. It should also be noted that the edge mask 362 has been extended by padding with zeros (“0's”).
  • Further, the blurred image 214 has been expanded to create the extended blurred image (g) 360. If the blurred image 214 is used as the initial guess for the iterations, padding that does not introduce artificial edges needs to be used. For example, extension 368 of the image by symmetric reflection (as illustrated in FIG. 3) can be used.
  • Next, compute
  • α ( n ) = m h ( m , n ) M ( m ) . Eq . 24
  • Where α is the blurred mask that results from the blurring of mask M. Subsequently, mask β is created as the reciprocal of the α mask. A small constant σ is chosen and set to prevent division by zero (“0”) in Equation 25

  • β(n)=1/α(n) if α(n)>σ,  Eq.25

  • or

  • β(n)=0 otherwise.  Eq.26
  • Chose the initial guess, e.g. an initial array is utilized. For example, the initial array can be generated by extending the blurred image 360 using symmetric reflection or in another fashion that does not introduce artificial edges. Alternatively, the initial array can be a non-zero constant array 451 or a random array 452 as illustrated in FIG. 4. It should be noted that the arrays 451, 453 illustrated in FIG. 4 are merely a small, non-exclusive portion of possible arrays. In any case, the initial array will have the size and shape of the image plus the extension. Thus, for example, f(0)=constant, f(0)=random variable, or f(0)=g with symmetric extension.
  • Next, the Lucy-Richardson iterations with masking are repeated until a preselected stopping criterion is reached:
  • g ( k ) ( m ) = n h ( m , n ) f ( k ) ( n ) , Eq . 27 f ( k + 1 ) ( n ) = β ( n ) ( m h ( m , n ) M ( m ) g ( m ) g ( k ) ( m ) ) f ( k ) ( n ) . Eq . 28
  • Finally, resulting adjust image 216 is cropped to have the same field of view as the original blurred image 14.
  • This algorithm can also be combined with acceleration and damping techniques for Lucy-Richardson method. If the PSF is spatially invariant, then FFT can be used for fast implementation:
  • g ( k ) = F - 1 HFf ( k ) , Eq . 29 f ( k + 1 ) = β ( F - 1 H * F ( M g g ( k ) ) ) f ( k ) , Eq . 30
  • and α can be computed using Equation 21.
  • FIG. 5 illustrates another embodiment of a deblurring system 570 having features of the present invention. In this embodiment, the image apparatus 210 again captures the blurred image 214 (illustrated in FIG. 1). However, in this embodiment, the blurred image 214 is transferred to computer 572 (e.g. a personal computer) that includes a computer control system 518 (illustrated in phantom) that uses the deblurring method disclosed herein to deblur the blurred image 214 and provide the adjusted image 316 (illustrated in FIG. 3).
  • While the current invention is disclosed in detail herein, it is to be understood that it is merely illustrative of the presently preferred embodiments of the invention and that no limitations are intended to the details of construction or design herein shown other than as described in the appended claims.

Claims (19)

1. A method for reducing blurring in a blurred image, the method comprising the steps of:
creating an edge mask from the blurred image;
providing an extended initial array;
extending the edge mask;
performing Lucy-Richardson iterations, with masking using the extended edge mask and the extended initial array to create a deblurred image; and
cropping the deblurred image to remove the extended portions of the deblurred image.
2. The method of claim 1 wherein the step of performing Lucy-Richardson iterations, with masking, is done until a pre-selected stopping criterion is reached.
3. The method of claim 1 wherein the step of providing an extended initial array includes the step of extending the blurred image.
4. The method of claim 3 wherein the step of extending the blurred image is done with padding that does not introduce artificial edges.
5. The method of claim 3 wherein the step of extending the blurred image is done with symmetric reflection.
6. The method of claim 1 wherein the step of providing an extended initial array includes generating a constant array.
7. The method of claim 1 wherein the step of providing an extended initial array includes generating a random array.
8. The method of claim 1 wherein the step of extending the edge mask is done by padding with zeros.
10. The method of claim 1 wherein the step of performing Lucy-Richardson with masking includes computing
f ( k + 1 ) ( n ) = β ( n ) ( m h ( m , n ) M ( m ) g ( m ) g ( k ) ( m ) ) f ( k ) ( n )
wherein f represents a sharp image, M is the edge mask, β is a mask, h is a known point spread function, g is the blurred image, and m,n denote positions of elements within a two dimensional pixel array of the blurred image.
11. The method of claim 1 wherein the step of performing Lucy-Richardson iterations includes computing
α ( n ) = m h ( m , n ) M ( m ) ;
wherein M is the edge mask, α is a blurred mask that results from the blurring of mask M, h is a known point spread function, and m,n denote positions of elements within a two dimensional pixel array of the blurred image.
12. A method for reducing blurring in a blurred image, the method comprising the steps of:
creating an edge mask from the blurred image;
extending the edge mask;
providing an extended initial array;
performing Lucy-Richardson iterations, with masking using the extended edge mask and the extended initial array until a pre-selected stopping criterion is reached to create a deblurred image; and
cropping the deblurred image to remove the extended portions of the deblurred image.
13. The method of claim 12 wherein the step of providing an extended initial array includes the step of extending the blurred image.
14. The method of claim 13 wherein the step of extending the blurred image is done with padding that does not introduce artificial edges.
15. The method of claim 13 wherein the step of extending the blurred image is done with symmetric reflection.
16. The method of claim 12 wherein the step of providing an extended initial array includes generating a constant array.
17. The method of claim 12 wherein the step of providing an extended initial array includes generating a random array.
18. The method of claim 12 wherein the step of extending the edge mask is done by padding with zeros.
19. The method of claim 12 wherein the step of performing Lucy-Richardson with masking includes computing
f ( k + 1 ) ( n ) = β ( n ) ( m h ( m , n ) M ( m ) g ( m ) g ( k ) ( m ) ) f ( k ) ( n )
wherein f represents a sharp image, M is the edge mask, β is a mask, h is a known point spread function, g is the blurred image, and m,n denote positions of elements within a two dimensional pixel array of the blurred image.
20. The method of claim 12 wherein the step of performing Lucy-Richardson iterations includes computing
α ( n ) = m h ( m , n ) M ( m ) ;
wherein M is the edge mask, α is a blurred mask that results from the blurring of mask M, h is a known point spread function, and m,n denote positions of elements within a two dimensional pixel array of the blurred image.
US12/741,184 2008-04-16 2008-04-16 Method for deblurring an image that produces less ringing Active 2029-12-26 US8687909B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2008/004895 WO2009128798A1 (en) 2008-04-16 2008-04-16 Method for deblurring an image that produces less ringing

Publications (2)

Publication Number Publication Date
US20100266218A1 true US20100266218A1 (en) 2010-10-21
US8687909B2 US8687909B2 (en) 2014-04-01

Family

ID=41199345

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/741,184 Active 2029-12-26 US8687909B2 (en) 2008-04-16 2008-04-16 Method for deblurring an image that produces less ringing

Country Status (2)

Country Link
US (1) US8687909B2 (en)
WO (1) WO2009128798A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100141807A1 (en) * 2003-01-16 2010-06-10 Alex Alon Camera with image enhancement functions
US20110085743A1 (en) * 2009-10-13 2011-04-14 Sony Corporation Method and system for reducing ringing artifacts of image deconvolution
US20110164152A1 (en) * 2008-09-24 2011-07-07 Li Hong Image segmentation from focus varied images using graph cuts
US20110169979A1 (en) * 2008-09-24 2011-07-14 Li Hong Principal components analysis based illuminant estimation
CN103310413A (en) * 2012-03-13 2013-09-18 三星电子株式会社 A method and an apparatus for debluring non-uniform motion blur
CN103971319A (en) * 2013-05-13 2014-08-06 段然 Image processing method executed through computer
US8860838B2 (en) 2008-09-24 2014-10-14 Nikon Corporation Automatic illuminant estimation and white balance adjustment based on color gamut unions
US9013596B2 (en) 2008-09-24 2015-04-21 Nikon Corporation Automatic illuminant estimation that incorporates apparatus setting and intrinsic color casting information
US20150138441A1 (en) * 2013-09-05 2015-05-21 Arecont Vision, Llc. System and method for spatio temporal video image enhancement

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8606032B2 (en) * 2009-12-23 2013-12-10 Sony Corporation Image processing method, device and program to process a moving image
US9202265B2 (en) 2012-04-30 2015-12-01 Nikon Corporation Point spread function cost function with non-uniform weights
KR102457462B1 (en) 2018-03-06 2022-10-21 삼성전자주식회사 Apparatus and method for processing an image of electronic device
WO2022178683A1 (en) * 2021-02-23 2022-09-01 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method of generating target image, electrical device, and non-transitory computer readable medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4764015A (en) * 1986-12-31 1988-08-16 Owens-Illinois Television Products Inc. Method and apparatus for non-contact spatial measurement
US4916525A (en) * 1988-08-29 1990-04-10 Hughes Aircraft Company High definition TV system
US5351314A (en) * 1991-10-04 1994-09-27 Canon Information Systems, Inc. Method and apparatus for image enhancement using intensity dependent spread filtering
US5896469A (en) * 1996-06-28 1999-04-20 Dainippon Screen Mfg. Co., Ltd. Image sharpness processing method and apparatus
US5974174A (en) * 1996-09-26 1999-10-26 Victor Company Of Japan, Ltd. Picture-information processing apparatus
US20040200369A1 (en) * 2003-04-11 2004-10-14 Brady Thomas P. Method and system for printing press image distortion compensation
US20050041842A1 (en) * 2003-06-13 2005-02-24 Frakes David Harold Data reconstruction using directional interpolation techniques
US20050147313A1 (en) * 2003-12-29 2005-07-07 Dimitry Gorinevsky Image deblurring with a systolic array processor
US20080008472A1 (en) * 2002-11-05 2008-01-10 Dress William B Distribution optical elements and compound collecting lenses for broadcast optical interconnect
US7826678B2 (en) * 2006-06-30 2010-11-02 Primax Electronics Ltd. Adaptive image sharpening method
US8139884B2 (en) * 2008-01-24 2012-03-20 Asustek Computer Inc. Blur image adjusting method
US8159552B2 (en) * 2007-09-12 2012-04-17 Samsung Electronics Co., Ltd. Apparatus and method for restoring image based on distance-specific point spread function

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4764015A (en) * 1986-12-31 1988-08-16 Owens-Illinois Television Products Inc. Method and apparatus for non-contact spatial measurement
US4916525A (en) * 1988-08-29 1990-04-10 Hughes Aircraft Company High definition TV system
US5351314A (en) * 1991-10-04 1994-09-27 Canon Information Systems, Inc. Method and apparatus for image enhancement using intensity dependent spread filtering
US5896469A (en) * 1996-06-28 1999-04-20 Dainippon Screen Mfg. Co., Ltd. Image sharpness processing method and apparatus
US5974174A (en) * 1996-09-26 1999-10-26 Victor Company Of Japan, Ltd. Picture-information processing apparatus
US20080008472A1 (en) * 2002-11-05 2008-01-10 Dress William B Distribution optical elements and compound collecting lenses for broadcast optical interconnect
US20040200369A1 (en) * 2003-04-11 2004-10-14 Brady Thomas P. Method and system for printing press image distortion compensation
US20050041842A1 (en) * 2003-06-13 2005-02-24 Frakes David Harold Data reconstruction using directional interpolation techniques
US20050147313A1 (en) * 2003-12-29 2005-07-07 Dimitry Gorinevsky Image deblurring with a systolic array processor
US7826678B2 (en) * 2006-06-30 2010-11-02 Primax Electronics Ltd. Adaptive image sharpening method
US8159552B2 (en) * 2007-09-12 2012-04-17 Samsung Electronics Co., Ltd. Apparatus and method for restoring image based on distance-specific point spread function
US8139884B2 (en) * 2008-01-24 2012-03-20 Asustek Computer Inc. Blur image adjusting method

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8126287B2 (en) * 2003-01-16 2012-02-28 DigitalOptics Corporation International Camera with image enhancement functions
US20100141807A1 (en) * 2003-01-16 2010-06-10 Alex Alon Camera with image enhancement functions
US20110164152A1 (en) * 2008-09-24 2011-07-07 Li Hong Image segmentation from focus varied images using graph cuts
US20110169979A1 (en) * 2008-09-24 2011-07-14 Li Hong Principal components analysis based illuminant estimation
US8860838B2 (en) 2008-09-24 2014-10-14 Nikon Corporation Automatic illuminant estimation and white balance adjustment based on color gamut unions
US9013596B2 (en) 2008-09-24 2015-04-21 Nikon Corporation Automatic illuminant estimation that incorporates apparatus setting and intrinsic color casting information
US9025043B2 (en) 2008-09-24 2015-05-05 Nikon Corporation Image segmentation from focus varied images using graph cuts
US20110085743A1 (en) * 2009-10-13 2011-04-14 Sony Corporation Method and system for reducing ringing artifacts of image deconvolution
US8588544B2 (en) * 2009-10-13 2013-11-19 Sony Corporation Method and system for reducing ringing artifacts of image deconvolution
US9042673B2 (en) * 2012-03-13 2015-05-26 Samsung Electronics Co., Ltd. Method and apparatus for deblurring non-uniform motion blur in large scale input image based on tile unit
CN103310413A (en) * 2012-03-13 2013-09-18 三星电子株式会社 A method and an apparatus for debluring non-uniform motion blur
US20130243319A1 (en) * 2012-03-13 2013-09-19 Postech Academy-Industry Foundation Method and apparatus for deblurring non-uniform motion blur in large scale input image based on tile unit
CN103971319A (en) * 2013-05-13 2014-08-06 段然 Image processing method executed through computer
US20150138441A1 (en) * 2013-09-05 2015-05-21 Arecont Vision, Llc. System and method for spatio temporal video image enhancement
US20150170342A1 (en) * 2013-09-05 2015-06-18 Arecont Vision,LLC. System and method for spatio video image enhancement
US9202263B2 (en) * 2013-09-05 2015-12-01 Arecont Vision, Llc System and method for spatio video image enhancement
US9262811B2 (en) * 2013-09-05 2016-02-16 Arecont Vision, Llc. System and method for spatio temporal video image enhancement

Also Published As

Publication number Publication date
US8687909B2 (en) 2014-04-01
WO2009128798A1 (en) 2009-10-22

Similar Documents

Publication Publication Date Title
US8687909B2 (en) Method for deblurring an image that produces less ringing
Zhang et al. Zero-shot restoration of back-lit images using deep internal learning
US9858651B2 (en) Electronic device and method in an electronic device for processing image data
JP5756099B2 (en) Imaging apparatus, image processing apparatus, image processing method, and image processing program
US8390704B2 (en) Image deblurring using a spatial image prior
US7496287B2 (en) Image processor and image processing program
EP2987134B1 (en) Generation of ghost-free high dynamic range images
KR101341096B1 (en) apparatus and method for restoring image
JP5237978B2 (en) Imaging apparatus and imaging method, and image processing method for the imaging apparatus
US20110019932A1 (en) Device and method for estimating defocus blur size in an image
US20110110601A1 (en) Method and device for image deblurring using joint bilateral filtering
CN110365894B (en) Method for image fusion in camera device and related device
EP1231778A2 (en) Method and system for motion image digital processing
WO2011096157A1 (en) Imaging device and method, and image processing method for imaging device
Kotera et al. Motion estimation and deblurring of fast moving objects
Lee et al. Video deblurring algorithm using accurate blur kernel estimation and residual deconvolution based on a blurred-unblurred frame pair
EP3269135B1 (en) Image processing method, image processing device, and image pickup apparatus
US20140078321A1 (en) Motion blur estimation and restoration using light trails
Ohkoshi et al. Blind image restoration based on total variation regularization and shock filter for blurred images
CN110992284A (en) Image processing method, image processing apparatus, electronic device, and computer-readable storage medium
Kartalov et al. A real time global motion compensation for multi-exposure imaging algorithms
Qian Image enhancement methods and applications in computational photography
Shaojie Computational low-light flash photography
Ren et al. Removing shift-variant motion blur from an image using Poisson interpolation
Song et al. Removing artifacts using gradient projection from a single image

Legal Events

Date Code Title Description
AS Assignment

Owner name: NIKON CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TEZAUR, RADKA;REEL/FRAME:024326/0704

Effective date: 20080407

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8