US9214011B2 - Camera defocus direction estimation - Google Patents
Camera defocus direction estimation Download PDFInfo
- Publication number
- US9214011B2 US9214011B2 US14/269,990 US201414269990A US9214011B2 US 9214011 B2 US9214011 B2 US 9214011B2 US 201414269990 A US201414269990 A US 201414269990A US 9214011 B2 US9214011 B2 US 9214011B2
- Authority
- US
- United States
- Prior art keywords
- image
- frequency domain
- determining
- defocus direction
- function
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 claims abstract description 50
- 230000006870 function Effects 0.000 claims description 55
- 230000008569 process Effects 0.000 claims description 20
- 238000012545 processing Methods 0.000 claims description 20
- 238000009826 distribution Methods 0.000 claims description 19
- 230000004044 response Effects 0.000 claims description 16
- 238000012549 training Methods 0.000 claims description 15
- 238000006243 chemical reaction Methods 0.000 claims description 12
- 230000009466 transformation Effects 0.000 claims description 7
- 238000000638 solvent extraction Methods 0.000 claims description 6
- 238000012935 Averaging Methods 0.000 claims description 5
- 230000007246 mechanism Effects 0.000 abstract description 8
- 238000012360 testing method Methods 0.000 abstract description 4
- 238000005516 engineering process Methods 0.000 description 21
- 238000004590 computer program Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 235000019557 luminance Nutrition 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G06T5/001—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/10—Image enhancement or restoration using non-spatial domain filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
Definitions
- This disclosure pertains generally to digital camera focus control, and more particularly to estimating the direction of defocus within a digital camera.
- the present disclosure describes a mechanism for readily determining the direction of defocus, which overcomes the shortcomings of previous defocus estimation techniques.
- Methods and apparatus are described for determining defocus direction, being either in-front, or behind, the focus plane of the object captured in an image by the camera. Rapidly determining this direction from a single image, allows autofocusing to be performed more readily, while it is applicable to other applications benefitting from a single image mechanism for determining defocus direction.
- Defocus direction is estimated from a frequency domain analysis of the camera defocus point spread functions (PSFs) of the captured image. Differences in the PSFs are evaluated in the frequency domain in relation to training images to estimate feature distributions. Statistics are then applied to make the determination whether the image was taken in-front, or behind, the focus plane for the object.
- PSFs camera defocus point spread functions
- FIG. 1 is a diagram of defocus direction and the ambiguity of lens position addressed according to embodiments of the present disclosure.
- FIG. 2 is a block diagram of a camera apparatus configured for performing camera defocus direction estimation according to an embodiment of the present disclosure.
- FIG. 3A and FIG. 3B are diagrams of image focus in relation to step edges utilized according to an embodiment of the present disclosure.
- FIG. 4A and FIG. 4B are images of point spread functions (PSFs) at 5 depths of field (DOFs) in front, and behind, the focus plane exemplifying a defocus direction problem resolved according to an embodiment of the present disclosure.
- PSFs point spread functions
- DOFs depths of field
- FIG. 5 is a plot of step edges convolved with point spread functions (PSFs) in front, and behind, the focus plane exemplifying a defocus direction problem resolved according to an embodiment of the present disclosure.
- PSFs point spread functions
- FIG. 6A and FIG. 6B are a plot and a magnified section thereof, respectively, of Fourier transform magnitude of point spread functions (PSFs) showing increased distinction in the frequency domain from which defocus direction is determined according to an embodiment of the present disclosure.
- PSFs point spread functions
- FIG. 7A and FIG. 7B are images of point spread functions (PSFs) of Fourier transforms in front and behind the focus plane demonstrating aspects of discerning defocus direction according to an embodiment of the present disclosure.
- PSFs point spread functions
- FIG. 8 is a plot of Fourier transform PSFs showing identification of frequency pairs utilized according to an embodiment of the present disclosure.
- FIG. 9 is a flow diagram of defocus direction estimation according to an embodiment of the present disclosure.
- FIG. 10 is a flow diagram of defocus direction estimation according to an embodiment of the present disclosure, showing increased particularity in relation to FIG. 9 .
- FIG. 11A and FIG. 11B is a flow diagram of defocus direction estimation according to an embodiment of the present disclosure, showing a level of particularity based on example equations.
- FIG. 1 illustrates a diagram 10 of the autofocus ambiguity issue in which the direction of defocus is ambiguous.
- An image sensor 12 is shown in relation to a first lens position 14 , yielding a first focus plane 16 , and a second lens position 18 and associated second focus position 20 .
- An ambiguity arises using traditional autofocus mechanisms in determining whether the lens is actually in the first position 14 or second position 18 in relation to object 22 .
- FIG. 2 illustrates an example embodiment 30 of an electronic device within which camera defocus direction estimation is performed, such as within an image capture device, such as a digital still and/or video camera.
- An imager 32 is shown for outputting collected images to a computer processor 36 (e.g., one or more central processing units (CPUs), microcontrollers, and/or digital signal processors (DSPs)), which is coupled to at least one memory 38 and optionally to auxiliary memory 40 , such as removable media.
- CPUs central processing units
- DSPs digital signal processors
- FIG. 1 Other elements are depicted for a conventional image capturing system, camera, including a focus/zoom control 34 , and interfaces shown by way of example as an optional image display 42 , optional touch screen 44 , and optional non-touch screen interface 46 , which exist on typical camera systems, although they are not necessary for practicing the present technology.
- Computer processor 36 in combination with memory 38 (and/or auxiliary memory 40 ) performs defocus direction estimation, which can be utilized, for example, within an autofocusing process of imaging device 30 .
- the defocus direction estimation is performed in response to instructions executed from memory 38 and/or auxiliary memory 40 .
- programming stored on memory 38 ( 40 ), is executable on computer processor 36 .
- the present technology is non-limiting with regard to the configuration of this memory, insofar as it is non-transitory in nature, and thus not constituting a transitory electronic signal.
- the present technology may include any form of computer-readable media, including those which are random access (e.g., RAM), require periodic refreshing (e.g., DRAM), those that degrade over time (e.g., EEPROMS, FLASH, disk media), or that store data for only short periods of time and/or only in the presence of power, with the only limitation being that the term “computer readable media” is not applicable to an electronic signal which is transitory.
- the technological teachings are not limited to the camera device exemplified in FIG. 2 , but may be utilized in any device configured for capturing and/or processing images, wherein information about the PSF of the image capture device can be obtained.
- Other devices upon which the present technology can be implemented include, but are not limited to: still cameras, video cameras, combination still and video cameras, camera equipped cell phones, camera equipped laptops/notebooks, scanners, security cameras, and applications for performing 2D to 3D image conversions.
- FIG. 3A depicts a condition 50 in which subject 52 is in focus, such that the captured image is the sharpest, as represented by the sharp contrast curve 54 , which is also referred to as the “edge profile” of the step edge.
- the calibration target or subject, preferably provides a mechanism for simply determining the sharpness of focus based on contrast.
- a clear step-edge delineation is made between at least two colors, shades, luminances, so that the sharpness of focus can be readily determined from the sharpness of the contrast profile.
- the target can be configured in any of a number of different ways, in a manner similar to the use of different chroma keys and color bar patterns in testing different aspects of video capture and output.
- FIG. 4A and FIG. 4B compare the point spread functions (PSFs) for a specific digital camera at two defocus positions, exemplified here as 5 DOF, in-front of the focus plane in FIG. 4A , and behind the focus plane in FIG. 4B .
- PSF point spread function
- PSF is the spatial domain version of the transfer function of the imaging system.
- FIG. 5 depicts an example plot of an observed signal for a step edge convolved with PSFs in front and behind the focus plane, of 5 DOF in-front of, and behind the focus plane. It will be noted that the difference between the two signals is difficult to discern, making it problematic to directly distinguish which one is captured in front of the focus plane and which one is captured behind the focus plane.
- FIG. 6A and FIG. 6B depict magnitude plots of Fourier transform of PSFs.
- the difference between the in-front or behind the focus plane becomes more prominent.
- FIG. 6A one can see a readily discernable distinction between the plots, while FIG. 6B depicts a magnified section of the plot in which the differences are even more readily apparent.
- FIG. 7A and FIG. 7B compare Fourier transforms of the PSFs at 5 DOF in-front of the focus plane in FIG. 7A , and 5 DOF behind the focus plane in FIG. 7B .
- Concentric circles are seen in the above Fourier transform images. The circle locations are different between the Fourier transform images in FIG. 7A representing defocus of 5 DOF in front of the focus position, and for FIG. 7B representing defocus of 5 DOF behind the focus position.
- These concentric circles correspond to local minimums and are at the same position as the local minimums seen in FIG. 6A and FIG. 6B .
- x denote the ideal image without defocus blur
- f denote the PSF
- y denote the observed defocused image
- ⁇ ⁇ circumflex over (x) ⁇ circumflex over (f) ⁇
- ⁇ , ⁇ circumflex over (x) ⁇ and ⁇ circumflex over (f) ⁇ are Fourier transforms of y, x and f, and convolution turns into product.
- the Fourier transforms of the image f (x,y) and the filter g(x,y) are F(u,v) and G(u,v), respectively, then in the Fourier domain the convolution operation becomes simple point-by-point multiplication f (x,y) ⁇ g (x,y) F(u,v)•G(u,v), as this can be utilized for speeding up convolution calculations.
- u and v are the frequency coordinates. Accordingly, a null frequency or local minimum of ⁇ circumflex over (f) ⁇ will result in a local minimum of ⁇ , regardless of the unknown ideal image x.
- the present technology utilizes the differences in frequency domain to identify whether the defocus occurs in front of, or behind, the focus plane.
- frequency pairs are first found where the difference between frequency responses of in-front and behind PSFs is large. It should be noted that frequency pairs are found with the help of the Fourier transform, and not after performing the Fourier transform.
- the terms f F and f B are used to denote the defocus PSF of in-front and behind focus plane respectively. Then the terms ⁇ circumflex over (f) ⁇ F and ⁇ circumflex over (f) ⁇ B to denote the corresponding Fourier transform of f F and f B .
- the magnitude of Fourier transform values is averaged at the same distance to the origin, yielding the following:
- N r is the number of pixels on the circle with radius r
- u and v are frequency coordinates.
- FIG. 8 depicts a magnitude plot of Fourier transform of PSFs showing four different frequency pairs marked with ellipses, the second pair of which illustrates r F2 and r B2 .
- this direction estimation feature is preferably determined as a ratio between radial components of Fourier transforms. These components may be averaged over the angle, however, this is not necessary because Fourier transforms are substantially symmetric as seen in FIG. 8 . Alternatively, other “features,” such as “frequency features” can be considered for describing the difference between “in-front” and “behind” patterns in frequency domain.
- the training images can be obtained by either convolving PSFs with ideal images or obtained directly from camera capture with known distance to focus.
- Direction estimation feature R i is then computed from each training image patch, and its sample average and standard deviation are utilized as estimates of mean ( ⁇ ) and standard ( ⁇ ) deviation for distribution of R i .
- the method obtains ( ⁇ Fi , ⁇ Fi ) and ( ⁇ Bi , ⁇ Bi ) so that the distribution of direction estimation feature R i for in front and behind focus can be described as:
- FIG. 9 through FIG. 11B describe defocus direction estimation according to the present disclosure at different levels of particularity.
- the image is partitioned into blocks 90 , then each block is converted from a spatial image function into a frequency domain function 92 , which is not limited to the use of Fourier transformation, as other forms of conversion can be utilized (e.g., discrete cosine transform).
- a frequency difference feature 94 is then calculated to determine whether the image was captured at an in-front or behind position in relation to the focal point for the target in the image.
- the image, or video frame is partitioned into blocks, as frequency analysis cannot be performed on a pixel level.
- the present technology can be applied to blocks of various sizes, and to a single block, or to all the blocks.
- the size of the block is important, because a block that is too small contains insufficient frequency information for determining defocus direction.
- the block size can be made too large, wherein multiple depths are contained within the block, complicating applications of the present technology.
- the block size utilized in an auto focus camera application may encompass the size of the focus window.
- block size selection is adjustable and it is automatically adjusted on the basis of a preliminary image analysis, such as regarding the extent of image detail. For example, if the image contains mainly flat areas, the block size can be enlarged. Alternatively, if many fine details exist in the image, the window (block size) can be made smaller. Block size can vary across the image, in response to the level of image detail.
- the present technological teachings can be applied to any device involved with the capturing of images or receiving of images from an image capture element/device.
- the teachings are particularly well-suited for use on any device containing a camera (i.e., image capture element/device), such as a still camera, video camera, cell phone containing a camera, laptop/notebook containing a camera, scanner, security cameras, and so forth. It should be appreciated, that for each case, information is required about the camera (e.g., its PSF function) that captured the image.
- FIG. 10 illustrates another example embodiment.
- the image (or frame) is partitioned into blocks 110 , within which frequency pairs in the PSF are found with largest difference in transform magnitude based on averaging captured image patches 112 .
- Fourier transform is described, other frequency transform mechanisms can be utilized without departing from the teachings of the present technology.
- a direction estimation feature is extracted 114 , and a probability analysis performed which determines 116 probability of in-front or behind focus position based on learned distribution of the direction estimation feature.
- FIG. 11A and FIG. 11B illustrate a very detailed description of this technology in relation to equations which were described in prior sections of the application.
- the image or frame is partitioned into blocks 130 .
- An average is taken on magnitude of Fourier transform performed on capture image portions (patches) 132 .
- Fourier transform is described, other transform mechanisms can be utilized as described in a prior section.
- FIG. 11B at least one pair of local minimum frequencies are found 134 which satisfy the given conditions.
- a direction estimation feature is extracted 136 , and distribution of the direction estimation feature is estimated 138 in response to training images.
- a probability analysis is performed 140 on the in-front or behind focus position based on probabilities of learned distribution of direction estimation features.
- Embodiments of the present technology may be described with reference to flowchart illustrations of methods and systems according to embodiments of the technology, and/or algorithms, formulae, or other computational depictions, which may also be implemented as computer program products.
- each block or step of a flowchart, and combinations of blocks (and/or steps) in a flowchart, algorithm, formula, or computational depiction can be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions embodied in computer-readable program code logic.
- any such computer program instructions may be loaded onto a computer, including without limitation a general purpose computer or special purpose computer, or other programmable processing apparatus to produce a machine, such that the computer program instructions which execute on the computer or other programmable processing apparatus create means for implementing the functions specified in the block(s) of the flowchart(s).
- blocks of the flowcharts, algorithms, formulae, or computational depictions support combinations of means for performing the specified functions, combinations of steps for performing the specified functions, and computer program instructions, such as embodied in computer-readable program code logic means, for performing the specified functions. It will also be understood that each block of the flowchart illustrations, algorithms, formulae, or computational depictions and combinations thereof described herein, can be implemented by special purpose hardware-based computer systems which perform the specified functions or steps, or combinations of special purpose hardware and computer-readable program code logic means.
- these computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the block(s) of the flowchart(s).
- the computer program instructions may also be loaded onto a computer or other programmable processing apparatus to cause a series of operational steps to be performed on the computer or other programmable processing apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable processing apparatus provide steps for implementing the functions specified in the block(s) of the flowchart(s), algorithm(s), formula(e), or computational depiction(s).
- An apparatus for determining defocus direction from an image comprising: a processor configured for processing an image which has been captured from an image capture element or device; programming executable on said processor for determining defocus direction of the image, said processing comprising: partitioning the image into blocks; converting a spatial image function of each said block of the image into a frequency domain function; and determining a frequency difference feature to indicate in-front or behind position of the image in relation to a correct focus position for that image.
- said apparatus comprises a camera device configured for still image capture, or for video image capture, or for a combination of still and video image capture.
- said apparatus comprises a device capable of capturing images selected from the group of electronic devices consisting of camera equipped cell phone, camera equipped laptop/notebook, scanner, security cameras.
- An apparatus for determining defocus direction from an image comprising: a processor configured for processing an image which has been captured from an image capture element or device; programming executable on said processor for determining defocus direction of the image, said processing comprising: partitioning the image into blocks; converting a spatial image function of each said block of the image into a frequency domain function; and determining a frequency difference feature to indicate in-front or behind position of the image in relation to a correct focus position for that image, performed in response to a statistical process estimating distribution of the difference feature on training images.
- said apparatus comprises a camera device configured for still image capture, or for video image capture, or for a combination of still and video image capture.
- said apparatus comprises a device capable of capturing images selected from the group of electronic devices consisting of camera equipped cell phone, camera equipped laptop/notebook, scanner, security cameras.
- a method of determining defocus direction from an image comprising: (a) partitioning an image into blocks within a device configured for video processing; (b) converting a spatial image function of each said block into a frequency domain function; and (c) determining a frequency difference feature to indicate in-front or behind position of the image, as a defocus direction, in relation to a correct focus position for at least one target within that image; (d) wherein said defocus direction of the image indicates whether the image was captured either in-front of a focus plane for a target object, or behind the focus plane of that target object.
- said device configured for video processing comprises a device capable of capturing images selected from the group of electronic devices consisting of still cameras, video cameras, combination still and video cameras, camera equipped cell phones, camera equipped laptops/notebooks, scanners and security cameras.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
y=x f (1)
and in the frequency domain this is:
ŷ={circumflex over (x)}·{circumflex over (f)} (2)
where ŷ, {circumflex over (x)} and {circumflex over (f)} are Fourier transforms of y, x and f, and convolution turns into product. It should be appreciated that if the Fourier transforms of the image f (x,y) and the filter g(x,y) are F(u,v) and G(u,v), respectively, then in the Fourier domain the convolution operation becomes simple point-by-point multiplication f (x,y)×g (x,y)F(u,v)•G(u,v), as this can be utilized for speeding up convolution calculations. In the above, u and v are the frequency coordinates. Accordingly, a null frequency or local minimum of {circumflex over (f)} will result in a local minimum of ŷ, regardless of the unknown ideal image x. Thus, the present technology utilizes the differences in frequency domain to identify whether the defocus occurs in front of, or behind, the focus plane.
where Nr is the number of pixels on the circle with radius r, while u and v are frequency coordinates. This averaging turns the two dimensional functions {circumflex over (f)}F (u,v) and {circumflex over (f)}B (u,v) into one dimensional functions
where a is a constant scalar which can be empirically obtained, for example in these demonstrations a is set to 4, and I is the number of frequency pairs.
where D=1 represents in front of focus plane, and D=0 represents behind focus plane. For any input testing image, direction estimation features can be determined {Ri}i=1, 2, . . . , I and the probability estimated of D=1 and D=0 as:
It is reasonable to assume prior probability p(D=1)=p(D=0)=0.5, since it is equally likely for defocus to occur in front or behind the focus plane.
then it has been estimated that defocus occurs in front of the focus plane, otherwise defocus is considered to occur behind the focus plane.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/269,990 US9214011B2 (en) | 2014-05-05 | 2014-05-05 | Camera defocus direction estimation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/269,990 US9214011B2 (en) | 2014-05-05 | 2014-05-05 | Camera defocus direction estimation |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150317770A1 US20150317770A1 (en) | 2015-11-05 |
US9214011B2 true US9214011B2 (en) | 2015-12-15 |
Family
ID=54355586
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/269,990 Active US9214011B2 (en) | 2014-05-05 | 2014-05-05 | Camera defocus direction estimation |
Country Status (1)
Country | Link |
---|---|
US (1) | US9214011B2 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11699327B2 (en) | 2021-11-17 | 2023-07-11 | Lnw Gaming, Inc. | Gaming machine and method with persistent award modifier triggered and modified by appearance of a catalyst symbol |
US11710370B1 (en) | 2022-01-26 | 2023-07-25 | Lnw Gaming, Inc. | Gaming machine and method with a symbol collection feature |
US11721165B2 (en) | 2021-11-18 | 2023-08-08 | Lnw Gaming, Inc. | Gaming machine and method with symbol redistribution feature |
US11741788B2 (en) | 2021-11-24 | 2023-08-29 | Lnw Gaming, Inc. | Gaming machine and method with symbol conversion feature |
US11804104B2 (en) | 2021-12-03 | 2023-10-31 | Lnw Gaming, Inc. | Gaming machine and method with value-bearing symbol feature |
US11875645B2 (en) | 2022-02-02 | 2024-01-16 | Lnw Gaming, Inc. | Gaming systems and methods for dynamic award symbols |
US11983983B2 (en) | 2022-01-20 | 2024-05-14 | Lnw Gaming, Inc. | Gaming machine and method with moving persistent symbols and win zone feature |
US12027018B2 (en) | 2022-05-18 | 2024-07-02 | Lnw Gaming, Inc. | Gaming system and method with symbol catalyst feature |
US12033472B2 (en) | 2021-12-14 | 2024-07-09 | Lnw Gaming, Inc. | Gaming machine and method with symbol array unlocking feature |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107545549B (en) * | 2017-07-21 | 2020-07-28 | 南京航空航天大学 | Method for estimating scattered focus point spread function based on one-dimensional spectrum curve |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110188780A1 (en) * | 2010-02-04 | 2011-08-04 | Sony Corporation | 2D to 3D Image Conversion Based on Image Content |
US20120243792A1 (en) | 2008-12-09 | 2012-09-27 | Mikhail Kostyukov | Detecting and Correcting Blur and Defocusing |
WO2012161829A2 (en) | 2011-02-25 | 2012-11-29 | Board Of Regents, The University Of Texas System | Focus error estimation in images |
US20130093850A1 (en) * | 2011-10-17 | 2013-04-18 | Novatek Microelectronics Corp. | Image processing apparatus and method thereof |
US20130100272A1 (en) * | 2011-10-25 | 2013-04-25 | Sanford-Burnham Medical Research Institute | Multifunction autofocus system and method for automated microscopy |
-
2014
- 2014-05-05 US US14/269,990 patent/US9214011B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120243792A1 (en) | 2008-12-09 | 2012-09-27 | Mikhail Kostyukov | Detecting and Correcting Blur and Defocusing |
US20110188780A1 (en) * | 2010-02-04 | 2011-08-04 | Sony Corporation | 2D to 3D Image Conversion Based on Image Content |
WO2012161829A2 (en) | 2011-02-25 | 2012-11-29 | Board Of Regents, The University Of Texas System | Focus error estimation in images |
US20130329122A1 (en) * | 2011-02-25 | 2013-12-12 | Board Of Regents, The University Of Texas System | Focus error estimation in images |
US20130093850A1 (en) * | 2011-10-17 | 2013-04-18 | Novatek Microelectronics Corp. | Image processing apparatus and method thereof |
US20130100272A1 (en) * | 2011-10-25 | 2013-04-25 | Sanford-Burnham Medical Research Institute | Multifunction autofocus system and method for automated microscopy |
Non-Patent Citations (1)
Title |
---|
Burge and Geisler, "Optimal defocus estimation in individual natural images," Proceedings of the National Academy of Sciences, Oct. 4, 2011, vol. 108, No. 40, pp. 16849-16854. |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11699327B2 (en) | 2021-11-17 | 2023-07-11 | Lnw Gaming, Inc. | Gaming machine and method with persistent award modifier triggered and modified by appearance of a catalyst symbol |
US11721165B2 (en) | 2021-11-18 | 2023-08-08 | Lnw Gaming, Inc. | Gaming machine and method with symbol redistribution feature |
US11741788B2 (en) | 2021-11-24 | 2023-08-29 | Lnw Gaming, Inc. | Gaming machine and method with symbol conversion feature |
US11804104B2 (en) | 2021-12-03 | 2023-10-31 | Lnw Gaming, Inc. | Gaming machine and method with value-bearing symbol feature |
US12033472B2 (en) | 2021-12-14 | 2024-07-09 | Lnw Gaming, Inc. | Gaming machine and method with symbol array unlocking feature |
US11983983B2 (en) | 2022-01-20 | 2024-05-14 | Lnw Gaming, Inc. | Gaming machine and method with moving persistent symbols and win zone feature |
US11710370B1 (en) | 2022-01-26 | 2023-07-25 | Lnw Gaming, Inc. | Gaming machine and method with a symbol collection feature |
US12039833B2 (en) | 2022-01-26 | 2024-07-16 | Lnw Gaming, Inc. | Gaming machine and method with a symbol collection feature |
US11875645B2 (en) | 2022-02-02 | 2024-01-16 | Lnw Gaming, Inc. | Gaming systems and methods for dynamic award symbols |
US12027018B2 (en) | 2022-05-18 | 2024-07-02 | Lnw Gaming, Inc. | Gaming system and method with symbol catalyst feature |
Also Published As
Publication number | Publication date |
---|---|
US20150317770A1 (en) | 2015-11-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9214011B2 (en) | Camera defocus direction estimation | |
EP3158532B1 (en) | Local adaptive histogram equalization | |
CN107409166B (en) | Automatic generation of panning shots | |
US9361680B2 (en) | Image processing apparatus, image processing method, and imaging apparatus | |
US9066002B2 (en) | System and method for utilizing enhanced scene detection in a depth estimation procedure | |
US8754963B2 (en) | Processing images having different focus | |
EP3783564A1 (en) | Image processing method, computer readable storage medium, and electronic device | |
US8891905B2 (en) | Boundary-based high resolution depth mapping | |
US7929804B2 (en) | System and method for tracking objects with a synthetic aperture | |
US20180276834A1 (en) | Methods and apparatus for controlling light field capture | |
CN108230333B (en) | Image processing method, image processing apparatus, computer program, storage medium, and electronic device | |
US8520953B2 (en) | Apparatus and method for extracting edges of image | |
JP5988068B2 (en) | System and method for performing depth estimation by utilizing an adaptive kernel | |
US9723197B2 (en) | Depth estimation from image defocus using multiple resolution Gaussian difference | |
EP3371741B1 (en) | Focus detection | |
Bailey et al. | Fast depth from defocus from focal stacks | |
KR20130061635A (en) | System and method for performing depth estimation utilizing defocused pillbox images | |
WO2020167314A1 (en) | Detection of projected infrared patterns using difference of gaussian and blob identification | |
US8564712B2 (en) | Blur difference estimation using multi-kernel convolution | |
US20160266348A1 (en) | Method for creating a camera capture effect from user space in a camera capture system | |
CN104754316A (en) | 3D imaging method and device and imaging system | |
KR20160024419A (en) | System and Method for identifying stereo-scopic camera in Depth-Image-Based Rendering | |
US20170345133A1 (en) | Image processing method and apparatus | |
CN112785651A (en) | Method and apparatus for determining relative pose parameters | |
US11842570B2 (en) | Image processing apparatus, image pickup apparatus, image processing method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WEI, JIANING;BERESTOV, ALEXANDER;SIGNING DATES FROM 20140429 TO 20140430;REEL/FRAME:033191/0331 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |