WO2009120718A1 - Procédés, systèmes et supports pour commander une profondeur de champ dans des images - Google Patents

Procédés, systèmes et supports pour commander une profondeur de champ dans des images Download PDF

Info

Publication number
WO2009120718A1
WO2009120718A1 PCT/US2009/038140 US2009038140W WO2009120718A1 WO 2009120718 A1 WO2009120718 A1 WO 2009120718A1 US 2009038140 W US2009038140 W US 2009038140W WO 2009120718 A1 WO2009120718 A1 WO 2009120718A1
Authority
WO
WIPO (PCT)
Prior art keywords
translating
image detector
image
integration period
detector
Prior art date
Application number
PCT/US2009/038140
Other languages
English (en)
Inventor
Shree K. Nayar
Hajime Nagahara
Sujit Kuthirummal
Changyin Zhou
Original Assignee
The Trustees Of Columbia University In The City Of New York
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Trustees Of Columbia University In The City Of New York filed Critical The Trustees Of Columbia University In The City Of New York
Publication of WO2009120718A1 publication Critical patent/WO2009120718A1/fr

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras

Definitions

  • the disclosed subject matter relates to methods, systems, and media for controlling depth of field in images.
  • the depth of field (DOF) of an image is the range of scene depths that appear focused in the image.
  • DOF depth of field
  • the DOF of an image can be increased by making the aperture of the camera smaller. However, this reduces the amount of light received by the detector, resulting in greater image noise (lower signal-to-noise ratio (SNR)).
  • SNR signal-to-noise ratio
  • the aperture of the lens must be opened up to maintain the SNR, which causes the DOF to be reduced. This trade-off gets worse with increases in spatial resolution (decreases in pixel size).
  • DOFs in images can be realized for images taken of dominant scene planes from non- perpendicular angles (e.g., such as an image of the surface of a desk when taken from a non- perpendicular angle). While some cameras have an image detector that is tilted with respect to the lens, in many modern cameras (such as in very thin cameras), such physical tilting is impracticable or impossible.
  • an image detector is translated so that an image incident on the image detector changes focus during at least a portion of an integration period of the image detector. An image is then captured at the image detector during the integration period.
  • FlG. 1 is a diagram showing the physical translation of an image detector relative to a lens in accordance with some embodiments.
  • FIG. 2 is a diagram of a spinning refractive element being used to provide effective translation of an image detector in accordance with some embodiments.
  • FIG. 3 is a diagram of an example of the basic geometry of a camera in accordance with some embodiments.
  • FIG. 4 is a diagram of an example camera in accordance with some embodiments.
  • FIGS. 6(a) and (b) are diagrams of calculated integrated point spread functions in accordance with some embodiments.
  • FIG. 7 is a diagram of an integrated point spread function measured using a proto-type camera in accordance with some embodiments.
  • FIGS. 8(a) and 8(b) are diagrams of a camera taking a picture of a scene having front, middle, and rear portions, and varying the points of focus of the camera during integration time in order to capture an image with discontinuous depths of field in accordance with some embodiments.
  • FIG. 9 is a diagram of scene and camera in which the dominant scene plane is inclined relative to the plane of the lens of the camera in accordance with some embodiments.
  • FlG. 10 is a diagram of hardware that can be used to implement some embodiments.
  • the position and/or orientation of an image detector in a camera can be physically and/or effectively varied prior to and/or during the integration time of the capture of an image.
  • the focal plane can be swept through a volume of a scene being captured causing all points within it to come into and go out of focus, while the image detector collects photons.
  • a scene 102 can be projected through a lens onto an image detector that is consecutively moving through positions 106, 108, 1 10, 1 12, and 1 14. At point 1 10, the scene is focused on the image detector and at other times the scene is defocused.
  • This varying of the position and/or orientation of a camera's image detector can enable the DOF of the camera to be controlled in various ways.
  • an extended depth of field can be effected in an image by moving an image detector with a global shutter (all pixels are exposed simultaneously and for the same duration) at a uniform (or nearly uniform) speed during image integration.
  • each scene point can be captured by the image detector under a continuous range of focus settings, including perfect focus.
  • the captured image can then be deconvolved with a single, known blur kernel to recover an image with significantly greater DOF.
  • a discontinuous depth of field can be effected in an image by moving a camera's global -shutter image detector non-uniformly In doing so, images that are focused for certain specified scene depths, but defocused for other (e.g., in-between) scene regions, can be captured.
  • images that are focused for certain specified scene depths, but defocused for other (e.g., in-between) scene regions, can be captured.
  • a scene that includes a person in the foreground, a landscape in the background, and a dirty window in between the two
  • a tilted depth of field can be effected in an image by uniformly translating an image detector with a rolling electronic shutter in which different rows or columns of the image detector are exposed at different time intervals.
  • a tilted image detector can be emulated without the need to physically tilt the image detector with respect to the lens.
  • an image can be captured with a tilted focal plane.
  • a non-planar image detector can be emulated. As a result, an image can be captured in which focus is maintained across a curved surface in the scene.
  • the focal plane of a camera can be swept through a large range of scene depths with a very small physical translation of the image detector. For instance, with a 12.5 mm focal length lens, the focal plane can be swept from a distance of 450 mm from the lens to infinity by physically translating the detector 360 microns. Because an image detector only weighs a few milligrams, a variety of micro-actuators (e.g., solenoids, piezoelectric stacks, ultrasonic transducers, DC motors, etc.) can be used to move an image detector over the required distance within a very short integration time (e.g., less than a millisecond if required). Examples of suitable micro-actuators are already used in many consumer cameras for focus control, aperture control, and lens stabilization. (0022] While physical translation of a camera's image detector can be used to control
  • manipulating the focus setting on a camera can be used to provide the same effect as physically translating the camera's image detector in some embodiments. More particularly, when the focus setting is changed, the distance between the image detector and the focal plane of the camera is also changed. Therefore, by changing the focus setting during image integration, translation of the detector along the optical axis can be emulated. In some embodiments, this change in focus setting can be achieved by controlling the electronics already present in the most cameras and/or lenses to realize auto-focus. For example, the motors that enable the lens to change focus setting during auto-focusing can be programmed so that the lens sweeps the focal plane through the scene during the integration time of a photograph.
  • the focal plane in order to emulate translating the detector with uniform speed, the focal plane has to be swept through the scene at non-uniform speed due to non-linearity of the thin lens law shown in equation 1 below.
  • a spinning refractive element of non-uniform thickness positioned between an imaging lens and an image detector of a camera can be used to provide the same effect as physically translating the camera's image detector.
  • a refractive clement 202 can be rotated about an axis 204 parallel to the lens' optical axis 206 - making several complete rotations within the integration time of a photograph.
  • the refractive element can be synchronized with the integration timing using any suitable technique, such as by detecting markers on the refractive element using a suitable optical detector.
  • the location of q 214 depends on the thickness of the slab at a corresponding moment in time. In this way, spinning a refractive element with smoothly varying thickness can result in an image sweeping every scene point through a continuous range of distances along the optical axis. Thus, while the image detector is not physically translated, the effect is the same as physically translating the detector along the optical axis.
  • An advantage of using a spinning refractive element rather than physically translating an image detector is that the refractive element can be kept spinning in the same manner across multiple photographs (or frames of video) whereas physical movement of an image detector requires the detector to move alternately toward and away from the lens.
  • physical movement of an image detector requires the detector to move alternately toward and away from the lens.
  • effective translation of an image detector can be achieved by capturing multiple images of a scene at different focus settings relative to the image detector and then calculating a weighted average image from the multiple captured images.
  • the different focus settings of the scene relative to the image detector can be realized by physically translating a camera's image detector, manipulating the focus setting on a camera, using a spinning refractive element as described above, and/or using any other suitable technique.
  • the weights can be chosen to mimic changing the distance between the lens and the image detector at constant speed.
  • references to the translation of an image detector should be understood to include physical translation of the image detector and effective translation of the image detector (such as by manipulating the focus on the camera, using a refractive element, and/or calculating a weighted average of a set of images captured at different focus settings as described above) unless otherwise indicated.
  • characteristics of the translation such as speed of translation, should be understood to include corresponding characteristics as applicable to the use of focus control, refractive elements, and the averaging of weighted images, such as the rate of change of focus.
  • FIG. 3 an example of the basic geometry 300 of a camera in accordance with some embodiments is illustrated.
  • a scene point M 302 there is a scene point M 302, a camera aperture 304, a lens 306, a focal plane 308, and a translated image detector plane 310.
  • Focal plane 308 is at a distance v 312 from lens 306, lens 306 has a focal length/ and camera aperture 304 has a diameter a 3 14
  • Scene point M 302 is imaged in perfect focus at m 316, if its distance // 318 from lens 306 satisfies the Gaussian lens law:
  • the distribution of light energy within the blur circle is referred to as the point spread function (PSF).
  • the PSF can be denoted as P ⁇ r, u, p), where r is the distance of an Docket No. 0315120.162-WO1
  • FIG. 4 shows an example of a camera 400 in accordance with some embodiments.
  • camera 400 can include a lens 402, an image detector 404, and a micro-actuator 406.
  • Lens 402 can be any suitable lens in some embodiments.
  • Image detector 404 can be any suitable image detector in some embodiments.
  • detector 404 can be a 1/3" SONY CCD with 1024x768 pixels and having a global shutter that can be used to implement extended DOF and discontinuous DOF.
  • detector 404 can be a 1/2.5" Micron CMOS detector with 2592x1944 pixels and a rolling shutter that can be used to implement tilted and curved DOFs.
  • Micro-actuator 406 can be any suitable micro-actuator in some embodiments.
  • micro-actuator 406 can be a PHYSIK INSTRUMENTE M- 1 1 1.1 DG translation stage.
  • detector 404 can be mounted to micro-actuator Docket No. 0315120.162-WO1
  • micro-actuator 406 to enable translation of the detector in a translation direction 408 aligned with the optical axis of lens 402.
  • micro-actuator 406 can include a DC motor actuator that can translate detector 404 through a 15 mm range at a top speed of 2.7 mm/sec and can position it with an accuracy of 0.05 microns.
  • FlG. 5 shows a table 500 illustrating examples of image detector translations
  • the detector can sweep very large depth ranges when moved by very small distances.
  • micro-actuators such as that illustrated above, such translations can be achieved within typical image integration times (a few milliseconds to a few seconds).
  • FIGS. 6(a) and 6(b) show examples of IPSFs for five scene points from 450 to
  • FlG. 7 is an example of an EDOF camera's IPSF measured for a 550 mm scene depth.
  • a discontinuous depth of field can be effected in an image by moving a camera's global-shutter image detector non-uniformly.
  • micro-actuator 406 can be controlled to position image detector 404 at one position along translation path 408, stay there some portion of a camera's integration time, then move to another position along translation path 408 and stay there for the remainder of the integration time.
  • Any suitable combinations of movement can be used for example, rather than stopping at two positions, in some embodiments, the micro-actuator can stop at any suitable numbers of positions.
  • a large aperture is used and the motion of a camera's image detector is controlled such that it first focuses on the star and arrow for a part of the integration time (as represented by period 814 in FlG. 8(b)), and then moves quickly to another location during period 816 in FlG. 8(b) to focus on the backdrop for the remaining portion of the integration time (as represented by period 818 in FlG. 8(b)), an image with all of the star, arrow, and backdrop in focus, and the mesh eliminated, can be obtained. While this image may include some blurring, it can capture the high frequencies in two disconnected DOFs - the foreground and the background - but almost completely eliminates the wire mesh in between. In some embodiments, this can be achieved without any post-processing. As mentioned above, in some embodiments, this approach is not limited to two disconnected DOFs; by pausing the detector at several locations during image integration, more complex DOFs can be realized.
  • a tilted depth of field can be effected in an image by uniformly translating an image detector with a rolling electronic shutter, without the need to physically tilt the image detector with respect to the lens.
  • a tilted image detector can be emulated. If this tilted detector makes an angle 0 with Docket No. 0315120.162-WO1
  • FlG. 9 shows an example of a scene where the dominant scene plane - a table top 902 with a cup 904 and a block 906 - is inclined at an angle of 53 degrees with respect to a lens plane 908 of a camera 910.
  • a normal camera is unable to focus on the entire plane.
  • a rolling-shutter detector e.g., a 1/2.5" Micron CMOS sensor with a 70 msec exposure lag between the first and last row of pixels in the sensor
  • a detector tilt of 2.6 degrees can be emulated and a desired DOF tilt of 53 degrees can be realized based on equation 8.
  • a camera can include a series of distance measuring mechanisms (e.g., such as laser range finders) that can be used to determine the dominant scene plane or curved surface so that an appropriate translation of the detector can be emulated.
  • distance measuring mechanisms e.g., such as laser range finders
  • any suitable number of detectors, or one or more sweeping detectors, can be used.
  • FIG. 10 illustrates an example of hardware 1000 that can be used in some embodiments.
  • hardware 1000 can include an image detector/shutter 1002, an analog to digital (A/D) converter 1004, a digital signal processor (DSP) 1006, a controller 1008, camera buttons/interface 1010, a digital to analog (D/A) converter 1012, a micro- actuator 1014, an auto-focus mechanism 1016, distance detectors) 1018, and a memory /interface 1020.
  • Image detector/shutter 1002 can be any suitable image detector such as a 1/3" SONY CCD with 1024x768 pixels and having a global shutter, or a 1/2.5" Micron CMOS detector with 2592x1944 pixels and a rolling shutter.
  • A/D converter 1004 can be any suitable mechanism for interfacing image detector/shutter to DSP 1006, such as an analog to digital converter.
  • DSP 1006 can be any suitable device for processing (e.g., deconvolving) images received from image detector 1002, such as a digital signal processor, a microprocessor, a computer, a central processing unit, a programmable logic device, Docket No. 0315120.162 -WOl
  • Controller 1008 can be any suitable device for controlling the operation of the remainder of hardware 1000 (e.g., controlling the movement of micro- actuator 1014, the operation of the auto-focus mechanism 1016, the spinning of a refractive element (not shown), the capturing of images by detector/shutter 1002, etc.), such as a digital signal processor, a microprocessor, a computer, a central processing unit, a programmable logic device, dedicated circuitry, etc.
  • DSP 1006 and controller 1008 can be combined or further broken down into sub-processors/controllers.
  • Camera buttons/interface can be any suitable buttons or interface for receiving control input from users or remote devices.
  • D/A converter 1012 can be any suitable mechanism for interfacing controller 1008 to micro-actuator 1014, such as a digital to analog converter.
  • Micro-actuator 1014 can be any suitable device for physically translating image detector/shutter 1002, such as a PHYSIK INSTRUMENTE M-1 1 U DG translation stage.
  • Auto-focus mechanism 1016 can be any suitable hardware and/or software for controlling the operation of the focus in a camera for any suitable purpose, for example, to provide effective translation of the image detector.
  • Distance detectors) 1018 can be any suitable mechanism for determining the distances to multiple points on a scene so that an angle of a scene plane or curve can be determined by controller 1008.
  • detector(s) 1018 can be laser range finders.
  • Memory /interface 1020 can be any suitable mechanism for storing images after processing (if any) by DSP 1006.
  • memory/interface 1020 can be non-volatile memory, a disk drive, an interface to an external device (such as a thumb drive, memory stick, a network server, or other storage or target devices for image transfers), a display (e.g., a display on a camera, computer, telephone, etc.), etc.
  • hardware 1000 can be implemented in any suitable device for capturing images and/or video, such as a portable camera, a video camera, a computer camera, a mobile telephone, a closed-circuit television camera, a security camera, an Internet Protocol camera, etc.
  • a portable camera such as a portable camera, a video camera, a computer camera, a mobile telephone, a closed-circuit television camera, a security camera, an Internet Protocol camera, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)

Abstract

L'invention porte sur des procédés, des systèmes et des supports pour commander une profondeur de champ dans des images. Dans certains modes de réalisation, un détecteur d'image est translaté de telle sorte qu'une image incidente sur le détecteur d'image change de foyer durant au moins une partie d'une période d'intégration du détecteur d'image. Une image est ensuite capturée au niveau du détecteur d'image durant la période d'intégration.
PCT/US2009/038140 2008-03-24 2009-03-24 Procédés, systèmes et supports pour commander une profondeur de champ dans des images WO2009120718A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US3880708P 2008-03-24 2008-03-24
US61/038,807 2008-03-24
US5240008P 2008-05-12 2008-05-12
US61/052,400 2008-05-12

Publications (1)

Publication Number Publication Date
WO2009120718A1 true WO2009120718A1 (fr) 2009-10-01

Family

ID=41114312

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/038140 WO2009120718A1 (fr) 2008-03-24 2009-03-24 Procédés, systèmes et supports pour commander une profondeur de champ dans des images

Country Status (1)

Country Link
WO (1) WO2009120718A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010131142A1 (fr) * 2009-05-12 2010-11-18 Koninklijke Philips Electronics N.V. Appareil photo, système comprenant un appareil photo, procédé de fonctionnement d'un appareil photo et procédé de déconvolution d'une image enregistrée
US20110292275A1 (en) * 2009-12-07 2011-12-01 Takashi Kawamura Imaging apparatus and method of controlling the same
EP2390720A3 (fr) * 2010-05-27 2012-03-07 Samsung Electro-Mechanics Co., Ltd Module d'appareil photographique
EP2511747A1 (fr) * 2009-12-07 2012-10-17 Panasonic Corporation Dispositif et procédé d'imagerie
CN102804751A (zh) * 2011-01-31 2012-11-28 松下电器产业株式会社 图像恢复装置、摄像装置以及图像恢复方法
WO2013162747A1 (fr) * 2012-04-26 2013-10-31 The Trustees Of Columbia University In The City Of New York Systèmes, procédés et supports destinés à assurer une refocalisation interactive dans des images
US8754975B2 (en) 2010-12-14 2014-06-17 Axis Ab Method and digital video camera for improving the image quality of images in a video image stream
WO2017144503A1 (fr) * 2016-02-22 2017-08-31 Koninklijke Philips N.V. Appareil de génération d'une image 2d synthétique avec une profondeur de champ améliorée d'un objet
DE102017220101A1 (de) 2016-11-23 2018-05-24 Mitutoyo Corporation Prüfsystem unter Verwendung von maschinellem Sehen zum Erhalten eines Bilds mit erweiterter Tiefenschärfe
US10178321B2 (en) 2013-11-27 2019-01-08 Mitutoyo Corporation Machine vision inspection system and method for obtaining an image with an extended depth of field
US10623627B2 (en) 2016-02-22 2020-04-14 Koninklijke Philips N.V. System for generating a synthetic 2D image with an enhanced depth of field of a biological sample

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6201899B1 (en) * 1998-10-09 2001-03-13 Sarnoff Corporation Method and apparatus for extended depth of field imaging
US6445415B1 (en) * 1996-01-09 2002-09-03 Kjell Olsson Increased depth of field for photography
US6873446B2 (en) * 2000-11-29 2005-03-29 Geoffrey Donald Owen Refractive optical deflector
US20060291844A1 (en) * 2005-06-24 2006-12-28 Nokia Corporation Adaptive optical plane formation with rolling shutter
US7336430B2 (en) * 2004-09-03 2008-02-26 Micron Technology, Inc. Extended depth of field using a multi-focal length lens with a controlled range of spherical aberration and a centrally obscured aperture

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6445415B1 (en) * 1996-01-09 2002-09-03 Kjell Olsson Increased depth of field for photography
US6201899B1 (en) * 1998-10-09 2001-03-13 Sarnoff Corporation Method and apparatus for extended depth of field imaging
US6873446B2 (en) * 2000-11-29 2005-03-29 Geoffrey Donald Owen Refractive optical deflector
US7336430B2 (en) * 2004-09-03 2008-02-26 Micron Technology, Inc. Extended depth of field using a multi-focal length lens with a controlled range of spherical aberration and a centrally obscured aperture
US20060291844A1 (en) * 2005-06-24 2006-12-28 Nokia Corporation Adaptive optical plane formation with rolling shutter

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TAPPEN M.F. ET AL.: "Exploiting the Sparse Derivative Prior for Super-Resolution and Image Demosaicing", THIRD INTEMATIONAL WORKSHOP ON STATISTICAL AND COMPUTATIONAL THEORIES OF VISION AT ICCV 2003, 2003, Retrieved from the Internet <URL:www.stat.ucla.edu/-yuille/meetings/2003_workshop.php]> *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010131142A1 (fr) * 2009-05-12 2010-11-18 Koninklijke Philips Electronics N.V. Appareil photo, système comprenant un appareil photo, procédé de fonctionnement d'un appareil photo et procédé de déconvolution d'une image enregistrée
CN102422629B (zh) * 2009-05-12 2015-04-29 皇家飞利浦电子股份有限公司 照相机、包括照相机的系统、操作照相机的方法和用于对记录的图像去卷积的方法
US8605202B2 (en) 2009-05-12 2013-12-10 Koninklijke Philips N.V. Motion of image sensor, lens and/or focal length to reduce motion blur
CN102422629A (zh) * 2009-05-12 2012-04-18 皇家飞利浦电子股份有限公司 照相机、包括照相机的系统、操作照相机的方法和用于对记录的图像去卷积的方法
EP2511747A1 (fr) * 2009-12-07 2012-10-17 Panasonic Corporation Dispositif et procédé d'imagerie
EP2511747A4 (fr) * 2009-12-07 2014-11-05 Panasonic Corp Dispositif et procédé d'imagerie
US20110292275A1 (en) * 2009-12-07 2011-12-01 Takashi Kawamura Imaging apparatus and method of controlling the same
US8576326B2 (en) * 2009-12-07 2013-11-05 Panasonic Corporation Imaging apparatus and method of controlling the image depth of field
EP2390720A3 (fr) * 2010-05-27 2012-03-07 Samsung Electro-Mechanics Co., Ltd Module d'appareil photographique
US8754975B2 (en) 2010-12-14 2014-06-17 Axis Ab Method and digital video camera for improving the image quality of images in a video image stream
CN102804751A (zh) * 2011-01-31 2012-11-28 松下电器产业株式会社 图像恢复装置、摄像装置以及图像恢复方法
EP2672696A4 (fr) * 2011-01-31 2015-07-08 Panasonic Corp Dispositif de restauration d'image, dispositif d'imagerie et procédé de restauration d'image
CN102804751B (zh) * 2011-01-31 2016-08-03 松下电器产业株式会社 图像恢复装置、摄像装置以及图像恢复方法
WO2013162747A1 (fr) * 2012-04-26 2013-10-31 The Trustees Of Columbia University In The City Of New York Systèmes, procédés et supports destinés à assurer une refocalisation interactive dans des images
US10582120B2 (en) 2012-04-26 2020-03-03 The Trustees Of Columbia University In The City Of New York Systems, methods, and media for providing interactive refocusing in images
US10178321B2 (en) 2013-11-27 2019-01-08 Mitutoyo Corporation Machine vision inspection system and method for obtaining an image with an extended depth of field
WO2017144503A1 (fr) * 2016-02-22 2017-08-31 Koninklijke Philips N.V. Appareil de génération d'une image 2d synthétique avec une profondeur de champ améliorée d'un objet
US10623627B2 (en) 2016-02-22 2020-04-14 Koninklijke Philips N.V. System for generating a synthetic 2D image with an enhanced depth of field of a biological sample
DE102017220101A1 (de) 2016-11-23 2018-05-24 Mitutoyo Corporation Prüfsystem unter Verwendung von maschinellem Sehen zum Erhalten eines Bilds mit erweiterter Tiefenschärfe

Similar Documents

Publication Publication Date Title
WO2009120718A1 (fr) Procédés, systèmes et supports pour commander une profondeur de champ dans des images
Nagahara et al. Flexible depth of field photography
CN109155842B (zh) 立体相机及立体相机的控制方法
CA2639527C (fr) Systeme de cameras de securite et methode d&#39;orientation de faisceaux pour modifier un champ de vision
US7215882B2 (en) High-speed automatic focusing system
EP1466210B1 (fr) Appareil photo numerique equipe d&#39;un viseur con u pour ameliorer la profondeur de champ des photographies
TW201126453A (en) Autofocus with confidence measure
US7907205B2 (en) Optical apparatus with unit for correcting blur of captured image caused by displacement of optical apparatus in optical-axis direction
JP4874668B2 (ja) オートフォーカスユニット及びカメラ
WO2006050430A2 (fr) Systeme de poursuite optique utilisant un objectif a focale variable
WO2006136894A1 (fr) Formation adaptative de plans optiques a l&#39;aide d&#39;un obturateur mecanique ou electronique
JP2007228005A (ja) デジタルカメラ
JP5938281B2 (ja) 撮像装置およびその制御方法ならびにプログラム
CN109564376A (zh) 时间复用可编程视场成像
US9100562B2 (en) Methods and apparatus for coordinated lens and sensor motion
JP2005321797A (ja) 画像安定化システムおよび方法
KR20100015320A (ko) 휴대용 카메라에서 안정화된 영상을 제공하기 위한 장치
US20110158617A1 (en) Device for providing stabilized images in a hand held camera
JP6128109B2 (ja) 撮影装置、撮影方向の制御方法及びプログラム
KR20220058593A (ko) 스마트한 파노라마 이미지를 획득하기 위한 시스템 및 방법
JP2014130131A (ja) 撮像装置、半導体集積回路および撮像方法
US20100128164A1 (en) Imaging system with a dynamic optical low-pass filter
US8582016B2 (en) Photographing apparatus and focus detecting method using the same
JP2007228007A (ja) デジタルカメラ
JP5656507B2 (ja) 撮影システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09724150

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09724150

Country of ref document: EP

Kind code of ref document: A1