WO2011070755A1 - Dispositif d'imagerie et son procédé de commande - Google Patents

Dispositif d'imagerie et son procédé de commande Download PDF

Info

Publication number
WO2011070755A1
WO2011070755A1 PCT/JP2010/007058 JP2010007058W WO2011070755A1 WO 2011070755 A1 WO2011070755 A1 WO 2011070755A1 JP 2010007058 W JP2010007058 W JP 2010007058W WO 2011070755 A1 WO2011070755 A1 WO 2011070755A1
Authority
WO
WIPO (PCT)
Prior art keywords
period
displacement
image
exposure
imaging
Prior art date
Application number
PCT/JP2010/007058
Other languages
English (en)
Japanese (ja)
Inventor
河村 岳
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Priority to US13/147,894 priority Critical patent/US8576326B2/en
Priority to EP10835682.5A priority patent/EP2512117B1/fr
Priority to JP2011545075A priority patent/JP5367094B2/ja
Priority to CN201080006737.4A priority patent/CN102308569B/zh
Publication of WO2011070755A1 publication Critical patent/WO2011070755A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/38Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/097Digital circuits for control of both exposure time and aperture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics

Definitions

  • the present invention relates to an imaging apparatus and a control method thereof, and more particularly to an imaging apparatus that changes a focus position during an exposure period.
  • EDOF depth of field expansion
  • a blur in the depth direction is made uniform by inserting an optical element called a phase plate into the optical system.
  • an image restoration process is performed on the obtained image using a blur pattern measured in advance or a blur pattern calculated by simulation. Thereby, the method generates an EDOF image.
  • WFC Wavefront Coding
  • the second method measures the distance with high accuracy for each partial area of the image by devising the aperture shape.
  • image restoration processing is performed on each partial region using a blur pattern corresponding to each distance measured in advance. Thereby, the method generates an EDOF image.
  • This method is referred to as coded aperture (hereinafter referred to as CA) (see Non-Patent Document 2).
  • the third method is to convolve an image that is uniformly focused in the depth direction by moving the focus lens or image sensor during the exposure time (that is, synonymous with equalizing blur at each depth).
  • an image restoration process is performed on the obtained image using a blur pattern measured in advance or a blur pattern calculated by simulation. Thereby, the method generates an EDOF image.
  • This method is called “Flexible DOF” (hereinafter referred to as “F-DOF”) (Non-patent Document 3).
  • Non-Patent Document 4 a method of performing depth estimation or image sharpness detection using on-axis chromatic aberration of the lens, and generating a sharp image as a whole by image processing
  • Non-Patent Document 5 a method of performing depth estimation or image sharpness detection using on-axis chromatic aberration of the lens, and generating a sharp image as a whole by image processing
  • Non-Patent Document 5 a method of performing depth estimation or image sharpness detection using on-axis chromatic aberration of the lens, and generating a sharp image as a whole by image processing
  • Non-Patent Document 5 a method in which the blur in the depth direction is made uniform using the image, and the image restoration process is performed using the blur pattern measured in advance or the blur pattern calculated by simulation.
  • these methods have a drawback that the EDOF effect is small in principle compared with the above three methods.
  • Focal Stack has also existed for a long time.
  • a method called Focal Stack has also existed for a long time.
  • a plurality of images with different in-focus positions (focus positions) are photographed, and an area that seems to be in focus is extracted from each image.
  • the said system produces
  • this method since a large number of images are required, there are a problem that it takes a relatively long time for imaging and a problem that a large amount of memory is consumed.
  • the second CA of the three methods described above increases the distance measurement accuracy by using an unusual aperture shape. Due to such characteristics of this method itself, a specific frequency component of a captured image and an image obtained after restoration processing is lost. That is, this method has a drawback that the image quality is deteriorated. In addition, this method is generally not suitable for shooting in a dark place because the amount of light is smaller than that of a normal shooting method regardless of the aperture shape.
  • the third F-DOF among the above-mentioned three methods is a method that can obtain the best image quality, and has a high EDOF effect.
  • the off-axis characteristics also depend on the lens characteristics themselves, so it is easy to improve performance.
  • it is necessary to use an image side telecentric lens because the same subject needs to be folded on the same image position even if the focus position is moved during exposure.
  • Patent Documents 1 to 4 As a method of using F-DOF in a microscope application, there are disclosed a case where a sample side which is a subject is moved during exposure and a case where a lens barrel is moved. Note that if image restoration processing after exposure is assumed, it is reasonable to control the movement so that the image blur is always uniform, so that an image restoration processing method using a single blur pattern can be applied.
  • Patent Document 5 For this purpose, when the object to be moved is an image sensor, it is necessary to move it at a constant speed. Also when moving the focus lens, it is necessary to perform focus displacement corresponding to the movement of the imaging surface at a constant speed (Non-patent Document 3). It is known that the moving pattern may be from the back-side focusing end position to the near-side focusing end position, or vice versa.
  • EDOF technology examples include cameras mounted on mobile phones in recent years.
  • the camera can be downsized.
  • the EDOF effect can provide an omnifocal image (an image in which all subjects are in focus) without having an autofocus mechanism.
  • F-DOF itself is not adopted because it requires a mechanism to move the focus lens or image sensor, and a method using WFC or axial chromatic aberration is adopted. Yes.
  • FIG. 1 shows a configuration of an imaging apparatus 500 when the focus lens is displaced during the exposure period.
  • An imaging apparatus 500 shown in FIG. 1 includes an imaging element 1, a lens 2, a shutter 3, a focus lens displacement control unit 4, a shutter opening / closing instruction unit 5, a release receiving unit 6, and a focus lens initial position detection unit 7.
  • An exposure time determination unit 8 a focus lens position initialization unit 18, a synchronization management unit 10, an image restoration processing unit 11, a PSF storage unit 12, and an imaging data recording unit 13.
  • the lens 2 includes a focus lens 20 and other lens groups.
  • the focus lens initial position detection unit 7 detects the current position (initial position) of the focus lens 20. After the detection, the focus lens position initialization unit 18 displaces the position of the focus lens 20 to a predetermined end position, for example, the nearest end or the farthest end.
  • a distance closest to the imaging device 500 with respect to the imaging device 500 in the predetermined focusing range is set as the nearest end, and a distance farthest from the imaging device 500 is set as the farthest end.
  • the exposure time determination unit 8 determines shooting parameters such as a shutter speed and an aperture value.
  • the synchronization management unit 10 issues an exposure start instruction to the focus lens displacement control unit 4 and the shutter opening / closing instruction unit 5.
  • the synchronization management unit 10 is based on the end position of the focus lens 20 initialized by the focus lens position initialization unit 18. If the end position is the farthest end, an instruction is issued to move the focus lens 20 within the exposure period from the farthest end to the nearest end.
  • FIG. 3 is a diagram showing a state of initialization of the position of the focus lens 20 before exposure and displacement of the focal position (image plane side distance) on the imaging element surface during exposure. It is assumed that a displacement control instruction is issued to the focus lens 20 so that the in-focus position is displaced at a constant speed on the image sensor surface. As shown in FIG. 4, when the distances between the subject, the lens, and the image sensor are u and v, and the focal length is f, the following relationship (Equation 1) is generally established from the lens formula.
  • FIG. 5 shows the relationship between u and v when f is 18 [mm].
  • the image plane side distance v which is the distance between the lens principal point and the image sensor.
  • instructing the focus lens 20 to perform displacement control so that the in-focus position is displaced at a constant speed on the imaging element surface means that the changing speed of the image plane side distance v is constant.
  • the subject distance u which is the distance between the focal plane on the subject side and the lens principal point, is not displaced at a constant speed. Absent.
  • the vertical axis in FIG. 3 is the image plane side distance v. That is, it should be noted that the relationship between the exposure time and the subject distance u is opposite in magnitude from the relationship between the exposure time and the image plane side distance v. That is, the closest and farthest ends on the subject distance side are interchanged with respect to the image plane side.
  • FIG. 2 is a diagram showing a configuration of the image pickup apparatus 501 when the image pickup element is displaced during the exposure time.
  • 2 includes an image sensor 1, a shutter 3, a shutter opening / closing instruction unit 5, a release receiving unit 6, an exposure time determining unit 8, an image restoration processing unit 11, and a PSF storage unit 12.
  • the imaging data recording unit 13, the imaging device initial position detection unit 14, the synchronization management unit 16, the imaging device displacement control unit 17, and the imaging device position initialization unit 19 are provided. Elements similar to those in FIG. 1 are denoted by the same reference numerals, and redundant description is omitted.
  • the image sensor initial position detecting unit 14 detects the current position (initial position) of the image sensor 1. After the detection, the image sensor position initialization unit 19 displaces the position of the image sensor 1 to a predetermined end position, for example, the nearest end or the farthest end. Simultaneously with the initialization operation of the image sensor 1, the exposure time determination unit 8 determines shooting parameters such as a shutter speed and an aperture value. As soon as the above operation is completed, the synchronization management unit 16 issues an instruction to start exposure to the image sensor displacement control unit 17 and the shutter opening / closing instruction unit 5.
  • the synchronization management unit 16 is based on the end position of the image sensor 1 initialized by the image sensor position initialization unit 19. If the end position is the farthest end, an instruction is issued from the farthest end to the nearest end to displace the image sensor 1 within the exposure time. Note that the displacement speed of the image sensor 1 is constant.
  • German Patent No. 2301800 German Patent: Application 1973/1/15
  • Japanese Patent Publication No. 5-27084 Japanese Patent No. 3191928 US Patent Application Publication No. 2008/0013941 Japanese Patent No. 3084130
  • the F-DOF method is dominant among various EDOF methods for digital still cameras and digital video cameras.
  • it is required to continuously shoot without time lag between frames. Therefore, during moving image shooting, it is known that EDOF moving image shooting is possible by performing reciprocal displacement as shown in FIG. 6 and alternately allocating the forward path and the return path one video frame at a time.
  • the displacement pattern of the image pickup device or the displacement pattern of the focus lens as shown in FIG. 6 includes sharp turns at the nearest end and the farthest end, the feasibility is low.
  • such displacement control that generates a large torque is not practical in portable digital still cameras and digital video cameras.
  • the consumption of the drive unit becomes severe, and it is often unacceptable in terms of quality.
  • FIG. 7 schematically shows the depth of field at the time of normal photographing, that is, the focus range, and is called a through focus characteristic.
  • the vertical axis represents the sharpness (sharpness of the image) and is generally expressed in MTF (Modulation Transfer Function).
  • MTF Modulation Transfer Function
  • the horizontal axis is the image plane side distance. Further, when the horizontal axis is reversed in size and rescaled based on the relationship shown in FIG. 5, the subject distance is obtained. In normal shooting without displacement during exposure, generally, when focusing on a certain subject distance, that part becomes the sharpest, and those at the front and rear subject distance move away from the in-focus position As the sharpness is lost.
  • FIG. 8 schematically shows the through focus characteristic when displacement is performed at a constant speed during exposure as shown in FIG.
  • a dotted line represents a state in which the through focus characteristic at the time of normal photographing expressed by the schematic diagram of FIG. 7 is displaced from the farthest end to the nearest end.
  • a solid line indicates a through focus characteristic obtained as a result of the displacement.
  • Non-Patent Document 3 shows that this solid through-focus characteristic can be obtained by dotted line integration.
  • the conventional displacement method shown in FIG. 6 also includes a problem that the image quality of the subject position restored at the farthest end and the nearest edge deteriorates compared with the image quality restored at the center. ing.
  • the present invention solves the above-described conventional problems, and an object of the present invention is to provide an imaging apparatus and a control method thereof that can suppress the degradation of restored image quality at the farthest end and the nearest end in F-DOF.
  • an imaging apparatus is an imaging apparatus, and includes an imaging element, a lens that performs image formation and focusing on the imaging element, and the imaging element.
  • a displacement control unit for displacing an in-focus position on the subject side of the imaging apparatus by displacing an image plane side distance that is a distance from the lens; and an exposure time determination unit that determines an exposure time based on an imaging scene; The displacement in the frame period including the exposure period so that the focus position moves from one end to the other end of a predetermined focus range during the exposure period of the exposure time length.
  • a displacement pattern determination unit that determines a displacement pattern of the image plane side distance by the control unit, and the displacement pattern determination unit is configured such that a displacement speed of the image plane side distance is a speed during an acceleration period included in the frame period. From scratch, during the constant speed period after the acceleration period included in the frame period, the image plane side distance is displaced at a constant speed, and the deceleration period after the constant speed period included in the frame period. In the meantime, the displacement pattern is determined so that the displacement speed of the image plane side distance decreases to zero speed, and the length of each of the acceleration period and the deceleration period is 1/10 of the length of the frame period. That's it.
  • the imaging apparatus sharply increases the image plane side distance between the exposure period of a certain frame period and the exposure period of the next frame period. There is no need to displace. Furthermore, the imaging device according to an aspect of the present invention can bring the image sharpness at the farthest end and the nearest end (one end and the other end) of the focusing range closer to the sharpness of the intermediate region. As described above, the imaging apparatus according to one embodiment of the present invention can suppress degradation of the restored image quality at the farthest end and the nearest end in the F-DOF. Accordingly, the imaging device according to one embodiment of the present invention can uniform the sharpness of the image within the in-focus range, and thus can generate a high-quality EDOF image.
  • the length of each of the acceleration period and the deceleration period may be 1 ⁇ 4 or less of the length of the frame period.
  • the imaging apparatus can suppress a reduction in the restored image quality at the center.
  • the displacement pattern determination unit increases the displacement speed of the image plane side distance at a constant acceleration during the acceleration period, and decelerates the displacement speed of the image plane side distance during the deceleration period.
  • the displacement pattern may be determined such that the displacement pattern decreases.
  • the imaging apparatus can easily realize a mechanism for displacing the correct position according to the displacement pattern.
  • the frame period includes the exposure period, a first non-exposure period before the exposure period, and a second non-exposure period after the exposure period, and at least a part of the acceleration period is the It may be included in the first non-exposure period, and at least a part of the deceleration period may be included in the second non-exposure period.
  • the imaging apparatus can obtain the same EDOF restoration processing image quality within the in-focus range.
  • the displacement control unit may displace the image plane side distance by moving the position of the lens.
  • the focus lens is displaced acutely between the exposure period of a certain frame period and the exposure period of the next frame period. There is no need to do.
  • the displacement control unit may displace the image plane side distance by moving the position of the image sensor.
  • the in-focus position can be displaced by displacing the imaging element.
  • the imaging apparatus further uses the restoration PSF for the image data generated by the PSF storage unit for storing the restoration PSF (Point Spread Function) in advance and the imaging element.
  • the present invention can be realized not only as such an image pickup apparatus but also as a method for controlling an image pickup apparatus using characteristic means included in the image pickup apparatus as a step, and such characteristic steps in a computer. It can also be realized as a program to be executed. Needless to say, such a program can be distributed via a non-transitory computer-readable recording medium such as a CD-ROM and a transmission medium such as the Internet.
  • the present invention can be realized as a semiconductor integrated circuit (LSI) that realizes part of the functions of such an imaging apparatus.
  • LSI semiconductor integrated circuit
  • the present invention can provide an imaging apparatus and a control method thereof that can suppress degradation of the restored image quality at the farthest end and the nearest end in the F-DOF.
  • FIG. 1 is a block diagram of a conventional imaging apparatus.
  • FIG. 2 is a block diagram of a conventional imaging apparatus.
  • FIG. 3 is a diagram illustrating an example of a displacement pattern in a conventional imaging apparatus.
  • FIG. 4 is a diagram illustrating a positional relationship between the subject distance and the image plane side distance.
  • FIG. 5 is a graph showing an example of the relationship between the subject distance and the image plane side distance.
  • FIG. 6 is a diagram illustrating an example of a displacement pattern at the time of moving image shooting by a conventional imaging device.
  • FIG. 7 is a diagram showing through focus characteristics obtained with a general lens configuration.
  • FIG. 8 is a diagram showing a through focus characteristic obtained by a conventional F-DOF displacement.
  • FIG. 1 is a block diagram of a conventional imaging apparatus.
  • FIG. 2 is a block diagram of a conventional imaging apparatus.
  • FIG. 3 is a diagram illustrating an example of a displacement pattern in a conventional imaging apparatus.
  • FIG. 4 is
  • FIG. 9 is a block diagram of the imaging apparatus according to Embodiment 1 of the present invention.
  • FIG. 10 is a block diagram of the imaging apparatus according to Embodiment 1 of the present invention.
  • FIG. 11 is a diagram showing an example of a displacement pattern according to Embodiment 1 of the present invention.
  • FIG. 12 is a diagram showing an example of the displacement speed according to Embodiment 1 of the present invention.
  • FIG. 13 is a diagram showing the through focus characteristic according to Embodiment 1 of the present invention.
  • FIG. 14A is a diagram showing a relationship between acceleration / deceleration time and required acceleration according to Embodiment 1 of the present invention.
  • FIG. 14B is a diagram showing a relationship between the acceleration / deceleration time and the displacement speed at the central portion according to Embodiment 1 of the present invention.
  • FIG. 15A is a diagram showing the MTF after the restoration process at the end of the in-focus range according to Embodiment 1 of the present invention.
  • FIG. 15B is a diagram showing the MTF after the restoration process at the end of the in-focus range according to Embodiment 1 of the present invention.
  • FIG. 15C is a diagram showing the MTF after the restoration process at the end of the in-focus range according to Embodiment 1 of the present invention.
  • FIG. 15D is a diagram showing the MTF after the restoration process at the end of the in-focus range according to Embodiment 1 of the present invention.
  • FIG. 15A is a diagram showing the MTF after the restoration process at the end of the in-focus range according to Embodiment 1 of the present invention.
  • FIG. 15B is a diagram showing the MTF after the restoration
  • FIG. 16A is a diagram showing an MTF before restoration processing at the center of the in-focus range according to Embodiment 1 of the present invention.
  • FIG. 16B is a diagram showing an MTF before restoration processing at the center of the in-focus range according to Embodiment 1 of the present invention.
  • FIG. 16C is a diagram showing an MTF before restoration processing at the center of the in-focus range according to Embodiment 1 of the present invention.
  • FIG. 16D is a diagram showing an MTF before restoration processing at the center of the in-focus range according to Embodiment 1 of the present invention.
  • FIG. 17 is a diagram showing a variation of the displacement pattern according to Embodiment 1 of the present invention.
  • FIG. 18 is a flowchart of an imaging operation performed by the imaging apparatus according to Embodiment 1 of the present invention.
  • FIG. 19 is a block diagram of a modification of the imaging device according to Embodiment 1 of the present invention.
  • FIG. 20 is a block diagram of an imaging apparatus according to Embodiment 2 of the present invention.
  • FIG. 21 is a block diagram of a modification of the imaging apparatus according to Embodiment 2 of the present invention.
  • the imaging apparatus according to Embodiment 1 of the present invention decelerates or accelerates the displacement speed of the image plane side distance at the end of the focusing range. As a result, the imaging apparatus according to Embodiment 1 of the present invention does not need to acutely displace the image plane side distance at the end of the focusing range. Furthermore, the imaging apparatus according to Embodiment 1 of the present invention can bring the image sharpness at the end of the focusing range closer to the sharpness of the intermediate area. As described above, the imaging apparatus according to Embodiment 1 of the present invention can suppress the degradation of the restored image quality at the farthest end and the nearest end in the F-DOF.
  • FIG. 9 is a diagram showing a schematic configuration of the imaging apparatus 100 according to Embodiment 1 of the present invention.
  • 9 includes an exposure time determination unit 8, a displacement pattern determination unit 21, a displacement control unit 22, an image sensor 1, and a lens 2.
  • the lens 2 performs image formation and condensing on the image sensor 1.
  • the displacement control unit 22 displaces the in-focus position (subject distance u) on the subject side of the imaging apparatus 100 by displacing the relative distance (image plane side distance v) between the imaging element 1 and the lens 2.
  • the focus position on the subject side is also simply referred to as the focus position.
  • the exposure time determination unit 8 determines the exposure time based on the imaging scene.
  • FIG. 10 is a diagram illustrating a detailed configuration of the imaging apparatus 100 according to Embodiment 1 of the present invention.
  • the imaging apparatus 100 includes an imaging device 1, a lens 2, a shutter 3, a focus lens displacement control unit 4 ⁇ / b> A, a shutter opening / closing instruction unit 5, a release receiving unit 6, and a focus lens initial position.
  • the focus lens displacement control unit 4A is a specific example of the displacement control unit 22 shown in FIG.
  • the focus lens displacement control unit 4A moves the focus lens 20 by moving the position of the focus lens 20.
  • focus lens displacement pattern determination unit 9 is a specific example of the displacement pattern determination unit 21 shown in FIG.
  • the shutter 3 physically starts and ends exposure to the image sensor 1.
  • the shutter opening / closing instruction unit 5 instructs opening / closing of the shutter 3.
  • the lens 2 is composed of a focus lens 20 and other lens groups in order to focus on a desired subject.
  • the focus lens 20 may be composed of a plurality of lenses.
  • the focus lens 20 can be displaced relative to other lens groups, and the in-focus position is displaced by displacing the relative position.
  • the release accepting unit 6 accepts an exposure start instruction (instruction to release the shutter) from the user.
  • the focus lens initial position detection unit 7 detects the current position (initial position) of the focus lens 20 when the release reception unit 6 receives an exposure start instruction from the user.
  • the focus lens position initialization unit 18 initializes the position of the focus lens 20 based on the initial position after the initial position is detected by the focus lens initial position detection unit 7. Specifically, the focus lens position initialization unit 18 displaces the position of the focus lens 20 to a predetermined end position, for example, the nearest end or the farthest end.
  • a predetermined end position for example, the nearest end or the farthest end.
  • the distance closest to the imaging apparatus 100 with respect to the imaging apparatus 100 in the predetermined focusing range is set as the nearest end
  • the distance farthest from the imaging apparatus 100 is set as the farthest end.
  • the exposure time determination unit 8 immediately determines the exposure time based on the imaging scene when the release reception unit 6 receives an instruction to release the shutter. Further, simultaneously with the initialization operation of the focus lens 20, the exposure time determination unit 8 determines shooting parameters such as a shutter speed and an aperture value.
  • the focus lens displacement pattern determination unit 9 performs focusing during the exposure period of the exposure time determined by the exposure time determination unit 8 after the focus lens position initialization by the focus lens position initialization unit 18 is completed.
  • the focus lens displacement pattern determination unit 9 determines a displacement pattern as shown in FIG.
  • the frame period is one video frame period in moving image shooting or the shooting time of one still image based on the required burst rate during continuous shooting.
  • the focus lens displacement pattern determination unit 9 determines the displacement pattern of the focus lens 20 according to this displacement pattern, and notifies the synchronization management unit 10A of the displacement pattern. Then, the synchronization management unit 10A performs synchronization management at the start and end of exposure for the focus lens displacement control unit 4A and the shutter opening / closing instruction unit 5 based on the displacement pattern.
  • the synchronization management unit 10A detects that the displacement pattern of the focus lens 20 is determined by the focus lens displacement pattern determination unit 9 and the exposure time is determined by the exposure time determination unit 8, the synchronization management unit 10A immediately Then, an exposure start instruction is issued to the focus lens displacement control unit 4A and the shutter opening / closing instruction unit 5.
  • the shutter opening / closing instruction unit 5 performs control so that the shutter 3 opens as soon as an exposure start instruction is issued. After the exposure time has elapsed, the synchronization management unit 10A instructs the focus lens displacement control unit 4A to end the displacement of the focus lens 20, and simultaneously instructs the shutter opening / closing instruction unit 5 to end the exposure. The shutter opening / closing instruction unit 5 performs control so that the shutter 3 is closed as soon as an exposure end instruction is issued.
  • the formed optical image is converted into an image signal that is an electrical signal by the image sensor 1, and the converted image signal is sent to the image restoration processing unit 11. Moved.
  • the synchronization management unit 10A notifies the image restoration processing unit 11 that the exposure has been completed and that the focus displacement has been imaged by F-DOF.
  • the PSF storage unit 12 stores in advance a restoration PSF (Point Spread Function) for restoring an image signal.
  • a restoration PSF Point Spread Function
  • the image restoration processing unit 11 After receiving the image signal, the image restoration processing unit 11 reads the restoration PSF stored in the PSF storage unit 12 and performs image restoration processing on the image signal using the restoration PSF. Specifically, the PSF storage unit 12 holds a blur pattern due to focus displacement, which is measured in advance or obtained by simulation, as a restoration PSF. Various methods such as Wiener Filter and Lucy-Richardson are known as image restoration methods, but any method may be used.
  • the imaging data recording unit 13 records the restored image signal as imaging data.
  • FIG. 11 is a diagram showing an example of a displacement pattern of the image plane side distance v according to Embodiment 1 of the present invention.
  • FIG. 12 is a diagram showing the displacement speed of the image plane side distance v in this case.
  • the time until the displacement from the farthest end to the nearest end is the same as the exposure time of one video frame period (one frame period). is there.
  • one frame period includes an acceleration period, a subsequent constant speed period, and a subsequent deceleration period.
  • the focus lens displacement pattern determination unit 9 controls the displacement pattern of the focus lens 20 in order to control the displacement pattern of the image plane side distance v.
  • control may be performed so as to shift from the nearest end to the farthest end during the exposure period of the next video frame.
  • the moving image shooting method shown here may be used.
  • the displacement pattern within the exposure period of one video frame shown here is an example, and other displacement patterns may be used.
  • the displacement pattern is not limited to the one-way displacement pattern shown here, but is equivalent to a pattern that is displaced one-way after an integer number of round-trips, such as one half-way or two-way half-way within the exposure period of one video frame. The effect is obtained.
  • the displacement speed of the image plane side distance v increases with a constant acceleration during the acceleration period. However, this acceleration may not be constant. Similarly, in the above description, the displacement speed of the image plane side distance v decreases with a constant deceleration during the deceleration period, but this deceleration may not be constant. However, since it is easy to realize the displacement pattern, it is preferable to use constant acceleration and deceleration.
  • the acceleration / deceleration time t which is the length of each of the acceleration period and the deceleration period, is preferably 1/10 or more and 1/4 or less of one frame time T which is the length of one frame period. . The reason is described below.
  • FIG. 13 is a diagram showing through focus characteristics when the displacement pattern according to Embodiment 1 of the present invention is employed. As shown in FIG. 13, the staying time at the end of the focus range increases as compared to the center, so that the sharpness at the end of the focus range increases from the center. As a result, uniform sharpness can be obtained in a desired focusing range.
  • FIG. 14A is a graph showing the relationship between the acceleration / deceleration time t and the required acceleration a when the displacement distance on the image plane side is 100 ⁇ m and one frame time T is 1/60 second.
  • FIG. 14B is a graph showing the relationship between the acceleration / deceleration time t and the displacement speed v at the center under the same conditions.
  • the image plane side displacement distance Sd, the one frame time T, the acceleration / deceleration time t, the required acceleration a at the end of the focusing range, and the displacement speed v at the center satisfy the following relationship. .
  • acceleration / deceleration time t is set on the horizontal axis.
  • the required acceleration a is set on the vertical axis
  • the displacement speed v at the center is set on the vertical axis in FIG. 14B.
  • the acceleration / deceleration time t As shown in FIG. 14A, as the acceleration / deceleration time t is shortened, the acceleration a (deceleration) required at the end of the focusing range increases rapidly. That is, if the acceleration / deceleration time t is shortened, it is necessary to use a very large acceleration, which is difficult to realize.
  • FIG. 15A to FIG. 15D and FIG. 7A to FIG. 16D are experimental data specifically evaluating these phenomena quantitatively.
  • the focal length is set to 4.8 mm
  • the F value is set to 2.8
  • the in-focus position is set to infinity
  • the wavelength is set to 550 nm
  • the in-focus subject distance is shifted to 0.48 m around the infinity as the in-focus range.
  • the experiment was conducted. In this case, the displacement on the image plane side is 4.75 mm to 4.85 mm according to the lens formula.
  • 15A to 15D show the MTF before the restoration process (the MTF of the image obtained by simply displacing) and the MTF after the restoration process (infinite as the PSF for restoration) at 0.48 m which is the front end. 2 adopts a PSF obtained at the time of displacement when a subject is assumed.
  • FIG. Indicates the case of t 0.4T.
  • the sharpness at the end is improved by providing the acceleration period and the deceleration period. Also, until the acceleration / deceleration time t is lengthened to a certain extent, the sharpness at the end portion is further improved. However, even if the acceleration / deceleration time t is increased for a certain time or longer, the sharpness at the end portion is improved. It can be seen that does not change much (the improvement effect is reduced).
  • FIG. 16A to FIG. 16D show the MTF characteristics before the restoration process at the infinity position (that is, the EDOF central portion).
  • the spatial frequency is about 160 lpmm.
  • the spatial frequency is about 145 lpmm.
  • t 0.4T shown in FIG. Is about 130 lpmm.
  • this acceleration / deceleration time t is limited to 0.1 to 0.25 with respect to one frame time (exposure time) T.
  • At least the acceleration / deceleration time t is 1/10 or more of one frame time T, so that the restored image quality at the farthest end and the nearest end can be improved. Furthermore, by setting the acceleration / deceleration time t to 1 ⁇ 4 or less of the one frame time T, it is possible to suppress the deterioration of the restored image quality at the center.
  • the length of the acceleration period and the length of the deceleration period may be the same or different. In other words, the acceleration during the acceleration period and the deceleration during the deceleration period may be different.
  • an exposure period and a non-exposure period may be included in one frame period.
  • the exposure time of one video frame determined by the exposure time determination unit 8 or the exposure time of one sheet during still image continuous shooting is the video frame rate (1/30 or 1/60), or continuous shooting.
  • the shooting time of one still image based on the required speed is shorter.
  • the in-focus position is displaced by a predetermined amount from the actually required displacement amount (distance between the farthest end and the nearest end), and the required displacement amount (the farthest end and the nearest end).
  • the displacement is constant velocity.
  • control is performed so that the acceleration / deceleration time is included in the non-exposure time instead of staying at the end where the speed becomes zero as described above. That is, the acceleration period is included in a non-exposure period that is included in one frame period and is positioned before the exposure period, and the deceleration period is included in a non-exposure period that is included in one frame period and is positioned after the exposure period.
  • a part of the acceleration period may be included in the non-exposure period, and the other part of the acceleration period may be included in the exposure period.
  • a part of the deceleration period may be included in the non-exposure period, and the other part of the deceleration period may be included in the exposure period.
  • FIG. 18 is a flowchart of the imaging operation by the imaging apparatus 100.
  • the exposure time determination unit 8 determines the exposure time based on the imaging scene (S101).
  • the focus lens displacement pattern determination unit 9 moves the focus position from one end of the predetermined focus range to the other end during the exposure period of the exposure time determined in step S101.
  • the displacement pattern of the image plane side distance in the frame period including the exposure period is determined (S102).
  • the focus lens displacement pattern determining unit 9 determines the displacement pattern of the focus lens 20 according to the displacement pattern, and notifies the synchronization management unit 10A of the displacement pattern. Then, the synchronization management unit 10A performs synchronization management at the start and end of exposure for the focus lens displacement control unit 4A and the shutter opening / closing instruction unit 5 based on the displacement pattern. Thereby, the focus lens displacement control unit 4A displaces the image plane side distance within one frame period as in the displacement pattern determined in step S102 (S103).
  • the imaging apparatus 100 according to Embodiment 1 of the present invention can realize moving image and continuous still image shooting using EDOF adopting F-DOF by performing such control. Furthermore, the imaging apparatus 100 according to Embodiment 1 of the present invention can also improve the image quality at the end of the in-focus range.
  • imaging apparatus 100 may not include all the processing units illustrated in FIG.
  • FIG. 19 is a block diagram of an imaging apparatus 101 that is a modification of the imaging apparatus 100 according to the first embodiment. Its configuration and operation are substantially the same as in FIG. Compared with FIG. 10, the imaging apparatus 101 is characterized in that an image obtained by exposure is directly recorded in the imaging data recording unit 13. This is intended to realize the image restoration processing not in the imaging apparatus but in another apparatus in the subsequent stage, such as a personal computer, an image viewer, or a network server. Thereby, the imaging apparatus 101 can realize a significant reduction in the shutter time lag while suppressing the amount of calculation as the imaging apparatus, as compared with the configuration illustrated in FIG.
  • Embodiment 2 of the present invention an example in which the in-focus position is displaced by moving the position of the image sensor 1 will be described.
  • the schematic configuration of the imaging apparatus 200 according to Embodiment 2 of the present invention is the same as that shown in FIG.
  • FIG. 20 is a block diagram of the imaging apparatus 200 according to Embodiment 2 of the present invention.
  • the imaging apparatus 200 is different from the configuration of the imaging apparatus 100 shown in FIG. 10 in that the focus lens initial position detection unit 7, the focus lens position initialization unit 18, the focus lens displacement pattern determination unit 9, the synchronization management unit 10A, and the focus lens.
  • the displacement control unit 4A an image sensor initial position detection unit 14, an image sensor position initialization unit 19, an image sensor displacement pattern determination unit 15, a synchronization management unit 16A, and an image sensor displacement control unit 17A are provided.
  • symbol is attached
  • the imaging device 200 may include the lens 2.
  • the lens 2 is fixed, and the imaging element 1 can be displaced in a relative position with respect to the lens 2.
  • the image sensor displacement control unit 17A is a specific example of the displacement control unit 22 shown in FIG.
  • the image sensor displacement control unit 17 ⁇ / b> A displaces the in-focus position by moving the position of the image sensor 1.
  • the image sensor displacement pattern determination unit 15 is a specific example of the displacement pattern determination unit 21 shown in FIG.
  • the image sensor initial position detecting unit 14 detects the current position (initial position) of the image sensor 1.
  • the image sensor position initialization unit 19 initializes the position of the image sensor 1 based on the initial position after the initial position is detected by the image sensor initial position detection unit 14. Specifically, the image sensor position initialization unit 19 displaces the position of the image sensor 1 to a predetermined end position, for example, the nearest end or the farthest end.
  • the exposure time determination unit 8 immediately determines the exposure time based on the imaging scene when the release reception unit 6 receives an instruction to release the shutter. Further, simultaneously with the initialization operation of the image sensor 1, the exposure time determination unit 8 determines photographing parameters such as a shutter speed and an aperture value.
  • the image sensor displacement pattern determination unit 15 uses the exposure time information determined by the exposure time determination unit 8 after the image sensor position initialization unit 19 completes the initialization of the image sensor position, for example, FIG. A displacement pattern of the in-focus position as shown in FIG. Then, the image sensor displacement pattern determination unit 15 determines a displacement pattern of the image sensor in accordance with the displacement pattern at the in-focus position, and notifies the synchronization management unit 16A of the displacement pattern.
  • the synchronization management unit 16A performs synchronization management at the start and end of exposure for the image sensor displacement control unit 17A and the shutter opening / closing instruction unit 5.
  • the synchronization management unit 16A issues an instruction to start exposure to the image sensor displacement control unit 17A and the shutter opening / closing instruction unit 5.
  • the shutter opening / closing instruction unit 5 performs control so that the shutter 3 opens as soon as an exposure start instruction is issued. After a predetermined exposure time has elapsed, the synchronization management unit 16A instructs the shutter opening / closing instruction unit 5 to end the exposure.
  • the shutter opening / closing instruction unit 5 performs control so that the shutter 3 is closed as soon as an exposure end instruction is issued.
  • the formed optical image is converted into an image signal that is an electrical signal by the image sensor 1, and the converted image signal is sent to the image restoration processing unit 11. Moved.
  • the synchronization management unit 16A notifies the image restoration processing unit 11 that the exposure has been completed and that imaging of the imaging element displacement by F-DOF has been performed.
  • the other configuration conforms to the case of the focus lens displacement in FIG.
  • the displacement pattern of the imaging element conforms to the pattern of FIG. 11 shown in the first embodiment.
  • the configuration in which the imaging element 1 itself is displaced is the difference from the second embodiment, and the displacement pattern is common.
  • the acceleration / deceleration time t is limited to 0.1 to 0.25 with respect to one frame time T.
  • the imaging apparatus 200 according to the second embodiment of the present invention performs such control, so that the moving image and the continuous still image using the EDOF adopting the F-DOF, as in the first embodiment. Shooting can be realized. Furthermore, the imaging apparatus 200 according to Embodiment 2 of the present invention can simultaneously improve the image quality at the end of the focusing range.
  • imaging apparatus 200 may not include all the processing units illustrated in FIG.
  • FIG. 21 is a block diagram of an imaging apparatus 201 that is a modification of the imaging apparatus 200 according to Embodiment 2 of the present invention. Its configuration and operation are substantially the same as in FIG.
  • the imaging device 201 is characterized in that an image obtained by exposure is directly recorded in the imaging data recording unit 13 as compared with FIG. This is intended to realize the image restoration processing not in the imaging apparatus but in another apparatus in the subsequent stage, such as a personal computer, an image viewer, or a network server. Accordingly, the imaging apparatus 201 can realize a significant reduction in the shutter time lag while suppressing the amount of calculation as the imaging apparatus, as compared with the configuration illustrated in FIG.
  • the imaging device according to the embodiment of the present invention has been described above, but the present invention is not limited to this embodiment.
  • At least a part of the plurality of processing units included in the imaging apparatus according to the above embodiment is realized as an LSI that is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
  • circuits are not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
  • An FPGA Field Programmable Gate Array
  • reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
  • a part of the functions of the imaging apparatus according to the embodiment of the present invention may be realized by a processor such as a CPU executing a program.
  • the present invention may be the above program or a non-transitory computer-readable recording medium on which the above program is recorded.
  • the program can be distributed via a transmission medium such as the Internet.
  • the present invention can be applied to an imaging apparatus and a control method thereof, and in particular, can be applied to moving image shooting and continuous still image shooting using the F-DOF method.
  • the present invention is useful in fields such as consumer and commercial imaging devices (digital still cameras).

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

L'invention porte sur un dispositif d'imagerie (100) qui comprend une unité de commande du déplacement (22) qui déplace la position du foyer du dispositif d'imagerie (100) sur le côté du sujet d'imagerie par déplacement de la distance à la surface de l'image, et une unité de réglage du modèle de déplacement (21) qui règle un modèle de déplacement dans une période d'image à l'aide de l'unité de commande du déplacement (22). L'unité de réglage du modèle de déplacement (21) règle le modèle du déplacement de la position du foyer de telle manière que : durant la période d'accélération de la période d'image, la vitesse de déplacement de la distance à la surface de l'image croît à partir d'une vitesse nulle ; durant la période à vitesse constante, la distance à la surface de l'image est déplacée à une vitesse constante ; et durant la période de décélération, la vitesse de déplacement de la distance à la surface d'image décroît jusqu'à la vitesse nulle. La période d'accélération et la période de décélération représentent chacune au moins 1/10ème de la durée de la période d'image.
PCT/JP2010/007058 2009-12-07 2010-12-03 Dispositif d'imagerie et son procédé de commande WO2011070755A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US13/147,894 US8576326B2 (en) 2009-12-07 2010-12-03 Imaging apparatus and method of controlling the image depth of field
EP10835682.5A EP2512117B1 (fr) 2009-12-07 2010-12-03 Dispositif d'imagerie et son procédé de commande
JP2011545075A JP5367094B2 (ja) 2009-12-07 2010-12-03 撮像装置及びその制御方法
CN201080006737.4A CN102308569B (zh) 2009-12-07 2010-12-03 摄像装置以及其控制方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009277638 2009-12-07
JP2009-277638 2009-12-07

Publications (1)

Publication Number Publication Date
WO2011070755A1 true WO2011070755A1 (fr) 2011-06-16

Family

ID=44145318

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/007058 WO2011070755A1 (fr) 2009-12-07 2010-12-03 Dispositif d'imagerie et son procédé de commande

Country Status (5)

Country Link
US (1) US8576326B2 (fr)
EP (1) EP2512117B1 (fr)
JP (1) JP5367094B2 (fr)
CN (1) CN102308569B (fr)
WO (1) WO2011070755A1 (fr)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5588461B2 (ja) * 2009-12-07 2014-09-10 パナソニック株式会社 撮像装置および撮像方法
JP5635844B2 (ja) * 2010-09-06 2014-12-03 キヤノン株式会社 焦点調整装置および撮像装置
CN102934003B (zh) * 2011-04-15 2016-06-08 松下电器产业株式会社 摄像装置、半导体集成电路以及摄像方法
US8810712B2 (en) 2012-01-20 2014-08-19 Htc Corporation Camera system and auto focus method
US8890996B2 (en) * 2012-05-17 2014-11-18 Panasonic Corporation Imaging device, semiconductor integrated circuit and imaging method
US9264630B2 (en) * 2013-01-04 2016-02-16 Nokia Technologies Oy Method and apparatus for creating exposure effects using an optical image stabilizing device
US9467616B2 (en) * 2013-04-15 2016-10-11 Panasonic Intellectual Property Management Co., Ltd. Distance measurement apparatus and distance measurement method
US20160299339A1 (en) * 2013-12-05 2016-10-13 B.G. Negev Technologies And Applications Ltd., At Ben-Gurion University Method for extended depth of field imaging
US9449234B2 (en) 2014-03-31 2016-09-20 International Business Machines Corporation Displaying relative motion of objects in an image
US9300857B2 (en) * 2014-04-09 2016-03-29 International Business Machines Corporation Real-time sharpening of raw digital images
JP6543916B2 (ja) * 2014-11-07 2019-07-17 株式会社ニコン 交換レンズおよび撮像装置
JP6724288B2 (ja) * 2014-11-07 2020-07-15 株式会社ニコン 交換レンズ、カメラ本体およびカメラ
CN105491280A (zh) * 2015-11-23 2016-04-13 英华达(上海)科技有限公司 机器视觉中进行图像采集的方法及装置
DE102017123511A1 (de) * 2017-10-10 2019-04-11 Carl Zeiss Microscopy Gmbh Mikroskop und Verfahren zum Erzeugen eines mikroskopischen Bildes mit einer erweiterten Schärfentiefe
US11172112B2 (en) 2019-09-09 2021-11-09 Embedtek, LLC Imaging system including a non-linear reflector
KR102323136B1 (ko) * 2021-01-07 2021-11-10 주식회사 오엠에스 플로우스캔 방식을 적용한 3d edof 스캐닝 장치

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2301800A1 (de) 1973-01-15 1974-10-10 Leitz Ernst Gmbh Verfahren zur erweiterung des schaerfentiefebereiches bei der optischen und elektronenmikroskopischen abbildung
JPH038413A (ja) 1989-06-05 1991-01-16 Mitsubishi Heavy Ind Ltd 排ガス中二酸化硫黄の還元方法
JPH0527084B2 (fr) 1983-05-13 1993-04-20 Shimadzu Corp
JPH0983858A (ja) * 1995-09-12 1997-03-28 Canon Inc 撮像装置および撮像方法
JPH10257373A (ja) * 1997-01-10 1998-09-25 Olympus Optical Co Ltd 画像入力装置
JP3191928B2 (ja) 1988-02-23 2001-07-23 オリンパス光学工業株式会社 画像入出力装置
US20080013941A1 (en) 2006-07-14 2008-01-17 Micron Technology, Inc. Method and apparatus for increasing depth of field for an imager

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3905619C2 (de) 1988-02-23 2000-04-13 Olympus Optical Co Bildeingabe-/Ausgabevorrichtung
DE3931934C2 (de) 1988-10-03 1994-11-10 Olympus Optical Co Bild-Ein/Ausgabevorrichtung
GB8917224D0 (en) 1989-07-27 1989-09-13 Unilever Plc Dispensing system
JPH03191928A (ja) 1989-12-20 1991-08-21 Matsushita Electric Ind Co Ltd 電気湯沸かし器
JPH0411704A (ja) 1990-04-28 1992-01-16 Murata Mfg Co Ltd コイル部品
JPH0527084A (ja) 1991-07-25 1993-02-05 Toshiba Eng & Constr Co Ltd 燃料集合体のチヤンネルボツクス載置確認装置
JP3084130B2 (ja) 1992-05-12 2000-09-04 オリンパス光学工業株式会社 画像入力装置
US6587148B1 (en) 1995-09-01 2003-07-01 Canon Kabushiki Kaisha Reduced aliasing distortion optical filter, and an image sensing device using same
JP2003259194A (ja) * 2002-03-06 2003-09-12 Tamron Co Ltd 画像安定化装置
JP2004153497A (ja) 2002-10-30 2004-05-27 Kyocera Corp ディジタルカメラの自動露出制御システム
KR101002538B1 (ko) * 2002-11-26 2010-12-17 파나소닉 주식회사 광 디스크 드라이브, 광의 초점을 얻는 방법, 프로세서 및 프로그램을 기억한 컴퓨터 판독 가능한 기억 매체
JP4927005B2 (ja) * 2008-03-04 2012-05-09 日東光学株式会社 変化要因情報のデータの生成法および信号処理装置
WO2009120718A1 (fr) * 2008-03-24 2009-10-01 The Trustees Of Columbia University In The City Of New York Procédés, systèmes et supports pour commander une profondeur de champ dans des images
US8289440B2 (en) * 2008-12-08 2012-10-16 Lytro, Inc. Light field data acquisition devices, and methods of using and manufacturing same

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2301800A1 (de) 1973-01-15 1974-10-10 Leitz Ernst Gmbh Verfahren zur erweiterung des schaerfentiefebereiches bei der optischen und elektronenmikroskopischen abbildung
JPH0527084B2 (fr) 1983-05-13 1993-04-20 Shimadzu Corp
JP3191928B2 (ja) 1988-02-23 2001-07-23 オリンパス光学工業株式会社 画像入出力装置
JPH038413A (ja) 1989-06-05 1991-01-16 Mitsubishi Heavy Ind Ltd 排ガス中二酸化硫黄の還元方法
JPH0983858A (ja) * 1995-09-12 1997-03-28 Canon Inc 撮像装置および撮像方法
JPH10257373A (ja) * 1997-01-10 1998-09-25 Olympus Optical Co Ltd 画像入力装置
US20080013941A1 (en) 2006-07-14 2008-01-17 Micron Technology, Inc. Method and apparatus for increasing depth of field for an imager

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
A. LEVIN, R. FERGUS, F. DURAND, W. T. FREEMAN: "Image and Depth from a Conventional Camera with a Coded Aperture", ACM TRANSACTIONS ON GRAPHICS, vol. 26, no. 3, 2007, pages 70 - 1,70-9
C. TISSE, H. P. NGUYEN, R. TESIERES, M. PYANET, F. GUICHARD: "Extended Depth-of-field (EDOF) using sharpness transport across colour channels", OPTICAL ENGINEERING + APPLICATIONS, PART OF SPIE OPTICS + PHOTONICS, SESSION 1 -IMAGING IN THE OPTICAL DESIGN PROCESS: DEPTH OF FIELD, 2008
E. R. DOWSKI, W. T. CATHEY, APPLIED OPTICS, vol. 34, no. 11, 1995, pages 1859 - 1866
H. NAGAHARA, S. KUTHIRUMMAL, C. ZHOU, S. NAYAR: "Flexible Depth of Field Photography", EUROPEAN CONFERENCE ON COMPUTER VISION (ECCV), OCT. 16TH, MORNING SESSION 2: COMPUTATIONAL PHOTOGRAPHY, 2008
See also references of EP2512117A4
W. CHI, N. GEORGE: "Computational imaging with the logarithmic asphere: theory", vol. 20, December 2003, OPTICAL SOCIETY OF AMERICA
Y. TAKAHASHI, R. OBANA, S. KOMATSU: "Optimized phase mask for wave-front coding: Extended DOF in off axis field", OPTICS AND PHOTONICS JAPAN 2007, EXTENDED ABSTRACTS, 2007, pages 464 - 465
Y. TAKAHASHI, S. KOMATSU: "Optics Letters", vol. 33, July 2008, OPTICAL SOCIETY OF AMERICA, article "Optimized free-form phase mask for extension of depth of field in wavefront-coded imaging"

Also Published As

Publication number Publication date
US20110292275A1 (en) 2011-12-01
EP2512117B1 (fr) 2015-08-05
EP2512117A4 (fr) 2013-05-22
CN102308569B (zh) 2014-11-12
JPWO2011070755A1 (ja) 2013-04-22
CN102308569A (zh) 2012-01-04
JP5367094B2 (ja) 2013-12-11
EP2512117A1 (fr) 2012-10-17
US8576326B2 (en) 2013-11-05

Similar Documents

Publication Publication Date Title
JP5367094B2 (ja) 撮像装置及びその制御方法
JP5588461B2 (ja) 撮像装置および撮像方法
JP5911531B2 (ja) 光学機器
US9531944B2 (en) Focus detection apparatus and control method thereof
JP2007139893A (ja) 合焦検出装置
JP5882898B2 (ja) 撮像装置、撮像方法、集積回路、コンピュータプログラム
WO2012105222A1 (fr) Dispositif de restauration d'image, dispositif d'imagerie et procédé de restauration d'image
JP2006053545A (ja) 光学機器
JP6459958B2 (ja) 撮像装置および交換レンズ
WO2017154366A1 (fr) Dispositif et procédé de commande de filtre passe-bas et dispositif de capture d'image
JP2019095480A (ja) 撮像装置及びその制御方法、プログラム、記憶媒体
JP6451191B2 (ja) 画像処理装置および撮像装置
JP5409588B2 (ja) 焦点調節方法、焦点調節プログラムおよび撮像装置
JP2010145495A (ja) カメラシステム
JP2008211678A (ja) 撮像装置およびその方法
JP2006023653A (ja) 光学機器
JP2019203929A (ja) 電子機器、電子機器の制御装置、制御プログラムおよび制御方法
JP5773659B2 (ja) 撮像装置および制御方法
JP2016122950A (ja) 撮像装置
JP2009031562A (ja) 受光素子、受光装置、焦点検出装置、カメラ
JP2019057942A (ja) 画像処理装置および撮像装置
JP2018110287A (ja) 撮像装置、その制御方法、プログラム、及び記録媒体
JP2016224357A (ja) レンズ制御装置、光学機器およびレンズ制御プログラム
JP2009094662A (ja) 固体撮像装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080006737.4

Country of ref document: CN

ENP Entry into the national phase

Ref document number: 2011545075

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 13147894

Country of ref document: US

Ref document number: 2010835682

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10835682

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE