WO2009104415A1 - 撮像装置 - Google Patents
撮像装置 Download PDFInfo
- Publication number
- WO2009104415A1 WO2009104415A1 PCT/JP2009/000740 JP2009000740W WO2009104415A1 WO 2009104415 A1 WO2009104415 A1 WO 2009104415A1 JP 2009000740 W JP2009000740 W JP 2009000740W WO 2009104415 A1 WO2009104415 A1 WO 2009104415A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- phase difference
- light
- lens
- image
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 262
- 238000006243 chemical reaction Methods 0.000 claims abstract description 20
- 238000003384 imaging method Methods 0.000 claims description 144
- 239000000758 substrate Substances 0.000 claims description 37
- 230000002093 peripheral effect Effects 0.000 claims description 3
- 230000005540 biological transmission Effects 0.000 abstract description 37
- 238000012937 correction Methods 0.000 description 68
- 230000003287 optical effect Effects 0.000 description 57
- 238000000034 method Methods 0.000 description 54
- 230000008569 process Effects 0.000 description 43
- 238000005375 photometry Methods 0.000 description 35
- 230000006870 function Effects 0.000 description 20
- 230000004048 modification Effects 0.000 description 16
- 238000012986 modification Methods 0.000 description 16
- 238000012545 processing Methods 0.000 description 16
- 230000008859 change Effects 0.000 description 7
- 239000003086 colorant Substances 0.000 description 6
- 238000012546 transfer Methods 0.000 description 6
- 238000002834 transmittance Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000011514 reflex Effects 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 239000006059 cover glass Substances 0.000 description 3
- 230000000994 depressogenic effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005530 etching Methods 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 229910004298 SiO 2 Inorganic materials 0.000 description 2
- 230000003321 amplification Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000005520 cutting process Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004907 flux Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 239000011295 pitch Substances 0.000 description 2
- 238000005498 polishing Methods 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 239000010409 thin film Substances 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000012212 insulator Substances 0.000 description 1
- 238000005304 joining Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/34—Systems for automatic generation of focusing signals using different areas in a pupil plane
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/64—Imaging systems using optical elements for stabilisation of the lateral and angular position of the image
- G02B27/646—Imaging systems using optical elements for stabilisation of the lateral and angular position of the image compensating for small deviations, e.g. due to vibration or shake
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14618—Containers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14621—Colour filter arrangements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14623—Optical shielding
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14627—Microlenses
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14643—Photodiode arrays; MOS imagers
- H01L27/14645—Colour imagers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14683—Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
- H01L27/14685—Process for coatings or optical elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/663—Remote control of cameras or camera parts, e.g. by remote control devices for controlling interchangeable camera parts based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/148—Charge coupled imagers
- H01L27/14831—Area CCD imagers
- H01L27/14843—Interline transfer
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L2924/00—Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
- H01L2924/0001—Technical content checked by a classifier
- H01L2924/0002—Not covered by any one of groups H01L24/00, H01L24/00 and H01L2224/00
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
Definitions
- the present invention relates to an image pickup apparatus including an image pickup element that performs photoelectric conversion.
- an image sensor such as a CCD (Charge-Coupled Device) image sensor or a CMOS (Complementary Metal-oxide Semiconductor) image sensor to convert a subject image into an electrical signal and digitize and record the electrical signal. It is popular.
- CCD Charge-Coupled Device
- CMOS Complementary Metal-oxide Semiconductor
- the digital single-lens reflex camera has a phase difference detection unit that detects a phase difference of a subject image, and thus has a phase difference detection AF function that performs autofocus (hereinafter also simply referred to as AF).
- AF phase difference detection AF function
- the defocus direction and the defocus amount can be detected, so that the moving time of the focus lens can be shortened and the focus can be quickly achieved (for example, Patent Document 1).
- a movable mirror configured to be able to advance / retreat on an optical path from a lens barrel to an image sensor is provided.
- a so-called compact digital camera adopts an autofocus function by video AF using an image sensor (for example, Patent Document 2).
- the compact digital camera is miniaturized by eliminating the mirror for guiding the light from the subject to the phase difference detection unit.
- autofocus can be performed while exposing the image sensor. That is, while performing autofocus, various processes using the image sensor, for example, obtaining an image signal from a subject image formed on the image sensor and displaying it on the image display unit provided on the back of the camera, It can be recorded in the recording unit.
- This autofocus function by video AF is generally advantageous in that it has higher accuracy than phase difference detection AF.
- the video AF cannot detect the defocus direction instantaneously as in the digital camera according to Patent Document 2.
- the contrast detection method AF the focus is detected by detecting the contrast peak, but if the focus lens is not moved back and forth from the current position, the direction of the contrast peak, that is, the defocus direction is detected. Can not do it. Therefore, it takes time for focus detection.
- the phase difference detection AF is more advantageous from the viewpoint of reducing the time required for focus detection.
- an imaging apparatus that employs phase difference detection AF such as a digital single-lens reflex camera according to Patent Document 1
- the lens barrel is moved to the imaging device. It is necessary to move a movable mirror on the optical path. Therefore, various processes using the image sensor cannot be performed while performing the phase difference detection method AF.
- the time required for focus detection can be shortened by the phase difference detection method AF, it is necessary to move the movable mirror when switching the path of the incident light between the path toward the phase difference detection unit and the path toward the image sensor. There is a problem that a time lag occurs for the movement of the movable mirror.
- the present invention has been made in view of such points, and an object of the present invention is to provide an imaging apparatus capable of detecting a phase difference while making light incident on an imaging element.
- the present inventor has configured the image sensor to allow light to pass, and arranges a phase difference detection unit for detecting a phase difference on the back side of the image sensor.
- the present invention has invented an imaging apparatus that can quickly detect a focus by a phase difference detection unit using light that has passed through the imaging element while exposing the imaging element. Then, the present inventor has come to make the present invention in order to realize such an imaging apparatus with a simple configuration.
- an imaging apparatus receives light and performs photoelectric conversion, and an imaging element configured to allow light to pass through, a holding unit that holds the imaging element, and the imaging element And a phase difference detector that detects the phase difference by receiving the light that has passed through the holding unit, and the holding unit is provided with a passing unit that passes the light that has passed through the image sensor, and passes through the image sensor. It is assumed that the received light passes through the holding part via the passage part and enters the phase difference detection part.
- the holding unit that holds the imaging element is provided with the passing unit that further passes the light that has passed through the imaging element, and the phase difference detection unit receives the light that has passed through the passing unit, thereby capturing an image.
- An imaging device capable of detecting a phase difference with a phase difference detection unit while making light incident on an element can be realized with a simple configuration.
- FIG. 1 is a cross-sectional view of an imaging unit according to Embodiment 1 of the present invention.
- FIG. 2 is a cross-sectional view of the image sensor.
- FIG. 3 is a plan view of the image sensor.
- FIG. 4 is a plan view of the phase detection unit.
- FIG. 5 is a perspective view of an imaging unit according to a modification.
- FIG. 6 is a cross-sectional view of an image sensor according to a modification.
- FIG. 7 is a cross-sectional view of an image sensor according to another modification.
- FIG. 8 is a cross-sectional view of a cross section corresponding to FIG. 1 of an imaging unit according to another modification.
- 9 is a cross-sectional view of a cross section orthogonal to the cross section corresponding to FIG.
- FIG. 10 is a block diagram of a camera according to Embodiment 2 of the present invention.
- FIG. 11 is a flowchart showing a flow until the release button is fully pressed in the photographing operation by the phase difference detection method AF.
- FIG. 12 is a flowchart showing a basic flow after the release button is fully pressed in each photographing operation including the phase difference detection method AF.
- FIG. 13 is a flowchart showing a flow until the release button is fully pressed in the photographing operation by the contrast detection method AF.
- FIG. 14 is a flowchart showing a flow until the release button is fully pressed in the shooting operation by the hybrid AF.
- FIG. 15 is a flowchart showing a flow until the release button is fully pressed in the photographing operation by the phase difference detection method AF according to the modification.
- FIG. 16 is a flowchart showing a flow until the release button is fully pressed in the photographing operation by the hybrid AF according to the modified example.
- FIG. 17 is a flowchart showing a flow until the release button is fully pressed in the shooting operation in the continuous shooting mode.
- FIG. 18 is a flowchart showing a flow after the release button is fully pressed in the shooting operation in the continuous shooting mode.
- FIG. 19 is a flowchart showing a flow until the release button is fully pressed in the shooting operation in the low contrast mode.
- FIG. 20 is a flowchart showing a flow until the release button is fully pressed in the photographing operation for switching the AF function depending on the type of the interchangeable lens.
- FIG. 21 is a block diagram of a camera according to Embodiment 3 of the present invention.
- 22A and 22B are explanatory diagrams for explaining the configuration of the quick return mirror and the light shielding plate.
- FIG. 22A is a retracted position
- FIG. 22B is a retracted position and a reflective position
- FIG. 23 is a flowchart showing a flow until the release button is fully pressed in the finder photographing mode.
- FIG. 24 is a flowchart showing a flow after the release button is fully pressed in the finder photographing mode.
- FIG. 25 is a flowchart showing a flow until the release button is fully pressed in the live view shooting mode.
- FIG. 26 is a flowchart showing a flow after the release button is
- Imaging unit Imaging device
- 210 210
- 310 Image sensor 11a Substrate 11b
- Light receiving part 17 Transmission part (thin part) 20,420
- Phase difference detection unit phase difference detection unit
- 31 Package (holding part) 31c Opening (passage part) 100, 200 camera (imaging device)
- Embodiment 1 of the Invention An imaging unit 1 as an imaging device according to the present invention is shown in FIG.
- the imaging unit 1 includes an imaging device 10 for converting a subject image into an electrical signal, a package 31 for holding the imaging device 10, and a phase difference detection unit 20 for performing focus detection by a phase difference detection method. have.
- the image sensor 10 is an interline CCD image sensor, and as shown in FIG. 2, a photoelectric conversion unit 11, a vertical register 12, a transfer path 13, a mask 14, and a color filter made of a semiconductor material. 15 and a microlens 16.
- the photoelectric conversion unit 11 includes a substrate 11a and a plurality of light receiving units (also referred to as pixels) 11b, 11b,... Arranged on the substrate 11a.
- the substrate 11a is composed of Si (silicon) base.
- the substrate 11a is composed of a Si single crystal substrate or an SOI (Silicon On Insulator wafer).
- the SOI substrate has a sandwich structure of an Si thin film and an SiO 2 thin film, and can stop the reaction in the SiO 2 layer in an etching process or the like, which is advantageous for stable substrate processing.
- the light receiving portion 11b is composed of a photodiode and absorbs light to generate electric charges.
- the light receiving portions 11b, 11b,... are respectively provided in minute square pixel regions arranged in a matrix on the substrate 11a (see FIG. 3).
- the vertical register 12 is provided for each light receiving unit 11b, and has a role of temporarily storing charges accumulated in the light receiving unit 11b. That is, the electric charge accumulated in the light receiving unit 11 b is transferred to the vertical register 12.
- the charges transferred to the vertical register 12 are transferred to a horizontal register (not shown) via the transfer path 13 and sent to an amplifier (not shown).
- the electric charge sent to the amplifier is amplified and taken out as an electric signal.
- the mask 14 is provided so as to cover the vertical register 12 and the transfer path 13 while exposing the light receiving unit 11b to the subject side, and prevents light from entering the vertical register 12 and the transfer path 13.
- the color filter 15 and the micro lens 16 are provided for each of the small square pixel regions corresponding to the light receiving portions 11b.
- the color filter 15 is for transmitting only a specific color, and a primary color filter or a complementary color filter is used. In this embodiment, as shown in FIG. 3, a so-called Bayer type primary color filter is used. That is, for the entire image sensor 10, when four color filters 15, 15,...
- Two green color filters that is, color filters having higher transmittance than the visible light wavelength region of colors other than green with respect to the green visible light wavelength region
- a diagonally red color filter that is, a color filter having a higher transmittance than the visible light wavelength range of colors other than red with respect to the red visible light wavelength range
- a blue color filter that is,
- a color filter 15b having a higher transmittance than the visible light wavelength region of colors other than blue with respect to the blue visible light wavelength region As a whole, every other green color filter 15g, 15g,... Is arranged vertically and horizontally.
- the microlens 16 collects light and makes it incident on the light receiving unit 11b.
- the microlens 16 can efficiently irradiate the light receiving portion 11b.
- the light condensed by the microlenses 16, 16,... Is incident on the color filters 15r, 15g, 15b, and only the light of the color corresponding to each color filter.
- the light is transmitted through the color filter and irradiated to the light receiving portions 11b, 11b,.
- Each light receiving portion 11b absorbs light and generates electric charges.
- the electric charge generated in each light receiving unit 11b is sent to the amplifier via the vertical register 12 and the transfer path 13 and is output as an electric signal.
- the received light amount of the color corresponding to each color filter is obtained as an output from the light receiving portions 11b, 11b,.
- the imaging device 10 converts the subject image formed on the imaging surface into an electrical signal by performing photoelectric conversion at the light receiving portions 11b, 11b,... On the entire imaging surface.
- a plurality of transmission parts 17, 17,... That transmit the irradiated light are formed on the substrate 11a.
- the transmission part 17 is formed by recessing a surface 11c opposite to the surface on which the light receiving part 11b is provided in the substrate 11a (hereinafter also simply referred to as a back surface) into a concave shape by cutting, polishing or etching, It is thinner than the periphery. More specifically, the transmission part 17 has a depressed surface 17a that is thinnest and inclined surfaces 17b and 17b that connect the depressed surface 17a and the rear surface 11c.
- the transmission part 17 in the substrate 11a By forming the transmission part 17 in the substrate 11a to a thickness that allows light to pass therethrough, a part of the light irradiated to the transmission part 17 out of the light irradiated to the photoelectric conversion part 11 is not converted into charges. Is transmitted through the photoelectric conversion unit 11. For example, by setting the thickness of the substrate 11a in the transmission part 17 to 2 to 3 ⁇ m, it is possible to transmit about 50% of the longer wavelength side than near infrared light.
- the inclined surfaces 17b and 17b are set to an angle at which light reflected by the inclined surface 17b when passing through the transmission unit 17 does not go to the condenser lenses 21a, 21a,. . By doing so, an image that is not a real image is prevented from being formed on the line sensor 24a described later.
- the transmission part 17 constitutes a thin part that transmits, that is, allows light incident on the image sensor 10 to pass therethrough.
- passing is a concept including “transmission”.
- the imaging device 10 configured in this way is held in a package 31 (see FIG. 1).
- the package 31 constitutes a holding unit.
- the package 31 is provided with a frame 32 on a flat bottom plate 31a and vertical walls 31b, 31b,.
- the image sensor 10 is mounted on the frame 32 so as to be covered on all sides by the standing walls 31b, 31b,... And is electrically connected to the frame 32 via bonding wires.
- a cover glass 33 is attached to the tips of the standing walls 31b, 31b,... Of the package 31 so as to cover the imaging surface of the imaging element 10 (surface on which the light receiving portions 11b, 11b,... Are provided).
- the cover glass 33 protects the image pickup surface of the image pickup device 10 from dust and the like.
- the same number of openings 31c, 31c,... are formed through the bottom plate 31a of the package 31 at positions corresponding to the transmissive parts 17, 17,. Yes.
- the openings 31c, 31c,... The light transmitted through the image sensor 10 reaches a phase difference detection unit 20 described later.
- This opening 31c constitutes a passage part.
- the opening 31c is not necessarily formed through the bottom plate 31a of the package 31. That is, as long as the light transmitted through the image sensor 10 reaches the phase difference detection unit 20, a configuration such as forming a transparent portion or a semi-transparent portion on the bottom plate 31a may be used.
- the phase difference detection unit 20 is provided on the back side (opposite side of the subject) of the image sensor 10 and receives the passing light from the image sensor 10 to detect the phase difference. Specifically, the phase difference detection unit 20 converts the received passing light into an electric signal for distance measurement. This phase difference detection unit 20 constitutes a phase difference detection unit.
- the phase difference detection unit 20 includes a condenser lens unit 21, a mask member 22, a separator lens unit 23, a line sensor unit 24, the condenser lens unit 21, the mask member 22, And a module frame 25 to which the separator lens unit 23 and the line sensor unit 24 are attached.
- the condenser lens unit 21, the mask member 22, the separator lens unit 23, and the line sensor unit 24 are arranged in this order from the image sensor 10 side along the thickness direction of the image sensor 10.
- the condenser lens unit 21 is formed by integrating a plurality of condenser lenses 21a, 21a,.
- the condenser lenses 21a, 21a,... are provided in the same number as the transmission parts 17, 17,.
- Each condenser lens 21 a is for condensing incident light, condenses the light that is transmitted through the image sensor 10 and is spreading, and guides it to a separator lens 23 a described later of the separator lens unit 23.
- Each condenser lens 21a has an incident surface 21b formed in a convex shape, and the vicinity of the incident surface 21b is formed in a cylindrical shape.
- the incident angle to the separator lens 23a is increased (the incident angle is reduced), so that the aberration of the separator lens 23a can be suppressed and the object image interval on the line sensor 24a described later can be reduced. Can be small. As a result, the separator lens 23a and the line sensor 24a can be reduced in size.
- the focus position of the subject image from the image pickup optical system is greatly deviated from the image pickup unit 1 (specifically, when it is greatly deviated from the image pickup device 10 of the image pickup unit 1), the contrast of the image is remarkably lowered.
- the condenser lens unit 21 may not be provided in the case of high-accuracy phase difference detection in the vicinity of the focal position, or when the dimensions of the separator lens 23a, the line sensor 24a, etc. are sufficient.
- the mask member 22 is disposed between the condenser lens unit 21 and the separator lens unit 23.
- two mask openings 22a and 22a are formed for each position corresponding to each separator lens 23a. That is, the mask member 22 divides the lens surface of the separator lens 23a into two regions, and only the two regions are exposed to the condenser lens 21a side. That is, the mask member 22 divides the light collected by the condenser lens 21a into two light fluxes and makes the light incident on the separator lens 23a.
- This mask member 22 can prevent harmful light from one separator lens 23a adjacent to the other separator lens 23a from entering.
- the mask member 22 need not be provided.
- the separator lens unit 23 has a plurality of separator lenses 23a, 23a,..., And these separator lenses 23a, 23a,.
- the separator lenses 23a, 23a,... are provided in the same number as the transmission parts 17, 17,.
- Each separator lens 23a forms two light fluxes incident through the mask member 22 on the line sensor 24a as two identical subject images.
- the line sensor unit 24 has a plurality of line sensors 24a, 24a,... And an installation section 24b for installing the line sensors 24a, 24a,.
- the number of line sensors 24a, 24a,... Is the same as the number of transmission parts 17, 17,.
- Each line sensor 24a receives an image formed on the imaging surface and converts it into an electrical signal. That is, the interval between two subject images can be detected from the output of the line sensor 24a, and the defocus amount (ie, defocus amount (Df amount)) of the subject image formed on the image sensor 10 based on the interval. It is possible to determine in which direction the focus is deviated (that is, the defocus direction) (hereinafter, the Df amount, the defocus direction, and the like are also referred to as defocus information).
- the condenser lens unit 21, the mask member 22, the separator lens unit 23, and the line sensor unit 24 configured as described above are disposed in the module frame 25.
- the module frame 25 is a member formed in a frame shape, and an attachment portion 25a that protrudes inward is provided on the inner periphery thereof.
- a first mounting portion 25b and a second mounting portion 25c are formed stepwise on the imaging element 10 side of the mounting portion 25a. Further, a third mounting portion 25d is formed on the side of the mounting portion 25a opposite to the image sensor 10.
- the mask member 22 is attached to the second attachment portion 25c of the module frame 25, and the condenser lens unit 21 is attached to the first attachment portion 25b.
- the condenser lens unit 21 and the mask member 22 are formed so that the peripheral edge fits into the module frame 25 as shown in FIGS. 1 and 4 when attached to the first attachment portion 25b and the second attachment portion 25c, respectively. And thereby positioned with respect to the module frame 25.
- the separator lens unit 23 is attached to the third attachment portion 25d of the module frame 25 from the opposite side of the image sensor 10.
- the third mounting portion 25d is provided with a positioning pin 25e and a direction reference pin 25f that protrude on the opposite side of the condenser lens unit 21.
- the separator lens unit 23 is formed with positioning holes 23b and direction reference holes 23c corresponding to the positioning pins 25e and the direction reference pins 25f, respectively.
- the diameters of the positioning pins 25e and the positioning holes 23b are set so as to be fitted.
- the diameters of the direction reference pin 25f and the direction reference hole 23c are set so as to fit gently.
- the separator lens unit 23 inserts the positioning hole 23b and the direction reference hole 23c into the positioning pin 25e and the direction reference pin 25f of the third attachment part 25d, respectively, so that the direction when attaching to the third attachment part 25d, etc.
- the posture is defined, and the positioning is performed with respect to the third mounting portion 25d by fitting the positioning hole 23b and the positioning pin 25e.
- the condenser lens unit 21, the mask member 22, and the separator lens unit 23 are attached while being positioned with respect to the module frame 25. That is, the positional relationship between the condenser lens unit 21, the mask member 22, and the separator lens unit 23 is positioned via the module frame 25.
- the line sensor unit 24 is attached to the module frame 25 from the back side of the separator lens unit 23 (the side opposite to the condenser lens unit 21). At this time, the line sensor unit 24 is attached to the module frame 25 in a state where the light transmitted through each separator lens 23a is positioned so as to enter the line sensor 24a.
- the condenser lens unit 21, the mask member 22, the separator lens unit 23, and the line sensor unit 24 to the module frame 25, light incident on the condenser lenses 21a, 21a,. Are transmitted through the mask member 22 and incident on the separator lenses 23a, 23a,..., And the light transmitted through the separator lenses 23a, 23a,... Forms an image on the line sensors 24a, 24a,.
- the image sensor 10 and the phase difference detection unit 20 configured as described above are joined to each other.
- the opening 31c of the package 31 in the image sensor 10 and the condenser lens 21a in the phase difference detection unit 20 are configured to fit each other. That is, the module frame 25 is bonded to the package 31 in a state where the condenser lenses 21a, 21a,... In the phase difference detection unit 20 are fitted in the openings 31c, 31c,.
- the imaging device 10 and the phase difference detection unit 20 can be joined in a state of being positioned with respect to each other.
- the condenser lenses 21a, 21a,..., The separator lenses 23a, 23a,... And the line sensors 24a, 24a, etc. are integrated into a unit and attached to the package 31 in a united state.
- the condenser lens 21a closest to the center of the imaging surface and the opening 31c are configured to be fitted to perform positioning within the imaging surface, and further, the condenser lens 21a and the aperture that are farthest from the center of the imaging surface. It is preferable that positioning is performed around the condenser lens 21a and the opening 31c (that is, the rotation angle) in the center of the imaging surface by being configured so as to be fitted with 31c.
- the condenser lens 21 a, the pair of mask openings 22 a and 22 a of the mask member 22, and the separator lens are provided on the back side of the substrate 11 a for each transmission portion 17.
- 23a and a line sensor 24a are arranged.
- the image sensor 10 converts the subject image formed on the imaging surface into an electrical signal for creating an image signal by the light receiving units 11b converting the light into electrical signals on the entire imaging surface.
- the light transmitted through the image sensor 10 enters the condenser lenses 21a, 21a,... Fitted in the openings 31c, 31c,.
- the light condensed by passing through each condenser lens 21a is divided into two light beams when passing through each pair of mask openings 22a, 22a formed in the mask member 22, and enters each separator lens 23a.
- the light thus divided into pupils passes through the separator lens 23a and forms an identical subject image at two positions on the line sensor 24a.
- the line sensor 24a Similar to the photoelectric conversion unit 11, the line sensor 24a outputs the received light amount in each light receiving unit as an electrical signal by photoelectric conversion.
- the image sensor 10 is a control unit that processes an electrical signal from the image sensor 10 to create an image signal (not shown in the present embodiment, but corresponds to, for example, a body control unit 5 of the second embodiment described later). ).
- the control unit is not included in the imaging unit 1, but may be configured to be included in the imaging unit 1.
- the control unit obtains the position information of each light receiving unit 11b and the output data corresponding to the amount of light received by the light receiving unit 11b from the entire imaging surface of the imaging element 10, thereby converting the subject image formed on the imaging surface to an electrical signal. Get as.
- the accumulated charge amounts differ depending on the wavelength of the light, so that the outputs from the light receiving units 11b, 11b,.
- Correction is performed according to the type of the color filters 15r, 15g, and 15b provided.
- an R pixel 11b provided with a red color filter 15r, a G pixel 11b provided with a green color filter 15g, and a B pixel 11b provided with a blue color filter 15b have colors corresponding to the respective color filters.
- the correction amount of each pixel is set so that the outputs from the R pixel 11b, the G pixel 11b, and the B pixel 11b are at the same level.
- the transmissive portions 17, 17,... On the substrate 11 a, the photoelectric conversion efficiency in the transmissive portions 17, 17,. That is, even if the same amount of light is received, the accumulated charge amount is the pixel provided in the other part of the pixels 11b, 11b,... Provided in the positions corresponding to the transmission parts 17, 17,. 11b, 11b,... As a result, the output data output from the pixels 11b, 11b,... Provided at the positions corresponding to the transmission parts 17, 17,.
- image processing similar to that for data is performed, there is a possibility that images of portions corresponding to the transmissive portions 17, 17,... Are not properly captured (for example, they are captured darkly).
- this decrease in output varies depending on the wavelength of light. That is, the longer the wavelength, the higher the transmittance of the substrate 11a. Therefore, the amount of light transmitted through the substrate 11a varies depending on the types of the color filters 15r, 15g, and 15b. Therefore, the correction for eliminating the influence of the transmissive part 17 on each pixel 11b corresponding to the transmissive part 17 is made different in the correction amount according to the wavelength of light received by each pixel 11b. That is, for each pixel 11b corresponding to the transmission unit 17, the correction amount is increased as the wavelength of light received by each pixel 11b is longer.
- a correction amount for eliminating the difference in accumulated charge amount depending on the type of light received is set, and in addition to the correction for eliminating the difference in accumulated charge amount due to this color type.
- the correction for eliminating the influence of the transmission part 17 is made. That is, the correction amount for eliminating the influence of the transmissive portion 17 is the same as the correction amount for each pixel 11b corresponding to the transmissive portion 17 and the pixel 11b corresponding to a position other than the transmissive portion 17 and receiving the same color. This is the difference from the correction amount for the pixel 11b.
- the correction amount is varied for each color according to the relationship shown below. By doing so, a stable image output can be obtained.
- red which is a long wavelength among the three colors of red, green, and blue
- blue which has the lowest transmittance
- the difference in the correction amount between the blue pixels is the smallest.
- the amount of correction of the output of each pixel 11b of the image sensor 10 depends on whether or not each pixel 11b is located at a position corresponding to the transmission unit 17 and the color type of the color filter 15 corresponding to each pixel 11b. To be determined. Each correction amount is determined so that, for example, the white balance and / or the luminance of the image displayed by the output from the transmission unit 17 and the output from other than the transmission unit 17 are equal.
- control unit corrects the output data from the light receiving units 11b, 11b,... As described above, and then, based on the output data, position information, color information, and luminance information in each light receiving unit, that is, the pixel 11b.
- An image signal including is created.
- an image signal of the subject image formed on the imaging surface of the image sensor 10 is obtained.
- the image of the subject image is obtained even in the image sensor 10 provided with the transmissive portions 17, 17.
- the signal can be acquired appropriately.
- the electrical signal output from the line sensor unit 24 is input to the control unit.
- This control unit may be the same as the control unit of the image sensor 10 or may be different.
- the control unit obtains the interval between the two subject images formed on the line sensor 24a, and the focus of the subject image formed on the image sensor 10 from the obtained interval.
- the state can be detected.
- the two subject images formed on the line sensor 24a have a predetermined reference when the subject image formed on the image sensor 10 through the imaging lens is accurately formed (focused). Positioned at a predetermined reference position with a gap.
- the interval between the two subject images is narrower than the reference interval at the time of focusing.
- the interval between the two subject images is wider than the reference interval at the time of focusing. That is, after amplifying the output from the line sensor 24a, it is possible to know whether the in-focus state or the out-of-focus state, the front pin or the rear pin, or the amount of Df by calculating with the arithmetic circuit.
- the image sensor 10 can easily reach the back side of the package 31, and the phase difference detection unit 20 is provided on the back side of the package 31, thereby detecting the phase difference of the light transmitted through the image sensor 10.
- a configuration in which the unit 20 receives light can be easily realized.
- the openings 31c, 31c,... Formed in the bottom plate 31a of the package 31 may adopt any configuration as long as the configuration allows the light transmitted through the image sensor 10 to pass through to the back side of the package 31.
- the openings 31c, 31c,... Which are holes the light transmitted through the image sensor 10 can reach the back side of the package 31 without being attenuated.
- phase difference detection unit 20 can be positioned with respect to the image sensor 10 by using the openings 31c, 31c,... By fitting the condenser lenses 21a, 21a,. .
- the phase difference detection unit for the image sensor 10 can be similarly formed by configuring the separator lenses 23a, 23a,. 20 positionings can be performed.
- the bottom plate 31a of the package 31 can be passed through the condenser lenses 21a, 21a,... And close to the substrate 11a, so that the imaging unit 1 can be configured compactly.
- the transmission part 17 is formed thinner than the peripheral part in the substrate 11a, but is not limited thereto.
- the thickness of the entire substrate 11a may be set so that light irradiated to the substrate 11a passes through the substrate 11a and sufficiently reaches the phase difference detection unit 20 on the back side of the substrate 11a. In this case, the whole substrate 11a becomes a transmission part.
- the three transmission parts 17, 17, and 17 are formed on the substrate 11a, and three sets of the condenser lens 21a, the separator lens 23a, and the line sensor 24a correspond to the transmission parts 17, 17, and 17, respectively. It is provided, but is not limited to this. The number of these is not limited to three, and can be set to an arbitrary number. For example, as shown in FIG. 5, nine transmission parts 17, 17,... May be formed on the substrate 11a, and nine sets of condenser lens 21a, separator lens 23a, and line sensor 24a may be provided.
- the image sensor 10 is not limited to a CCD image sensor, and may be a CMOS image sensor as shown in FIG.
- the imaging device 210 is a CMOS image sensor, and includes a photoelectric conversion unit 211 made of a semiconductor material, a transistor 212, a signal line 213, a mask 214, a color filter 215, and a microlens 216. Yes.
- the photoelectric conversion unit 211 includes a substrate 211a and light receiving units 211b, 211b,.
- a transistor 212 is provided for each light receiving portion 211b.
- the charge accumulated in the light receiving portion 211b is amplified by the transistor 212 and output to the outside through the signal line 213.
- the mask 214, the color filter 215, and the micro lens 216 have the same configuration as the mask 14, the color filter 15, and the micro lens 16.
- transmission parts 17, 17,... That transmit the irradiated light are formed on the substrate 211a.
- the transmission part 17 is formed by recessing a surface 211c opposite to the surface on which the light receiving unit 211b is provided in the substrate 211a (hereinafter also simply referred to as a back surface) 211c by cutting, polishing or etching, It is thinner than the periphery.
- the amplification factor of the transistor 212 is determined based on whether each light receiving portion 11b is located at a position corresponding to the transmission portion 17. Or by setting based on the color type of the color filter 15 corresponding to each of the light receiving portions 11b, to prevent the image of the portion corresponding to the transmissive portions 17, 17,... Can do.
- the configuration of the image sensor through which light passes is not limited to the configuration in which the transmissive portions 17, 17,. Any configuration can be adopted as long as light passes through the image sensor (including transmission as described above).
- the imaging element 310 may include a passage portion 318 in which a plurality of through holes 318 a, 318 a,... Are formed in a substrate 311 a.
- the through holes 318a, 318a,... are formed so as to penetrate the substrate 311a in the thickness direction. Specifically, in a matrix-like pixel region on the substrate 311a, when four pixel regions adjacent to two rows and two columns are defined as one unit, light receiving portions 11b, 11b, and 11b are arranged in three pixel regions. In addition, a through hole 318a is formed in the remaining one pixel region.
- three color filters 15r, 15g, and 15b corresponding to the three light receiving portions 11b, 11b, and 11b, respectively. 15b is arranged. More specifically, a green color filter 15g is disposed in the light receiving portion 11b positioned diagonally with respect to the through hole 318a, and a red color filter 15r is disposed in one light receiving portion 11b adjacent to the through hole 318a. A blue color filter 15b is disposed in the other light receiving portion 11b adjacent to the through hole 318a. A color filter is not provided in the pixel region corresponding to the through hole 318a.
- pixels corresponding to the through hole 318a are interpolated using the outputs of the light receiving portions 11b, 11b,... Adjacent to the through hole 318a. Specifically, it corresponds to the through hole 318a using the average value of the outputs of the four light receiving portions 11b, 11b,... Provided with the green color filter 15g adjacent to the through hole 318a in the diagonal direction. Interpolate (standard interpolation) the pixel signal. Alternatively, in the four light receiving portions 11b, 11b,... Provided with the green color filter 15g adjacent to the through hole 318a in the diagonal direction, two sets of light receiving portions 11b adjacent to each other in the diagonal direction. 11b,...
- the signal of the pixel corresponding to the through hole 318a is interpolated (tilt interpolation) using the average value of the outputs of the portions 11b and 11b. If the pixel to be interpolated is the edge of the focused subject, using the light receiving portions 11b and 11b having the larger change results in an unfavorable result because the edges are blurred. Therefore, when there is a change greater than or equal to a predetermined threshold, the smaller change is used, and when the change is less than the predetermined threshold, the larger change is used.
- the luminance information and color information of the pixel corresponding to each light receiving portion 11b are obtained using the output data of the light receiving portions 11b, 11b,.
- predetermined image processing and synthesis are performed to create an image signal.
- the imaging device 310 configured as described above can allow incident light to pass through the plurality of through holes 318a, 318a,.
- the imaging element 310 through which light passes can also be configured by providing the substrate 311a with the passage portion 318 constituted by the plurality of through holes 318a, 318a, ... instead of the transmission portion 17.
- the passage portion 318 may be provided only at a position corresponding to the condenser lens 21a or the separator lens 23a of the phase difference detection unit 20, or may be provided on the entire substrate 311a.
- the said image pick-up element is formed by forming the said opening 31c, 31c, ... in the baseplate 31a of the package 31 which accommodates the image pick-up element 310.
- the light that has passed through 310 can easily reach the back side of the package 31, and the phase difference detection unit 20 is disposed on the back side of the package 31, thereby detecting the phase difference of the light that has passed through the image sensor 310.
- a configuration in which the unit 20 receives light can be easily realized.
- the condenser lens 21a, the separator lens 23a and the line sensor 24a are configured so that light from the plurality of through holes 318a, 318a,...
- the size of one set of the sensor 24a is preferable in that it is not limited to the size of the pixel. That is, the size of one set of the condenser lens 21a, the separator lens 23a, and the line sensor 24a is preferable because it does not hinder the increase in the number of pixels of the image sensor 310 due to the narrowing of the pixels.
- the phase difference detection unit 20 is not limited to the above-described configuration.
- the condenser lens 21a and the separator lens 23a are positioned with respect to the transmission part 17 of the image sensor 10, the fitting between the condenser lens 21a and the opening 31c of the package 31 is not necessarily required.
- the structure which does not have a condenser lens may be sufficient.
- a condenser lens and a separator lens may be integrally formed.
- a condenser lens unit 421, a mask member 422, a separator lens unit 423, and a line sensor unit 424 are provided on the back side of the image sensor 10. It may be a phase difference detection unit 420 arranged side by side in a direction parallel to the ten imaging surfaces.
- the condenser lens unit 421 integrally forms a plurality of condenser lenses 421a, 421a,..., And has an incident surface 421b, a reflective surface 421c, and an exit surface 421d. That is, the condenser lens unit 421 reflects the light collected by the condenser lenses 421a, 421a,... By the reflecting surface 421c at an angle of approximately 90 ° and emits the light from the emitting surface 421d.
- the light that has passed through the image sensor 10 and entered the condenser lens unit 421 has its optical path bent substantially vertically by the reflection surface 421c, and is emitted from the emission surface 421d to the separator lens 423a of the separator lens unit 423. Head.
- the light incident on the separator lens 423a passes through the separator lens 423a and forms an image on the line sensor 424a.
- the thus configured condenser lens unit 421, mask member 422, separator lens unit 423, and line sensor unit 424 are arranged in the module frame 425.
- the module frame 425 is formed in a box shape, and a step portion 425a for attaching the condenser lens unit 421 is formed therein.
- the condenser lens unit 421 is attached to the step portion 425a so that the condenser lenses 421a, 421a,... Face outward from the module frame 425.
- the module frame 425 is provided with an attachment wall portion 425b for attaching the mask member 422 and the separator lens unit 423 at a position facing the emission surface 421d of the condenser lens unit 421.
- An opening 425c is formed in the mounting wall portion 425b.
- the mask member 422 is attached to the attachment wall portion 425b from the condenser lens unit 421 side.
- the separator lens unit 423 is attached to the attachment wall portion 425b from the side opposite to the condenser lens unit 421.
- the condenser lens unit 421, the mask member 422, the separator lens unit 423, the line sensor unit 424, and the like are formed on the back surface side of the image sensor 10 by bending the optical path of the light that has passed through the image sensor 10. Since it can arrange in the direction parallel to the image pick-up surface of the image sensor 10 instead of arranging in the horizontal direction, the dimension of the image sensor 10 in the thickness direction of the image sensor 10 can be reduced. That is, the imaging unit 401 can be formed compactly.
- phase difference detection unit having an arbitrary configuration can be adopted as long as the phase difference can be detected by receiving the light that has passed through the image sensor 10 on the back side of the image sensor 10.
- Embodiment 2 of the Invention ⁇ Embodiment 2 of the Invention >> Next, a camera as an imaging apparatus according to Embodiment 2 of the present invention will be described.
- the camera 100 is an interchangeable lens type single-lens reflex digital camera, and is mainly detachable from the camera body 4 having the main functions of the camera system and the camera body 4.
- the interchangeable lens 7 is mounted.
- the interchangeable lens 7 is attached to a body mount 41 provided on the front surface of the camera body 4.
- the body mount 41 is provided with an electrical section 41a.
- the camera body 4 includes an imaging unit 1 according to the first embodiment that acquires a subject image as a captured image, a shutter unit 42 that adjusts an exposure state of the imaging unit 1, and infrared light of the subject image incident on the imaging unit 1.
- An image display unit 44 that includes an IR cut / OLPF (Optical Low Pass Filter) 43 for reducing removal and moire phenomenon, a liquid crystal monitor, and displays captured images, live view images, and various information, and a body control unit 5 And have.
- IR cut / OLPF Optical Low Pass Filter
- the camera body 4 includes a power switch 40a for turning on / off the power of the camera system, a release button 40b operated by the photographer during focusing and release, and a setting switch for switching various photographing modes and functions ON / OFF. 40c and 40d are provided.
- the release button 40b is a two-stage type, which performs autofocus and AE, which will be described later, by half-pressing, and releasing by fully pressing the release button 40b.
- the AF setting switch 40c is a switch for switching three autofocus functions described later.
- the camera body 4 is configured to set the autofocus function to any one of the three by switching the AF setting switch 40c.
- the continuous shooting setting switch 40d is a switch for setting / canceling a continuous shooting mode to be described later.
- the camera body 4 is configured to be able to switch between the normal shooting mode and the continuous shooting mode by operating the continuous shooting setting switch 40d.
- these setting switches 40c and 40d may be selection items in a menu for selecting various camera photographing functions.
- the imaging unit 1 is configured to be movable in a plane orthogonal to the optical axis X by the blur correction unit 45.
- the body control unit 5 controls the operation of the body microcomputer 50, the nonvolatile memory 50 a, the shutter control unit 51 that controls the driving of the shutter unit 42, and the imaging unit 1, and converts the electrical signal from the imaging unit 1 to A /
- the imaging unit control unit 52 that performs D conversion and outputs the image data to the body microcomputer 50, and the reading of the image data from the image storage unit 58, which is a card-type recording medium or an internal memory, for example, and the recording of the image data in the image storage unit 58
- a blur detection unit 56 that detects the amount and a correction unit control unit 57 that controls the blur correction unit 45 are included.
- the body microcomputer 50 is a control device that controls the center of the camera body 4 and controls various sequences.
- a CPU, a ROM, and a RAM are mounted on the body microcomputer 50.
- the body microcomputer 50 can implement
- the body microcomputer 50 receives input signals from the power switch 40a, the release button 40b and the setting switches 40c and 40d, as well as a shutter control unit 51, an imaging unit control unit 52, an image reading / recording unit 53, and an image recording.
- the control unit 54 is configured to output a control signal to the correction unit control unit 57 and the like, and includes a shutter control unit 51, an imaging unit control unit 52, an image reading / recording unit 53, an image recording control unit 54, and a correction unit.
- the control unit 57 or the like is caused to execute each control.
- the body microcomputer 50 performs inter-microcomputer communication with a lens microcomputer 80 described later.
- the imaging unit control unit 52 performs A / D conversion on the electrical signal from the imaging unit 1 and outputs it to the body microcomputer 50.
- the body microcomputer 50 performs predetermined image processing on the captured electric signal to create an image signal.
- the body microcomputer 50 transmits an image signal to the image reading / recording unit 53 and instructs the image recording control unit 54 to record and display an image, thereby storing the image signal in the image storage unit 58 and the image.
- the image signal is transmitted to the display control unit 55.
- the image display control unit 55 controls the image display unit 44 based on the transmitted image signal, and causes the image display unit 44 to display an image.
- the body microcomputer 50 corrects the output from the light receiving unit 11b as predetermined image processing depending on whether or not the light receiving unit 11b is provided at a position corresponding to the transmission unit 17 as described above. The correction etc. which eliminate the influence of are performed.
- the nonvolatile memory 50a stores various information (body information) related to the camera body 4.
- the body information includes, for example, the model name for identifying the camera body 4 such as the manufacturer name, date of manufacture, model number, software version installed in the body microcomputer 50, and information on the firmware upgrade.
- Information main body specifying information
- information about whether the camera body 4 is equipped with means for correcting image blur such as the blur correction unit 45 and the blur detection unit 56, the model number and sensitivity of the blur detection unit 56, etc. It also includes information on detection performance, error history, etc.
- These pieces of information may be stored in the memory unit in the body microcomputer 50 instead of the nonvolatile memory 50a.
- the shake detection unit 56 includes an angular velocity sensor that detects the movement of the camera body 4 caused by camera shake or the like.
- the angular velocity sensor outputs a positive / negative angular velocity signal according to the direction in which the camera body 4 moves with reference to the output when the camera body 4 is stationary.
- two angular velocity sensors are provided to detect the two directions of the yawing direction and the pitching direction.
- the output angular velocity signal is subjected to filter processing, amplifier processing, and the like, converted into a digital signal by an A / D conversion unit, and provided to the body microcomputer 50.
- the interchangeable lens 7 constitutes an imaging optical system for connecting a subject image to the imaging unit 1 in the camera body 4, and mainly includes a focus adjustment unit 7A that performs focusing and an aperture adjustment unit 7B that adjusts the aperture.
- the image blur correction unit 7C for correcting the image blur by adjusting the optical path and the lens control unit 8 for controlling the operation of the interchangeable lens 7 are provided.
- the interchangeable lens 7 is attached to the body mount 41 of the camera body 4 via the lens mount 71.
- the lens mount 71 is provided with an electrical piece 71 a that is electrically connected to the electrical piece 41 a of the body mount 41 when the interchangeable lens 7 is attached to the camera body 4.
- the focus adjusting unit 7A includes a focus lens group 72 that adjusts the focus.
- the focus lens group 72 is movable in the direction of the optical axis X in a section from the closest focus position to the infinite focus position defined as the standard of the interchangeable lens 7.
- the focus lens group 72 needs to be movable back and forth in the optical axis X direction with respect to the focus position in the case of focus position detection by a contrast detection method to be described later. It has a lens shift margin section that can move further back and forth in the optical axis X direction than the section up to the infinite focus position.
- the aperture adjusting unit 7B includes an aperture unit 73 that adjusts the aperture or opening.
- the lens image blur correction unit 7C includes a blur correction lens 74 and a blur correction lens driving unit 74a that moves the blur correction lens 74 in a plane orthogonal to the optical axis X.
- the lens controller 8 receives a control signal from the lens microcomputer 80, the nonvolatile memory 80 a, the focus lens group controller 81 that controls the operation of the focus lens group 72, and the focus lens group controller 81, and receives the focus lens group 72.
- a focus drive unit 82 for driving the lens a diaphragm control unit 83 for controlling the operation of the diaphragm unit 73, a blur detection unit 84 for detecting blur of the interchangeable lens 7, and a blur correction lens unit for controlling the blur correction lens driving unit 74a.
- a control unit 85 a control unit 85.
- the lens microcomputer 80 is a control device that controls the center of the interchangeable lens 7, and is connected to each part mounted on the interchangeable lens 7.
- the lens microcomputer 80 is equipped with a CPU, a ROM, and a RAM, and various functions can be realized by reading a program stored in the ROM into the CPU.
- the lens microcomputer 80 has a function of setting a lens image blur correction device (such as the blur correction lens driving unit 74a) to a correctable state or an uncorrectable state based on a signal from the body microcomputer 50.
- the body microcomputer 50 and the lens microcomputer 80 are electrically connected by contact between the electrical slice 71a provided on the lens mount 71 and the electrical slice 41a provided on the body mount 41, so that information can be transmitted and received between them. It has become.
- the lens information includes, for example, a model for specifying the interchangeable lens 7 such as a manufacturer name, a date of manufacture, a model number, a software version installed in the lens microcomputer 80, and information on firmware upgrade of the interchangeable lens 7.
- the interchangeable lens 7 is equipped with means for correcting image blur such as the blur correction lens driving unit 74a and blur detection unit 84, and means for correcting image blur , Information on the detection performance such as the model number and sensitivity of the blur detection unit 84, information on the correction performance such as the model number of the blur correction lens driving unit 74a and the maximum correctable angle (lens side correction performance information), Software version for image blur correction is included.
- the lens information includes information on power consumption necessary for driving the blur correction lens driving unit 74a (lens side power consumption information) and information on driving method of the blur correction lens driving unit 74a (lens side driving method information). It is.
- the nonvolatile memory 80a can store information transmitted from the body microcomputer 50. These pieces of information may be stored in a memory unit in the lens microcomputer 80 instead of the nonvolatile memory 80a.
- the focus lens group control unit 81 includes an absolute position detection unit 81a that detects an absolute position of the focus lens group 72 in the optical axis direction, and a relative position detection unit 81b that detects a relative position of the focus lens group 72 in the optical axis direction.
- the absolute position detector 81 a detects the absolute position of the focus lens group 72 in the casing of the interchangeable lens 7.
- the absolute position detector 81a is constituted by, for example, a contact encoder board of several bits and a brush, and is configured to be able to detect the absolute position.
- the relative position detector 81b alone cannot detect the absolute position of the focus lens group 72, but can detect the moving direction of the focus lens group 72, and uses, for example, a two-phase encoder.
- Two two-phase encoders such as a rotary pulse encoder, an MR element, and a Hall element, are provided that alternately output binary signals at an equal pitch according to the position of the focus lens group 72 in the optical axis direction. These pitches are installed so as to shift the phases.
- the lens microcomputer 80 calculates the relative position of the focus lens group 72 in the optical axis direction from the output of the relative position detector 81b.
- the blur detection unit 84 includes an angular velocity sensor that detects the movement of the interchangeable lens 7 caused by camera shake.
- the angular velocity sensor outputs positive and negative angular velocity signals according to the direction in which the interchangeable lens 7 moves with reference to the output when the interchangeable lens 7 is stationary.
- two angular velocity sensors are provided to detect the two directions of the yawing direction and the pitching direction.
- the output angular velocity signal is subjected to filter processing, amplifier processing, and the like, converted into a digital signal by an A / D conversion unit, and provided to the lens microcomputer 80.
- the blur correction lens unit control unit 85 includes a movement amount detection unit (not shown).
- the movement amount detection unit is a detection unit that detects an actual movement amount of the blur correction lens 74.
- the blur correction lens unit control unit 85 performs feedback control of the blur correction lens 74 based on the output from the movement amount detection unit.
- a correction device may be mounted, and neither of the shake detection unit and the shake correction device may be mounted (in this case, the above-described sequence related to the shake correction may be excluded).
- the camera 100 configured as described above has various shooting modes and functions. In the following, various shooting modes and functions of the camera 100 and the operation at that time will be described.
- the camera 100 focuses by AF.
- This AF has three autofocus functions of phase difference detection AF, contrast detection AF, and hybrid AF. is doing. These three autofocus functions can be selected by the photographer by operating an AF setting switch 40c provided on the camera body 4.
- the normal shooting mode is the most basic shooting mode of the camera 100 for performing normal shooting, which is not the continuous shooting mode described later.
- Phase difference detection AF Phase difference detection AF
- step Sa1 When the power switch 40a is turned on (step Sa1), communication between the camera body 4 and the interchangeable lens 7 is performed (step Sa2). Specifically, power is supplied to the body microcomputer 50 and various units in the camera body 4, and the body microcomputer 50 is activated. At the same time, electrodes are supplied to the lens microcomputer 80 and various units in the interchangeable lens 7 via the electrical sections 41a and 71a, and the lens microcomputer 80 is activated.
- the body microcomputer 50 and the lens microcomputer 80 are programmed to transmit and receive information to each other at the time of activation. For example, lens information relating to the interchangeable lens 7 is transmitted from the memory unit of the lens microcomputer 80 to the body microcomputer 50. 50 memory units.
- the body microcomputer 50 positions the focus lens group 72 at a predetermined reference position set in advance via the lens microcomputer 80 (step Sa3), and at the same time, opens the shutter unit 42 (see FIG. Step Sa4). Thereafter, the process proceeds to step Sa5 and waits until the release button 40b is half-pressed by the photographer.
- the light that has passed through the interchangeable lens 7 and entered the camera body 4 passes through the shutter unit 42, further passes through the IR cut / OLPF 43, and enters the imaging unit 1.
- the subject image formed by the imaging unit 1 is displayed on the image display unit 44, and the photographer can observe an erect image of the subject via the image display unit 44.
- the body microcomputer 50 reads an electrical signal from the image sensor 10 via the imaging unit control unit 52 at a constant cycle, performs predetermined image processing on the read electrical signal, and then creates an image signal. Then, the image display control unit 55 is controlled to display the live view image on the image display unit 44.
- part of the light incident on the imaging unit 1 passes through the transmission parts 17, 17,... Of the imaging element 10 and enters the phase difference detection unit 20.
- step Sa5 when the release button 40b is half-pressed by the photographer (that is, the S1 switch (not shown) is turned on)
- the body microcomputer 50 receives the signal from the line sensor 24a of the phase difference detection unit 20. After amplifying the output, it is calculated by an arithmetic circuit to determine whether the in-focus state or the in-focus state, the front pin or the rear pin, and the Df amount (step Sa6).
- the body microcomputer 50 drives the focus lens group 72 through the lens microcomputer 80 in the defocus direction by the amount of Df acquired in step Sa6 (step Sa7).
- the phase difference detection unit 2 has three sets of the condenser lens 21a, the mask openings 22a and 22a, the separator lens 23a, and the line sensor 24a, that is, three distance measuring points for detecting the phase difference. Have one.
- the focus lens group 72 is driven based on the output of the line sensor 24a of the set corresponding to the distance measuring point arbitrarily selected by the photographer.
- an automatic optimization algorithm may be set in the body microcomputer 50 so that the focus lens group 72 is driven by selecting a distance measurement point closest to the camera among a plurality of distance measurement points. In this case, it is possible to reduce the probability that a hollow photo or the like will occur.
- the selection of the distance measurement point is not limited to the phase difference detection method AF, and can be adopted for any type of AF as long as the focus lens group 72 is driven using the phase difference detection unit 2. .
- step Sa8 it is determined whether or not the subject is in focus. Specifically, when the Df amount obtained from the output of the line sensor 24a is less than or equal to a predetermined value, it is determined that the camera is in focus (YES), and the process proceeds to step Sa11, while the Df amount obtained from the output of the line sensor 24a is greater than the predetermined value. If it is larger, it is determined that the subject is not in focus (NO), the process returns to step Sa6, and steps Sa6 to Sa8 are repeated.
- the detection of the focus state and the driving of the focus lens group 72 are repeated, and when the Df amount becomes a predetermined amount or less, it is determined that the focus is achieved, and the driving of the focus lens group 72 is stopped.
- step Sa9 photometry is performed (step Sa9) and image blur detection is started (step Sa10).
- step Sa9 the amount of light incident on the image sensor 10 is measured by the image sensor 10. That is, in the present embodiment, since the above-described phase difference detection method AF is performed using the light incident on the image sensor 10 and transmitted through the image sensor 10, in parallel with the phase difference detection method AF, Photometry can be performed using the image sensor 10.
- the body microcomputer 50 performs photometry by taking in an electrical signal from the image sensor 10 via the imaging unit controller 52 and measuring the intensity of subject light based on the electrical signal. Then, the body microcomputer 50 determines the shutter speed and aperture value at the time of exposure according to the photographing mode from the photometric result according to a predetermined algorithm.
- step Sa9 image blur detection is started in step Sa10.
- step Sa9 and step Sa10 may be performed in parallel.
- step Sa11 the process waits until the photographer fully presses the release button 40b (that is, the S2 switch (not shown) is turned on).
- the release button 40b When the release button 40b is fully pressed by the photographer, the body microcomputer 50 temporarily closes the shutter unit 42 (step Sa12).
- the shutter unit 42 While the shutter unit 42 is in the closed state, the charges accumulated in the light receiving portions 11b, 11b,...
- the body microcomputer 50 starts image blur correction based on communication information between the camera body 4 and the interchangeable lens 7 or arbitrary designation information of the photographer (step Sa13). Specifically, the blur correction lens driving unit 74 a in the interchangeable lens 7 is driven based on information from the blur detection unit 56 in the camera body 4. Further, according to the photographer's intention, (i) the blur detection unit 84 and the blur correction lens driving unit 74a in the interchangeable lens 7 are used, and (ii) the blur detection unit 56 and the blur correction unit 45 in the camera body 4 are provided. Either (iii) using the blur detection unit 84 in the interchangeable lens 7 or the blur correction unit 45 in the camera body 4 can be selected.
- the driving of the image blur correcting means is started when the release button 40b is half-pressed, so that the movement of the subject to be focused is reduced and the phase difference detection AF can be performed more accurately.
- the body microcomputer 50 narrows down the diaphragm 73 via the lens microcomputer 80 so that the diaphragm value obtained from the photometric result in step Sa9 is obtained (step Sa14).
- the body microcomputer 50 opens the shutter unit 42 based on the shutter speed obtained from the result of the photometry in step Sa9 (step Sa15).
- the shutter unit 42 by opening the shutter unit 42, light from the subject enters the image sensor 10, and the image sensor 10 accumulates charges for a predetermined time (step Sa16).
- the body microcomputer 50 closes the shutter unit 42 based on the shutter speed, and ends the exposure (step Sa17).
- the body microcomputer 50 reads the image data from the imaging unit 1 via the imaging unit control unit 52, and after the predetermined image processing, the image data is sent to the image display control unit 55 via the image reading / recording unit 53. Output.
- the captured image is displayed on the image display unit 44.
- the body microcomputer 50 stores image data in the image storage unit 58 via the image recording control unit 54 as necessary.
- the body microcomputer 50 ends the image blur correction (step Sa18) and opens the diaphragm 73 (step Sa19). Then, the body microcomputer 50 opens the shutter unit 42 (step Sa20).
- the lens microcomputer 80 When the reset is completed, the lens microcomputer 80 notifies the body microcomputer 50 of the completion of the reset.
- the body microcomputer 50 waits for the reset completion information from the lens microcomputer 80 and the completion of a series of processes after exposure, and then confirms that the state of the release button 40b is not depressed, and ends the photographing sequence. Thereafter, the process returns to step Sa5 and waits until the release button 40b is half-pressed.
- the body microcomputer 50 moves the focus lens group 72 to a predetermined reference position set in advance (step Sa22) and closes the shutter unit 42. (Step Sa23). Then, the operation of the body microcomputer 50 and various units in the camera body 4 and the lens microcomputer 80 and various units in the interchangeable lens 7 are stopped.
- photometry is performed by the image sensor 10 in parallel with the autofocus based on the phase difference detection unit 20. That is, since the phase difference detection unit 20 receives the light transmitted through the image sensor 10 and acquires defocus information, the image sensor 10 is always irradiated with light from the subject when acquiring the defocus information. . Therefore, photometry is performed using light transmitted through the image sensor 10 during autofocus. By doing so, it is not necessary to separately provide a photometric sensor, and since photometry can be performed before the release button 40b is fully pressed, exposure is completed after the release button 40b is fully pressed. Time (hereinafter also referred to as a release time lag) can be shortened.
- part of the light guided from the subject to the imaging device is guided to the phase difference detection unit provided outside the imaging device by a mirror or the like, whereas the light guided to the imaging unit 1 is used as it is. Since the focus state can be detected by the phase difference detection unit 20, the focus state can be performed with very high accuracy.
- step Sb1 When the power switch 40a is turned on (step Sb1), communication between the camera body 4 and the interchangeable lens 7 is performed (step Sb2), the focus lens group 72 is positioned at a predetermined reference position (step Sb3), and in parallel therewith.
- step Sb4 The steps until the shutter unit 42 is opened (step Sb4) and the release button 40b is half-pressed (step Sb5) are the same as steps Sa1 to Sa5 in the phase difference detection method AF.
- the body microcomputer 50 drives the focus lens group 72 via the lens microcomputer 80 (step Sb6). Specifically, the focus lens group 72 is driven so that the focus of the subject image moves in a predetermined direction along the optical axis (for example, the subject side).
- the body microcomputer 50 obtains the contrast value of the subject image based on the output from the image sensor 10 taken in via the imaging unit controller 52, and determines whether or not the contrast value has changed to a low value (step Sb7). ). As a result, when the contrast value is low (YES), the process proceeds to step Sb8, whereas when the contrast value is high (NO), the process proceeds to step Sb9.
- the contrast value When the contrast value is low, it means that the focus lens group 72 is driven in a direction opposite to the in-focus direction, so that the focus of the subject image is opposite to the predetermined direction in the optical axis direction (for example, The focus lens group 72 is reversely driven so as to move to the opposite side of the subject (step Sb8). Thereafter, it is determined whether or not a contrast peak has been detected (step Sb10), and the inversion driving of the focus lens group 72 (step Sb8) is repeated while no contrast peak is detected (NO). When the contrast peak is detected (YES), the inversion driving of the focus lens group 72 is stopped and the focus lens group 72 is moved to the position where the contrast value has reached the peak, and the process proceeds to step Sa11.
- Step Sb9 it is determined whether or not the peak of the contrast value has been detected (Step Sb10). As a result, while the contrast peak is not detected (NO), the driving of the focus lens group 72 is repeated (step Sb9).
- the contrast peak is detected (YES)
- the driving of the focus lens group 72 is stopped and the focus lens group 72 is stopped. The lens group 72 is moved to a position where the contrast value reaches a peak, and the process proceeds to step Sa11.
- the focus lens group 72 is driven for the time being (step Sb6), and when the contrast value changes to a low value, the focus lens group 72 is inverted to search for the peak of the contrast value (step Sb8, Sb10) On the other hand, when the contrast value changes high, the focus lens group 72 is driven as it is to search for the peak of the contrast value (steps Sb9, Sb10).
- step Sb11 photometry is performed (step Sb11) and image blur detection is started (step Sb12).
- steps Sb11 and Sb12 are the same as steps Sa9 and Sa10 of the phase difference detection method AF.
- step Sa11 the process waits until the release button 40b is fully pressed by the photographer.
- the flow after the release button 40b is fully pressed is the same as in the phase difference detection AF.
- This contrast detection method AF can directly capture the contrast peak, and unlike the phase difference detection method AF, it does not require various correction calculations such as open back correction (focus shift due to the aperture opening degree of the aperture). Focus performance can be obtained.
- open back correction focus shift due to the aperture opening degree of the aperture.
- Focus performance can be obtained.
- the focus lens group is driven by the reciprocating drive operation of the focus lens group 72. It is necessary to remove the backlash generated in the system.
- step Sc1 to Sc5 From the time when the power switch 40a is turned ON until the release button 40b is half-pressed (steps Sc1 to Sc5), the same as steps Sa1 to Sa5 in the phase difference detection AF.
- the body microcomputer 50 When the release button 40b is half-pressed by the photographer (step Sc5), the body microcomputer 50 amplifies the output from the line sensor 24a of the phase difference detection unit 20, and then performs calculation by the arithmetic circuit to focus. Or out-of-focus (step Sc6). Furthermore, the body microcomputer 50 obtains the defocus information by determining the front pin or the rear pin and how much the defocus amount is (step Sc7). Thereafter, the process proceeds to Step Sc10.
- step Sc8 photometry is performed (step Sc8) and image blur detection is started (step Sc9).
- steps Sc6 and Sc7 are the same as steps Sa9 and Sa10 of the phase difference detection method AF. Thereafter, the process proceeds to Step Sc10. In addition, after step Sc9, you may progress to step Sa11 instead of step Sc10.
- step Sc10 the body microcomputer 50 drives the focus lens group 72 based on the defocus information acquired in step Sc7.
- the body microcomputer 50 determines whether or not a contrast peak has been detected (step Sc11).
- the contrast peak is not detected (NO)
- the driving of the focus lens group 72 is repeated (step Sc10), while when the contrast peak is detected (YES), the driving of the focus lens group 72 is stopped and the focus lens group 72 is stopped. Is moved to a position where the contrast value reaches a peak, and then the process proceeds to step Sa11.
- the focus lens group 72 is moved at a high speed based on the defocus direction and the defocus amount calculated in step Sc7, and then the focus lens group 72 is moved at a lower speed than the aforementioned speed. It is preferable to detect the contrast peak by moving it.
- step Sa7 in the phase difference detection AF the focus lens group 72 is moved to a position predicted as the in-focus position based on the defocus amount
- step Sc10 in the hybrid AF the defocus amount Based on this, the focus lens group 72 is driven to a position farther forward and backward than the predicted position as the in-focus position.
- the contrast peak is detected while driving the focus lens group 72 toward a position predicted as the in-focus position.
- step Sa11 the process waits until the release button 40b is fully pressed by the photographer.
- the flow after the release button 40b is fully pressed is the same as in the phase difference detection AF.
- the defocus information is acquired by the phase difference detection unit 20, and the focus lens group 72 is driven based on the defocus information. Then, the position of the focus lens group 72 where the contrast value calculated based on the output from the image sensor 10 reaches a peak is detected, and the focus lens group 72 is positioned at this position.
- defocus information can be detected before the focus lens group 72 is driven, there is no need to drive the focus lens group 72 for the time being like the contrast detection method AF.
- the processing time can be shortened.
- the focus is finally adjusted by the contrast detection method AF, it is possible to focus on a subject with a repetitive pattern or a subject with extremely low contrast with higher accuracy than the phase difference detection method AF.
- the hybrid AF includes phase difference detection
- the defocus information is acquired by the phase difference detection unit 20 using the light transmitted through the image sensor 10, so that the photometry by the image sensor 10 Acquisition of defocus information by the phase difference detection unit 20 can be performed in parallel.
- the processing time after the release button 40b is half-pressed can be prevented by performing the photometry in parallel with the acquisition of the defocus information. it can.
- the narrowing is performed immediately after the release button 40b is pressed and immediately before exposure.
- the phase difference detection AF and the hybrid AF before the release button 40b is fully pressed.
- a modified example configured to narrow down before autofocus will be described.
- Phase difference detection AF Phase difference detection AF Specifically, first, the photographing operation of the camera system using the phase difference detection method AF according to the modification will be described with reference to FIG.
- step Sd1 to Sd5 From the time when the power switch 40a is turned on until the release button 40b is half-pressed (steps Sd1 to Sd5), the same as steps Sa1 to Sa5 in the above-described phase difference detection AF.
- step Sd5 When the release button 40b is half-pressed by the photographer (step Sd5), image blur detection is started (step Sd6), and in parallel with this, photometry is performed (step Sd7).
- steps Sd5 and Sd6 are the same as steps Sa9 and Sa10 of the phase difference detection method AF.
- step Sd8 an aperture value at the time of exposure is obtained based on the result of photometry in step Sd7, and it is determined whether or not the obtained aperture value is larger than a predetermined aperture threshold value (step Sd8).
- the process proceeds to step Sd10, and when the obtained aperture value is equal to or smaller than the predetermined aperture threshold (NO), the process proceeds to step Sd9.
- the body microcomputer 50 drives the diaphragm unit 73 via the lens microcomputer 80 so as to obtain the obtained diaphragm value.
- the predetermined aperture threshold is set to an aperture value at which defocus information can be acquired based on the output of the line sensor 24a of the phase difference detection unit 20. That is, when the aperture value obtained based on the photometric result is larger than the aperture threshold value, if the aperture unit 73 is stopped down to the aperture value, defocus information cannot be acquired by the phase difference detection unit 20 described later. The process proceeds to step Sd10 without narrowing the diaphragm 73. On the other hand, if the aperture value obtained based on the photometric result is equal to or smaller than the aperture threshold value, the aperture unit 73 is limited to the aperture value, and the process proceeds to step Sd10.
- steps Sd10 to Sd12 similarly to steps Sa6 to Sa8 in the phase difference detection method AF described above, the body microcomputer 50 obtains defocus information based on the output from the line sensor 24a of the phase difference detection unit 20 (step Sd10). Sd10), the focus lens group 72 is driven based on the defocus information (step Sd11), and it is determined whether or not it is in focus (step Sd12). After focusing, the process proceeds to step Sa11.
- step Sa11 the process waits until the release button 40b is fully pressed by the photographer.
- the flow after the release button 40b is fully pressed is the same as the phase difference detection AF described above.
- step Sd8 only when it is determined in step Sd8 that the aperture value obtained based on the photometric result is larger than the predetermined aperture threshold value, the aperture unit 73 is narrowed down in step Sa14. That is, when it is determined in step Sd8 that the aperture value obtained based on the result of photometry is equal to or less than the predetermined aperture threshold, the aperture unit 73 is previously narrowed down in step Sd9, so step Sa14 needs to be performed. There is no.
- the aperture value at the time of exposure obtained based on the result of photometry is a value that can perform the phase difference detection method AF.
- the aperture 73 is stopped prior to autofocus prior to exposure. By doing so, it is not necessary to narrow down the diaphragm 73 after the release button 40b is fully pressed, and the release time lag can be shortened.
- step Se1 to Se5 From the time the power switch 40a is turned on until the release button 40b is half-pressed (steps Se1 to Se5), the same as steps Sa1 to Sa5 in the phase difference detection AF described above.
- step Se5 When the release button 40b is half-pressed by the photographer (step Se5), image blur detection is started (step Se6), and in parallel with this, photometry is performed (step Se7).
- steps Se6 and Se7 are the same as steps Sa9 and Sa10 of the phase difference detection method AF.
- step Se8 an aperture value at the time of exposure is obtained based on the result of photometry in step Se7, and it is determined whether or not the obtained aperture value is larger than a predetermined aperture threshold value (step Se8).
- the process proceeds to Step Se10, and when the obtained aperture value is equal to or less than the predetermined aperture threshold value (NO), the process proceeds to Step Se9.
- the body microcomputer 50 drives the diaphragm unit 73 via the lens microcomputer 80 so as to obtain the obtained diaphragm value.
- the predetermined aperture threshold is set to an aperture value such that the peak of the contrast value calculated from the output of the image sensor 10 can be detected. That is, when the aperture value obtained based on the photometric result is larger than the aperture threshold value, if the aperture portion 73 is reduced to the aperture value, a contrast peak described later cannot be detected. The process proceeds to Step Se10. On the other hand, if the aperture value obtained based on the photometric result is equal to or smaller than the aperture threshold value, the aperture unit 73 is limited to the aperture value, and the process proceeds to Step Se10.
- the body microcomputer 50 determines the defocus information based on the output from the line sensor 24a of the phase difference detection unit 20. (Step Se10, Se11), the focus lens group 72 is driven based on the defocus information (Step Se12), the contrast peak is detected, and the focus lens group 72 is moved to the position where the contrast value has reached the peak. (Step Se13).
- step Sa11 the process waits until the release button 40b is fully pressed by the photographer.
- the flow after the release button 40b is fully pressed is the same as the normal phase difference detection AF described above.
- step Se8 only when it is determined in step Se8 that the aperture value obtained based on the photometric result is larger than a predetermined aperture threshold value, the aperture section 73 is narrowed down in step Sa14. That is, when it is determined in step Se8 that the aperture value obtained based on the photometric result is equal to or smaller than the predetermined aperture threshold, the aperture unit 73 is previously narrowed in step Se9, so step Sa14 needs to be performed. There is no.
- the diaphragm 73 is stopped before autofocus. By doing so, it is not necessary to narrow down the diaphragm 73 after the release button 40b is fully pressed, and the release time lag can be shortened.
- the continuous shooting mode will be described with reference to FIGS.
- a description will be given on the assumption that the hybrid AF according to the modification is performed.
- the continuous shooting mode is not limited to the hybrid AF according to the modified example, but is adopted in any configuration such as a phase difference detection AF, a contrast detection AF, a hybrid AF, and a phase difference detection AF according to the modification. be able to.
- Steps Sf1 to Sf13 The same as Steps Se1 to Se13 in FIG.
- the body microcomputer 50 forms an image on the line sensor 24a at that time (that is, when focused by the contrast detection method AF).
- the interval between the two subject images to be stored is stored in the memory unit (step Sf14).
- step Sf15 the process waits until the photographer fully presses the release button 40b.
- the release button 40b is fully pressed by the photographer, exposure is performed in the same manner as in steps Sa12 to Sa17 in the phase difference detection method AF.
- the body microcomputer 50 temporarily closes the shutter unit 42 (step Sf16), starts image blur correction (step Sf17), and if the aperture 73 is not stopped in step Sf9, the result of photometry.
- the aperture 73 is narrowed down (step Sf18), then the shutter unit 42 is opened (step Sf19), exposure is started (step Sf20), and the shutter unit 42 is closed (step Sf21). Exit.
- step Sf22 it is determined whether or not the release button 40b has been fully pressed.
- the release button 40b is released (YES)
- the process proceeds to steps Sf29 and Sf30.
- the release button 40b is fully pressed (NO)
- the process proceeds to step Sf23 to perform continuous shooting.
- the body microcomputer 50 opens the shutter unit 42 (step Sf23) and performs phase difference detection AF (steps Sf24 to Sf26).
- the focus state of the subject image on the image sensor 10 is detected via the phase difference detection unit 20 (step Sf24), defocus information is acquired (step Sf25), and the focus lens group 72 is based on the defocus information. Is driven (step Sf26).
- defocus information is obtained by comparing the interval between two subject images formed on the line sensor 24a with a preset reference interval (step Sf11).
- steps Sf24 and Sf25 after the release button 40b is fully pressed the interval between two subject images formed on the line sensor 24a is stored in step Sf14, and after the contrast detection method AF in the hybrid method AF.
- the focus state and defocus information are obtained by comparison with the interval between the two subject images formed on the line sensor 24a.
- the body microcomputer 50 After performing the phase difference detection method AF in this manner, the body microcomputer 50 has arrived at a timing for outputting a signal for starting exposure (ie, an exposure start signal) from the body microcomputer 50 to the shutter control unit 51 and the imaging unit control unit 52. Is determined (step Sf27).
- the output timing of the exposure start signal is the continuous shooting timing in the continuous shooting mode.
- the phase difference detection AF is repeated (steps Sf24 to Sf26).
- the focus lens group 72 is driven. Stop (step Sf28) and perform exposure (step Sf20).
- the electric shutter 11 sweeps out the electric charges of the light receiving portions 11b, 11b,... Or the shutter unit 42 is temporarily closed and the electric charges of the light receiving portions 11b, 11b,. To start exposure.
- step Sf22 After the exposure is completed, it is determined again whether or not the release button 40b has been fully pressed (step Sf22), and the phase difference detection method AF and exposure are repeated as long as the release button 40b is fully pressed (step Sf22). Sf23 to Sf28, Sf20, Sf21).
- step Sf29 when full release of the release button 40b is released, the image blur correction is completed (step Sf29), the diaphragm 73 is opened (step Sf30), and the shutter unit 42 is opened (step Sf31).
- step Sa5 When the photographing sequence is completed after the reset is completed, the process returns to step Sa5 and waits until the release button 40b is half-pressed.
- the body microcomputer 50 moves the focus lens group 72 to a predetermined reference position set in advance (step Sf33) and closes the shutter unit 42. (Step Sf34). Then, the operation of the body microcomputer 50 and various units in the camera body 4 and the lens microcomputer 80 and various units in the interchangeable lens 7 are stopped.
- the phase difference detection AF can be performed between the exposures during the continuous shooting, so that high focus performance can be realized.
- the autofocus at this time is the phase difference detection method AF
- the defocus direction can be acquired instantaneously, and the focus can be instantaneously adjusted even in a short time between continuous shots.
- phase difference detection AF even with the phase difference detection AF, it is not necessary to provide a movable mirror for phase difference detection as in the prior art, so that the release time lag can be shortened and power consumption can be suppressed. Furthermore, since there is a release time lag corresponding to the up and down movement of the movable mirror in the conventional case, when the subject is a moving object, it is necessary to take a picture while predicting the movement of the moving object in the release time lag. In this embodiment, since there is no release time lag corresponding to the raising and lowering operation of the movable mirror, it is possible to focus while tracking the movement of the subject until immediately before the exposure.
- the contrast detection method AF is used when the release button 40b is half-pressed.
- the first frame in continuous shooting mode is not limited to hybrid AF.
- Either phase difference detection AF or contrast detection AF may be used.
- step Sf14 is not performed, and in steps Sf24 and Sf25, the distance between the two subject images formed on the line sensor 24a is compared with a preset reference interval and the focus state is reached. And defocus information.
- the phase difference detection AF is performed until the release button 40b is fully pressed even after the subject is focused. You may comprise.
- the camera 100 is configured to switch the autofocus method according to the contrast of the subject image.
- the camera 100 has a low contrast mode in which shooting is performed under a low contrast condition.
- the low-con mode will be described with reference to FIG.
- the description will be made on the assumption that the hybrid AF is performed.
- the low contrast mode is not limited to the hybrid AF, but is adopted in any configuration such as the phase difference detection AF, the contrast detection AF, the phase difference detection AF according to the modification, the hybrid AF according to the modification, and the like. be able to.
- step Sg1 to Sg5 From the time when the power switch 40a is turned on until the release button 40b is half-pressed (steps Sg1 to Sg5), it is the same as the steps Sa1 to Sa5 in the phase difference detection AF.
- the body microcomputer 50 When the release button 40b is half-pressed by the photographer (step Sg5), the body microcomputer 50 amplifies the output from the line sensor 24a of the phase difference detection unit 20 and then calculates it with an arithmetic circuit (step Sg6). . And it is determined whether it is a low contrast state (step Sg7). Specifically, it is determined whether the contrast value is high enough to detect the positions of the two subject images formed on the line sensor 24a based on the output from the line sensor 24a.
- Step Sg8 to Sg10 are the same as steps Sc7, Sc10, and Sc11 in the hybrid AF.
- Step Sg11 when the contrast value is not high enough to detect the positions of the two subject images (YES), it is determined that the low contrast state is set, and the process proceeds to step Sg11 to perform contrast detection AF. Steps Sg11 to Sg15 are the same as steps Sb6 to Sb10 in the contrast detection method AF.
- step Sa11 After performing the hybrid method AF or the contrast detection method AF, the process proceeds to step Sa11.
- step Sg16 photometry is performed (step Sg16) and image blur detection is started (step Sg17).
- steps Sg16 and Sg17 are the same as steps Sa9 and Sa10 of the phase difference detection method AF. Thereafter, the process proceeds to Step Sa11.
- step Sa11 the process waits until the release button 40b is fully pressed by the photographer.
- the flow after the release button 40b is fully pressed is the same as the normal hybrid AF.
- the hybrid method AF is performed when the contrast at the time of photographing is high enough to perform the phase difference detection method AF, while the contrast at the time of photographing cannot be performed by the phase difference detection method AF. If it is too low, contrast detection AF is performed.
- the method AF is determined, but is not limited to this.
- a contrast value is obtained from the output of the image sensor 10, and this imaging is performed. You may comprise so that it may be determined whether the contrast value calculated
- the predetermined value is set to a contrast value that can detect the position of the subject image formed on the line sensor 24a.
- the hybrid AF is performed, while the contrast value obtained from the output of the image sensor 10.
- the contrast detection method AF may be performed when the value is less than a value at which the in-focus state can be detected by the phase difference detection method.
- the hybrid method AF when the in-focus state can be detected by the phase difference detection method, the hybrid method AF is performed.
- the phase difference detection method AF may be performed.
- phase difference detection AF including hybrid AF
- contrast detection AF can be performed. Therefore, by selecting the phase difference detection method AF and the contrast detection method AF according to the contrast, it is possible to realize a highly accurate focus performance.
- the camera 100 is configured to switch the autofocus method according to the type of the interchangeable lens 7 attached to the camera body 4.
- the AF switching function depending on the type of the interchangeable lens will be described with reference to FIG.
- the description will be made on the assumption that the hybrid AF is performed.
- the AF switching function by the interchangeable lens is not limited to the hybrid AF, but can be any phase difference detection AF, contrast detection AF, phase difference detection AF according to the modification, hybrid AF according to the modification, and the like. Can be adopted in the configuration.
- step Sh1 to Sh5 From the time when the power switch 40a is turned on until the release button 40b is half-pressed (steps Sh1 to Sh5), it is the same as the steps Sa1 to Sa5 in the phase difference detection AF.
- step Sh5 When the release button 40b is half-pressed by the photographer (step Sh5), photometry is performed (step Sh6) and at the same time, image blur detection is started (step Sh7).
- steps Sh6 and Sh7 are the same as steps Sa9 and Sa10 of the phase difference detection method AF. Note that these photometry and image blur detection may be performed in parallel with an autofocus operation described later.
- the body microcomputer 50 determines whether the interchangeable lens 7 is a third-party reflective telephoto lens or STF (Smooth Trans-Focus) lens based on information from the lens microcomputer 80 (step Sh8).
- the process proceeds to step Sh13 to perform contrast detection AF.
- Steps Sh13 to Sh17 are the same as steps Sb6 to Sb10 in the contrast detection method AF.
- Step Sh9 is the same as steps Sc6, Sc7, Sc10, and Sc11 in the hybrid AF.
- step Sa11 After performing the contrast detection AF or the hybrid AF, the process proceeds to step Sa11.
- step Sa11 the process waits until the release button 40b is fully pressed by the photographer.
- the flow after the release button 40b is fully pressed is the same as in the hybrid AF.
- the phase difference detection may not be performed with high accuracy, so the hybrid method AF (specifically, the phase difference detection method AF) is performed.
- contrast detection AF is performed.
- the interchangeable lens 7 is neither a third-party reflective telephoto lens nor an STF lens
- hybrid AF is performed. That is, the body microcomputer 50 determines whether there is a guarantee that the optical axis of the interchangeable lens 7 can be subjected to the phase difference detection AF, and the optical axis is aligned to the extent that the phase difference detection AF can be performed. Only the interchangeable lens 7 with the guarantee is configured to perform the hybrid AF, while the interchangeable lens 7 without the guarantee that the optical axis is aligned to the extent that the phase difference detection AF can be performed is configured to perform the contrast detection AF. .
- phase difference detection AF including hybrid AF
- contrast detection AF can be performed. Therefore, by selecting the phase difference detection method AF and the contrast detection method AF according to the type of the interchangeable lens 7, high-precision focus performance can be realized.
- whether to perform the hybrid AF or the contrast detection AF is determined depending on whether the interchangeable lens 7 is a third-party reflective telephoto lens or an STF lens. It is not limited to. It does not matter whether it is a reflective telephoto lens or STF lens, and it is configured to determine whether to perform hybrid AF or contrast detection AF depending on whether or not the interchangeable lens 7 is made by third party. May be.
- the interchangeable lens 7 when the interchangeable lens 7 is neither a third-party reflective telephoto lens nor an STF lens, it is configured to perform the hybrid AF, but is configured to perform the phase difference detection AF. May be.
- Embodiment 3 of the Invention ⁇ Embodiment 3 of the Invention >> Next, a camera as an imaging apparatus according to Embodiment 3 of the present invention will be described.
- the camera 200 according to the third embodiment includes a finder optical system 6 as shown in FIG.
- the camera body 204 guides incident light from the viewfinder optical system 6 for viewing the subject image via the viewfinder 65 and the interchangeable lens 7 to the viewfinder optical system 6. And a semi-transparent quick return mirror 46.
- the camera body 204 has a viewfinder shooting mode in which shooting is performed while viewing the subject image via the viewfinder optical system 6, and a live view shooting mode in which shooting is performed while viewing the subject image via the image display unit 44. ing.
- the camera body 204 is provided with a finder mode setting switch 40g. When the finder mode setting switch 40g is turned ON, the finder shooting mode is set, and when the finder mode setting switch 40g is turned OFF, the live view shooting mode is set.
- the finder optical system 6 includes a finder screen 61 on which reflected light from the quick return mirror 46 forms an image, a pentaprism 62 that converts a subject image projected on the finder screen 61 into an erect image, and an enlarged subject image. It includes an eyepiece 63 for visual recognition, an infinder display unit 64 for displaying various information in the viewfinder field, and a finder 65 provided on the back side of the camera body 204.
- the subject image formed on the finder screen 61 can be observed from the finder 65 through the pentaprism 62 and the eyepiece lens 63.
- the body control unit 205 further includes a mirror control unit 260 that controls the flip-up drive of the quick return mirror 46 described later based on a control signal from the body microcomputer 50. Including.
- the quick return mirror 46 is a semi-transmissive mirror that can reflect and transmit incident light.
- the quick return mirror 46 is in front of the shutter unit 42 and reflects on the optical path X from the subject toward the imaging unit 1 (see the solid line in FIG. 21) and the optical path. It is configured to be rotatable between a retracted position outside X and close to the viewfinder optical system 6 (see the two-dot chain line in FIG. 21).
- the quick return mirror 46 divides incident light into reflected light that is reflected toward the viewfinder optical system 6 and transmitted light that is transmitted to the back side of the quick return mirror 46 at the reflection position.
- the quick return mirror 46 is disposed in front of the shutter unit 42 (that is, on the subject side), and is supported rotatably around an axis Y extending horizontally in front of the upper portion of the shutter unit 42.
- the quick return mirror 46 is biased toward the retracted position by a biasing spring (not shown).
- the quick return mirror 46 is moved to the reflection position when the urging spring is wound up by a motor (not shown) that opens and closes the shutter unit 42.
- the quick return mirror 46 moved to the reflection position is locked at the reflection position by an electromagnet or the like. Then, by releasing this locking, the quick return mirror 46 is rotated to the retracted position by the force of the biasing spring.
- the quick return mirror 46 when a part of the incident light is guided to the finder screen 61, the quick return mirror 46 is positioned at the reflection position by winding up the urging spring with a motor. On the other hand, when all of the incident light is guided to the imaging unit 1, the quick return mirror 46 is rotated to the retracted position by the elastic force of the biasing spring by releasing the locking by the electromagnet or the like.
- the light return plate 47 is connected to the quick return mirror 46 as shown in FIG.
- the light shielding plate 47 operates in conjunction with the quick return mirror 46, and covers the quick return mirror 46 from below (on the optical path X side from the subject to the imaging unit 1) when the quick return mirror 46 is located at the retracted position. It is configured as follows. By doing so, the light incident from the finder optical system 6 is prevented from reaching the imaging unit 1 when the quick return mirror 46 is located at the retracted position.
- the light shielding plate 47 is pivotally connected to the end of the quick return mirror 46 on the side opposite to the rotation axis Y, and is rotatable to the first light shielding plate 48. And a second light shielding plate 49 connected to the first light shielding plate 49.
- the first light shielding plate 48 has a first cam follower 48a.
- the camera body 204 is provided with a first cam groove 48b with which the first cam follower 48a is engaged.
- the second light shielding plate 49 has a second cam follower 49a.
- the camera body 204 is provided with a second cam groove 49b with which the second cam follower 49a is engaged.
- the first light shielding plate 48 moves following the quick return mirror 46 and the second light shielding plate 49 moves following the first light shielding plate 48.
- the first and second light shielding plates 48 and 49 are interlocked with the quick return mirror 46 while the first and second cam followers 48a and 49a are guided by the first and second cam grooves 48b and 49b, respectively.
- the first and second light shielding plates 48 and 49 are arranged in a single flat plate below the quick return mirror 46 as shown in FIG. In this state, the light is shielded between the quick return mirror 46 and the shutter unit 42, that is, the imaging unit 1. At this time, the first and second light shielding plates 48 and 49 are located outside the optical path X as in the case of the quick return mirror 46. Therefore, the first and second light shielding plates 48 and 49 do not affect the light incident on the imaging unit 1 from the subject.
- the quick return mirror 46 moves from the retracted position to the reflecting position, the first and second light shielding plates 48 and 49 are bent from the flat plate state as shown in FIG.
- 46 is rotated to the reflection position, as shown in FIG. 22 (C), it is bent to the extent that it faces each other.
- the first and second light shielding plates 48 and 49 are located outside the optical path X and on the opposite side of the finder screen 61 across the optical path X. Therefore, when the quick return mirror 46 is located at the reflection position, the first and second light shielding plates 48 and 49 are reflected by the quick return mirror 46 to the finder optical system 6 or transmitted through the quick return mirror 46. Will not be affected.
- the quick return mirror 46 in the finder photographing mode, the subject image can be visually recognized by the finder optical system 6 before photographing, and the imaging unit 1 can be viewed.
- Light can be made to reach, and light from the subject can be guided to the image pickup unit 1 at the time of photographing, and light incident from the finder optical system 6 can be prevented from reaching the image pickup unit 1 by the light shielding plate 47.
- the light incident from the finder optical system 6 in the live view shooting mode, the light incident from the finder optical system 6 can be prevented from reaching the imaging unit 1 by the light shielding plate 47.
- the camera 200 configured as described above has two shooting modes, namely, a finder shooting mode and a live view shooting mode, which differ in how the subject is visually recognized.
- two shooting modes namely, a finder shooting mode and a live view shooting mode, which differ in how the subject is visually recognized.
- operations of the two shooting modes of the camera 200 will be described.
- the power switch 40a is turned on (step Si1), the release button 40b is pressed halfway by the photographer (step Si5), and then the release button 40b is fully pressed by the photographer (step Si11), and the shutter unit 42 is temporarily closed.
- step Si12 the operation is basically the same as the operations in steps Sa1 to Sa12 in the phase difference detection AF according to the second embodiment.
- the quick return mirror 46 is located at the reflection position on the optical path X. Therefore, a part of the light entering the camera body 204 is reflected and enters the finder screen 61.
- the light incident on the finder screen 61 is formed as a subject image.
- This subject image is converted into an erect image by the pentaprism 62 and enters the eyepiece lens 63. That is, the subject image is not displayed on the image display unit 44 as in the second embodiment, but the photographer can observe an erect image of the subject through the eyepiece lens 63.
- the image display unit 44 displays various types of information related to shooting, not the subject image.
- step Si5 When the release button 40b is pressed halfway by the photographer (step Si5), various information related to photographing (such as information on AF and photometry below) is displayed on the infinder display unit 64 observed from the eyepiece 63.
- various information related to photographing such as information on AF and photometry below
- the photographer can check various information related to photographing not only by the image display unit 44 but also by the finder display unit 64.
- the quick return mirror 46 is semi-transmissive, a part of the light incident on the camera body 204 is guided to the finder optical system 6 by the quick return mirror 46, but the remaining light passes through the quick return mirror 46. Is incident on the shutter unit 42. Then, when the shutter unit 42 is opened (step Si4), the light transmitted through the quick return mirror 46 enters the imaging unit 1. As a result, it is possible to perform autofocus (steps Si6 to Si8) and photometry (step Si9) by the imaging unit 1 while making it possible to visually recognize the subject image via the finder optical system 6.
- phase difference detection AF is performed based on the output from the phase difference detection unit 20 of the imaging unit 1, and in parallel with this, the output from the imaging element 10 of the imaging unit 1 in step Si9.
- Metering can be performed based on
- the phase detection width of the phase difference focus detection unit differs between when the quick return mirror 46 is retracted from the subject image optical path and enters the imaging state, and when the quick return mirror 46 is positioned at the reflection position. . Therefore, in the finder shooting mode in which the quick return mirror 46 is inserted in the subject image optical path, the phase detection width in the phase difference focus detection of the first embodiment (that is, the phase difference in the hybrid AF in the live view shooting mode described later).
- the defocus information is output with a phase detection width changed by a predetermined amount from the phase detection width in focus detection. Note that the phase detection width is a reference phase difference for calculating the defocus amount as 0, that is, determining the focus.
- the steps Si6 to Si8 for performing the phase difference detection method AF are the same as the steps Sa6 to Sa8 in the phase difference detection method AF of the second embodiment.
- step Si9 the amount of light incident on the image sensor 10 is measured by the image sensor 10.
- the body microcomputer 50 reflects the output from the image sensor 10 on the reflection of the quick return mirror 46.
- the amount of light from the subject is obtained by correcting based on the characteristics.
- Step Si11 After the release button 40b is fully pressed by the photographer (step Si11) and the shutter unit 42 is once closed (step Si12), image blur correction starts (step Si13) and the aperture 73 is stopped.
- Step Si15 the quick return mirror 46 is flipped up to the retracted position.
- steps Si16 to Si18 exposure is performed as in steps Sa15 to Sa17 in the phase difference detection method AF of the second embodiment.
- the quick return mirror 46 is moved to the reflection position in step Si21 in parallel with the end of the image blur correction (step Si19) and the opening of the diaphragm 73 (step Si20). By doing so, the photographer can again view the subject image via the finder optical system 6.
- step Si22 the shutter unit 42 is opened (step Si22).
- the process returns to step Si5 and waits until the release button 40b is half-pressed.
- steps Si23 to Si25 after the power switch 40a is turned off are the same as steps Sa21 to Sa23 in the phase difference detection method AF of the second embodiment.
- the imaging element 10 is attached to the imaging unit 1. Since the phase difference detection unit 20 that detects the phase difference using the transmitted light is provided, the quick return mirror 46 is made semi-transmissive so that a part of the light incident on the quick return mirror 46 reaches the imaging unit 1. As a result, it is possible to perform parallel processing of the phase difference detection method AF and photometry while making it possible to visually recognize the subject image via the finder optical system 6. By doing so, it is not necessary to separately provide a reflection mirror for phase difference detection AF and a sensor for photometry, and photometry can be performed in parallel with autofocus, so that the release time lag can be shortened.
- step Sj1 to Sj4 the operation from the time when the power switch 40a is turned on until the shutter unit 42 is opened.
- step Sj5 the body microcomputer 50 jumps the quick return mirror 46 to the retracted position.
- the light incident on the camera body 4 from the subject passes through the shutter unit 42 without being divided into the finder optical system 6, further passes through the IR cut / OLPF 43, and enters the imaging unit 1.
- the subject image formed by the imaging unit 1 is displayed on the image display unit 44, and the photographer can observe the subject image via the image display unit 44. Further, part of the light incident on the imaging unit 1 passes through the imaging element 10 and enters the phase difference detection unit 20.
- Step Sj6 when the release button 40b is pressed halfway by the photographer (step Sj6), the hybrid AF is performed unlike the finder photographing mode.
- Steps Sj7, Sj8, Sj11, Sj12 related to the hybrid AF are the same as steps Sc6, Sc7, Sc10, Sc11 in the hybrid AF according to the second embodiment.
- the present invention is not limited to hybrid AF, and contrast detection AF or phase difference detection AF may be performed.
- step Sj9 photometry is performed (step Sj9) and image blur detection is started (step Sj10).
- steps Sj9 and Sj10 are the same as steps Sc8 and Sc9 of the hybrid AF according to the second embodiment.
- step Sj13 the shutter unit 42 is closed and the quick return mirror 46 is moved to the retracted position until the photographer fully presses the release button 40b (step Sj13) until the exposure is completed and the reset is completed (step Sj22). Except that there is no step (corresponding to step Si15), and there is no step (corresponding to step Si21) of moving the quick return mirror 46 to the reflection position after the exposure is completed with the shutter unit 42 closed. This is basically the same as steps Si11 to Si22 in the finder photographing mode.
- step Sj23 when the power switch 40a is turned off (step Sj23), the focus lens group 72 is moved to the reference position (step Sj24), and the shutter unit 42 is closed (step Sj25).
- step Sj26 the quick return mirror 46 is moved to the reflection position. Thereafter, the operation of the body microcomputer 50 and various units in the camera body 204 and the lens microcomputer 80 and various units in the interchangeable lens 7 are stopped.
- the shooting operation of the camera system in this live view shooting mode is the same as the shooting operation of the camera 100 according to the second embodiment except for the operation of the quick return mirror 46. That is, in the above description, the hybrid AF has been described. However, various shooting operations according to the second embodiment can be performed, and the same operations and effects can be achieved.
- the finder optical system 6 is provided, but the present invention is not limited to this.
- an EVF Electronic (View Finder) may be provided. That is, a small image display unit composed of a liquid crystal or the like is disposed in the camera body 204 at a position where it can be viewed from the viewfinder, and the image data acquired by the imaging unit 1 is displayed on the image display unit. By doing so, it is possible to realize photographing while looking through the viewfinder without providing a complicated viewfinder optical system 6. In such a configuration, the quick return mirror 46 is not necessary.
- the shooting operation is the same as that of the camera 100 according to the second embodiment, except that there are two image display units.
- the imaging unit 1 In the second and third embodiments, the configuration in which the imaging unit 1 is mounted on the camera has been described. However, the present invention is not limited to this.
- the imaging unit 1 can be mounted on a video camera.
- the power switch 40a When the power switch 40a is turned on, the diaphragm unit and the shutter unit are opened, and the image pickup device 10 of the image pickup unit 1 starts to capture an image. Then, photometry and WB adjustment optimal for live view display are performed, and the live view image is displayed on the image display unit. In this way, in parallel with the imaging by the imaging device 10, the focus state is detected based on the output of the phase difference detection unit 20 incorporated in the imaging unit 1, and the focus lens group is continuously driven in accordance with the movement of the subject. In this way, while the live view image display and the phase difference detection method AF are continued, it waits for the REC button to be pressed.
- the image data acquired by the image sensor 10 is recorded while repeating the phase difference detection AF. In this way, it is possible to maintain a focused state at all times, and it is not necessary to perform a minute driving of the focus lens in the optical axis direction (wobbling) unlike a conventional digital video camera, and a motor with a heavy load in terms of power, etc. There is no need to drive the actuator.
- step Sa4 after the shutter unit 42 is opened in step Sa4, the phase difference focus detection in step Sa6 and the focus lens drive in step Sa7 are repeatedly performed. In parallel with this, determination at step Sa5, photometry at step Sa9, start of image blur detection at step Sa10, and determination at step Sa11 are performed.
- the live view image can be displayed in a focused state by using it together with the display of the live view image.
- the display of the live view image and the phase difference detection method AF can be used in combination. Such an operation may be provided in the camera as the “always AF mode”, or the “always AF mode” may be switched on / off.
- a camera equipped with the imaging unit 1 is an example of a camera that can simultaneously perform exposure of an imaging element and phase difference detection by a phase difference detection unit.
- the present invention is not limited to this, and for example, a camera that guides subject light to both the image sensor and the phase difference detection unit by a light separation element (for example, a prism, a semi-transmission mirror, etc.) that separates light to the image sensor.
- a light separation element for example, a prism, a semi-transmission mirror, etc.
- a camera in which a part of the microlens of the imaging element is a separator lens and the subject light that is divided into pupils can be received by the light receiving unit may be used.
- the present invention is useful for an imaging device including an imaging device that performs photoelectric conversion.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Power Engineering (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Optics & Photonics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Focusing (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
10,210,310 撮像素子
11a 基板
11b 受光部
17 透過部(薄肉部)
20,420 位相差検出ユニット(位相差検出部)
21a コンデンサレンズ
23a セパレータレンズ
24a ラインセンサ(センサ)
31 パッケージ(保持部)
31c 開口(通過部)
100,200 カメラ(撮像装置)
本発明に係る撮像装置としての撮像ユニット1を図1に示す。この撮像ユニット1は、被写体像を電気信号に変換するための撮像素子10と、撮像素子10を保持するためのパッケージ31と、位相差検出方式の焦点検出を行うための位相差検出ユニット20とを有している。
ここで、
Rk:透過部17のR画素補正量-透過部17以外のR画素補正量
Gk:透過部17のG画素補正量-透過部17以外のG画素補正量
Bk:透過部17のB画素補正量-透過部17以外のB画素補正量
とする。
次に、本発明の実施形態2に係る撮像装置としてのカメラについて説明する。
カメラ本体4は、被写体像を撮影画像として取得する前記実施形態1に係る撮像ユニット1と、撮像ユニット1の露光状態を調節するシャッタユニット42と、撮像ユニット1に入射する被写体像の赤外光除去とモアレ現象を軽減するためのIRカット兼OLPF(Optical Low Pass Filter)43と、液晶モニタで構成され、撮影画像やライブビュー画像や各種情報を表示する画像表示部44と、ボディ制御部5とを有している。
交換レンズ7は、カメラ本体4内の撮像ユニット1に被写体像を結ぶための撮像光学系を構成しており、主に、フォーカシングを行うフォーカス調節部7Aと、絞りを調節する絞り調節部7Bと、光路を調節することで像ブレを補正するレンズ用像ブレ補正部7Cと、交換レンズ7の動作を制御するレンズ制御部8とを有している。
このように構成されたカメラ100は、種々の撮影モード及び機能を備えている。以下、カメラ100の種々の撮影モード及び機能と共にそのときの動作を説明する。
カメラ100は、レリーズボタン40bが半押しされると、AFにより焦点を合わせるが、このAFとして、位相差検出方式AFと、コントラスト検出方式AFと、ハイブリッド方式AFとの3つのオートフォーカス機能を有している。これら3つのオートフォーカス機能は、カメラ本体4に設けられたAF設定スイッチ40cを操作することによって、撮影者が選択可能となっている。
まず、位相差検出方式AF方式によるカメラシステムの撮影動作について、図11,12を参照して説明する。
次に、コントラスト検出方式AFによるカメラシステムの撮影動作について、図13を参照して説明する。
続いて、ハイブリッド方式AFによるカメラシステムの撮影動作について、図14を参照して説明する。
以上の説明では、レリーズボタン40bの全押し後であって露光の直前に絞り込みを行っているが、以下では、位相差検出方式AF及びハイブリッド方式AFにおいて、レリーズボタン40bの全押し前であって、さらにオートフォーカス前に絞り込みを行うように構成した変形例について説明する。
具体的に、まず、変形例に係る位相差検出方式AFによるカメラシステムの撮影動作について、図15を参照して説明する。
次に、変形例に係るハイブリッド方式AFによるカメラシステムの撮影動作について、図16を参照して説明する。
以上の説明では、レリーズボタン40bが全押しされるごとに1つの画像が撮影されるが、カメラ100は、レリーズボタン40bの1回の全押し操作で複数の画像が撮影される連写モードを備えている。
本実施形態に係るカメラ100は、被写体像のコントラストに応じてオートフォーカスの方式を切り替えるように構成されている。つまり、カメラ100は、コントラストが低い条件下で撮影を行うローコンモードを備えている。
さらに、本実施形態に係るカメラ100は、カメラ本体4に取り付けられた交換レンズ7の種類に応じてオートフォーカスの方式を切り替えるように構成されている。
次に、本発明の実施形態3に係る撮像装置としてのカメラについて説明する。
カメラ本体204は、実施形態1のカメラ本体4の構成に加えて、ファインダ65を介して被写体像を視認するためのファインダ光学系6と、交換レンズ7からの入射光をファインダ光学系6に導く半透過のクイックリターンミラー46とをさらに有している。
このように構成されたカメラ200は、被写体の視認の仕方が異なる、ファインダ撮影モードとライブビュー撮影モードとの2つの撮影モードを備えている。以下、カメラ200の2つの撮影モードの動作を説明する。
まず、ファインダ撮影モードにおけるカメラシステムの撮影動作について、図23,24を参照して説明する。
次に、ライブビュー撮影モードにおけるカメラシステムの撮影動作について、図25,26を参照して説明する。
本発明は、前記実施形態について、以下のような構成としてもよい。
あってもよい。また、撮像素子のマイクロレンズの一部をセパレータレンズとし、瞳分割された被写体光をそれぞれ受光部にて受光できるように配列したカメラであってもよい。
Claims (8)
- 光を受けて光電変換を行うと共に、光が通過するように構成されている撮像素子と、
前記撮像素子を保持する保持部と、
前記撮像素子を通過した光を受光して位相差検出を行う位相差検出部とを備え、
前記保持部には、前記撮像素子を通過した光を通過させる通過部が設けられており、
前記撮像素子を通過した光が、前記通過部を介して前記保持部を通過し、前記位相差検出部に入射するように構成されている撮像装置。 - 請求項1に記載の撮像装置において、
前記撮像素子は、受光部と該受光部が配設される基板とをさらに有し、
前記基板は、光を通過させる撮像装置。 - 請求項2に記載の撮像装置において、
前記基板は、周辺部よりも薄く形成された薄肉部を有し、該薄肉部を介して光を通過させる撮像装置。 - 請求項1乃至3に記載の撮像装置において、
前記通過部は、前記保持部に形成された貫通孔である撮像装置。 - 請求項3に記載の撮像装置において、
前記通過部は、前記保持部に形成された貫通孔であり、
前記薄肉部は、複数形成されており、
前記保持部には、前記貫通孔が前記薄肉部に応じた個数だけ形成されており、
前記位相差検出部を前記貫通孔に応じた個数だけ備えている撮像装置。 - 請求項4又は5に記載の撮像装置において、
前記位相差検出部は、前記通過部を通過した光を分割するセパレータレンズと、該セパレータレンズによって分割された光の位相差を検出するセンサとを有し、
前記セパレータレンズは、前記貫通孔に嵌合される撮像装置。 - 請求項4又は5に記載の撮像装置において、
前記位相差検出部は、前記通過部を通過した光を集光するコンデンサレンズと、該コンデンサレンズによって集光された光を分割するセパレータレンズと、該セパレータレンズによって分割された光の位相差を検出するセンサとを有し、
前記コンデンサレンズは、前記貫通孔に嵌合される撮像装置。 - 請求項7に記載の撮像装置において、
前記セパレータレンズ、前記コンデンサレンズ及び前記センサは、一体的にユニット化されていると共に、ユニット化された状態で前記保持部に取り付けられる撮像装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/918,662 US8078047B2 (en) | 2008-02-22 | 2009-02-20 | Imaging apparatus |
JP2009554230A JP4902890B2 (ja) | 2008-02-22 | 2009-02-20 | 撮像装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008041573 | 2008-02-22 | ||
JP2008-041573 | 2008-02-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009104415A1 true WO2009104415A1 (ja) | 2009-08-27 |
Family
ID=40985301
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/000740 WO2009104415A1 (ja) | 2008-02-22 | 2009-02-20 | 撮像装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US8078047B2 (ja) |
JP (1) | JP4902890B2 (ja) |
WO (1) | WO2009104415A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101952759B (zh) * | 2008-02-22 | 2012-12-19 | 松下电器产业株式会社 | 摄像装置 |
JP5097275B2 (ja) * | 2008-10-28 | 2012-12-12 | パナソニック株式会社 | 撮像ユニット |
JP5709532B2 (ja) | 2011-01-05 | 2015-04-30 | キヤノン株式会社 | 自動合焦装置及びそれを有するレンズ装置及び撮像システム |
US10148864B2 (en) | 2015-07-02 | 2018-12-04 | Pixart Imaging Inc. | Imaging device having phase detection pixels and regular pixels, and operating method thereof |
US9978154B2 (en) * | 2015-07-02 | 2018-05-22 | Pixart Imaging Inc. | Distance measurement device base on phase difference and distance measurement method thereof |
KR20210034822A (ko) * | 2019-09-23 | 2021-03-31 | 삼성전자주식회사 | 디스플레이 및 카메라 장치를 포함하는 전자 장치 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5962809A (ja) * | 1982-10-04 | 1984-04-10 | Olympus Optical Co Ltd | 焦点検出装置 |
JPH10161014A (ja) * | 1996-11-26 | 1998-06-19 | Kyocera Corp | 2次元センサを用いた自動焦点検出装置 |
JPH11352394A (ja) * | 1998-06-09 | 1999-12-24 | Minolta Co Ltd | 焦点検出装置 |
JP2004046132A (ja) * | 2002-05-17 | 2004-02-12 | Olympus Corp | 自動焦点調節装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6768867B2 (en) * | 2002-05-17 | 2004-07-27 | Olympus Corporation | Auto focusing system |
JP2005175976A (ja) | 2003-12-12 | 2005-06-30 | Canon Inc | 多層フォトダイオードを有する撮像素子を用いた撮像装置のオートフォーカスシステム |
JP2006106435A (ja) * | 2004-10-06 | 2006-04-20 | Canon Inc | 光学機器 |
JP2007135140A (ja) | 2005-11-14 | 2007-05-31 | Konica Minolta Photo Imaging Inc | 撮像装置 |
JP4834394B2 (ja) | 2005-12-09 | 2011-12-14 | キヤノン株式会社 | 撮像装置およびその制御方法 |
-
2009
- 2009-02-20 WO PCT/JP2009/000740 patent/WO2009104415A1/ja active Application Filing
- 2009-02-20 US US12/918,662 patent/US8078047B2/en not_active Expired - Fee Related
- 2009-02-20 JP JP2009554230A patent/JP4902890B2/ja not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5962809A (ja) * | 1982-10-04 | 1984-04-10 | Olympus Optical Co Ltd | 焦点検出装置 |
JPH10161014A (ja) * | 1996-11-26 | 1998-06-19 | Kyocera Corp | 2次元センサを用いた自動焦点検出装置 |
JPH11352394A (ja) * | 1998-06-09 | 1999-12-24 | Minolta Co Ltd | 焦点検出装置 |
JP2004046132A (ja) * | 2002-05-17 | 2004-02-12 | Olympus Corp | 自動焦点調節装置 |
Also Published As
Publication number | Publication date |
---|---|
JP4902890B2 (ja) | 2012-03-21 |
JPWO2009104415A1 (ja) | 2011-06-23 |
US20100329656A1 (en) | 2010-12-30 |
US8078047B2 (en) | 2011-12-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5128616B2 (ja) | 撮像装置 | |
JP5398893B2 (ja) | 撮像装置 | |
JP4902892B2 (ja) | 撮像装置 | |
JP5247522B2 (ja) | 撮像装置 | |
JP5604160B2 (ja) | 撮像装置 | |
JP5097275B2 (ja) | 撮像ユニット | |
JP4902891B2 (ja) | 撮像装置 | |
JP5190537B2 (ja) | 撮像素子及びそれを備えた撮像装置 | |
JP4902890B2 (ja) | 撮像装置 | |
JP2010113272A (ja) | 撮像装置 | |
JP2010113273A (ja) | 撮像装置 | |
JP4902893B2 (ja) | 撮像装置 | |
JP2009210817A (ja) | 撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09712017 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009554230 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12918662 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09712017 Country of ref document: EP Kind code of ref document: A1 |