US20130044246A1 - Imaging apparatus - Google Patents
Imaging apparatus Download PDFInfo
- Publication number
- US20130044246A1 US20130044246A1 US13/657,239 US201213657239A US2013044246A1 US 20130044246 A1 US20130044246 A1 US 20130044246A1 US 201213657239 A US201213657239 A US 201213657239A US 2013044246 A1 US2013044246 A1 US 2013044246A1
- Authority
- US
- United States
- Prior art keywords
- phase difference
- difference detection
- section
- imaging device
- lens
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 389
- 238000001514 detection method Methods 0.000 claims abstract description 465
- 238000006243 chemical reaction Methods 0.000 claims abstract description 21
- 238000005375 photometry Methods 0.000 claims description 62
- 238000004519 manufacturing process Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 abstract description 30
- 230000003287 optical effect Effects 0.000 description 90
- 238000012937 correction Methods 0.000 description 75
- 238000000034 method Methods 0.000 description 66
- 230000008569 process Effects 0.000 description 45
- 239000000758 substrate Substances 0.000 description 41
- 230000006870 function Effects 0.000 description 27
- 230000008859 change Effects 0.000 description 8
- 238000002834 transmittance Methods 0.000 description 8
- 230000009467 reduction Effects 0.000 description 7
- 238000012546 transfer Methods 0.000 description 6
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 4
- 230000003321 amplification Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000003199 nucleic acid amplification method Methods 0.000 description 4
- 238000003825 pressing Methods 0.000 description 4
- 230000011514 reflex Effects 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 239000006059 cover glass Substances 0.000 description 3
- 238000005530 etching Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 229910052681 coesite Inorganic materials 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 229910052906 cristobalite Inorganic materials 0.000 description 2
- 238000005520 cutting process Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 239000011295 pitch Substances 0.000 description 2
- 238000005498 polishing Methods 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 239000000377 silicon dioxide Substances 0.000 description 2
- 229910052682 stishovite Inorganic materials 0.000 description 2
- 239000010409 thin film Substances 0.000 description 2
- 229910052905 tridymite Inorganic materials 0.000 description 2
- 101001012235 Schistosoma japonicum 23 kDa integral membrane protein Proteins 0.000 description 1
- 101001012263 Schistosoma japonicum 25 kDa integral membrane protein Proteins 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 239000010408 film Substances 0.000 description 1
- 239000012212 insulator Substances 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000005304 joining Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000005549 size reduction Methods 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B19/00—Cameras
- G03B19/02—Still-picture cameras
- G03B19/12—Reflex cameras with single objective and a movable reflector or a partly-transmitting mirror
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14618—Containers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14621—Colour filter arrangements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/1462—Coatings
- H01L27/14623—Optical shielding
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14627—Microlenses
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14643—Photodiode arrays; MOS imagers
- H01L27/14645—Colour imagers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14683—Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
- H01L27/14685—Process for coatings or optical elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/148—Charge coupled imagers
- H01L27/14831—Area CCD imagers
- H01L27/14843—Interline transfer
Definitions
- the present invention has been devised, and it is therefore an object of the present invention to improve the convenience of an imaging apparatus including an imaging device and a phase difference detection section in relation to performing various types of processing using the imaging device and phase difference detection using the phase difference detection section.
- FIG. 2 is a cross-sectional view of an imaging unit.
- FIG. 3 is a cross-sectional view of an imaging device.
- FIG. 10 is a cross-sectional view illustrating a cross section of the imaging unit of the another variation, which is perpendicular to the cross section corresponding to FIG. 2 .
- the release button 40 b operates as a two-stage switch. Specifically, autofocusing, AE (Automatic Exposure) or the like, which will be described later, is performed by pressing the release button 40 b halfway down, and releasing is performed by pressing the release button 40 b all the way down.
- AE Automatic Exposure
- the lens microcomputer 80 is a control device for controlling core functions of the interchangeable lens 7 , and is connected to each Component mounted on the interchangeable lens 7 .
- the lens microcomputer 80 includes a CPU, a ROM, and a RAM and, when programs stored in the ROM are read by the CPU, various functions can be executed.
- the lens microcomputer 80 has the function of setting a lens image blur correction system (the blur correction lens driving section 74 a or the like) to be a correction possible state or a correction impossible state, based on a signal from the body microcomputer 50 .
- the relative position detection section 81 b employs, for example, a two-phase encoder.
- the two-phase encoder two rotary pulse encoders, two MR devices, two hall devices, or the like, for alternately outputting binary signals with an equal pitch according to the position of the focus lens group 72 in the direction along the optical axis are provided so that the phases of their pitches are different from each other.
- the lens microcomputer 80 calculates the relative position of the focus lens group 72 in the direction along the optical axis from an output of the relative position detection section 81 b.
- the substrate 11 a is made of a Si (silicon) based substrate.
- the substrate 11 a is made of a Si single crystal substrate or a SOI (silicon-on-insulator wafer).
- an SOI substrate has a sandwich structure of Si thin films and a SiO 2 thin film, and chemical reaction can be stopped at the SiO 2 film in etching or like processing.
- it is advantageous to use an SOI substrate.
- two green color filters 15 g i.e., color filters having a higher transmittance in a green visible light wavelength range than in the other color visible light wavelength ranges
- a red color filter 15 r i.e., a color filter having a higher transmittance in a red visible light wavelength range than in the other color visible light wavelength ranges
- a blue color filter 15 b i.e., a color filter having a higher transmittance in a blue visible light wavelength range than in the other color visible light wavelength ranges
- every second color filter in the row and column directions is the green color filter 15 g.
- the imaging device 10 performs photoelectric conversion at the light receiving sections 11 b provided throughout the entire imaging plane, thereby converting an object image formed on an imaging plane into an electrical signal.
- the same number of openings 31 c as the number of the light transmitting portions 17 are formed in the bottom plate 31 a of the package 31 to pass through the bottom plate 31 a and be located at corresponding positions to the positions of the light transmitting portions 17 of the imaging device 10 .
- the openings 31 c serves as light passing portions.
- the phase difference detection unit 20 is provided in the back surface (an opposite surface to a surface facing an object) side of the imaging device 10 and receives light transmitted through the imaging device 10 to perform phase difference detection. Specifically, the phase difference detection unit 20 converts the received transmitted light into an electrical signal to be used for distance measurement.
- the phase difference detection unit 20 serves as a phase difference detection section.
- the mask member 22 is provided between the condenser lens unit 21 and the separator lens unit 23 .
- two mask openings 22 a are formed in a part thereof corresponding to each of the separator lenses 23 a . That is, the mask member 22 divides a lens surface of each of the separator lenses 23 a into two areas, so that only the two areas are exposed toward the condenser lenses 21 a . More specifically, the mask member 22 performs pupil division to divide light which has been collected by the condenser lenses 21 a into two light beams and causes the two light beams to enter the separator lens 23 a .
- the mask member 22 can prevent harmful light from one of adjacent two of the separator lenses 23 a from entering the other one of the adjacent two. Note that the mask member 22 does not have to be provided.
- the condenser lens unit 21 , the mask member 22 and the separator lens unit 23 are attached while being held in positions determined relative to the module frame 25 . That is, the positional relation of the condenser lens unit 21 , the mask member 22 and the separator lens unit 23 are determined by the module frame 25 .
- the imaging device 10 and the phase difference detection unit 20 are preferably configured so that one of the condenser lenses 21 a and one of the openings 31 c located closest to the center of the imaging plane closely fit each other to determine positions in the imaging plane, and furthermore, one of the condenser lenses 21 a and one of the openings 31 c located most distant from the center of the imaging plane closely fit each other to determine circumferential positions (rotation angles) of the condenser lens 21 a and the opening 31 c which are located at the center of the imaging plane.
- An electrical signal output from the line sensor unit 24 is also input to the body microcomputer 50 .
- the body microcomputer 50 can obtain a distance between two object images formed on the line sensor 24 a , based on the output from the line sensor unit 24 , and then, can detect an in-focus state of an object image formed on the imaging device 10 from the obtained distance. For example, when an object image is transmitted through an imaging lens and is correctly formed on the imaging device 10 (in focus), the two object images formed on the line sensor 24 a are located at predetermined reference positions with a predetermined reference distance therebetween. In contrast, when an object image is formed before the imaging device 10 in the direction along the optical axis (front focus), the distance between the two object images is smaller than the reference distance when the object image is in focus.
- the imaging unit 1 (specifically, the imaging device 10 and the phase difference detection unit 20 ) includes three areas (hereinafter also referred to as “phase difference areas”) for detection of a phase difference, in which the condenser lens 21 a , a pair of the mask openings 22 a of the mask member 22 , the separator lens 23 a and the line sensor 24 a are arranged along the optical axis.
- three light transmitting portions 17 are formed in the substrate 11 a , and three phase difference areas are provided.
- the present invention is not limited thereto.
- the number of each of those components is not limited to three, but may be any number.
- nine light transmitting portions 17 may be formed in the substrate 11 a , and accordingly, nine sets of the condenser lens 21 a , the separator lens 23 a and the line sensor 24 a may be provided, thereby providing nine phase difference areas.
- three color filters 15 r , 15 g and 15 b corresponding to respective colors of the three light receiving sections 11 b are provided.
- a green color filter 15 g is provided in the light receiving section 11 b located in a diagonal position to the through hole 318 a
- a red color filter 15 r is provided in one of the light receiving sections 11 b located adjacent to the through hole 318 a
- a blue color filter 15 b is provided in the other one of the light receiving sections 11 b located adjacent to the through hole 318 a .
- No color filter is provided in the pixel region corresponding to the through hole 318 a.
- a pixel desired to be interpolated is an edge of a focus object. If interpolation is performed using the pair of the light receiving sections 11 b whose change in output is larger, the edge is undesirably caused to be loose. Therefore, the smaller change is used when each of the changes are equal to or larger than a predetermined threshold, and the larger change is used when each of the changes is smaller than the predetermine threshold so that as small change rate (slope) as possible is employed.
- the imaging device 310 configured in the above-described manner can cause incident light to pass therethrough via the plurality of the through holes 318 a.
- the “normal shooting mode” is not a during-exposure AF shooting mode, a macro shooting mode, or a continuous shooting mode, which will be described later, but a most basic shooting mode of the camera 100 for normal shooting.
- the body microcomputer 50 stops down the aperture section 73 via the lens microcomputer 80 so as to attain an aperture value calculated based on a result of photometry in Step Sa 9 (Step Sa 14 ).
- a photometry sensor does not have to be additionally provided, and photometry can be performed before the release button 40 b is pressed all the way down, so that a time (hereinafter also referred to as a “release time lag”) from a time point when the release button 40 b is pressed all the way down to a time point when exposure is terminated can be reduced.
- Step Sb 6 -Sb 10 photometry is performed (Step Sb 11 ), and also image blur detection is started (Step Sb 12 ).
- Steps Sb 11 and Sb 12 are the same as Step Sa 9 and Step Sa 11 ) in phase difference detection AF.
- Step Sa 11 the body microcomputer 50 remains in a standby state until the release button 40 b is pressed all the way down by the user. A flow of steps after the release button 40 b is pressed all the way down is the same as that of phase difference detection AF.
- defocus information is obtained by the phase difference detection unit 20 using light transmitted through the imaging device 10
- photometry by the imaging device 10 can be performed in parallel with obtaining defocus information by the phase difference detection unit 20 , although hybrid AF includes phase difference detection.
- a mirror for dividing a part of light from an object does not have to be provided for phase difference detection, and also, a photometry sensor does not have to be additionally provided.
- photometry can be performed before the release button 40 b is pressed all the way down, so that a release time lag can be reduced. In the configuration in which photometry is performed before the release button 40 b is pressed all the way down, photometry can be performed in parallel with obtaining defocus information, thereby preventing increase in processing time after the release button 40 b is pressed halfway down.
- Step Sd 8 an aperture value at the time of exposure is obtained based on a result of photometry in Step Sd 7 , and whether or not the obtained aperture value is larger than a predetermined aperture threshold value is determined (Step Sd 8 ). Then, when the obtained aperture value is larger than the predetermined aperture threshold value (YES), the process proceeds to Step Sd 10 . When the obtained value is equal to or smaller than the predetermined aperture threshold value (NO), the process proceeds to Step Sd 9 . In Step Sd 9 , the body microcomputer 50 drives the aperture section 73 via the lens microcomputer 80 to attain the obtained aperture value.
- Step Se 8 only when it is determined in Step Se 8 that the aperture value obtained based on the result of photometry is larger than the predetermined aperture threshold value, stopping down of the aperture section 73 is performed in Step Sa 14 . That is, when it is determined in Step Se 8 that the aperture value obtained based on the result of photometry is equal to or smaller than the predetermined aperture threshold value, Step Sa 14 does not have to be performed because stopping down of the aperture section 73 is performed beforehand in Step Se 9 .
- Step Sf 15 the body microcomputer remains in a standby state until the release button 40 b is pressed all the way down by the user.
- the release button 40 b is pressed all the way down by the user, exposure is performed in the same manner as in Steps Sa 12 -Sa 17 in phase difference detection AF.
- Step Sf 23 the body microcomputer 50 puts the shutter unit 42 into an open state (Step Sf 23 ), and phase difference detection AF is performed (Steps Sf 24 -Sf 26 ).
- Step Sf 11 a distance between two object images formed on the line sensor 24 a is compared to a reference distance which has been set beforehand to obtain the defocus information.
- Step Sf 24 and Sf 25 after the release button 40 b is pressed all the way down, the distance between two object images formed on the line sensor 24 a is compared to the distance of two object images formed on the line sensor 24 a which has been stored in Step Sf 14 after contrast detection AF in hybrid AF to obtain an in-focus state and defocus information.
- Step Sf 32 When the power switch 40 a is turned off (Step Sf 32 ), the body microcomputer 50 moves the focus lens group 72 to a predetermined reference position which has been set beforehand (Step Sf 33 ), and puts the shutter unit 42 into a close state (Step Sf 34 ). Then, respective operations of the body microcomputer 50 and other units in the camera body 4 , and the lens microcomputer 80 and other units in the interchangeable lens 7 are halted.
- phase difference detection AF can be performed between exposures during continuous shooting, so that a high focus performance can be realized.
- a movable mirror for phase difference detection does not have to be provided.
- a release time lag can be reduced, and also, power consumption can be reduced.
- a release time lag corresponding to the vertical movement of the movable mirror is generated, and thus, when an object is a moving object, it is necessary to predict the movement of the moving object during the release time lag and then shoot an image.
- there is no release time lag corresponding to the vertical movement of the movable mirror and therefore, focus can be achieved while following the movement of an object until immediately before exposure.
- Step Sg 5 When the release button 40 b is pressed halfway down by a user (Step Sg 5 ), the body microcomputer 50 amplifies an output from the line sensor 24 a of the phase difference detection unit 20 , and then performs an operation by the arithmetic circuit (Step Sg 6 ). Then, whether or not a low contrast state has occurred is determined (Step Sg 7 ). Specifically, it is determined whether or not a contrast value is high enough to detect respective positions of two object images formed on the line sensor 24 a based on the output from the line sensor 24 a.
- Step Sg 8 When the contrast value is high enough to detect the positions of the two object images (NO), it is determined that a low contrast state has not occurred, and the process proceeds to Step Sg 8 to perform hybrid AF. Note that Steps Sg 8 -Sg 10 are the same as Steps Sc 7 , Sc 10 and Sc 11 in hybrid AF.
- Step Sa 11 the body microcomputer remains in a standby state until the release button 40 b is pressed all the way down by the user.
- a flow of steps after the release button 40 b is pressed all the way down is the same as that of normal hybrid detection AF.
- the camera system may be configured so that after the release button 40 b is pressed halfway down, the contrast value is obtained from an output of the imaging device 10 to determine whether or not the contrast value obtained from the output of the imaging device 10 is higher than a predetermined value before a phase difference focus is detected (i.e., between Steps Sg 5 and Sg 6 in FIG. 19 ).
- the camera 100 of this embodiment is configured so that the autofocusing method is switched according to the type of the interchangeable lens 7 attached to the camera body 4 .
- AF switching function according to the type of the interchangeable lens will be described hereinafter with reference to FIG. 20 .
- hybrid AF is performed.
- the AF switching function according to the interchangeable lens is not limited to hybrid AF, but can be employed in any configuration using phase difference detection AF, contrast detection AF, phase difference detection AF according to the variation, hybrid AF according to the variation, or the like.
- Step Sh 5 When the release button 40 b is pressed half way down by a user (Step Sh 5 ), photometry is performed (Step Sh 6 ), and in parallel with Step Sh 6 , image blur detection is started (Step Sh 7 ).
- Steps Sh 6 and Sh 7 are the same as Steps Sa 9 and Sa 10 in phase difference detection AF. Note that the photometry and image blur detection may be performed in parallel with an autofocus operation, which will be described later.
- Step Sh 9 when the interchangeable lens 7 is not either a reflecting telephoto lens produced by a third party or a STF lens (NO), the process proceeds to Step Sh 9 to perform hybrid AF.
- Steps Sh 9 -Sh 12 are the same as Steps Sc 6 , Sc 7 , Sc 10 and Sc 11 in hybrid AF.
- the imaging device 10 is configured so that light passes through the imaging device 10 , and the phase difference detection unit 20 for receiving light which has passed through the imaging device 10 to perform phase difference detection is provided, so that when photometry using the imaging device 10 and phase difference detection using the phase difference detection unit 20 are performed in succession, switching between photometry by the imaging device 10 and phase difference detection by the phase difference detection unit 20 can be performed quickly and quietly by control of the body control section 5 .
- Step Sk 19 When the during-exposure AF flag is 1 (YES), the process proceeds to Step Sk 19 to perform exposure, and also to Step Sk 25 to perform phase difference detection AF in parallel with Step Sk 19 .
- phase difference detection AF is executed during exposure of the imaging device 10 .
- the phase difference detection AF continuously performed while exposure is performed.
- “during exposure” can be also referred to as “during a period of still image shooting by the imaging device 10 ”, “during a period of video signal storing by the imaging device 10 ”, “during a period of electrical charge storing by the imaging device 10 ”, or the like.
- autofocusing is performed during exposure.
- an imaging blur in the direction along the optical axis X during exposure is reduced.
- autofocusing during the exposure is performed using phase difference detection AF, the defocus direction can be instantly obtained, and thus, an object image can be instantly brought into focus even in a short time during which exposure is performed.
- a camera body 204 further includes, in addition to components of the camera body 4 of the first embodiment, a finder optical system 6 for viewing an object image through a finder 65 , and a semi-transparent quick return mirror 46 for guiding incident light from the interchangeable lens 7 to the finder optical system 6 .
- a body control section 205 further includes, in addition to components of the body control section 5 of the first embodiment, a mirror control section 260 for controlling flip-up of the quick return mirror 46 , which will be described later, based on a control signal from the body microcomputer 50 .
- the quick return mirror 46 is arranged in front of the shutter unit 42 (i.e., at an object side), and pivotally supported about an axis Y which is located above and in front of the shutter unit 42 and horizontally extends.
- the quick return mirror 46 is biased toward a retracted position by a bias spring (not shown).
- the quick return mirror 46 is moved to the reflection position by the bias spring being wound up by a motor (not shown) for opening and closing the shutter unit 42 .
- the quick return mirror 46 which has been moved to the reflection position is engaged with an electromagnet or the like at the refection position. Then, this engagement is released, thereby causing the quick return mirror 46 to be pivotally moved to the retracted position by force of the bias spring.
- Step S 11 The power switch 40 a is turned on (Step S 11 ), the release button 40 b is pressed halfway down by a user (Step S 15 ), and then, the release button 40 b is pressed all the way down by the user (Step Si 11 ), so that the shutter unit 42 is temporarily put into a close state (Step Si 12 ).
- the above-described Steps Si 1 -Si 12 are basically the same as Steps Sa 1 -Sa 12 in phase difference detection AF of the first embodiment.
- defocus information is output with a phase detection width obtained by changing the phase detection width in phase difference focus detection of the first embodiment (i.e., a phase detection width in phase difference focus detection of hybrid AF in the live view shooting mode which will be described later) by a predetermined amount.
- the phase detection width means a reference phase difference used for determining that a calculated defocus amount is 0, i.e., an object is in focus.
- Step Sj 6 when the release button 40 b is pressed halfway down by the user (Step Sj 6 ), as opposed to the finder shooting mode, hybrid AF is performed.
- Steps Sj 7 , Sj 8 , Sj 11 and Sj 12 according to this hybrid AF are the same as Steps Sc 6 , Sc 7 , Sc 10 and Sc 11 in hybrid AF of the first embodiment.
- the finder shooting mode before shooting is performed, an object image can be viewed through the finder optical system 6 , and light can be caused to reach the imaging unit 1 . Also, when shooting is performed, incident light from the finder optical system 6 can be prevented from reaching the imaging unit 1 by the light shielding plate 47 while light from an object is guided to the imaging unit 1 . In the live view shooting mode, incident light from the finder optical system 6 can be prevented from reaching the imaging unit 1 by the light shielding plate 47 .
- the present invention is useful particularly for an imaging apparatus including an imaging device for performing photoelectric conversion.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Power Engineering (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Focusing (AREA)
- Automatic Focus Adjustment (AREA)
- Structure And Mechanism Of Cameras (AREA)
- Adjustment Of Camera Lenses (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
An imaging apparatus with improved convenience, which can perform various types of processing using an imaging device while performing phase difference detection, is provided.
An imaging unit (1) includes an imaging device (10) for performing photoelectric conversion to convert light into an electrical signal, the imaging device (10) configured so that light passes through the imaging device (10), a phase difference detection unit (20) for receiving the light having passed through the imaging device (10) to perform phase difference detection, a focus lens group (72) for adjusting a focus position, and a body control section (5) for controlling the imaging device (10) and controlling driving of the focus lens group (72) at least based on a detection result of the phase difference detection unit. The body control section (5) performs a focus operation based on a detection result of the phase difference detection unit during exposure of the imaging device.
Description
- The present invention relates to an imaging apparatus including an imaging device for performing photoelectric conversion.
- In recent years, digital cameras that convert an object image into an electrical signal using an imaging device such as a charge coupled device (CCD) image sensor, a complementary metal-oxide semiconductor (CMOS) image sensor or the like, digitizes the electrical signal, and records the obtained digital signal have been widely used.
- Single-lens reflex digital cameras include a phase difference detection section for detecting a phase difference between object images, and have the phase difference detection AF function of performing autofocusing (hereinafter also simply referred to as “AF”) by the phase difference detection section. Since the phase difference detection AF function allows detection of defocus direction and defocus amount, the moving time of a focus lens can be reduced, thereby realizing fast-focusing (see, for example, Patent Document 1). In known single-lens reflex digital cameras, provided is a movable mirror capable of moving in or out of an optical path from a lens tube to an imaging device in order to guide light from an object to a phase difference detection section.
- In so-called compact digital cameras, the autofocus function by video AF using an imaging device (see, for example, Patent Document 2) is employed. Therefore, in compact digital cameras, a mirror for guiding light from an object to a phase difference detection section is not provided, thus achieving reduction in the size of compact digital cameras. In such compact digital cameras, autofocusing can be performed with light incident on the imaging device, i.e., with the imaging device being exposed to light. That is, it is possible to perform various types of processing using the imaging device, including, for example, obtaining an image signal from an object image formed on the imaging device to display the object image on an image display section provided on a back surface of the camera or to record the object image in a recording section, while performing autofocusing. In general, this autofocus function by video AF advantageously has higher accuracy than that of phase difference detection AF.
-
- PATENT DOCUMENT 1: Japanese Patent Application No. 2007-163545
- PATENT DOCUMENT 2: Japanese Patent Application No. 2007-135140
- However, as in a digital camera according to PATENT DOCUMENT 2, a defocus direction cannot be instantaneously detected by video AF. For example, when contrast detection AF is employed, a focus is detected by detecting a contrast peak, but a contrast peak direction, i.e., a defocus direction cannot be detected unless a focus lens is shifted to back and forth from its current position, or the like. Therefore, it takes a longer time to detect a focus. In view of reducing a time required for detecting a focus, phase difference detection AF is more advantageous. However, in an imaging apparatus such as a single-lens reflex digital camera according to
Patent Document 1 employing phase difference detection AF, a movable mirror has to be moved to be on an optical path from a lens tube to an imaging device in order to guide light from an object to a phase difference detection section. Thus, various types of processing using the imaging device, such as, for example, exposure of the imaging apparatus cannot be performed while phase difference detection AF is performed. Also, the movable mirror has to be moved when an optical path of incident light is switched between a path toward the phase difference detection section and a path toward the imaging device. Thus, disadvantageously, a time lag and noise are generated by moving the movable mirror. - That is, a known imaging apparatus for performing phase difference detection AF has been not convenient in relation to performing various types of processing using the imaging apparatus.
- In view of the above-described points, the present invention has been devised, and it is therefore an object of the present invention to improve the convenience of an imaging apparatus including an imaging device and a phase difference detection section in relation to performing various types of processing using the imaging device and phase difference detection using the phase difference detection section.
- An imaging apparatus according to the present invention includes an imaging device for performing photoelectric conversion to convert light into an electrical signal, the device being configured so that light passes through the imaging device, a phase difference detection section for receiving light which has passed through the imaging device to perform phase difference detection, a focus lens for adjusting a focus position, and a control section for controlling the imaging device and controlling driving of the focus lens at least based on a detection result of the phase difference detection section to adjust the focus position.
- An imaging apparatus according to another aspect of the present invention has been devised to realize fast-focusing on an image object while performing exposure of the imaging device. Specifically, the imaging apparatus includes an imaging device for performing photoelectric conversion to convert light into an electrical signal, the device being configured so that light passes through the imaging device, a phase difference detection section for receiving light which has passed through the imaging device to perform phase difference detection, a focus lens for adjusting a focus position, and a control section for controlling driving of the focus lens based on a detection result of the phase difference detection section to perform a focus operation to focus an image object on the imaging device, and the control section performs the focus operation based on the detection result of the phase difference detection section during exposure of the imaging device.
- According to the present invention, an imaging device configured so that light passes through the imaging device, and a phase difference detection section for receiving light which has passed through the imaging device to perform phase difference detection are provided. Moreover, a control section controls the imaging device and also controls driving of the focus lens at least based on a detection result of the phase difference detection section. Thus, phase difference detection can be performed by the phase difference detection section using light which has passed through the imaging device, while causing light to enter the imaging device to perform various types of processing using the imaging device. Accordingly, various types of processing using the imaging device can be performed in parallel with phase difference detection by the phase difference detection section, or, switching between various types of processing using the imaging device and phase difference detection by the phase difference detection section can be performed quietly and quickly, so that the convenience of the imaging device can be improved.
- According to another aspect of the present invention, an imaging device configured so that light passes through the imaging device, and a phase difference detection section for receiving light which has passed through the imaging device to perform phase difference detection are provided. Thus, phase difference detection can be performed by the phase difference detection section using light which has passed through the imaging device to focus an image object on the imaging device, while performing exposure of the imaging device. Accordingly, the imaging object can be quickly focused while the imaging device is exposed to light.
-
FIG. 1 is a block diagram of a camera according to a first embodiment of the present invention. -
FIG. 2 is a cross-sectional view of an imaging unit. -
FIG. 3 is a cross-sectional view of an imaging device. -
FIG. 4 is a plan view of the imaging device. -
FIG. 5 is a plan view of a phase difference detection unit. -
FIG. 6 is a schematic perspective view of an imaging unit according to a variation. -
FIG. 7 is a cross-sectional view of an imaging device according to the variation. -
FIG. 8 is a cross-sectional view of an imaging device according to another variation. -
FIG. 9 is a cross-sectional view illustrating a cross section of an imaging unit according to the another variation, which corresponds toFIG. 2 . -
FIG. 10 is a cross-sectional view illustrating a cross section of the imaging unit of the another variation, which is perpendicular to the cross section corresponding toFIG. 2 . -
FIG. 11 is a flowchart of the steps in a shooting operation using phase difference detection AF before the release button is pressed all the way down. -
FIG. 12 is a flowchart showing the basic steps in each of shooting operations including a shooting operation using phase difference detection AF after the release button is pressed all the way down. -
FIG. 13 is a flowchart of the steps in a shooting operation using contrast detection AF before the release button is pressed all the way down. -
FIG. 14 is a flowchart of the steps in a shooting operation using hybrid AF before the release button is pressed all the way down. -
FIG. 15 is a flowchart of the steps in a shooting operation using phase difference detection AF according to the variation before the release button is pressed all the way down. -
FIG. 16 is a flowchart of the steps in a shooting operation using hybrid AF according to the variation before the release button is pressed all the way down. -
FIG. 17 is a flowchart of the steps in a shooting operation in a continuous shooting mode before the release button is pressed all the way down. -
FIG. 18 is a flowchart of the steps in a shooting operation in the continuous shooting mode after the release button is pressed all the way down. -
FIG. 19 is a flowchart of the steps in a shooting operation in a low contrast mode before the release button is pressed all the way down. -
FIG. 20 is a flowchart of the steps in a shooting operation of changing AF function according to a type of an interchangeable lens before the release button is pressed all the way down. -
FIG. 21 is a flowchart of the steps in a shooting operation in a during-exposure AF shooting mode before the release button is pressed all the way down. -
FIG. 22 is a flowchart of the steps in a shooting operation in the during-exposure AF shooting mode after the release button is pressed all the way down. -
FIG. 23 is a block diagram of a camera according to a second embodiment of the present invention. -
FIGS. 24(A) through 24(C) are perspective views illustrating a configuration of a quick return mirror and a shielding plate.FIG. 24(A) illustrates the quick return mirror in a retracted position.FIG. 24(B) illustrates the quick return mirror in a position between the retracted position and a reflection position.FIG. 24(C) illustrates the quick return mirror in the reflection position. -
FIG. 25 is a flowchart of the steps in a finder shooting mode before the release button is pressed all the way down. -
FIG. 26 is a flowchart of the steps in the finder shooting mode after the release button is pressed all the way down. -
FIG. 27 is a flowchart of the steps in a live view shooting mode before the release button is pressed all the way down. -
FIG. 28 is a flowchart of the steps in the live view shooting mode after the release button is pressed all the way down. -
-
- 1, 401 Imaging Unit
- 10, 210, 310 Imaging Device
- 20, 420 Phase Difference Detection Unit (Phase Difference Detection Section)
- 4 Camera Body (Imaging Apparatus Body)
- 40 e During-Exposure AF Setting Switch (Setting Switch)
- 44 Image Display Section
- 46 Quick Return Mirror (Movable Mirror)
- 47 Shielding Plate (Shielding Section)
- 5 Body Control Section (Control Section, Distance Detection Section)
- 6 Finder Optical System
- 7 Interchangeable Lens
- 72 Focus Lens Group (Focus Lens)
- 73 Aperture Section (Light Amount Adjustment Section)
- 100, 200 Camera (Imaging Apparatus)
- Embodiments of the present invention will be described hereinafter in detail with reference to the accompanying drawings.
- A camera as an imaging apparatus according to a first embodiment of the present invention will be described.
- As shown in
FIG. 1 , acamera 100 according to the first embodiment is a single-lens reflex digital camera with interchangeable lenses and includes, as major components, acamera body 4 having a major function as a camera system, andinterchangeable lenses 7 removably attached to thecamera body 4. Theinterchangeable lenses 7 are attached to abody mount 41 provided on a front face of thecamera body 4. The body mount 41 is provided with anelectric contact piece 41 a. - —Configuration of Camera Body—
- The
camera body 4 includes theimaging unit 1 for capturing an object image as a shooting image, ashutter unit 42 for adjusting an exposure state of theimaging unit 1, an optical low pass filter (OLPF) 43, serving also as an IR cutter, for removing infrared light of the object image entering theimaging unit 1 and reducing the moire phenomenon, animage display section 44, comprised of a liquid crystal monitor, for displaying a shooting image, a live view image and various pieces of information, and abody control section 5. Thecamera body 4 serves as an imaging apparatus body. - In the
camera body 4, apower switch 40 a for turning on/off the camera system, arelease button 40 b operated by a user when the user performs focusing and releasing operations, and settingswitches - When the camera system is turned on by the
power switch 40 a, power is supplied to each part of thecamera body 4 and theinterchangeable lens 7. - The
release button 40 b operates as a two-stage switch. Specifically, autofocusing, AE (Automatic Exposure) or the like, which will be described later, is performed by pressing therelease button 40 b halfway down, and releasing is performed by pressing therelease button 40 b all the way down. - An
AF setting switch 40 c is a switch for switching an autofocus function from one to another of three autofocus functions, which will be described later. Thecamera body 4 is configured so that the autofocus function is set to be one of the three autofocus functions by switching theAF setting switch 40 c. - A continuous shooting
mode setting switch 40 d is a switch for setting/canceling a continuous shooting mode, which will be described later. Thecamera body 4 is configured so that a shooting mode can be switched between a normal shooting mode and a continuous shooting mode by operating the continuous shootingmode setting switch 40 d. - A during-exposure
AF setting switch 40 e is a switch for turning on/off during-exposure AF, which will be described later. Thecamera body 4 is configured so that the shooting mode is switched between a during-exposure AF shooting mode in which exposure is performed while autofocusing is performed and a normal shooting mode in which thefocus lens group 72 is halted and autofocusing is not performed while exposure is performed by operating the during-exposureAF setting switch 40 e. The during-exposureAF setting switch 40 e serves as a setting switch for switching the shooting mode between the during-exposure focusing shooting mode and the normal shooting mode. - A
macro setting switch 40 f is a switch for setting/canceling a macro shooting mode, which will be described later. Thecamera body 4 is configured so that the shooting mode is switched between the normal shooting mode and the macro shooting mode which is suitable for close-up shooting by operating themacro setting switch 40 f. - Clearly, the setting switches 40 c-40 f may be selection items in a menu for selecting various camera shooting functions.
- Furthermore, the
macro setting switch 40 f may be provided to theinterchangeable lens 7. - The
imaging unit 1, which will be described in detail later, performs photoelectric conversion to convert an object image into an electrical signal. Theimaging unit 1 is configured to be movable in a plane perpendicular to an optical axis X by ablur correction unit 45. - The
body control section 5 includes abody microcomputer 50, anonvolatile memory 50 a, ashutter control section 51 for controlling driving of theshutter unit 42, an imagingunit control section 52 for controlling the operation of theimaging unit 1 and performing A/D conversion of an electrical signal from theimaging unit 1 to output the converted signal to thebody microcomputer 50, an image reading/recording section 53 for reading image data from, for example, a card type recording medium or animage storage section 58 which is an internal memory and recording image data in theimage storage section 58, an imagerecording control section 54 for controlling the image reading/recording section 53, an imagedisplay control section 55 for controlling display of theimage display section 44, ablur detection section 56 for detecting an amount of an image blur generated due to shake of thecamera body 4, and a correctionunit control section 57 for controlling theblur correction unit 45. Thebody control section 5 serves as a control section. - The
body microcomputer 50 is a control device for controlling core functions of thecamera body 4, and performs control of various sequences. Thebody microcomputer 50 includes, for example, a CPU, a ROM and a RAM. Programs stored in the ROM are read by the CPU, and thereby, thebody microcomputer 50 can execute various functions. - The
body microcomputer 50 is configured to receive input signals from thepower switch 40 a, therelease button 40 b and each of the setting switches 40 c and 40 f and output control signals to theshutter control section 51, the imagingunit control section 52, the image reading/recording section 53, the imagerecording control section 54, the correctionunit control section 57 and the like, thereby causing theshutter control section 51, the imagingunit control section 52, the image reading/recording section 53, the imagerecording control section 54, the correctionunit control section 57 and the like to execute respective control operations. Thebody microcomputer 50 performs inter-microcomputer communication with alens microcomputer 80, which will be described later. - For example, according to an instruction of the
body microcomputer 50, the imagingunit control section 52 performs A/D conversion of an electrical signal from theimaging unit 1 to output the converted signal to thebody microcomputer 50. Thebody microcomputer 50 performs predetermined image processing to the received electrical signal to generate an image signal. Then, thebody microcomputer 50 transmits the image signal to the image reading/recording section 53, and also instructs the imagerecording control section 54 to record and display an image, and thereby, the image signal is stored in theimage storage section 58 and is transmitted to the imagedisplay control section 55. The imagedisplay control section 55 controls theimage display section 44 based on the transmitted image signal to cause theimage display section 44 to display an image. - The
body microcomputer 50, which will be described in detail later, is configured to detect an object point distance to the object via alens microcomputer 80. - In the
nonvolatile memory 50 a, various pieces of information (unit information) for thecamera body 4 are stored. The unit information includes, for example, model information (unit specific information) provided to specify thecamera body 4, such as name of a manufacturer, production date and model number of thecamera body 4, version information for software installed in thebody microcomputer 50 and firmware update information, information regarding whether or not thecamera body 4 includes sections for correcting an image blur, such as theblur correction unit 45, theblur detection section 56 and the like, information regarding a detection performance of theblur detection section 56, such as a model number, detection capability and the like, error history and the like. Such information as listed above may be stored in a memory section of thebody microcomputer 50, instead of thenonvolatile memory 50 a. - The
blur detection section 56 includes an angular velocity sensor for detecting the movement of thecamera body 4 due to hand shake and the like. The angular velocity sensor outputs a positive/negative angular velocity signal according to the direction in which thecamera body 4 is moved, using as a reference an output in a state where thecamera body 4 stands still. In this embodiment, two angular velocity sensors are provided to detect two directions, i.e., a yawing direction and a pitching direction. After being subjected to filtering, amplification and the like, the output angular velocity signal is converted into a digital signal by the A/D conversion section, and then, is given to thebody microcomputer 50. - —Configuration of Interchangeable Lens—
- The
interchangeable lens 7 serves as an imaging optical system for forming an object image on theimaging unit 1 in thecamera body 4, and includes, as major components, afocus adjustment section 7A for performing focusing, anaperture adjustment section 7B for adjusting an aperture, a lens imageblur correction section 7C for adjusting an optical path to correct an image blur, and alens control section 8 for controlling an operation of theinterchangeable lens 7. - The
interchangeable lens 7 is attached to thebody mount 41 of thecamera body 4 via alens mount 71. Thelens mount 71 is provided with anelectric contact piece 71 a which is electrically connected to theelectric contact piece 41 a of thebody mount 41 when theinterchangeable lens 7 is attached to thecamera body 4. - The
focus adjustment section 7A is comprised of afocus lens group 72 for adjusting focus. Thefocus lens group 72 is movable in the direction along the optical axis X in a zone from a closest focus position predetermined as a standard for theinterchangeable lens 7 to an infinite focus position. When a focus position is detected using a contrast detection method, which will be described later, thefocus lens group 72 has to be movable forward and backward from a focus position in the direction along the optical axis X. Therefore, thefocus lens group 72 has a lens shift margin zone which allows thefocus lens group 72 to move forward and backward in the direction along the optical axis X to a further distance beyond the zone ranging from the closest focus position to the infinite focus position. Note that thefocus lens group 72 does not have to be comprised of a plurality of lenses, but may be comprised of a single lens. - The
aperture adjustment section 7B is comprised of anaperture section 73 for adjusting an aperture. Theaperture section 73 serves as a light amount adjustment section. - The lens image
blur correction section 7C includes ablur correction lens 74, and a blur correctionlens driving section 74 a for moving theblur correction lens 74 in a plane perpendicular to the optical axis X. - The
lens control section 8 includes alens microcomputer 80, anonvolatile memory 80 a, a focus lensgroup control section 81 for controlling an operation of thefocus lens group 72, afocus driving section 82 for receiving a control signal of the focus lensgroup control section 81 to drive thefocus lens group 72, anaperture control section 83 for controlling an operation of theaperture section 73, ablur detection section 84 for detecting a blur of theinterchangeable lens 7, and a blur correction lensunit control section 85 for controlling the blur correctionlens driving section 74 a. - The
lens microcomputer 80 is a control device for controlling core functions of theinterchangeable lens 7, and is connected to each Component mounted on theinterchangeable lens 7. Specifically, thelens microcomputer 80 includes a CPU, a ROM, and a RAM and, when programs stored in the ROM are read by the CPU, various functions can be executed. For example, thelens microcomputer 80 has the function of setting a lens image blur correction system (the blur correctionlens driving section 74 a or the like) to be a correction possible state or a correction impossible state, based on a signal from thebody microcomputer 50. Due to the contact of theelectric contact piece 71 a provided to thelens mount 71 with theelectric contact piece 41 a provided to thebody mount 41, thebody microcomputer 50 is electrically connected to thelens microcomputer 80, so that information can be transmitted/received between thebody microcomputer 50 and thelens microcomputer 80. - In the
nonvolatile memory 80 a, various pieces of information (lens information) for theinterchangeable lens 7 are stored. The lens information includes, for example, model information (lens specific information) provided to specify theinterchangeable lens 7, such as name of a manufacturer, production date and model number of theinterchangeable lens 7, version information for software installed in thelens microcomputer 80 and firmware update information, and information regarding whether or not theinterchangeable lens 7 includes sections for correcting an image blur, such as the blur correctionlens driving section 74 a, theblur detection section 84, and the like. If theinterchangeable lens 7 includes sections for correcting an image blur, the lens information further includes information regarding a detection performance of theblur detection section 84 such as a model number, detection capability and the like, information regarding a correction performance (a lens side correction performance information) of the blur correctionlens driving section 74 a such as a model number, a maximum correctable angle and the like, version information for software for performing image blur correction, and the like. Furthermore, the lens information includes information (lens side power consumption information) regarding necessary power consumption for driving the blur correctionlens driving section 74 a, and information (lens side driving method information) regarding a method for driving the blur correctionlens driving section 74 a. Thenonvolatile memory 80 a can store information transmitted from thebody microcomputer 50. The information listed above may be stored in a memory section of thelens microcomputer 80, instead of thenonvolatile memory 80 a. - The focus lens
group control section 81 includes an absoluteposition detection section 81 a for detecting an absolute position of thefocus lens group 72 in the direction along the optical axis, and a relativeposition detection section 81 b for detecting a relative position of thefocus lens group 72 in the direction along the optical axis. The absoluteposition detection section 81 a detects an absolute position of thefocus lens group 72 provided in a case of theinterchangeable lens 7. For example, the absoluteposition detection section 81 a is comprised of a several-bit contact-type encoder substrate and a brush, and is capable of detecting an absolute position. The relativeposition detection section 81 b cannot detect the absolute position of thefocus lens group 72 by itself, but can detect a moving direction of thefocus lens group 72. The relativeposition detection section 81 b employs, for example, a two-phase encoder. As for the two-phase encoder, two rotary pulse encoders, two MR devices, two hall devices, or the like, for alternately outputting binary signals with an equal pitch according to the position of thefocus lens group 72 in the direction along the optical axis are provided so that the phases of their pitches are different from each other. Thelens microcomputer 80 calculates the relative position of thefocus lens group 72 in the direction along the optical axis from an output of the relativeposition detection section 81 b. - The
blur detection section 84 includes an angular velocity sensor for detecting the movement of theinterchangeable lens 7 due to hand shake and the like. The angular velocity sensor outputs a positive/negative angular velocity signal according to the direction in which theinterchangeable lens 7 moves, using as a reference an output in a state where theinterchangeable lens 7 stands still. In this embodiment, two angular velocity sensors are provided to detect two directions, i.e., a yawing direction and a pitching direction. After being subjected to filtering, amplification and the like, the output angular velocity signal is converted into a digital signal by the A/D conversion section, and then, is given to thelens microcomputer 80. - A blur correction lens
unit control section 85 includes a moving amount detection section (not shown). The moving amount detection section is a detection section for detecting an actual moving amount of theblur correction lens 74. The blur correction lensunit control section 85 performs feedback control of theblur correction lens 74 based on an output of the moving amount detection section. - An example in which the
blur detection sections blur correction units camera body 4 and theinterchangeable lens 7 has been described. However, such blur detection section and blur correction unit may be provided to either one of thecamera body 4 and theinterchangeable lens 7. Also, a configuration in which such blur detection section and blur correction unit are not provided to either thecamera body 4 or theinterchangeable lens 7 may be employed (in such a configuration, a sequence regarding the above-described blur correction may be eliminated). - —Configuration of Imaging Unit—
- As shown in
FIG. 2 , theimaging unit 1 includes animaging device 10 for converting an object image into an electrical signal, apackage 31 for holding theimaging device 10, and a phasedifference detection unit 20 for performing focus detection using phase difference detection. - The
imaging device 10 is an interline type CCD image sensor, and, as shown inFIG. 3 , includes aphotoelectric conversion section 11 made of a semiconductor material,vertical registers 12,transfer paths 13, masks 14,color filters 15, andmicrolenses 16. - The
photoelectric conversion section 11 includes asubstrate 11 a and a plurality of light receiving sections (also referred to as “pixels”) 11 b arranged on thesubstrate 11 a. - The
substrate 11 a is made of a Si (silicon) based substrate. Specifically, thesubstrate 11 a is made of a Si single crystal substrate or a SOI (silicon-on-insulator wafer). In particular, an SOI substrate has a sandwich structure of Si thin films and a SiO2 thin film, and chemical reaction can be stopped at the SiO2 film in etching or like processing. Thus, in terms of performing stable substrate processing, it is advantageous to use an SOI substrate. - Each of the
light receiving sections 11 b is made of a photodiode, and absorbs light to generate electrical charges. Thelight receiving sections 11 b are provided in micro pixel regions each having a square shape, arranged in matrix on thesubstrate 11 a (seeFIG. 4 ). - The
vertical register 12 is provided for each light receivingsection 11 b, and serves to temporarily store electrical charges stored in thelight receiving section 11 b. The electrical charges stored in thelight receiving section 11 b are transferred to thevertical register 12. The electrical charges transferred to thevertical register 12 are transferred to a horizontal register (not shown) via thetransfer path 13, and then, to an amplifier (not shown). The electrical charges transferred to the amplifier are amplified and pulled out as an electrical signal. - The
mask 14 is provided so that thelight receiving sections 11 b is exposed toward an object while thevertical register 12 and thetransfer path 13 are covered by themask 14, thereby preventing light from entering thevertical register 12 and thetransfer path 13. - The
color filter 15 and themicrolens 16 are provided in each micro pixel region having a square shape to correspond to an associated one of thelight receiving sections 11 b. Each of thecolor filters 15 transmits only a specific color, and primary color filters or complementary color filters are used as the color filters 15. In this embodiment, as shown inFIG. 4 , so-called Bayer primary color filters are used. That is, assuming that fourcolor filters 15 arranged adjacent to one another in two rows and two columns (or in four pixel regions) are a repeat unit throughout theentire imaging device 10, twogreen color filters 15 g (i.e., color filters having a higher transmittance in a green visible light wavelength range than in the other color visible light wavelength ranges) are arranged in a diagonal direction, and ared color filter 15 r (i.e., a color filter having a higher transmittance in a red visible light wavelength range than in the other color visible light wavelength ranges) and ablue color filter 15 b (i.e., a color filter having a higher transmittance in a blue visible light wavelength range than in the other color visible light wavelength ranges) are arranged in another diagonal direction. When the entire set of thecolor filters 15 is viewed, every second color filter in the row and column directions is thegreen color filter 15 g. - The
microlenses 16 collect light to cause the light to enter thelight receiving sections 11 b. Thelight receiving sections 11 b can be efficiently irradiated with light by themicrolenses 16. - In the
imaging device 10 configured in the above-described manner, light collected by themicrolens 16 enters thecolor filters light receiving sections 11 b is irradiated with the light. Each of thelight receiving sections 11 b absorbs light to generate electrical charges. The electrical charges generated by thelight receiving sections 11 b are transferred to the amplifier via thevertical register 12 and thetransfer path 13, and are output as an electrical signal. That is, the amount of received light having a corresponding color to each color filter is obtained from each of thelight receiving sections 11 b as an output. - Thus, the
imaging device 10 performs photoelectric conversion at thelight receiving sections 11 b provided throughout the entire imaging plane, thereby converting an object image formed on an imaging plane into an electrical signal. - In this case, a plurality of light transmitting
portions 17 for transmitting irradiation light are formed in thesubstrate 11 a. Thelight transmitting portions 17 are formed by cutting, polishing or etching an opposite surface (hereinafter also referred to as a “back surface”) 11 c of thesubstrate 11 a to a surface thereof on which thelight receiving sections 11 b are provided to provide concave-shaped recesses, and each of thelight transmitting portions 17 has a smaller thickness than that of a part of thesubstrate 11 a located around each of thelight transmitting portions 17. More specifically, each of thelight transmitting portions 17 includes a recess-bottom surface 17 a having a smallest thickness and aninclined surfaces 17 b for connecting the recess-bottom surface 17 a with theback surface 11 c. - Each of the
light transmitting portions 17 in thesubstrate 11 a is formed to have a thickness which allows light to transmit through thelight transmitting portion 17, so that a part of irradiation light onto thelight transmitting portions 17 is not converted into electrical charges and is transmitted through thephotoelectric conversion section 11. For example, by forming thesubstrate 11 a so that each of parts thereof located in thelight transmitting portions 17 has a thickness of 2-3 μm, about 50% of light having a longer wavelength than that of near infrared light can be caused to transmit through thelight transmitting portions 17. - Each of the
inclined surfaces 17 b is set to be at an angle at which light reflected by theinclined surfaces 17 b is not directed tocondenser lenses 21 a of the phasedifference detection unit 20, which will be described later, when light is transmitted through thelight transmitting portions 17. Thus, formation of a non-real image on aline sensor 24 a, which will be described later, is prevented. - Each of the
light transmitting portions 17 serves as a reduced-thickness portion, which transmits light entering theimaging device 10, i.e., which allows light entering theimaging device 10 to pass therethrough. The term “passing” includes the concept of “transmitting” at least in this specification. - The
imaging device 10 configured in the above-described manner is held in the package 31 (seeFIG. 2 ). Thepackage 31 serves as a holding portion. - Specifically, the
package 31 includes aflat bottom plate 31 a provided with aframe 32, andupright walls 31 b provided in four directions. Theimaging device 10 is mounted on theframe 32 to be surrounded by theupright walls 31 b in four directions, and is electrically connected to theframe 32 via bonding wires. - Moreover, a
cover glass 33 is attached to ends of theupright walls 31 b of thepackage 31 to cover the imaging plane of the imaging device 10 (on which thelight receiving sections 11 b are provided). The imaging plane of theimaging device 10 is protected by thecover glass 33 from dust and the like being attached thereto. - In this case, the same number of
openings 31 c as the number of thelight transmitting portions 17 are formed in thebottom plate 31 a of thepackage 31 to pass through thebottom plate 31 a and be located at corresponding positions to the positions of thelight transmitting portions 17 of theimaging device 10. With theopenings 31 c provided, light transmitted through theimaging device 10 reaches the phasedifference detection unit 20, which will be described later. Theopenings 31 c serves as light passing portions. - In the
bottom plate 31 a of thepackage 31, theopenings 31 c do not have to be necessarily formed to pass through thebottom plate 31 a. That is, as long as light transmitted through theimaging device 10 can reach the phasedifference detection unit 20, a configuration in which transparent portions or semi-transparent portions are formed in thebottom plate 31 a, or like configuration may be employed. - The phase
difference detection unit 20 is provided in the back surface (an opposite surface to a surface facing an object) side of theimaging device 10 and receives light transmitted through theimaging device 10 to perform phase difference detection. Specifically, the phasedifference detection unit 20 converts the received transmitted light into an electrical signal to be used for distance measurement. The phasedifference detection unit 20 serves as a phase difference detection section. - As shown in
FIGS. 2 and 5 , the phasedifference detection unit 20 includes acondenser lens unit 21, amask member 22, aseparator lens unit 23, aline sensor unit 24, amodule frame 25 to which thecondenser lens unit 21, themask member 22, theseparator lens unit 23 and theline sensor unit 24 are attached. Thecondenser lens unit 21, themask member 22, theseparator lens unit 23 and theline sensor unit 24 are arranged in this order along the thickness direction of theimaging device 10 from theimaging device 10 side. - The plurality of
condenser lenses 21 a integrated into a single unit form thecondenser lens unit 21. The same number of thecondenser lenses 21 a as the number of thelight transmitting portions 17 are provided. Each of thecondenser lenses 21 a collects incident light. Thecondenser lens 21 a collects light transmitted through theimaging device 10 and spreading out, and guides the light to aseparator lens 23 a of theseparator lens unit 23, which will be described later. Each of thecondenser lenses 21 a is formed so that anincident surface 21 b of thecondenser lens 21 a has a convex shape and a part thereof located close to theincident surface 21 b has a circular column shape. - Since an incident angle of light entering each of the
separator lenses 23 a is reduced by providing thecondenser lenses 21 a, an aberration of theseparator lens 23 a can be reduced, and a distance between object images on aline sensor 24 a which will be described later can be reduced. As a result, the size of each of theseparator lenses 23 a and theline sensor 24 a can be reduced. Additionally, when a focus position of an object image from the imaging optical system greatly diverges from the imaging unit 1 (specifically, greatly diverges from theimaging device 10 of the imaging unit 1), the contrast of the image is remarkably reduced. According to this embodiment, however, due to the size-reduction effect of thecondenser lenses 21 a and theseparator lenses 23 a, reduction in contrast can be prevented, so that a focus detection range can be increased. If highly accurate phase difference detection around a focus position is performed, or if theseparator lenses 23 a, theline sensors 24 a and the like are of sufficient dimensions, thecondenser lens unit 21 does not have to be provided. - The
mask member 22 is provided between thecondenser lens unit 21 and theseparator lens unit 23. In themask member 22, twomask openings 22 a are formed in a part thereof corresponding to each of theseparator lenses 23 a. That is, themask member 22 divides a lens surface of each of theseparator lenses 23 a into two areas, so that only the two areas are exposed toward thecondenser lenses 21 a. More specifically, themask member 22 performs pupil division to divide light which has been collected by thecondenser lenses 21 a into two light beams and causes the two light beams to enter theseparator lens 23 a. Themask member 22 can prevent harmful light from one of adjacent two of theseparator lenses 23 a from entering the other one of the adjacent two. Note that themask member 22 does not have to be provided. - The
separator lens unit 23 includes a plurality ofseparator lenses 23 a. In other words, theseparator lenses 23 a are integrated into a single unit to form theseparator lens unit 23. Like thecondenser lenses 21 a, the same number of theseparator lens 23 a as the number of light transmittingportions 17 are provided. Each of theseparator lenses 23 a forms two identical object images on theline sensor 24 a from two light beams which have passed through themask member 22 and has entered theseparator lens 23 a. - The
line sensor unit 24 includes a plurality ofline sensors 24 a and a mountingportion 24 b on which theline sensors 24 a are mounted. Like thecondenser lenses 21 a, the same number of theline sensors 24 a as the number of thelight transmitting portions 17 are provided. Each of theline sensors 24 a receives an image formed on an imaging plane and converts the image into an electrical signal. That is, a distance between the two object images can be detected from an output of theline sensor 24 a, and a shift amount (defocus amount: Df amount) of a focus of an object image to be formed on theimaging device 10, and the direction (defocus direction) in which the focus is shifted can be obtained based on the distance. (The Df amount, the defocus direction and the like will be hereinafter also referred to as “defocus information.”) - The
condenser lens unit 21, themask member 22, theseparator lens unit 23 and theline sensor unit 24, configured in the above-described manner, are provided in themodule frame 25. - The
module frame 25 is a member formed to have a frame shape, and an attachingsection 25 a is provided on an inner circumference surface of themodule frame 25 to inwardly protrude. A first attachingportion 25 b and a second attachingportion 25 c are formed into a step-like shape at a part of the attachingsection 25 a located closer to theimaging device 10. Moreover, a third attaching portion 25 d is formed at a part of the attachingsection 25 a located at an opposite side to theimaging device 10. - The
mask member 22 is attached to a side of the second attachingportion 25 c of themodule frame 25 located closer to theimaging device 10, and thecondenser lens unit 21 is attached to the first attachingportion 25 b. As shown inFIGS. 2 and 5 , thecondenser lens unit 21 and themask member 22 are formed so that their edge portions fit in themodule frame 25 when thecondenser lens unit 21 and themask member 22 are attached to the first attachingportion 25 b and the second attachingportion 25 c. Thus, the positions of thecondenser lens unit 21 andmask member 22 are determined relative to themodule frame 25. - The
separator lens unit 23 is attached to a side of the third attaching portion 25 d of themodule frame 25 located opposite to theimaging device 10. The third attaching portion 25 d is provided withpositioning pins 25 e and direction reference pins 25 f each protruding at an opposite side to thecondenser lens unit 21 side. Theseparator lens unit 23 is provided withpositioning holes 23 b and direction reference holes 23 c corresponding respectively to the positioning pins 25 e and the direction reference pins 25 f. Respective diameters of the positioning pins 25 e and the positioning holes 23 b are determined so that the positioning pins 25 e closely fit in the positioning holes 23 b. Respective diameters of the direction reference pins 25 f and the direction reference holes 23 c are determined so that the direction reference pins 25 f loosely fit in the direction reference holes 23 c. That is, the attitude of theseparator lens unit 23, such as the direction in which theseparator lens unit 23 is arranged when being attached to the third attaching portion 25 d, is defined by inserting the positioning pins 25 e and the direction reference pins 25 f of the third attaching portion 25 d in the positioning holes 23 b and the direction reference holes 23 c, and the position of theseparator lens unit 23 is determined relative to the third attaching portion 25 d by providing a close fit of the positioning pins 25 e with the positioning holes 23 b. Thus, when the attitude and position of theseparator lens unit 23 are determined and then theseparator lens unit 23 is attached, the lens surface of each of theseparator lenses 23 a is directed toward thecondenser lens unit 21 and faces to an associated one of themask openings 22 a. - In the above-described manner, the
condenser lens unit 21, themask member 22 and theseparator lens unit 23 are attached while being held in positions determined relative to themodule frame 25. That is, the positional relation of thecondenser lens unit 21, themask member 22 and theseparator lens unit 23 are determined by themodule frame 25. - Then, the
line sensor unit 24 is attached to a side of themodule frame 25 located closer to the back surface side of the separator lens unit 23 (which is the opposite side to thecondenser lens unit 21 side). In this case, theline sensor unit 24 is attached to themodule frame 25 while being held in a position which allows light transmitted through each of theseparator lenses 23 a to enter an associated one of theline sensors 24 a. - Thus, the
condenser lens unit 21, themask member 22, theseparator lens unit 23 and theline sensor unit 24 are attached to themodule frame 25, and thereby, thecondenser lenses 21 a, themask member 22, theseparator lenses 23 a and theline sensor 24 a are arranged to be located at determined positions so that incident light to thecondenser lenses 21 a is transmitted through thecondenser lenses 21 a to enter theseparator lenses 23 a via themask member 22, and then, the light transmitted through theseparator lenses 23 a forms an image on each of theline sensors 24 a. - The
imaging device 10 and the phasedifference detection unit 20 configured in the above-described manner are joined together. Specifically, theimaging device 10 and the phasedifference detection unit 20 are configured so that theopenings 31 c of thepackage 31 in theimaging device 10 closely fit thecondenser lenses 21 a in the phasedifference detection unit 20. That is, with thecondenser lenses 21 a in the phasedifference detection unit 20 inserted in theopenings 31 c of thepackage 31 in theimaging device 10, themodule frame 25 is bonded to thepackage 31. Thus, the respective positions of theimaging device 10 and the phasedifference detection unit 20 are determined, and then, theimaging device 10 and the phasedifference detection unit 20 are joined together while being held in the positions. As described above, thecondenser lenses 21 a, theseparator lenses 23 a and theline sensors 24 a are integrated into a single unit, and then are attached as a signal unit to thepackage 31. - The
imaging device 10 and the phasedifference detection unit 20 may be configured so that all of theopenings 31 c closely fit thecondenser lenses 21 a. Alternatively, theimaging device 10 and the phasedifference detection unit 20 may be also configured so that only some of theopenings 31 c closely fit associated ones of thecondenser lenses 21 a, and the rest of theopenings 31 c loosely fit associated ones of thecondenser lenses 21 a. In the latter case, theimaging device 10 and the phasedifference detection unit 20 are preferably configured so that one of thecondenser lenses 21 a and one of theopenings 31 c located closest to the center of the imaging plane closely fit each other to determine positions in the imaging plane, and furthermore, one of thecondenser lenses 21 a and one of theopenings 31 c located most distant from the center of the imaging plane closely fit each other to determine circumferential positions (rotation angles) of thecondenser lens 21 a and theopening 31 c which are located at the center of the imaging plane. - As a result of joining the
imaging device 10 and the phasedifference detection unit 20 together, thecondenser lens 21 a, the pair of themask openings 22 a of themask member 22, theseparator lens 23 a and theline sensor 24 a are arranged in the back surface side of thesubstrate 11 b to correspond to each of thelight transmitting portions 17. - As described above, relative to the
imaging device 10 configured to transmit light therethrough, theopenings 31 c are formed in thebottom plate 31 a of thepackage 31 for housing theimaging device 10, and thereby, light transmitted through theimaging device 10 is easily caused to reach the back surface side of thepackage 31. Also, the phasedifference detection unit 20 is arranged in the back surface side of thepackage 31, and thus, a configuration where light transmitted through theimaging device 10 is received at the phasedifference detection unit 20 can be easily realized. - As long as light transmitted through the
imaging device 10 can pass through theopenings 31 c formed in thebottom plate 31 a of thepackage 31 to the back surface side of thepackage 31, any configuration can be employed for theopenings 31 c. However, by forming theopenings 31 c as through holes, light transmitted through theimaging device 10 can be caused to reach the back surface side of thepackage 31 without attenuating light transmitted through theimaging device 10. - With the
openings 31 c provided to closely fit thecondenser lenses 21 a, positioning of the phasedifference detection unit 20 relative to theimaging device 10 can be performed using theopenings 31 c. If thecondenser lenses 21 a are not provided, theseparator lenses 23 a are configured to fit theopenings 31 c. Thus, positioning of the phasedifference detection unit 20 relative to theimaging device 10 can be performed in the same manner. - In addition, the
condenser lenses 21 a can be provided to pass through thebottom plate 31 a of thepackage 31 and reach a close point to thesubstrate 11 a. Thus, theimaging unit 1 can be configured as a compact size imaging unit. - The operation of the
imaging unit 1 configured in the above-described manner will be described hereinafter. - When light enters the
imaging unit 1 from an object, the light is transmitted through thecover glass 33 and enters theimaging device 10. The light is collected by themicrolenses 16 of theimaging device 10, and then, is transmitted through thecolor filters 15, so that only light of a specific color reaches thelight receiving sections 11 b. Thelight receiving sections 11 b absorbs light to generate electrical charges. Generated electrical charges are transferred to the amplifier via thevertical register 12 and thetransfer path 13, and are output as an electrical signal. Thus, each of thelight receiving sections 11 b converts light into an electrical signal throughout the entire imaging plane, and thereby, theimaging device 10 converts an object image formed on the imaging plane into an electrical signal for generating an image signal. - In the
light transmitting portions 17, a part of irradiation light to theimaging device 10 is transmitted through theimaging device 10. The light transmitted through theimaging device 10 enters thecondenser lenses 21 a which are provided to closely fit theopenings 31 c of thepackage 31. The light transmitted through each of thecondenser lenses 21 a and collected is divided into two light beams, when passing through each pair ofmask openings 22 a formed in themask member 22, and then, enters each of theseparator lenses 23 a. Light subjected to pupil division is transmitted through theseparator lens 23 a, and identical object images are formed at two positions on theline sensor 24 a. Theline sensor 24 a performs photoelectric conversion to generate an electrical signal from the object images and outputs the electrical signal. - In this case, the electrical signal output from the
imaging device 10 is input to thebody microcomputer 50 via the imagingunit control section 52. Thebody microcomputer 50 obtains positional information of each of thelight receiving sections 11 b and output data corresponding to the amount of light received by thelight receiving section 11 b from the entire imaging plane of theimaging device 10, thereby obtaining an object image formed on the image plane as an electrical signal. - In this case, in the
light receiving sections 11 b, even when the same light amount is received, the amount of accumulated charges are different among different lights having different wavelengths. Thus, outputs from thelight receiving sections 11 b of theimaging device 10 are corrected according to the types of thecolor filters light receiving sections 11 b. For example, a correction amount for each pixel is determined so that, when each of aR pixel 11 b to which thered color filter 15 r is provided, aG pixel 11 b to which thegreen color filter 15 g is provided, and aB pixel 11 b to which theblue color filter 15 b is provided receives the same amount of light corresponding to the color of each color filter, respective outputs of theR pixel 11 b, theG pixel 11 b and theB pixel 11 b become at the same level. - In this embodiment, the
light transmitting portions 17 are provided in thesubstrate 11 a, and thus, the photoelectric conversion efficiency is reduced in thelight transmitting portions 17, compared to the other portions. That is, even when thepixels 11 b receive the same light amount, the amount of accumulated charges is smaller in ones of thepixels 11 b provided in positions corresponding to thelight transmitting portions 17 than in the other ones of thepixels 11 b provided in positions corresponding to the other portions. Accordingly, when the same image processing as image processing for output data from thepixels 11 b provided in positions corresponding to the other portions is performed to output data from thepixels 11 b provided in positions corresponding to thelight transmitting portions 17, parts of an image corresponding to thelight transmitting portions 17 might not be able to be properly shot (for example, shooting image is dark). Therefore, an output of each of thepixels 11 b in thelight transmitting portions 17 is corrected to eliminate or reduce the influence of the light transmitting portions 17 (for example, by amplifying an output of each of thepixels 11 b in thelight transmitting portions 17 or like method). - Reduction in output varies depending on the wavelength of light. That is, as the wavelength increases, the transmittance of the
substrate 11 a increases. Thus, depending on the types of thecolor filters substrate 11 a differs. Therefore, when correction to eliminate or reduce the influence of thelight transmitting portions 17 on each of thepixels 11 b corresponding to thelight transmitting portions 17 is performed, the correction amount is changed according to the wavelength of light received by each of thepixels 11 b. That is, for each of thepixels 11 b corresponding to thelight transmitting portions 17, the correction amount is increased as the wavelength of light received by thepixel 11 b increases. - As described above, in each of the
pixels 11 b, the correction amount for eliminating or reducing the difference of the amount of accumulated charges depending on the types of color of received light is determined. In addition to the correction to eliminate or reduce the difference of the amount of accumulated charges depending on the types of color of received light, correction to eliminate or reduce the influence of thelight transmitting portions 17 is performed. That is, the correction amount for eliminating or reducing the influence of thelight transmitting portions 17 is a difference between the correction amount for each of thepixels 11 b corresponding to thelight transmitting portions 17 and the correction amount for thepixels 11 b which correspond to the other portions than thelight transmitting portions 17 and receive light having the same color. In this embodiment, different correction amounts are determined for different colors, based on the following relationship. Thus, a stable image output can be obtained. -
Rk>Gk>Bk [Expression 1] - where Rk is: a difference obtained by deducting the correction amount for R pixels in the other portions than the
light transmitting portions 17 from the correction amount for R pixels in thelight transmitting portions 17, Gk is: a difference obtained by deducting the correction amount for G pixels in the other portions than thelight transmitting portions 17 from the correction amount for G pixels in thelight transmitting portions 17, and Bk is: a difference obtained by deducting the correction amount for B pixels in the other portions than thelight transmitting portions 17 from the correction amount for B pixels in thelight transmitting portions 17. - Specifically, since the transmittance of red light having the largest wavelength is the highest of the transmittances of red, green and blue lights, the difference in the correction amount for red pixels is the largest. Also, since the transmittance of blue light having the smallest wavelength is the lowest of the transmittances of red, green and blue lights, the difference in the correction amount for blue pixels is the smallest.
- That is, the correction amount of an output of each of the
pixels 11 b in theimaging device 10 is determined based on whether or not thepixel 11 b is provided on a position corresponding to thelight transmitting portion 17, and the type of color of thecolor filter 15 corresponding to thepixel 11 b. For example, the correction amount of an output of each of thepixels 11 b is determined so that the white balance and/or intensity is equal for an image displayed by an output from thelight transmitting portion 17 and an image displayed by an output from some other portion than thelight transmitting portion 17. - The
body microcomputer 50 corrects output data from thelight receiving sections 11 b in the above-described manner, and then, generates, based on the output data, an image signal including positional information, color information and intensity information in each of the light receiving sections, i.e., thepixels 11 b. Thus, an image signal of an object image formed on the imaging plane of theimaging device 10 is obtained. - By correcting an output from the
imaging device 10 in the above-described manner, an object image can be properly shot even by theimaging device 10 provided with thelight transmitting portions 17. - An electrical signal output from the
line sensor unit 24 is also input to thebody microcomputer 50. Thebody microcomputer 50 can obtain a distance between two object images formed on theline sensor 24 a, based on the output from theline sensor unit 24, and then, can detect an in-focus state of an object image formed on theimaging device 10 from the obtained distance. For example, when an object image is transmitted through an imaging lens and is correctly formed on the imaging device 10 (in focus), the two object images formed on theline sensor 24 a are located at predetermined reference positions with a predetermined reference distance therebetween. In contrast, when an object image is formed before theimaging device 10 in the direction along the optical axis (front focus), the distance between the two object images is smaller than the reference distance when the object image is in focus. When an object image is formed behind theimaging device 10 in the direction along the optical axis (back focus), the distance between the two object images is larger than the reference distance when the object image is in focus. That is, an output from theline sensor 24 a is amplified, and then, an operation by the arithmetic circuit obtains information regarding whether or not an object image has been brought into focus, whether the object is in front focus or back focus, and the Df amount. - According to this embodiment, three light transmitting
portions 17 are formed in theimaging device 10, and in the back surface side of each of thelight transmitting portions 17, thecondenser lens 21 a of the phasedifference detection unit 20, a pair ofmask openings 22 a of themask member 22, theseparator lens 23 a and theline sensor 24 a are provided to be arranged along the optical axis. That is, the imaging unit 1 (specifically, theimaging device 10 and the phase difference detection unit 20) includes three areas (hereinafter also referred to as “phase difference areas”) for detection of a phase difference, in which thecondenser lens 21 a, a pair of themask openings 22 a of themask member 22, theseparator lens 23 a and theline sensor 24 a are arranged along the optical axis. - According to this embodiment, each of the
light transmitting portions 17 is formed in thesubstrate 11 a to have a smaller thickness than that of a part of thesubstrate 11 a located around thelight transmitting portion 17. However, the present invention is not limited thereto. For example, the thickness of theentire substrate 11 a may be determined so that a part of irradiation light onto thesubstrate 11 a in a sufficient amount is transmitted through thesubstrate 11 a to reach the phasedifference detection unit 20 provided in the back surface side of thesubstrate 11 a. In such a case, theentire substrate 11 a serves as thelight transmitting portion 17. - Also, according to this embodiment, three light transmitting
portions 17 are formed in thesubstrate 11 a, and three phase difference areas are provided. However, the present invention is not limited thereto. The number of each of those components is not limited to three, but may be any number. For example, as shown inFIG. 6 , ninelight transmitting portions 17 may be formed in thesubstrate 11 a, and accordingly, nine sets of thecondenser lens 21 a, theseparator lens 23 a and theline sensor 24 a may be provided, thereby providing nine phase difference areas. - Furthermore, the
imaging device 10 is not limited to a CCD image sensor but, as shown inFIG. 7 , may be a CMOS image sensor. - An
imaging device 210 is a CMOS image sensor, and includes aphotoelectric conversion section 211 made of a semiconductor material,transistors 212,signal lines 213,masks 214,color filters 215, and microlenses 216. - The
photoelectric conversion section 211 includes asubstrate 211 a, and light receivingsections 211 b each being comprised of a photodiode. Thetransistor 212 is provided for each of thelight receiving sections 211 b. Electrical charges accumulated in thelight receiving sections 211 b are amplified by thetransistors 212 and are output to the outside via the signal lines 213. Respective configurations of themasks 214, thecolor filters 215 and the microlenses 216 are the same as those of themask 14, thecolor filter 15 and themicrolens 16. - As in the CCD image sensor, the
light transmitting portions 17 for transmitting irradiation light are formed in thesubstrate 211 a. Thelight transmitting portions 17 are formed by cutting, polishing or etching an opposite surface (hereinafter also referred to as a “back surface”) 211 c of thesubstrate 211 a to a surface thereof on which thelight receiving sections 211 b are provided to provide concave-shaped recesses, and each of thelight transmitting portions 17 is formed to have a smaller thickness than that of a part of thesubstrate 11 a located around each of thelight transmitting portions 17. - In the CMOS image sensor, an amplification rate of the
transistor 212 can be determined for eachlight receiving section 211 b. Therefore, by determining the amplification rate of eachtransistor 212 based on whether or not each light receivingsection 11 b is located at a position corresponding to thelight transmitting portion 17 and the type of color of thecolor filter 15 corresponding to thelight receiving section 11 b, parts of an image corresponding to thelight transmitting portions 17 can be prevented from being not properly shot. - The configuration of an imaging device through which light passes is not limited to the configuration in which the
light transmitting portions 17 are provided in the manner described above. As long as light passes (or is transmitted, as described above) through the imaging device, any configuration can be employed. For example, as shown inFIG. 8 , animaging device 310 including light passingportions 318 each of which includes a plurality of throughholes 318 a formed in asubstrate 311 a may be employed. - Each of the through
holes 318 a is formed to pass through thesubstrate 311 a in the thickness direction of thesubstrate 311 a. Specifically, regarding pixel regions formed on thesubstrate 311 a to be arranged in matrix, when it is assumed that four pixel regions located in two adjacent columns and two adjacent rows are as a single unit, thelight receiving sections 11 b are provided in three of the four pixel regions, and the throughhole 318 a is formed in the other one of the four pixels. - In the three pixel regions of the four pixel regions in which the
light receiving sections 11 b are provided, threecolor filters sections 11 b are provided. Specifically, agreen color filter 15 g is provided in thelight receiving section 11 b located in a diagonal position to the throughhole 318 a, ared color filter 15 r is provided in one of thelight receiving sections 11 b located adjacent to the throughhole 318 a, and ablue color filter 15 b is provided in the other one of thelight receiving sections 11 b located adjacent to the throughhole 318 a. No color filter is provided in the pixel region corresponding to the throughhole 318 a. - In the
imaging device 10, a pixel corresponding to each throughhole 318 a is interpolated using outputs of thelight receiving sections 11 b located adjacent to the throughhole 318 a. Specifically, interpolation (standard interpolation) of a signal of the pixel corresponding to the throughhole 318 a is performed using an average value of outputs of the fourlight receiving sections 11 b each of which is located diagonally adjacent to the throughhole 318 a in the pixel regions and in which thegreen color filters 15 g are provided. Alternatively, in the fourlight receiving sections 11 b each of which is located diagonally adjacent to the throughhole 318 a in the pixel regions and in which thegreen color filters 15 g are provided, change in output of one pair of thelight receiving sections 11 b located adjacent to each other in one diagonal direction is compared to change in output of the other pair of thelight receiving sections 11 b located adjacent to each other in the other diagonal direction, and then, interpolation (slope interpolation) of a signal of a pixel corresponding to the throughhole 318 a is performed using an average value of outputs of the pair of thelight receiving sections 11 b, located diagonally adjacent, whose change in output is larger, or an average value of outputs of the pair of thelight receiving sections 11 b, located diagonally adjacent, whose change in output is smaller. Assume that a pixel desired to be interpolated is an edge of a focus object. If interpolation is performed using the pair of thelight receiving sections 11 b whose change in output is larger, the edge is undesirably caused to be loose. Therefore, the smaller change is used when each of the changes are equal to or larger than a predetermined threshold, and the larger change is used when each of the changes is smaller than the predetermine threshold so that as small change rate (slope) as possible is employed. - Then, after performing the interpolation of output data of the
light receiving sections 11 b corresponding to the throughholes 318 a, intensity information and color information for the pixel corresponding to each of thelight receiving sections 11 b are obtained using output data of each of thelight receiving sections 11 b and, furthermore, predetermined image processing or image synthesis is performed to generate an image signal. - Thus, it is possible to prevent parts of an image at the
light passing portions 318 from becoming dark. - The
imaging device 310 configured in the above-described manner can cause incident light to pass therethrough via the plurality of the throughholes 318 a. - As described above, also by providing, instead of the
light transmitting portions 17, thelight passing portions 318 made of the throughholes 318 a in thesubstrate 311 a, theimaging device 310 through which light passes can be configured. Moreover, theimaging device 310 is configured so that light from the plurality of throughholes 318 a enters a set of thecondenser lens 21 a, theseparator lens 23 a and theline sensor 24 a, and thus, advantageously, the size of one set of thecondenser lens 21 a, theseparator lens 23 a and theline sensor 24 a is not restricted by the size of pixels. That is, advantageously, the size of one set of thecondenser lens 21 a, theseparator lens 23 a and theline sensor 24 a does not cause any problem in increasing the resolution of theimaging device 310 by reducing the size of pixels. - The
light passing portions 318 may be provided only in each part of thesubstrate 311 a corresponding to thecondenser lenses 21 a and theseparator lens 23 a of the phasedifference detection unit 20, or may be provided throughout theentire substrate 311 a. - Furthermore, the phase
difference detection unit 20 is not limited to the above-described configuration. For example, as long as a configuration in which the positions of thecondenser lenses 21 a and theseparator lens 23 a are determined relative to thelight transmitting portions 17 of theimaging device 10 is provided, thecondenser lenses 21 a do not necessarily have to closely fit theopenings 31 c of thepackage 31. Also, a configuration which does not include a condenser lens may be employed. Alternatively, a configuration in which a condenser lens and a separator lens are integrated into a single unit may be employed. - As another example, as shown in
FIGS. 9 and 10 , a phasedifference detection unit 420 in which acondenser lens unit 421, amask member 422, aseparator lens unit 423 and aline sensor unit 424 are provided so as to be arranged in parallel to the imaging plane of theimaging device 10 in the back surface side of theimaging device 10 may be employed. - Specifically, the
condenser lens unit 421 is configured so that a plurality ofcondenser lenses 421 a are integrated into a single unit, and includes anincident surface 421 b, areflection surface 421 c and an output surface 421 d. That is, in thecondenser lens unit 421, light collected by thecondenser lenses 421 a is reflected by thereflection surface 421 c at an angle of about 90 degrees, and is output from the output surface 421 d. As a result, the light which has been transmitted through theimaging device 10 and has entered thecondenser lens unit 421 is bent substantially at a right angle, and output from the output surface 421 d to be directed to aseparator lens 423 a of aseparator lens unit 423. The light which has entered theseparator lens 423 a is transmitted through theseparator lens 423 a, and forms an image on theline sensor 424 a. - The
condenser lens unit 421, themask member 422, theseparator lens unit 423 and theline sensor unit 424, configured in the above-described manner, are provided within themodule frame 425. - The
module frame 425 is formed to have a box shape, and astep portion 425 a for attaching thecondenser lens unit 421 is provided in themodule frame 425. Thecondenser lens unit 421 is attached to thestep portion 425 a so that thecondenser lenses 421 a face outward from themodule frame 425. - Moreover, in the
module frame 425, anattachment wall potion 425 b for attaching themask member 422 and theseparator lens unit 423 is provided so as to upwardly extend at a part facing the output surface 421 d of thecondenser lens unit 421. An opening 425 c is formed in theattachment wall potion 425 b. - The
mask member 422 is attached to a side of theattachment wall potion 425 b located closer thecondenser lens unit 421. Theseparator lens unit 423 is attached to a side of theattachment wall potion 425 b located opposite to thecondenser lens unit 421. - Thus, the optical path of light which has passed through the
imaging device 10 is bent in the back surface side of theimaging device 10, and thus, thecondenser lens unit 421, themask member 422, theseparator lens unit 423, theline sensor unit 424 and the like can be arranged not in the thickness direction of theimaging device 10 but in parallel to the imaging plane of theimaging device 10. Therefore, a dimension of theimaging unit 401 in the thickness direction of theimaging device 10 can be reduced. That is, animaging unit 401 can be formed as a compactsize imaging unit 401. - As described above, as long as light which has passed through the
imaging device 10 can be received in the back surface side of theimaging device 10 and then phase difference detection can be performed, a phase difference detection unit having any configuration can be employed. - —Operation of Camera—
- The
camera 100 configured in the above-described manner has various shooting modes and functions. The various shooting modes and functions of thecamera 100, and the operation thereof at the time of each of the modes and functions will be described hereinafter. - —AF Function—
- When the
release button 40 b is pressed halfway down, thecamera 100 performs AF to focus. To perform AF, thecamera 100 has three autofocus functions, i.e., phase difference detection AF, contrast detection AF and hybrid AF. A user can select one of the three autofocus functions to be used by operating theAF setting switch 40 c provided to thecamera body 4. - Assuming that a camera system is in a normal shooting mode, the shooting operation of the camera system using each of the autofocus functions will be described hereinafter. The “normal shooting mode” is not a during-exposure AF shooting mode, a macro shooting mode, or a continuous shooting mode, which will be described later, but a most basic shooting mode of the
camera 100 for normal shooting. - (Phase Difference Detection AF)
- First, the shooting operation of the camera system using phase difference detection AF will be described with reference of
FIGS. 11 and 12 . - When the
power switch 40 a is turned on (Step Sa1), communication between thecamera body 4 and theinterchangeable lens 7 is performed (Step Sa2). Specifically, power is supplied to thebody microcomputer 50 and each of other units in thecamera body 4 to start up thebody microcomputer 50. At the same time, power is supplied to thelens microcomputer 80 and each of other units in theinterchangeable lens 7 via theelectric contact pieces lens microcomputer 80. Thebody microcomputer 50 and thelens microcomputer 80 are programmed to transmit/receive information to/from each other at start-up time. For example, lens information for theinterchangeable lens 7 is transmitted from the memory section of thelens microcomputer 80 to thebody microcomputer 50, and then is stored in the memory section of thebody microcomputer 50. - Subsequently, the
body microcomputer 50 positions thefocus lens group 72 at a predetermined reference position which has been determined in advance by the lens microcomputer 80 (Step Sa3), and also puts theshutter unit 42 into an open state (Step Sa4) in parallel with Step Sa3. Then, the process proceeds to Step Sa5, and thebody microcomputer 50 remains in a standby state until therelease button 40 b is pressed halfway down by the user. - Thus, light which has been transmitted through the
interchangeable lens 7 and has entered thecamera body 4 passes through theshutter unit 42, is transmitted through theOLPF 43 serving also as an IR cutter, and then enters theimaging unit 1. An object image formed in theimaging unit 1 is displayed at theimage display section 44, so that the user can observe an erected image of an object through theimage display section 44. Specifically, thebody microcomputer 50 reads an electrical signal from theimaging device 10 via the imagingunit control section 52 at constant intervals, and performs predetermined image processing to the electrical signal that has been read. Then, thebody microcomputer 50 generates an image signal, and controls the imagedisplay control section 55 to cause theimage display section 44 to display a live view image. - A part of the light which has entered the
imaging unit 1 is transmitted through thelight transmitting portions 17 of theimaging device 10, and enters the phasedifference detection unit 20. - In this case, when the
release button 40 b is pressed halfway down (i.e., S1 switch, which is not shown in the drawings, is turned on) by the user (Step Sa5), thebody microcomputer 50 amplifies an output from theline sensor 24 a of the phasedifference detection unit 20, and then an operation by the arithmetic circuit obtains information regarding whether or not an object image has been brought into focus, whether the object is in front focus or back focus, and the Df amount (Step Sa6). - Thereafter, the
body microcomputer 50 drives thefocus lens group 72 via thelens microcomputer 80 in the defocus direction by the Df amount obtained in Step Sa6 (Step Sa7). - In this case, the phase
difference detection unit 20 of this embodiment includes three sets of thecondenser lens 21 a, themask openings 22 a,separator lens 23 a, and theline sensor 24 a, i.e., has three phase difference areas at which phase difference detection is performed. In phase difference detection in phase difference detection AF or hybrid AF, thefocus lens group 72 is driven based on an output of theline sensor 24 a of one of the sets corresponding to a distance measurement point arbitrarily selected by the user. - Alternatively, an automatic optimization algorithm may be installed in the
body microcomputer 50 beforehand to select one of the distance measurement points located closest to the camera and drive thefocus lens group 72. Thus, the rate of the occurrence of focusing on the background of an object instead of the object can be reduced. - Application of this selection of the distance measurement point is not limited to phase difference detection AF. As long as the
focus lens group 72 is driven using the phase difference detection unit 2, this selection can be employed in AF using any method. - Then, whether or not an object image has been brought into focus is determined (Step Sa8). Specifically, if the Df amount obtained based on the output of the
line sensor 24 a is equal to or smaller than a predetermined value, it is determined that an object image has been brought into focus (YES), and then, the process proceeds to Step Sa11. If the Df amount obtained based on the output of theline sensor 24 a is larger than the predetermined value, it is determined that an object has not been brought into focus (NO), the process returns to Step Sa6, and Steps Sa6-Sa8 are repeated. - In the above-described manner, detection of an in-focus state and driving of the
focus lens group 72 are repeated and, when the Df amount is equal to or smaller than the predetermined value, it is determined that an object image has been brought into focus, and driving of thefocus lens group 72 is halted. - In parallel with phase difference detection AF in Steps Sa6-Sa8, photometry is performed (Step Sa9), and also image blur detection is started (Step Sa10).
- Specifically, in Step Sa9, the amount of light entering the
imaging device 10 is measured by theimaging device 10. That is, in this embodiment, the above-described phase difference detection AF is performed using light which has entered theimaging device 10 and has been transmitted through theimaging device 10, and thus, photometry can be performed using theimaging device 10 in parallel with the above-described phase difference detection AF. - More specifically, the
body microcomputer 50 retrieves an electrical signal from theimaging device 10 via the imagingunit control section 52, and measures the intensity of object light based on the electrical signal, thereby performing photometry. According to a predetermined algorithm, thebody microcomputer 50 determines, from a result of photometry, a shutter speed and an aperture value, which correspond to a shooting mode at the time of exposure. - When photometry is terminated in Step Sa9, image blur detection is started in Step Sa10. Step Sa9 and Step Sa10 may be performed in parallel.
- When the
release button 40 b is pressed halfway down by the user, various pieces of information for shooting are displayed as well as a shooting image at theimage display section 44, and thus, the user can confirm each piece of information through theimage display section 44. - In Step Sa11, the
body microcomputer 50 remains in a standby state until therelease button 40 b is pressed all the way down (i.e., a S2 switch, which is not shown in the drawings, is turned on) by the user. When therelease button 40 b is pressed all the way down by the user, thebody microcomputer 50 temporarily puts theshutter unit 42 into a close state (Step Sa12). Then, while theshutter unit 42 is kept in a close state, electrical charges stored in thelight receiving sections 11 b of theimaging device 10 are transferred for exposure, which will be described later. - Thereafter, the
body microcomputer 50 starts correction of an image blur based on communication information between thecamera body 4 and theinterchangeable lens 7 or any information specified by the user (Step Sa13). Specifically, the blur correctionlens driving section 74 a in theinterchangeable lens 7 is driven based on information of theblur detection section 56 in thecamera body 4. According to the intention of the user, any one of (i) use of theblur detection section 84 and the blur correctionlens driving section 74 a in theinterchangeable lens 7, (ii) use of theblur detection section 56 and theblur correction unit 45 in thecamera body 4, and (iii) use of theblur detection section 84 in theinterchangeable lens 7 and theblur correction unit 45 in thecamera body 4 can be selected. - By starting driving of the image blur correction sections at a time when the
release button 40 b is pressed halfway down, the movement of an object desired to be in focus is reduced, and thus, phase difference detection AF can be performed with higher accuracy. - In parallel with starting of image blur correction, the
body microcomputer 50 stops down theaperture section 73 via thelens microcomputer 80 so as to attain an aperture value calculated based on a result of photometry in Step Sa9 (Step Sa14). - Thus, when the image blur correction is started and the aperture operation is terminated, the
body microcomputer 50 puts theshutter unit 42 into an open state based on the shutter speed obtained from the result of photometry in Step Sa9 (Step Sa15). In the above-described manner, theshutter unit 42 is put into an open state, so that light from the object enters theimaging device 10, and electrical charges are stored in theimaging device 10 only for a predetermined time (Step Sa16). - The
body microcomputer 50 puts theshutter unit 42 into a close state based on the shutter speed, to terminate exposure (Step Sa17). After the termination of the exposure, in thebody microcomputer 50, image data is read out from theimaging unit 1 via the imagingunit control section 52 and then, after performing predetermined image processing to the image data, the image data is output to the imagedisplay control section 55 via the image reading/recording section 53. Thus, a shooting image is displayed at theimage display section 44. Thebody microcomputer 50 stores the image data in theimage storage section 58 via the imagerecording control section 54 as necessary. - Thereafter, the
body microcomputer 50 terminates image blur correction (Step Sa18), and releases the aperture section 73 (Step Sa19). Then, thebody microcomputer 50 puts theshutter unit 42 into an open state (Step Sa20). - When a reset operation is terminated, the
lens microcomputer 80 notifies thebody microcomputer 50 of the termination of the reset operation. Thebody microcomputer 50 waits to receive reset termination information from thelens microcomputer 80 and also a series of processings after exposure to be terminated. Thereafter, thebody microcomputer 50 confirms that therelease button 40 b is not in a pressed state, and terminates a shooting sequence. Then, the process returns to Step Sa5, and thebody microcomputer 50 remains in a standby state until therelease button 40 b is pressed halfway down. - When the
power switch 40 a is turned off (Step Sa21), thebody microcomputer 50 moves thefocus lens group 72 to a predetermined reference position which has been determined in advance (Step Sa22), and puts theshutter unit 42 into a close state (Step Sa23). Then, respective operations of thebody microcomputer 50 and other units in thecamera body 4, and thelens microcomputer 80 and other units in theinterchangeable lens 7 are halted. - As described above, in the shooting operation of the camera system using phase difference detection AF, photometry is performed by the
imaging device 10 in parallel with autofocusing based on the phasedifference detection unit 20. Specifically, the phasedifference detection unit 20 receives light transmitted through theimaging device 10 to obtain defocus information, and thus, whenever the phasedifference detection unit 20 obtains defocus information, theimaging device 10 is irradiated with light from an object. Therefore, photometry is performed using light transmitted through theimaging device 10 in autofocusing. By doing so, a photometry sensor does not have to be additionally provided, and photometry can be performed before therelease button 40 b is pressed all the way down, so that a time (hereinafter also referred to as a “release time lag”) from a time point when therelease button 40 b is pressed all the way down to a time point when exposure is terminated can be reduced. - Moreover, even in a configuration in which photometry is performed before the
release button 40 b is pressed all the way down, by performing photometry in parallel with autofocusing, increase in processing time after therelease button 40 b is pressed halfway down can be prevented. In such a case, a mirror for guiding light from an object to a photometry sensor or a phase difference detection unit does not have to be provided. - Conventionally, a part of light from an object to an imaging apparatus is directed to a phase difference detection unit provided outside the imaging apparatus by a mirror or the like. In contrast, according to this embodiment, an in-focus state can be detected by the phase
difference detection unit 20 using light guided to theimaging unit 1 as it is, and thus, the in-focus state can be detected with very high accuracy. - (Contrast Detection AF)
- Next, the shooting operation of the camera system using contrast detection AF will be described with reference to
FIG. 13 . - When the
power switch 40 a is turned on (Step Sb1), communication between thecamera body 4 and theinterchangeable lens 7 is performed (Step Sb2), thefocus lens group 72 is positioned at a predetermined reference position (Step Sb3), theshutter unit 42 is put into an open state (Step Sb4) in parallel with Step Sb3, and then, thebody microcomputer 50 remains in a standby state until therelease button 40 b is pressed halfway down (Step Sb5). The above-described steps are the same as Steps Sa1-Sa5. - When the
release button 40 b is pressed halfway down by the user (Step Sb5), thebody microcomputer 50 drives thefocus lens group 72 via the lens microcomputer 80 (Step Sb6). Specifically, thebody microcomputer 50 drives thefocus lens group 72 so that a focal point of an object image is moved in a predetermined direction (e.g., toward an object) along the optical axis. - Then, the
body microcomputer 50 obtains a contrast value for the object image, based on an output from theimaging device 10 received by thebody microcomputer 50 via the imagingunit control section 52, to determine whether or not the contrast value is reduced (Step Sb7). If the contrast value is reduced (YES), the process proceeds to Step Sb8. If the contrast value is increased (NO), the process proceeds to Step Sb9. - Reduction in contrast value means that the
focus lens group 72 is driven in an opposite direction to the direction in which the object image is brought into focus. Therefore, when the contrast value is reduced, thefocus lens group 72 is reversely driven so that the focal point of the object image is moved in an opposite direction to the predetermined direction (e.g., toward the opposite side to the object) along the optical axis (Step Sb8). Thereafter, whether or not a contrast peak has been detected is determined (Step Sb10). If the contrast peak has not been detected (NO), reverse driving of the focus lens group 72 (Step Sb8) is repeated. If the contrast peak has been detected (YES), reverse driving of thefocus lens group 72 is halted, and thefocus lens group 72 is moved to a position where the contrast value has reached the peak. Then, the process proceeds to Step Sa11. - On the other hand, when the
focus lens group 72 is driven in Step Sb6 and the contrast value is increased, thefocus lens group 72 is driven in the direction in which the object image is brought into focus. Therefore, driving of thefocus lens group 72 is continued (Step Sb9), and whether or not a peak of the contrast value has been detected is determined (Step Sb10). If the contrast peak has not been detected (NO), driving of the focus lens group 72 (Step Sb9) is repeated. If the contrast peak has been detected (YES), driving of thefocus lens group 72 is halted, and thefocus lens group 72 is moved to a position where the contrast value has reached the peak. Then, the process proceeds to Step Sa11. - As has been described, in the contrast detection method, the
focus lens group 72 is tentatively driven (Step Sb6). Then, if the contrast value is reduced, thefocus lens group 72 is reversely driven to search for the peak of the contrast value (Steps Sb8 and Sb10). If the contrast value is increased, driving of thefocus lens group 72 is continued to search for the peak of the contrast value (Steps Sb9 and Sb10). - In parallel with this contrast detection AF (Steps Sb6-Sb10), photometry is performed (Step Sb11), and also image blur detection is started (Step Sb12). Steps Sb11 and Sb12 are the same as Step Sa9 and Step Sa11) in phase difference detection AF.
- In Step Sa11, the
body microcomputer 50 remains in a standby state until therelease button 40 b is pressed all the way down by the user. A flow of steps after therelease button 40 b is pressed all the way down is the same as that of phase difference detection AF. - In this contrast detection AF, a contrast peak can be directly obtained, and thus, as opposed to phase difference detection AF, various correction operations such as release back correction (for correcting an out-of-focus state due to the degree of aperture) and the like are not necessary, so that a highly accurate focusing performance can be achieved. However, to detect the peak of a contrast value, the
focus lens group 72 has to be driven until thefocus lens group 72 passes through a position where the contrast value reaches its peak. Accordingly, thefocus lens group 72 has to be moved beyond the position where the contrast value reaches the peak first and then be moved back to the position corresponding to the peak of the contrast value, and thus, a backlash generated in a focus lens group driving system due to the operation of driving thefocus lens group 72 in back and forth directions has to be removed. - (Hybrid AF)
- Subsequently, the shooting operation of the camera system using hybrid AF will be described with reference to
FIG. 14 . - Steps (Steps Sc1-Sc5) from the step in which the
power switch 40 a is turned on to the step in which the body microcomputer remains in a standby state until therelease button 40 b is pressed halfway down are the same as Steps Sa1-Sa5 in phase difference detection AF. - When the
release button 40 b is pressed halfway down by the user (Step Sc5), thebody microcomputer 50 amplifies an output from theline sensor 24 a of the phasedifference detection unit 20, and then performs an operation by the arithmetic circuit, thereby determining whether or not an object image has been brought into focus (Step Sc6). Furthermore, thebody microcomputer 50 obtains information regarding whether the object is in front focus or back focus and the Df amount, and then, obtains defocus information (Step Sc7). Thereafter, the process proceeds to Step Sc10. - In parallel with Steps Sc6 and Sc7, photometry is performed (Step Sc8), and also image blur detection is started (Step Sc9). Steps Sc6 and Sc7 are the same as Steps Sa9 and Sa10 in phase difference detection AF. Thereafter, the process proceeds to Step Sc10. Note that, after Step Sc9, the process may also proceed to Step Sa11, instead of Sc10.
- As decried above, in this embodiment, using light which has entered the
imaging device 10 and has been transmitted through theimaging device 10, the above-described focus detection based on a phase difference is performed. Thus, in parallel with the above-described focus detection, photometry can be performed using theimaging device 10. - In Step Sc10, the
body microcomputer 50 drives thefocus lens group 72 based on the defocus information obtained in Step Sc7. - The
body microcomputer 50 determines whether or not a contrast peak has been detected (Step Sc11). If the contrast peak has no't been detected (NO), driving of the focus lens group 72 (Step Sc10) is repeated. If the contrast peak has been detected (YES), driving of thefocus lens group 72 is halted, and thefocus lens group 72 is moved to a position where the contrast value has reached the peak. Then, the process proceeds to Step Sa11. - Specifically, in Steps Sc10 and Sc11, it is preferable that, based on the defocus direction and the defocus amount calculated in Step Sc7, the
focus lens group 72 is moved at high speed, and then, thefocus lens group 72 is moved at lower speed than the high speed to detect the contrast peak. - In this case, it is preferable that an moving amount of the
focus lens group 72 which is moved based on the calculated defocus amount (i.e., a position to which thefocus lens group 72 is to be moved) is set to be different from that in Step Sa7 in phase difference detection AF. Specifically, in Step Sa7 in phase difference detection AF, thefocus lens group 72 is moved to a position which is estimated as a focus position, based on the defocus amount. In contrast, in Step Sc10 in hybrid AF, thefocus lens group 72 is driven to a position shifted forward or backward from the position estimated as a focus position based on the defocus amount. Thereafter, in hybrid AF, the contrast peak is detected while thefocus lens group 72 is driven toward the position estimated as the focus position. - In Step Sa11, the
body microcomputer 50 remains in a standby state until therelease button 40 b is pressed all the way down by the user. A flow of steps after therelease button 40 b is pressed all the way down is the same as that of phase difference detection AF. - As has been described, in hybrid AF, first, defocus information is obtained by the phase
difference detection unit 20, and thefocus lens group 72 is driven based on the defocus information. Then, the position of thefocus lens group 72 at which the contrast value calculated based on an output from theimaging device 10 reaches its peak is detected, and thefocus lens group 72 is moved to the position. Thus, defocus information can be detected before driving thefocus lens group 72, and therefore, as opposed to contrast detection AF, the step of tentatively driving thefocus lens group 72 is not necessary. This allows reduction in processing time for autofocusing. Moreover, an object image is brought into focus by contrast detection AF eventually, and therefore, particularly, an object having a repetitive pattern, an object having extremely low contrast, and the like can be brought into focus with higher accuracy than in phase difference detection AF. - Since defocus information is obtained by the phase
difference detection unit 20 using light transmitted through theimaging device 10, photometry by theimaging device 10 can be performed in parallel with obtaining defocus information by the phasedifference detection unit 20, although hybrid AF includes phase difference detection. As a result, a mirror for dividing a part of light from an object does not have to be provided for phase difference detection, and also, a photometry sensor does not have to be additionally provided. Furthermore, photometry can be performed before therelease button 40 b is pressed all the way down, so that a release time lag can be reduced. In the configuration in which photometry is performed before therelease button 40 b is pressed all the way down, photometry can be performed in parallel with obtaining defocus information, thereby preventing increase in processing time after therelease button 40 b is pressed halfway down. - —Variations—
- In the above description, after the
release button 40 b is pressed all the way down, stopping down is performed immediately before exposure. In the following description, a variation configured so that, in phase difference detection AF and hybrid AF, before therelease button 40 b is pressed all the way down, stopping down is performed before autofocusing will be described. - (Phase Difference Detection AF)
- Specifically, first, the shooting operation of the camera system in phase difference detection AF according to the variation will be described with reference to
FIG. 15 . - Steps (Steps Sd1-Sd5) from the step in which the
power switch 40 a is turned on to the step in which the body microcomputer remains in a standby state until therelease button 40 b is pressed halfway down are the same as Steps Sa1-Sa5 in phase difference detection AF which have been described above. - When the
release button 40 b is pressed half way down by a user (Step Sd5), image blur detection is started (Step Sd6), and in parallel with Step Sd6, photometry is performed (Step Sd7). Steps Sd5 and Sd6 are the same as Steps Sa9 and Sa10 in phase difference detection AF. - Thereafter, an aperture value at the time of exposure is obtained based on a result of photometry in Step Sd7, and whether or not the obtained aperture value is larger than a predetermined aperture threshold value is determined (Step Sd8). Then, when the obtained aperture value is larger than the predetermined aperture threshold value (YES), the process proceeds to Step Sd10. When the obtained value is equal to or smaller than the predetermined aperture threshold value (NO), the process proceeds to Step Sd9. In Step Sd9, the
body microcomputer 50 drives theaperture section 73 via thelens microcomputer 80 to attain the obtained aperture value. - In this case, the predetermined aperture threshold value is set to be about an aperture value at which defocus information can be obtained based on an output of the
line sensor 24 a of the phasedifference detection unit 20. That is, assuming that the aperture value obtained based on the result of photometry is larger than the aperture threshold value, if theaperture section 73 is stopped down to the aperture value, defocus information cannot be obtained by the phasedifference detection unit 20. Therefore, theaperture section 73 is not stopped down, and the process proceeds to Step Sd10. On the other hand, when the aperture value obtained based on the result of photometry is equal to or smaller than the aperture threshold value, theaperture section 73 is stopped down to the aperture value, and then, the process proceeds to Step Sd10. - In Steps Sd10-Sd12, similarly to Steps Sa6-Sa8 in phase difference detection AF described above, the
body microcomputer 50 obtains defocus information based on an output from theline sensor 24 a of the phase difference detection unit 20 (Step Sd10), drives thefocus lens group 72 based on the defocus information (Step Sd11), and determines whether or not an object image has been brought into focus (Step Sd12). After an object image has been brought into focus, the process proceeds to Step Sa11. - In Step Sa11, the body microcomputer remains in a standby state until the
release button 40 b is pressed all the way down by the user. A flow of steps after therelease button 40 b is pressed all the way down is the same as that of phase difference detection AF described above. - It should be noted that only when it is determined in Step Sd8 that the aperture value obtained based on the result of photometry is larger than the predetermined aperture threshold value, stopping down of the
aperture section 73 is performed in Step Sa14. That is, when it is determined in Step Sd8 that the aperture value obtained based on the result of photometry is equal to or smaller than the predetermined aperture threshold value, Step Sa14 does not have to be performed because stopping down of theaperture section 73 is performed beforehand in Step Sd9. - As described above, in the shooting operation of the camera system in phase difference detection AF according to the variation, when the aperture value at the time of exposure obtained based on the result of photometry is about a value at which phase difference detection AF can be performed, the
aperture section 73 is stopped down in advance of exposure before autofocusing. Thus, stopping down of theaperture section 73 does not have to be performed after therelease button 40 b is pressed all the way down, so that a release time lag can be reduced. - (Hybrid AF)
- Next, the shooting operation of the camera system in hybrid AF according to the variation will be described with reference to
FIG. 16 . - Steps (Steps Se1-Se5) from the step in which the
power switch 40 a is turned on to the step in which the body microcomputer remains in a standby state until therelease button 40 b is pressed halfway down are the same as Steps Sa1-Sa5 in phase difference detection AF which have been described above. - When the
release button 40 b is pressed half way down by a user (Step Se5), image blur detection is started (Step Se6), and in parallel with Step Se6, photometry is performed (Step Se7). Steps Se6 and Se7 are the same as Steps Sa9 and Sa10 in phase difference detection AF. - Thereafter, an aperture value at the time of exposure is obtained based on a result of photometry in Step Se7, and whether or not the obtained aperture value is larger than a predetermined aperture threshold value is determined (Step Se8). Then, when the obtained aperture value is larger than the predetermined aperture threshold value (YES), the process proceeds to Step Se10. When the obtained value is equal to or smaller than the predetermined aperture threshold value (NO), the process proceeds to Step Se9. In Step Se9, the
body microcomputer 50 drives theaperture section 73 via thelens microcomputer 80 to attain the obtained aperture value. - In this case, the predetermined aperture threshold value is set to be about an aperture value at which a peak of a contrast value calculated from an output of the
imaging device 10 can be detected. That is, assuming that the aperture value obtained based on the result of photometry is larger than the aperture threshold value, if theaperture section 73 is stopped down to the aperture value, contrast peak detection, which will be described later, cannot be performed. Therefore, theaperture section 73 is not stopped down, and the process proceeds to Step Se10. On the other hand, when the aperture value obtained based on the result of photometry is equal to or smaller than the aperture threshold value, theaperture section 73 is stopped down to the aperture value, and then, the process proceeds to Step Se10. - In Steps Se10-Se12, similarly to Steps Sc6, Sc7, Sc10 and Sc11 in normal hybrid AF described above, the
body microcomputer 50 obtains defocus information based on an output from theline sensor 24 a of the phase difference detection unit 20 (Steps Se10 and Se11), drives thefocus lens group 72 based on the defocus information (Step Se12), and detects the contrast peak to move thefocus lens group 72 to a position where the contrast value has reached the peak (Step Se13). - Thereafter, in Step Sa11, the body microcomputer remains in a standby state until the
release button 40 b is pressed all the way down by the user. A flow of steps after therelease button 40 b is pressed all the way down is the same as that of normal phase difference detection AF described above. - It should be noted that only when it is determined in Step Se8 that the aperture value obtained based on the result of photometry is larger than the predetermined aperture threshold value, stopping down of the
aperture section 73 is performed in Step Sa14. That is, when it is determined in Step Se8 that the aperture value obtained based on the result of photometry is equal to or smaller than the predetermined aperture threshold value, Step Sa14 does not have to be performed because stopping down of theaperture section 73 is performed beforehand in Step Se9. - As described above, in the shooting operation of the camera system in hybrid AF according to the variation, when the aperture value at the time of exposure obtained based on the result of photometry is about a value at which contrast detection AF can be performed, the
aperture section 73 is stopped down in advance of exposure before autofocusing. Thus, stopping down of theaperture section 73 does not have to be performed after therelease button 40 b is pressed all the way down, and therefore, a release time lag can be reduced. - —Continuous Shooting Mode—
- In the above description, each time the
release button 40 b is pressed all the way down, a single image is shot. Thecamera 100 has a continuous shooting mode in which a plurality of images are shot by pressing therelease button 40 b all the way down once. - The continuous shooting mode will be described hereinafter with reference to
FIGS. 17 and 18 . In the following description, it is assumed that hybrid AF according to the variation is performed. Note that the continuous shooting mode is not limited to hybrid AF according to the variation, but can be employed in any configuration using phase difference detection AF, contrast detection AF, hybrid AF, phase difference detection AF according to the variation, or the like. - Steps (Steps Sf1-Sf13) from the step in which the
power switch 40 a is turned on to the step in which arelease button 40 b is pressed halfway down and thefocus lens group 72 is moved to the position where the contrast value has reached the peak are the same as Steps Se1-Se13 in hybrid AF according to the variation. - After the
focus lens group 72 is moved to the position where the contrast value has reached the peak, thebody microcomputer 50 causes the memory section to store a distance between two object images formed on theline sensor 24 a at that time (i.e., when an object image has been brought into focus using contrast detection AF) (Step Sf14). - Thereafter, in Step Sf15, the body microcomputer remains in a standby state until the
release button 40 b is pressed all the way down by the user. When therelease button 40 b is pressed all the way down by the user, exposure is performed in the same manner as in Steps Sa12-Sa17 in phase difference detection AF. - Specifically, the
body microcomputer 50 temporarily puts theshutter unit 42 into a close state (Step Sf16), image blur correction is started (Step Sf17), and if theaperture section 73 is not stopped down in Step Sf9, theaperture section 73 is stopped down based on a result of photometry (Step Sf18). Thereafter, theshutter unit 42 is put into an open state (Step Sf19), exposure is started (Step Sf20), and theshutter unit 42 is put into a close state (Step Sf21) to terminate the exposure. - After the exposure is terminated, whether or not the
release button 40 b has been released from being pressed all the way down is determined (Step Sf22). When therelease button 40 b has been released (YES), the process proceeds to Steps Sf29 and Sf30. On the other hand, when therelease button 40 b is continuously pressed all the way down (NO), the process proceeds to Step Sf23 to perform continuous shooting. - When the
release button 40 b is continuously pressed all the way down, thebody microcomputer 50 puts theshutter unit 42 into an open state (Step Sf23), and phase difference detection AF is performed (Steps Sf24-Sf26). - Specifically, an in-focus state of an object image in the
imaging device 10 is detected via the phase difference detection unit 20 (Step Sf24), defocus information is obtained (Step Sf25), and thefocus lens group 72 is driven based on the defocus information (Step Sf26). - In this case, in hybrid AF before the
release button 40 b is pressed all the way down, a distance between two object images formed on theline sensor 24 a is compared to a reference distance which has been set beforehand to obtain the defocus information (Step Sf11). In contrast, in Steps Sf24 and Sf25 after therelease button 40 b is pressed all the way down, the distance between two object images formed on theline sensor 24 a is compared to the distance of two object images formed on theline sensor 24 a which has been stored in Step Sf14 after contrast detection AF in hybrid AF to obtain an in-focus state and defocus information. - After phase difference detection AF is performed in the above-described manner, the
body microcomputer 50 determines whether or not it is a timing of outputting a signal (i.e., an exposure start signal) for starting exposure from thebody microcomputer 50 to theshutter control section 51 and the imaging unit control section 52 (Step Sf27). This output timing of the exposure start signal is a timing of performing continuous shooting in continuous shooing mode. When it is not the output timing of the exposure start signal (NO), phase distance detection AF is repeated (Steps Sf24-Sf26). On the other hand, when it is the output timing of the exposure start signal (YES), driving of thefocus lens group 72 is halted (Step Sf28) to perform exposure (Step Sf20). - Note that after the
focus lens group 72 is halted, it is necessary to sweep out, before starting exposure, electrical charges accumulated in thelight receiving sections 11 b of theimaging device 10 during phase difference detection AF. Therefore, electrical charges in thelight receiving sections 11 b are swept out using an electronic shutter, or theshutter unit 42 is temporarily put into a close state to sweep out electrical charges in thelight receiving sections 11 b, and then theshutter unit 42 is put into an open state to start exposure. - After the exposure is terminated, whether or not the
release button 40 b has been released from being pressed all the way down is determined again (Step Sf22). As long as therelease button 40 b is pressed all the way down, phase difference detection AF and exposure are repeated (Steps Sf23-Sf28 and Steps Sf20 and Sf21). - When the
release button 40 b has been released from being pressed all the way down, image blur correction is terminated (Step Sf29), and also, theaperture section 73 is opened up (Step Sf30) to put theshutter unit 42 into an open state (Step Sf31). - After completing resetting, when a shooting sequence is terminated, the process returns to Step Say, and the body microcomputer remains in a standby state until the
release button 40 b is pressed halfway down. - When the
power switch 40 a is turned off (Step Sf32), thebody microcomputer 50 moves thefocus lens group 72 to a predetermined reference position which has been set beforehand (Step Sf33), and puts theshutter unit 42 into a close state (Step Sf34). Then, respective operations of thebody microcomputer 50 and other units in thecamera body 4, and thelens microcomputer 80 and other units in theinterchangeable lens 7 are halted. - As described above, in the shooting operation of the camera system in the continuous shooting mode, phase difference detection AF can be performed between exposures during continuous shooting, so that a high focus performance can be realized.
- Also, since autofocusing is performed using phase difference detection AF in this case, the defocus direction can be instantly obtained, and thus, an object can be instantly brought into focus even in a short time between shootings continuously performed.
- Furthermore, as opposed to a known technique, even in phase difference detection AF, a movable mirror for phase difference detection does not have to be provided. Thus, a release time lag can be reduced, and also, power consumption can be reduced. Moreover, according to the known technique, a release time lag corresponding to the vertical movement of the movable mirror is generated, and thus, when an object is a moving object, it is necessary to predict the movement of the moving object during the release time lag and then shoot an image. However, according to this embodiment, there is no release time lag corresponding to the vertical movement of the movable mirror, and therefore, focus can be achieved while following the movement of an object until immediately before exposure.
- In phase difference detection AF during continuous shooting, as the reference distance between two object images formed on the
line sensor 24 a based on which whether or not an object image has been brought into focus is determined, the distance between two object images formed on theline sensor 24 a when therelease button 40 b is pressed halfway down and an object image has been brought into focus by contrast detection AF is used. Thus, highly accurate autofocusing which corresponds to actual equipment and actual shooting conditions can be performed. - At the time of shooting for the first frame in the continuous shooting mode, the autofocusing method is not limited to hybrid AF. Phase difference detection AF or contrast detection AF may be used. However, as described above, if AF such as hybrid AF and contrast detection AF in which focus adjustment is eventually performed based on the contrast value is employed at the time of shooting for the first frame, phase difference detection AF at the time of shooting for second and subsequent frames can be performed based on a highly accurate in-focus state obtained at the time of shooting for the first frame. Note that when phase difference detection AF is used, Step Sf14 is not performed, the distance between two object images formed on the
line sensor 24 a is compared to the reference distance which has been set beforehand to obtain an in-focus state and defocus information. - Not only in the continuous shooting mode but also in normal shooting, the camera system may be configured so that when an object is a moving object, phase difference detection AF is performed until the
release button 40 b is pressed all the way down even after an object image has been brought into focus. - —Low Contrast Mode—
- The
camera 100 of this embodiment is configured so that the autofocusing method is switched according to the contrast of an object. That is, thecamera 100 has a low contrast mode in which shooting is performed under a low contrast condition. - The low contrast mode will be described hereinafter with reference to
FIG. 19 . In the following description, it is assumed that hybrid AF is performed. Note that the low contrast mode is not limited to hybrid AF, but can be employed in any configuration using phase difference detection AF, contrast detection AF, phase difference detection AF according to the variation, hybrid AF according to the variation, or the like. Steps (Steps Sg1-Sg5) from the step in which thepower switch 40 a is turned on to the step in which the body microcomputer remains in a standby state until therelease button 40 b is pressed halfway down are the same as Steps Sa1-Sa5 in phase difference detection AF. - When the
release button 40 b is pressed halfway down by a user (Step Sg5), thebody microcomputer 50 amplifies an output from theline sensor 24 a of the phasedifference detection unit 20, and then performs an operation by the arithmetic circuit (Step Sg6). Then, whether or not a low contrast state has occurred is determined (Step Sg7). Specifically, it is determined whether or not a contrast value is high enough to detect respective positions of two object images formed on theline sensor 24 a based on the output from theline sensor 24 a. - When the contrast value is high enough to detect the positions of the two object images (NO), it is determined that a low contrast state has not occurred, and the process proceeds to Step Sg8 to perform hybrid AF. Note that Steps Sg8-Sg10 are the same as Steps Sc7, Sc10 and Sc11 in hybrid AF.
- On the other hand, when the contrast value is not high enough to detect the position of the two object images (YES), it is determined that a low contrast state has occurred, and the process proceeds to Step Sg11 to perform contrast detection AF. Note that Steps Sg11-Sg15 are the same as Steps Sb6-Sb10 in contrast detection AF.
- After hybrid AF or contrast detection AF is preformed in the above-described manner, the process proceeds to Step Sa11.
- In parallel with this autofocus operation (Steps Sg6-Sg15), photometry is performed (Step Sg16), and image blur detection is started (Step Sg17). Steps Sg16 and Sg17 are the same as Steps Sa9 and Sa11) in phase difference detection AF. Thereafter, the process proceeds to Step Sa11.
- In Step Sa11, the body microcomputer remains in a standby state until the
release button 40 b is pressed all the way down by the user. A flow of steps after therelease button 40 b is pressed all the way down is the same as that of normal hybrid detection AF. - That is, in the low contrast mode, when the contrast at the time of shooting is high enough to perform phase difference detection AF, hybrid AF is performed. On the other hand, when the contrast at the time of shooting is so low that phase difference detection AF cannot be performed, contrast detection AF is performed.
- In this embodiment, first, it is determined whether or not an in-focus state can be detected using phase difference detection based on the output of the
line sensor 24 a of the phasedifference detection unit 20, and then, hybrid AF or contrast detection AF is selected. However, the present invention is not limited thereto. For example, the camera system may be configured so that after therelease button 40 b is pressed halfway down, the contrast value is obtained from an output of theimaging device 10 to determine whether or not the contrast value obtained from the output of theimaging device 10 is higher than a predetermined value before a phase difference focus is detected (i.e., between Steps Sg5 and Sg6 inFIG. 19 ). The predetermined value is set to be about a contrast value at which a position of an object image formed on theline sensor 24 a can be detected. That is, the camera system may be configured so that, when the contrast value obtained from the output of theimaging device 10 is approximately equal to or larger than a value at which an in-focus state can be detected using phase difference detection, hybrid AF is performed and, on the other hand, when the contrast value obtained from the output of theimaging device 10 is smaller than the value at which an in-focus state can be detected using phase difference detection, contrast detection AF is performed. - Also, in this embodiment, when an in-focus state can be detected using phase difference detection, hybrid AF is performed. However, the camera system may be configured so that, when an in-focus state can be detected using phase difference detection, phase difference detection AF is performed.
- As described above, in the
camera 100 including theimaging unit 1 for receiving light transmitting through theimaging device 10 by the phasedifference detection unit 20, the movable mirror of the known technique for guiding light to the phase difference detection unit is not provided, but phase difference detection AF (including hybrid AF) and contrast detection AF can be performed. Thus, a highly accurate focus performance can be realized by selecting one of phase difference detection AF and contrast detection AF according to the contrast. - —AF Switching According to Interchangeable Lens—
- Furthermore, the
camera 100 of this embodiment is configured so that the autofocusing method is switched according to the type of theinterchangeable lens 7 attached to thecamera body 4. - An AF switching function according to the type of the interchangeable lens will be described hereinafter with reference to
FIG. 20 . In the following description, it is assumed that hybrid AF is performed. Note that the AF switching function according to the interchangeable lens is not limited to hybrid AF, but can be employed in any configuration using phase difference detection AF, contrast detection AF, phase difference detection AF according to the variation, hybrid AF according to the variation, or the like. - Steps (Steps Sh1-Sh5) from the step in which the
power switch 40 a is turned on to the step in which the body microcomputer remains in a standby state until therelease button 40 b is pressed halfway down are the same as Steps Sa1-Sa5 in phase difference detection AF. - When the
release button 40 b is pressed half way down by a user (Step Sh5), photometry is performed (Step Sh6), and in parallel with Step Sh6, image blur detection is started (Step Sh7). Steps Sh6 and Sh7 are the same as Steps Sa9 and Sa10 in phase difference detection AF. Note that the photometry and image blur detection may be performed in parallel with an autofocus operation, which will be described later. - Thereafter, the
body microcomputer 50 determines whether or not theinterchangeable lens 7 is a reflecting telephoto lens produced by a third party or a smooth trans focus (STF) lens based on information from the lens microcomputer 80 (Step Sh8). When theinterchangeable lens 7 is a reflecting telephoto lens produced by a third party or a STF lens (YES), the process proceeds to Step Sh13 to perform contrast detection AF. Note that Steps Sh13-Sh17 are the same as Steps Sb6-Sb10 in contrast detection AF. - On the other hand, when the
interchangeable lens 7 is not either a reflecting telephoto lens produced by a third party or a STF lens (NO), the process proceeds to Step Sh9 to perform hybrid AF. Note that Steps Sh9-Sh12 are the same as Steps Sc6, Sc7, Sc10 and Sc11 in hybrid AF. - After contrast detection AF or hybrid AF is performed in the above-described manner, the process proceeds to Step Sa11.
- In Step Sa11, the body microcomputer remains in a standby state until the
release button 40 b is pressed all the way down by the user. A flow of steps after therelease button 40 b is pressed all the way down is the same as that of hybrid AF. - That is, when the
interchangeable lens 7 is a reflecting telephoto lens produced by a third party or a STF lens, phase difference detection might not be performed with high accuracy, and therefore, hybrid AF (specifically, phase difference detection AF) is not performed, but contrast detection AF is performed. On the other hand, when theinterchangeable lens 7 is not either a reflecting telephoto lens produced by a third party or a STF lens, hybrid AF is performed. That is, thebody microcomputer 50 determines whether or not it is ensured that an optical axis of theinterchangeable lens 7 properly extends so that phase difference detection AF can be performed. Then, only when it is ensured that the optical axis of theinterchangeable lens 7 properly extends so that phase difference detection AF can be performed, hybrid AF is performed. If it is not ensured that the optical axis of theinterchangeable lens 7 properly extends so that phase difference detection AF can be performed, contrast detection AF is performed. - As described above, in the
camera 100 including theimaging unit 1 for receiving light transmitting through theimaging device 10 by the phasedifference detection unit 20, the movable mirror of the known technique for guiding light to the phase difference detection unit is not provided, but phase difference detection AF (including hybrid AF) and contrast detection AF can be performed. Thus, a highly accurate focus performance can be realized by selecting one of phase difference detection AF and contrast detection AF according to the type of theinterchangeable lens 7. - According to this embodiment, it is determined which of hybrid AF and contrast detection AF is to be performed depending on whether or not the
interchangeable lens 7 is a reflecting telephoto lens produced by a third party or a STF lens. However, the present invention is not limited thereto. The camera system may be configured to determine which of hybrid AF and contrast detection AF is to be performed depending on only whether or not theinterchangeable lens 7 is produced by a third party, regardless of whether or not theinterchangeable lens 7 is a reflecting telephoto lens or a STF lens. - Also, according to this embodiment, the camera system is configured so that when the
interchangeable lens 7 is not either a reflecting telephoto lens produced by a third party or a STF lens, hybrid AF is performed. However, the camera system may be configured so that when theinterchangeable lens 7 is not either a reflecting telephoto lens produced by a third party or a STF lens, phase difference detection AF is performed. - Therefore, according to this embodiment, the
imaging device 10 is configured so that light passes through theimaging device 10, and the phasedifference detection unit 20 for receiving light which has passed through theimaging device 10 to perform phase difference detection is provided. Moreover, thebody control section 5 controls theimaging device 10 and also controls driving of thefocus lens group 72 at least based on a detection result of the phasedifference detection unit 20 to perform focus adjustment. Thus, various types of processing using theimaging device 10 can be performed in parallel with autofocusing (phase difference detection AF and hybrid AF which have been described above) using the phasedifference detection unit 20, so that the processing time can be reduced. - Also, in the above-described configuration, when light enters the
imaging device 10, light also enters the phasedifference detection unit 20. Thus, even if various types of processing using theimaging device 10 are not performed in parallel with autofocusing the phasedifference detection unit 20, switching between various types of processing using theimaging device 10 and autofocusing the phasedifference detection unit 20 can be performed in a simple manner by changing a control mode of thebody control section 5. That is, compared to a known configuration in which the direction in which light travels from an object is switched between the direction toward an imaging device and the direction toward a phase difference detection unit by moving a movable mirror forward/backward, the movable mirror does not have to be moved forward/backward, so that switching between various types of processing using theimaging device 10 and autofocusing the phasedifference detection unit 20 can be quickly performed. Also, noise is not caused by moving the movable mirror forward/backward, and thus, switching between various types of processing using theimaging device 10 and autofocusing the phasedifference detection unit 20 can be quietly performed. - Thus, the convenience of the
camera 100 can be improved. - Specifically, the
imaging device 10 is configured so that light passes through theimaging device 10, and the phasedifference detection unit 20 for receiving light which has passed through theimaging device 10 to perform phase difference detection is provided, so that AF using the phasedifference detection unit 20, such as the above-described phase difference detection AF, and photometry using theimaging device 10 can be performed in parallel. Thus, photometry does not have to be performed after pressing therelease button 40 b all the way down, thus resulting in reduction in a release time lag. Even in the configuration in which photometry is performed before therelease button 40 b is pressed all the way down, increase in processing time after therelease button 40 b is pressed halfway down can be prevented by performing photometry in parallel with autofocusing. Furthermore, since photometry is performed using theimaging device 10, there is no need to additionally provide a photometry sensor. Also, a movable mirror for guiding light from an object to the photometry sensor or the phase difference detection unit does not have to be provided. Therefore, power consumption can be reduced. - Also, the
imaging device 10 is configured so that light passes through theimaging device 10, and the phasedifference detection unit 20 for receiving light which has passed through theimaging device 10 to perform phase difference detection is provided, so that the driving direction of thefocus lens group 72 is determined based on a detection result of the phasedifference detection unit 20, and then, contrast detection AF based on an output of theimaging device 10 can be quickly performed as in the above-described hybrid AF. That is, switching from phase difference detection by the phasedifference detection unit 20 to contrast detection using theimaging device 10 can be quickly performed by control of thebody control section 5 without performing switching the optical path using a movable mirror, or the like, in a known manner, thereby reducing a time required for hybrid AF. A movable mirror is not needed, and therefore, noise caused by such a movable mirror is not generated and hybrid AF can be performed quietly. - Furthermore, the
body control section 5 performs photometry using theimaging device 10 and controls theaperture section 73 based on the result of the photometry to adjust the amount of light, and then, phase difference detection is performed by the phasedifference detection unit 20. Thus, stopping down does not have to be performed after therelease button 40 b is pressed all the way down, thus reducing a release time lag. Then, theimaging device 10 is configured so that light passes through theimaging device 10, and the phasedifference detection unit 20 for receiving light which has passed through theimaging device 10 to perform phase difference detection is provided, so that when photometry using theimaging device 10 and phase difference detection using the phasedifference detection unit 20 are performed in succession, switching between photometry by theimaging device 10 and phase difference detection by the phasedifference detection unit 20 can be performed quickly and quietly by control of thebody control section 5. - In the continuous shooting mode, the
body control section 5 performs, based on the detection result of the phasedifference detection unit 20, phase difference detection AF at the time of shooting for second and subsequent frames. Thus, phase difference detection AF can be performed between frames during continuous shooting, so that the focusing performance can be improved. Then, theimaging device 10 is configured so that light passes through theimaging device 10, and the phasedifference detection unit 20 for receiving light which has passed through theimaging device 10 to perform phase difference detection is provided, so that switching between exposure by theimaging device 10 and AF by the phasedifference detection unit 20 can be quickly and quietly performed. Thus, phase difference detection AF between frames during continuous shooting can be realized. - Also, since autofocusing is performed using phase difference detection AF in this case, the defocus direction can be instantly obtained, and thus, an object can be instantly brought into focus even in a short time between shootings continuously performed.
- In shooting for the second and subsequent frames, phase difference detection AF is continuously performed until a next shooting timing, and thus, even if an object moves between shootings continuously performed, it is possible to follow the movement of the object and to bring the object in focus.
- Furthermore, since there is no release time lag corresponding to the movement of a movable mirror in the forward/backward directions, it is possible to follow the movement of the object to bring an object in focus until immediately before exposure. Thus, even if an object is a moving object, the object can be brought into focus with high accuracy without performing moving object prediction.
- At the time of shooting for a first frame in the continuous shooting mode, AF (i.e., contrast detection AF or hybrid AF) for eventually adjusting a focus based on a contrast value is performed. After an object image has been brought into focus, phase difference detection is performed by the phase
difference detection unit 20, and a result of the detection is stored. At the time of shooting for second and subsequent frames, phase difference detection AF is performed based on the detection result of the phasedifference detection unit 20 after the focusing for the first frame. Thus, highly accurate autofocusing which corresponds to actual equipment and actual shooting conditions can be performed. - In the low contrast mode, the
body control section 5 performs focus adjustment at least based on the detection result of the phasedifference detection unit 20 when the contrast value of an object is a predetermined value or larger, and performs focus adjustment not using the detection result of the phasedifference detection unit 20 but based on an output of theimaging device 10 when the contrast value of the object is smaller than the predetermined value. Thus, the object can be brought into focus with high accuracy using AF using a suitable autofocusing method according to the contrast of the object. Specifically, thebody control section 5 performs AF (i.e., phase difference detection AF or hybrid AF) using the phasedifference detection unit 20 when the contrast value of the object is large enough to perform phase difference detection AF, and performs contrast detection AF when the contrast value of the object is so low that phase difference detection AF cannot be performed. Thus, the object can be brought into focus by AF using a suitable method which corresponds to the contrast of the object, so that the object can be brought into focus with high accuracy. - The
body control section 5 switches an AF method between AF at least based on a detection result of the phasedifference detection unit 20 and AF based on an output of theimaging device 10 without using the detection result of the phasedifference detection unit 20 according to the type of theinterchangeable lens 7. Thus, focus can be achieved by AF using a suitable method which corresponds to theinterchangeable lens 7. Specifically, thebody control section 5 performs contrast detection AF when theinterchangeable lens 7 is a reflecting telephoto lens produced by a third party (i.e., produced by a different manufacturer from a manufacturer of the camera body 4), or a STF lens, and performs AF (i.e., phase difference detection AF of hybrid AF) at least using the phasedifference detection unit 20 when theinterchangeable lens 7 is not a product by a third party or a reflecting telephoto lens, and also not a STF lens. In other words, only when theinterchangeable lens 7 ensured that an optical axis is so proper that phase difference detection AF can be performed is attached to thecamera body 4, hybrid AF is performed. If it is not ensured that the optical axis of theinterchangeable lens 7 is so proper that phase difference detection AF can be performed, contrast detection AF is performed. Thus, the object image can be brought into focus by AF using a suitable method which corresponds to theinterchangeable lens 7, so that focus can be achieved with high accuracy. - In the known configuration in which light directed from an object to the
imaging device 10 is guided to a phase difference detection unit provided at some other position than the back surface side of theimaging device 10 using a movable mirror or the like, accuracy in focus adjustment is not high because of a difference between the optical path at the time of exposure and the optical path at the time of phase difference detection, an arrangement error, and the like. In contrast, in this embodiment, the phasedifference detection unit 20 receives light passing through theimaging device 10 to perform phase difference detection, and thus, phase difference detection can be performed using the same optical path as that at the time of exposure. Also, a member such as a movable mirror which causes an error is not provided. Thus, the accuracy of focus adjustment based on phase difference detection can be improved. - —During-Exposure AF Shooting Mode—
- In the above-described normal shooting mode, driving of the
focus lens group 72 is halted during exposure. However, thecamera 100 has a during-exposure AF shooting mode in which autofocusing is also performed during exposure. - Specifically, the
camera 100 is configured so that switching between the during-exposure AF shooting mode in which exposure is performed while performing AF and the normal shooting mode in which thefocus lens group 72 is halted at the time of exposure can be performed by turning on/off the during-exposureAF setting switch 40 e by a user. - The shooting mode is switched to the during-exposure AF shooting mode by turning on the during-exposure
AF setting switch 40 e, but is also automatically switched to the during-exposure AF shooting mode according to an object point distance (i.e., a distance from a lens to an object). That is, thecamera body 4 is configured to be capable of calculating the object point distance and to perform, when the object point distance is smaller than a predetermined distance, during-exposure AF shooting even with the during-exposure AF setting switch being turned off. Specifically, thebody microcomputer 50 calculates the object point distance based on a detection result from the absoluteposition detection section 81 a and lens information of theinterchangeable lens 7 and sets, when the calculated object point distance is a predetermined threshold or smaller, the shooting mode to be the during-exposure AF shooting mode. On the other hand, when the calculated object point distance is larger than the predetermined threshold, thebody microcomputer 50 sets the shooting mode to be the normal shooting mode. That is, thebody control section 5 and the absoluteposition detection section 81 a serve as a distance detection section. Note that although thebody control section 5 has both of the functions as the distance detection section for detecting a distance to an object and the control section for controlling theimaging device 10, a separate distance detection section for detecting a distance to the object may be additionally provided. - As described above, the
camera 100 is configured so that the shooting mode is switched between the during-exposure AF shooting mode and the normal shooting mode according to an operation by the user and also the shooting mode is automatically switched between the during-exposure AF shooting mode and the normal shooting mode according to the object point distance. Note that a method for detecting the object point distance is not limited to the above-described method, but any means and method can be employed. - Furthermore, in a macro shooting mode in which shooting (so-called close-up shooting) is performed with a setting suitable for shooting of an object close to a camera, the shooting mode is automatically switched to the during-exposure AF shooting mode. That is, the
camera 100 has the macro shooting mode which is suitable for close-up shooting, and is configured so that the shooting mode can be switched between the macro shooting mode and the normal shooting mode by turning on/off themacro setting switch 40 f by the user. - Specifically, the
camera 100 is in the normal shooting mode when themacro setting switch 40 f is in an off state, and sets a moving range of thefocus lens group 72 at the time of autofocusing to be a range which allows focusing on an object at an object point distance of several cm to infinity. On the other hand, when themacro setting switch 40 f is in an on state, thecamera 100 is in the macro shooting mode, and sets the moving range of thefocus lens group 72 to be a range which allows focusing on an object at an object point distance of several cm to several tens cm. Shooting of an object at this object point distance is assumed as close-up shooting. That is, in the macro shooting mode, the range in which thefocus lens group 72 is moved at the time of autofocusing is set to be a limited range which is assumed to be a range of close-up shooting, and thus, the moving distance of thefocus lens group 72 can be reduced and fast-focusing can be realized. When themacro setting switch 40 f is in an on state, the shooting mode is automatically set to be the during-exposure AF shooting mode. That is, when an ON signal is input from themacro setting switch 40 f, thebody microcomputer 50 sets the moving range of thefocus lens group 72 at the time of autofocusing to be the above-described limited range, and the shooting mode to be the during-exposure AF shooting mode. - The shooting operation of the camera system in the during-exposure AF shooting mode will be described hereinafter with reference to
FIGS. 21 and 22 . - In the following description, it is assumed that hybrid AF is performed. Note that the during-exposure AF shooting mode is not limited to hybrid AF, but may be employed in any configuration using phase difference detection AF, contrast detection AF, phase difference detection AF according to the variation, hybrid AF according to the variation, or the like.
- Steps (Steps Sk1-Sk11) from the step in which the
power switch 40 a is turned on to the step in which therelease button 40 b is pressed halfway down and then autofocusing is completed, i.e., thefocus lens group 72 is moved to the position where the contrast value has reached the peak are the same as Steps Sc1-Sc11 in hybrid AF in the normal shooting mode. - Thereafter, in Step Sk12, the
body microcomputer 50 determines whether or not the during-exposureAF setting switch 40 e is in an on state. When an ON signal from the during-exposureAF setting switch 40 e is input (YES), the process proceeds to Step Sk16 to set a during-exposure AF flag to 1. When an ON signal from the during-exposureAF setting switch 40 e is not input (NO), the process proceeds to Step Sk13. - In Step Sk13, the
body microcomputer 50 determines whether or not the shooting mode is set to be the macro shooting mode, i.e., whether or not themacro setting switch 40 f is in an on state. When an ON signal from themacro setting switch 40 f is input (YES), the step proceeds to Step Sk16 to set the during-exposure AF flag to 1. When an ON signal from the during-exposureAF setting switch 40 e is not input (NO), the process proceeds to Step Sk14. - In Step Sk14, the
body microcomputer 50 calculates an object point distance from thefocus lens group 72 to an object based on a detection result from the absoluteposition detection section 81 a and lens information of theinterchangeable lens 7 to determine whether or not the calculated object point distance is a predetermined threshold or smaller. When the object point distance is the threshold or smaller (YES), the process proceeds to Step Sk16 to set the during-exposure AF flag to 1. When the object point distance is larger than the threshold, the process proceeds to Step Sk15 to set the during-exposure AF flag to 0. The threshold is set to be an object point distance with which close-up shooting is assumed to be performed. - Note that after setting the during-exposure AF flag in Steps Sk15 and Sk16, the process proceeds to Step Sk17.
- In Step Sk17, the
body microcomputer 50 remains in a standby state until therelease button 40 b is pressed all the way down by the user. When therelease button 40 b is pressed all the way down by the user, whether or not the during-exposure AF flag is 1 is determined in Step Sk18. When the during-exposure AF flag is 0 (NO), the process proceeds to Step Sk19 to perform exposure in the same manner as exposure in the above-described hybrid AF, i.e., as in Sa12-Sa17 in phase difference detection AF. - When the during-exposure AF flag is 1 (YES), the process proceeds to Step Sk19 to perform exposure, and also to Step Sk25 to perform phase difference detection AF in parallel with Step Sk19.
- Specifically, an output from the
line sensor 24 a of the phasedifference detection unit 20 is amplified, and then, an operation by the arithmetic circuit obtains information regarding whether or not an object image has been brought into focus, whether the object is in front focus or back focus, and the Df amount (Step Sk25). Thereafter, thefocus lens group 72 is driven in the defocus direction by the obtained Df amount via the lens microcomputer 80 (Step Sk26). Then, whether or not theshutter unit 42 is put in a close state is determined (Step Sk27). If theshutter unit 42 is not in a close state (NO), the process returns to Step Sk25 to repeat phase difference detection AF. If theshutter unit 42 is in a close state (YES), the process proceeds to Step Sk28 to terminate phase difference detection AF. After terminating phase difference detection AF, the process proceeds to Step Sk31. Note that in phase difference detection AF, light from an object has to enter theimaging unit 1, and therefore, phase difference detection AF is temporarily halted during Steps Sk19-Sk22 in which theshutter unit 42 is in a close state. - Process in Steps Sk29-Sk34 after exposure has been terminated is the same as process after exposure in the above-described hybrid AF, i.e., in Steps Sa18-Sa23 in phase difference detection AF.
- That is, in the during-exposure AF shooting mode, phase difference detection AF is executed during exposure of the
imaging device 10. The phase difference detection AF continuously performed while exposure is performed. Herein, “during exposure” can be also referred to as “during a period of still image shooting by theimaging device 10”, “during a period of video signal storing by theimaging device 10”, “during a period of electrical charge storing by theimaging device 10”, or the like. - The during-exposure AF shooting mode is intentionally set by the user by operating the during-exposure
AF setting switch 40 e, and also, is automatically set when the shooting mode is the macro shooting mode or when the object point distance is very short. In many cases, so-called close-up shooting in which the shooting mode is the macro shooting mode or in which the object point distance is very short is performed indoors, compared to normal shooting other than close-up shooting. That is, close-up shooting is performed in a dark environment in many cases, and thus, the shutter speed has to be reduced to set an exposure time to be long. In such a condition, the influence of the movement (hereinafter also referred to as “camera shake”) of thecamera body 4 due to hand shake of the user and the like, and the movement of an object itself (hereinafter also referred to as “object shake”) on shooting is increased. - In general, it has been known that an image blur correction mechanism is provided to reduce the influence of camera shake on shooting. The image blur correction mechanism is a mechanism for correcting an image blur in an plane perpendicular to an optical axis, and does not correct an image blur in the direction along the optical axis. In this embodiment, the
camera 100 includes theblur correction unit 45 for moving theimaging unit 1 in an plane perpendicular to the optical axis X, and the blur correctionlens driving section 74 a for moving theblur correction lens 74 in a plane perpendicular to the optical axis X. Theblur correction unit 45 and the blur correctionlens driving section 74 a correct an image blur in a plane perpendicular to the optical axis X, but cannot correct an image blur in the direction along the optical axis X. - Therefore, in the during-exposure AF shooting mode of this embodiment, autofocusing is performed during exposure. Thus, an imaging blur in the direction along the optical axis X during exposure is reduced. Since autofocusing during the exposure is performed using phase difference detection AF, the defocus direction can be instantly obtained, and thus, an object image can be instantly brought into focus even in a short time during which exposure is performed.
- Then, as described above, the
imaging device 10 is configured so that light passes through theimaging device 10, and the phasedifference detection unit 20 is configured to receive light which has passed through theimaging device 10 to detect a phase difference, and thus, phase difference detection AF while performing exposure of theimaging device 10 can be realized. Note that in the case of AF such as contrast detection AF and the like using a signal from theimaging device 10, AF cannot be performed during exposure of theimaging device 10, in other words, during a period of image signal storing by the imaging device 10 (i.e., the electrical charge storing period in this embodiment). - When each of the during-exposure
AF setting switch 40 e and themacro setting switch 40 f is in an off state, whether or not to perform close-up shooting is determined based on the object point distance. Then, if it is determined to perform close-up shooting, the shooting mode is set to be the during-exposure AF shooting mode. Thus, even when the user does not realize that shooting is being performed in a condition where the influence of hand shake on the direction along the optical axis X is large, thecamera 100 automatically determines that shooting is being performed in such a shooting condition to correct an image blur in the direction along the optical axis X. If close-up shooting is not to be performed, i.e., the object point distance is long, the influence of hand shake in the direction along the optical axis X on shooting is small, so that autofocusing during exposure is not performed, thus resulting in reduction in power consumption. - Note that although autofocusing performed before exposure, i.e., before the
release button 40 b is pressed all the way down may be any one of phase difference detection AF, contrast detection AF and hybrid AF, autofocusing during exposure is phase difference detection AF. - Also, in this embodiment, the during-exposure AF flag determination is performed immediately after the
release button 40 b is pressed all the way down, but the present invention is not limited thereto. For example, thecamera 100 may be configured so that the during-exposure AF flag determination is performed at the time of starting exposure and, if the shooting mode is the during-exposure AF shooting mode, phase difference detection AF is started simultaneously with the start of exposure. - Therefore, according to this embodiment, the
imaging device 10 is configured so that light passes through theimaging device 10, and the phasedifference detection unit 20 is configured to receive light which has passed through theimaging device 10 to perform phase difference detection, thus realizing phase difference detection AF while performing exposure of theimaging device 10. In this case, autofocusing is phase difference detection AF, and thus, the defocus direction and the defocus amount can be instantly obtained, and an object can be quickly brought into focus. As a result, autofocusing can be performed even in a shot time during which exposure is performed. - Also, phase difference detection AF during exposure is continuously performed entirely during the exposure, and thus, highly accurate autofocusing can be performed.
- Furthermore, by using exposure while performing phase difference detection AF in the macro shooting mode or in close-up shooting used when the object point distance is small, an image blur in the direction along the optical axis generated due to camera shake or object shake can be reduced.
- Also, with the during-exposure
AF setting switch 40 e provided, the user can set the during-exposure AF shooting mode at the user's option. Therefore, the user can flexibly use the during-exposure AF shooting mode not only in close-up shooting but also when the user wants to reduce image blur in the direction along the optical axis. - Next, a camera as an imaging apparatus according to a second embodiment will be described.
- As shown in
FIG. 23 , acamera 200 according to the second embodiment includes a finderoptical system 6. - —Configuration of Camera Body—
- A
camera body 204 further includes, in addition to components of thecamera body 4 of the first embodiment, a finderoptical system 6 for viewing an object image through afinder 65, and a semi-transparentquick return mirror 46 for guiding incident light from theinterchangeable lens 7 to the finderoptical system 6. - The
camera body 204 has a finder shooting mode in which shooting is performed while a user views an object image through the finderoptical system 6, and a live view shooting mode in which shooting is performed while a user views an object image through theimage display section 44. Thecamera body 204 is provided with a findermode setting switch 40 g. The shooting mode is set to be the finder shooting mode is set by turning on the findermode setting switch 40 g, and to be the live view shooting mode by turning off the findermode setting switch 40 g. - The finder
optical system 6 includes afinder screen 61 on which reflected light from thequick return mirror 46 forms an image, apentaprism 62 for converting an object image projected on thefinder screen 61 into an erected image, aneye lens 63 for enlarging the projected object image for viewing, an in-finder display section 64 for displaying various kinds of information in a finder viewing field, and afinder 65 provided on a back surface side of thecamera body 204. - That is, the user can observe an object image formed on the
finder screen 61 through thefinder 65 via thepentaprism 62 and theeye lens 63. - A
body control section 205 further includes, in addition to components of thebody control section 5 of the first embodiment, amirror control section 260 for controlling flip-up of thequick return mirror 46, which will be described later, based on a control signal from thebody microcomputer 50. - The
quick return mirror 46 is a semi-transparent mirror capable of reflecting and transmitting incident light, and is configured to be capable of pivotally moving in front of theshutter unit 42 between a reflection position (see a solid line ofFIG. 23 ) which is on an optical path X extending from an object to theimaging unit 1 and a retracted position (see a chain double-dashed line ofFIG. 23 ) which is off the optical path X and is located adjacent to the finderoptical system 6. At the reflection position, thequick return mirror 46 divides incident light into reflected light toward the finderoptical system 6 and transmitted light to the back surface side of thequick return mirror 46. Thequick return mirror 46 serves as a movable mirror. The reflection position corresponds to a first position, and the retracted position corresponds to a second position. - Specifically, the
quick return mirror 46 is arranged in front of the shutter unit 42 (i.e., at an object side), and pivotally supported about an axis Y which is located above and in front of theshutter unit 42 and horizontally extends. Thequick return mirror 46 is biased toward a retracted position by a bias spring (not shown). Thequick return mirror 46 is moved to the reflection position by the bias spring being wound up by a motor (not shown) for opening and closing theshutter unit 42. Thequick return mirror 46 which has been moved to the reflection position is engaged with an electromagnet or the like at the refection position. Then, this engagement is released, thereby causing thequick return mirror 46 to be pivotally moved to the retracted position by force of the bias spring. - That is, to guide a part of incident light to the
finder screen 61, the bias spring is wound up by the motor, thereby causing thequick return mirror 46 to be positioned at the reflection position. To guide the entire incident light to theimaging unit 1, the engagement with the electromagnet or the like is released, thereby causing thequick return mirror 46 to be pivotally moved to the retracted position by elastic force of the bias spring. - As shown in
FIGS. 24(A)-24(C) , alight shielding plate 47 is connected to thequick return mirror 46. Thelight shielding plate 47 is configured to interact with thequick return mirror 46, and covers, when thequick return mirror 46 is positioned at the retracted position, thequick return mirror 46 from below (i.e., from a side closer to the optical path X extending from the object to the imaging unit 1). Thus, when thequick return mirror 46 is positioned at the retracted position, light entering from the finderoptical system 6 is prevented from reaching theimaging unit 1. Thelight shielding plate 47 serves as a light shielding section. - Specifically, the
light shielding plate 47 includes a firstlight shielding plate 48 pivotally connected to an end portion of thequick return mirror 46 located at an opposite side to the pivot axis Y, and a secondlight shielding plate 49 pivotally connected to thefirst shielding plate 48. The firstlight shielding plate 48 includes afirst cam follower 48 a. In thecamera body 204, afirst cam groove 48 b with which thefirst cam follower 48 a is to be engaged is provided. The secondlight shielding plate 49 includes asecond cam follower 49 a. In thecamera body 204, asecond cam groove 49 b with which thesecond cam follower 49 a is to be engaged is provided. - That is, when the
quick return mirror 46 is pivotally moved, the firstlight shielding plate 48 is moved to follow thequick return mirror 46, and the secondlight shielding plate 49 is moved to follow the firstlight shielding plate 48. In this case, the first and secondlight shielding plates quick return mirror 46 while the first andsecond cam followers second cam grooves - As a result, when the
quick return mirror 46 is positioned at the retracted position, as shown inFIG. 24(A) , the first and secondlight shielding plates quick return mirror 46, thereby shielding light between thequick return mirror 46 and theshutter unit 42, i.e., theimaging unit 1. In this case, similarly to thequick return mirror 46, the first and secondlight shielding plates light shielding plates imaging unit 1 from the object. - As the
quick return mirror 46 is moved from the retracted position to the reflection position, as shown inFIG. 24(B) , the first and secondlight shielding plates quick return mirror 46 is pivotally moved to the reflection position, as shown inFIG. 24(C) , the first and secondlight shielding plates light shielding plates finder screen 61 across the optical path X. Therefore, when thequick return mirror 46 is positioned at the reflection position, the first and secondlight shielding plates optical system 6 by thequick return mirror 46 and light transmitting through thequick return mirror 46. - As described above, with the semi-transparent
quick return mirror 46 and the shieldingplate 47 provided, in the finder shooting mode, the user can view an object image with the finderoptical system 6 before shooting is performed, and light can be caused to reach theimaging unit 1. Also, when shooting is performed, incident light from the finderoptical system 6 can be prevented from reaching theimaging unit 1 by thelight shielding plate 47 while light from an object is directed to theimaging unit 1. In the live view shooting mode, incident light from the finderoptical system 6 can be prevented from reaching theimaging unit 1 by thelight shielding plate 47. - —Operation of Camera—
- The
camera 200 configured in the above-described manner has the two shooting modes, i.e., the finder shooting mode and the live view shooting mode that employ different methods for viewing an object. The operations of the two shooting modes of thecamera 200 will be described hereinafter. - —Finder Shooting Mode—
- First, the shooting operation of the camera system in the finder shooting mode will be described hereinafter with reference to
FIGS. 25 and 26 . - The
power switch 40 a is turned on (Step S11), therelease button 40 b is pressed halfway down by a user (Step S15), and then, therelease button 40 b is pressed all the way down by the user (Step Si11), so that theshutter unit 42 is temporarily put into a close state (Step Si12). The above-described Steps Si1-Si12 are basically the same as Steps Sa1-Sa12 in phase difference detection AF of the first embodiment. - When the
power switch 40 a is turned on, thequick return mirror 46 is positioned at the reflection position on the optical path X. Thus, a part of light which has entered thecamera body 204 is reflected and enters thefinder screen 61. - Light which has entered the
finder screen 61 is formed as an object image. The object image is converted into an erected image by thepentaprism 62, and enters theeye lens 63. That is, as opposed to the first embodiment, the object image is not displayed at theimage display section 44, but the user can observe the erected image of the object through theeye lens 63. In this case, the object image is not displayed, but various pieces of information regarding shooting are displayed at theimage display section 44. - Then, when the
release button 40 b is pressed halfway down by the user (Step Si5), various pieces of information (such as information regarding AF and photometry which will be described later, and the like) regarding shooting are displayed at the in-finder display section 64 which the user can observe through theeye lens 63. That is, the user can identify each piece of information regarding shooting by not only theimage display section 44 but also the in-finder display section 64. - In this case, since the
quick return mirror 46 is semi-transparent, a part of light which has entered thecamera body 204 is directed to the finderoptical system 6 by thequick return mirror 46, but the rest of the light is transmitted through thequick return mirror 46 to enter theshutter unit 42. Then, when theshutter unit 42 is put into an open state (Step Si4), light transmitted through thequick return mirror 46 enters theimaging unit 1. As a result, viewing the object image through the finderoptical system 6 is allowed, and autofocusing by the imaging unit 1 (Steps Si6-Si8) and photometry (Step Si9) can be performed. - Specifically, in Steps Si6-Si8, phase difference detection AF is performed based on an output from the phase
difference detection unit 20 of theimaging unit 1 and, in parallel with phase difference detection AF, photometry can be performed based on an output of theimaging device 10 of theimaging unit 1 in Step Si9. - In phase difference detection in Step Si6, the object image light is transmitted through the
quick return mirror 46, and accordingly, an optical length is increased by an amount corresponding to the thickness of thequick return mirror 46. Thus, a phase detection width of the phase difference detection section defers between when thequick return mirror 46 is retracted from an object image optical path and is put into an image capturing state and when thequick return mirror 46 is positioned at a reflection position. Therefore, in the finder shooting mode in which thequick return mirror 46 is interposed in the object image optical path, defocus information is output with a phase detection width obtained by changing the phase detection width in phase difference focus detection of the first embodiment (i.e., a phase detection width in phase difference focus detection of hybrid AF in the live view shooting mode which will be described later) by a predetermined amount. Note that the phase detection width means a reference phase difference used for determining that a calculated defocus amount is 0, i.e., an object is in focus. - Steps Si6-Si8 of performing phase difference detection AF is the same as Steps Sa6-Sa8 in phase difference detection AF of the first embodiment.
- In Step Si9, the amount of light entering the
imaging device 10 is measured by theimaging device 10. Note that in this embodiment, as opposed to the first embodiment, not the entire light from an object enters theimaging device 10, and therefore, thebody microcomputer 50 corrects an output from theimaging device 10 based on reflection characteristics of thequick return mirror 46 to obtain the amount of light from the object. - Then, after the
release button 40 b is pressed all the way down by the user (Step Si11) and theshutter unit 42 is temporarily put into a close state (Step Si12), in parallel with starting of image blur correction (Step Si13) and stopping down of the aperture section 73 (Step Si14), thequick return mirror 46 is flipped up to the retracted position in Step Si15. - Thereafter, in Steps Si16-Si18, similarly to Steps Sa15-Sa17 in phase difference detection AF of the first embodiment, exposure is performed.
- After exposure is terminated, in parallel with terminating of image blur correction (Step Si19) and opening of the aperture section 73 (Step Si20), the
quick return mirror 46 is moved to the reflection position in Step Si21. Thus, the user can view an object image through the finderoptical system 6 again. - Thereafter, the
shutter unit 42 is put into an open state (Step Si22). When a shooting sequence is terminated after resetting is completed, the process returns to Step Si5, and the body microcomputer remains in a standby state until therelease button 40 b is pressed halfway down by the user. - Steps Si23-Si25 after the
power switch 40 a is turned off are the same as Steps Sa21-Sa23 in phase difference detection AF of the first embodiment. - As described above, the phase
difference detection unit 20 for detecting a phase difference using light transmitted through theimaging device 10 is provided to theimaging unit 1. Thus, even with the configuration in which light from an object is directed to the finderoptical system 6 by thequick return mirror 46 and thereby an object image can be viewed through the finderoptical system 6, phase difference detection AF and photometry can be performed in parallel while allowing the object image to be viewed through the finderoptical system 6 by employing the semi-transparentquick return mirror 46 and thus causing a part of light entering thequick return mirror 46 to reach theimaging unit 1. Therefore, there is no need to additionally provide a reflecting mirror for phase difference detection AF and a sensor for photometry, and also, photometry can be performed in parallel with autofocusing, so that a release time lag can be reduced. - —Live View Shooting Mode—
- Next, the shooting operation of the camera system in a live view shooting mode will be described with reference to
FIGS. 27 and 28 . - First, in steps (Steps Sj1-Sj4) from the step in which the
power switch 40 a is turned on to the step in which theshutter unit 42 is put into an open state, the same operation as the operation in hybrid AF of the first embodiment is performed. - In this case, in the
camera 200, immediately after thepower switch 40 a is turned on, thequick return mirror 46 is positioned at the reflection position, and thus, in Step Sj5, thebody microcomputer 50 flips up thequick return mirror 46 to the retracted position. - As a result, light entering the
camera body 204 from an object is not divided to be directed to the finderoptical system 6, but passes, through theshutter unit 42, is transmitted through theOLPF 43 serving also an IR cutter, and then, enters theimaging unit 1. An object image formed at theimaging unit 1 is displayed at theimage display section 44, so that the user can observe the object image through theimage display section 44. A part of light which has entered to theimaging unit 1 is transmitted through theimaging device 10 and enters the phasedifference detection unit 20. - Then, when the
release button 40 b is pressed halfway down by the user (Step Sj6), as opposed to the finder shooting mode, hybrid AF is performed. Steps Sj7, Sj8, Sj11 and Sj12 according to this hybrid AF are the same as Steps Sc6, Sc7, Sc10 and Sc11 in hybrid AF of the first embodiment. - Note that the autofocusing method employed in this case is not limited to hybrid AF, but contrast detection AF or phase difference detection AF may be performed.
- In parallel with hybrid AF, photometry is performed (Step Sj9), and image blur detection is started (Step Sj10). Steps Sj9 and Sj10 are the same as Steps Sc8 and Sc9 in hybrid AF of the first embodiment.
- Thus, when the
release button 40 b is pressed halfway down by the user, various pieces of information regarding shooting (such as information regarding AF and photometry and the like) are displayed at theimage display section 44. - Thereafter, the steps from the step (Step Sj13) in which the
release button 40 b is pressed all the way down by the user to the step (Step Sj22) in which exposure is terminated to complete resetting are basically the same as Steps Si11-Si22 in the finder shooting mode, except that the step (corresponding to Step Si15) of moving thequick return mirror 46 to the retracted position after theshutter unit 42 is put into a close state is not included, and that the step (corresponding to Step Si21) of moving thequick return mirror 46 to the reflection position after theshutter unit 42 is put into a close state to terminate exposure is not included. - According to this embodiment, when the
power switch 40 a is turned off (Step Sj23), thefocus lens group 72 is moved to the reference position (Step Sj24) and, in parallel with putting theshutter unit 42 into a close state (Step Sj25), thequick return mirror 46 is moved to the reflection position in Step Sj26. Thereafter, respective operations of thebody microcomputer 50 and other units in thecamera body 204, and thelens microcomputer 80 and other units in theinterchangeable lens 7 are halted. - The shooting operation of the camera system in the live view shooting mode is the same as the shooting operation of the
camera 100 of the first embodiment, except the operation of thequick return mirror 46. That is, although hybrid AF has been described in the description above, various shooting operations according to the first embodiment can be performed, and the same functional effects and advantages as those of the first embodiment can be achieved. - Therefore, according to this embodiment, the
camera 200 further includes the finderoptical system 6 provided to thecamera body 204 and the semi-transparentquick return mirror 46, configured so that the position of thequick return mirror 46 can be switched between the reflection position located on an optical path from an object to theimaging device 10 and the retracted position located off the optical path, for reflecting a part of incident light at the reflection position to guide the part of the incident light to the finderoptical system 6 and causing the rest of the incident light to pass therethrough to guide the rest of the incident light to theimaging device 10. Thebody control section 5 is configured to be capable of switching a shooting mode between a finder shooting mode in which shooting is performed in a state where an object can be viewed through the finderoptical system 6 and the live view shooting mode in which shooting is performed in a state in which an object can be viewed through the image display section. In the finder shooting mode, thequick return mirror 46 is positioned at the reflection position to guide a part of incident light to the finderoptical system 6, thereby allowing an object image to be viewed through the finderoptical system 6, and the rest of the incident light is guided to theimaging device 10 and focus adjustment is performed based on a detection result of the phasedifference detection unit 20 which has received light which has passed through theimaging device 10. In the live view shooting mode, thequick return mirror 46 is positioned at the retracted position to cause incident light from an object to enter theimaging device 10, thereby causing theimage display section 44 to display an image based on an output of theimaging device 10, and focus adjustment is performed at least based on a detection result of the phasedifference detection unit 20. Thus, in thecamera 200 including the finderoptical system 6, various types of processing using theimaging device 10 can be performed in parallel with autofocusing (the above-described phase difference detection AF or hybrid AF) using the phasedifference detection unit 20, regardless of which of the finder shooting mode and the live view shooting mode is selected, so that the processing time can be reduced and also switching between the various types of processing using theimaging device 10 and autofocusing the phasedifference detection unit 20 can be performed quickly and quietly. As a result, the convenience of thecamera 200 can be improved. - Moreover, with the semi-transparent
quick return mirror 46 and the shieldingplate 47 provided, in the finder shooting mode, before shooting is performed, an object image can be viewed through the finderoptical system 6, and light can be caused to reach theimaging unit 1. Also, when shooting is performed, incident light from the finderoptical system 6 can be prevented from reaching theimaging unit 1 by thelight shielding plate 47 while light from an object is guided to theimaging unit 1. In the live view shooting mode, incident light from the finderoptical system 6 can be prevented from reaching theimaging unit 1 by thelight shielding plate 47. - In connection with the above-described embodiments, the following configurations may be employed.
- Specifically, according to the second embodiment, the finder
optical system 6 is provided, but the present invention is not limited thereto. For example, a configuration including an electronic view finder (EVF), instead of the finderoptical system 6, may be employed. More specifically, a compact image display section comprised of a liquid crystal display or the like is arranged in thecamera body 204 to be located at a position where the user can view the image display section through the finder, and image data obtained in theimaging unit 1 is displayed at the image display section. Thus, even if the complex finderoptical system 6 is not provided, shooting while viewing through the finder can be realized. In such a configuration, thequick return mirror 46 is not necessary. The shooting operation is the same as that of thecamera 100 of the first embodiment, although two image display sections are provided. - In each of the above-described first and second embodiments, the configuration in which the
imaging unit 1 is mounted on a camera has been described. However, the present invention is not limited thereto. For example, theimaging unit 1 can be mounted on a video camera. - An example shooting operation of a video camera will be described. When the
power switch 40 a is turned on, an aperture section and a shutter unit are opened, and image capturing is started in theimaging device 10 of theimaging unit 1. Then, optimal photometry and white balance adjustment for displaying a live view are performed, and a live view image is displayed at the image display section. Thus, in parallel with image capturing by theimaging device 10, an in-focus state is detected based on an output of the phasedifference detection unit 20 mounted in theimaging unit 1 and driving of the focus lens group is continued according to the movement of an object or the like. In this manner, the video camera remains in a standby state until a REC button is pressed while continuing to display a live view image and to perform phase difference detection AF. When the REC button is operated, image data captured by theimaging device 10 is recorded while phase difference detection AF is repeated. Thus, an in-focus state can be maintained at all the time, and as opposed to a known digital camera, micro driving of a focus lens in an optical direction (wobbling) does not have to be performed, so that an actuator such as a motor and the like, which has a large electric load, does not have to be driven. - Also, the configuration in which when the
release button 40 b is pressed halfway down by the user (i.e., the S1 switch is turned on), AF is started has been described. However, AF may be performed before therelease button 40 b is pressed halfway down. Moreover, the configuration in which AF is terminated when it is determined that an object image has been brought into focus has been described. However, AF may be continued after focus determination, and also AF may be continuously performed without performing focus determination. A specific example will be described hereinafter. InFIGS. 11 and 12 , after theshutter unit 42 is opened in Step Sa4, phase difference focus detection of Step Sa6 and focus lens driving of Step Sa1 are performed repeatedly. In parallel with this operation, determination of Step Sa5, photometry of Step Sa9, image blur detection of Step Sa10, and determination of Step Sa11 are performed. Thus, an in-focus state can be achieved even before therelease button 40 b is pressed halfway down by the user. For example, by using this operation with live view image display, display of a live view image in an in-focus state is allowed. If phase difference detection AF is used, live view image display and phase difference detection AF can be used together. The above-described operation may be added as a “continuous AF mode” to the function of a camera. A configuration in which the “continuous AF mode” is changeable between on and off may be employed. - In each of the above-described first and second embodiments, the configuration in which the
imaging unit 1 is mounted in a camera has been described. However, the present invention is not limited to the above-described configuration. The camera in which theimaging unit 1 is mounted is an example of cameras in which exposure of an imaging device and phase difference detection by a phase difference detection unit can be simultaneously performed. A camera according to the present invention is not limited thereto, but may have a configuration in which object light is guided to both of an imaging device and a phase difference detection unit, for example, by an optical isolation device (such as, for example, a prism, a semi-transparent mirror, and the like) for isolating light to the image device. Moreover, a camera in which a part of each microlens of an imaging device is used as a separator lens and is arranged so that pupil-divided object light can be received at light receiving sections may be employed. - Note that the above-described embodiments are essentially preferable examples which are illustrative and do not limit the present invention, its applications and the scope of use of the invention.
- As has been described, the present invention is useful particularly for an imaging apparatus including an imaging device for performing photoelectric conversion.
Claims (10)
1-20. (canceled)
21. An imaging apparatus, comprising:
an imaging section configured to convert light into an electrical signal by photoelectric conversion to output the electrical signal;
a phase difference detection section configured to receive light to perform phase difference detection; and
a control section configured to control the phase difference detection section to allow the phase difference detection section to perform the phase difference detection in parallel with obtaining an output from the imaging section.
22. The imaging apparatus of claim 21 , further comprising
a display section configured to display an image, wherein
the control section allows the display section to display an image based on the output from the imaging section in parallel with controlling the phase difference detection section to allow the phase difference detection section to perform the phase difference detection.
23. The imaging apparatus of claim 21 , further comprising
a focus lens configured to adjust a focus position, wherein
the control section is configured to perform focus adjustment using contrast detection based on the output from the imaging section and determines, based on a detection result of the phase difference detection, a direction in which the focus lens is driven when driving of the focus lens is started in the focus adjustment.
24. The imaging apparatus of claim 21 , wherein
the control section detects a contrast value of an image based on the output from the imaging section and performs, when the contrast value is equal to or larger than a predetermined value, focus adjustment at least based on a detection result of the phase difference detection section, and performs, when the contrast value is smaller than the predetermined value, focus adjustment not using the detection result of the phase difference detection section but based on the output from the imaging section.
25. The imaging apparatus of claim 21 , further comprising
an imaging apparatus body in which the imaging section, the phase difference detection section and the control section are provided; and
an interchangeable lens attached to the imaging apparatus body so as to be removable,
wherein
when the interchangeable lens is a reflecting telephoto lens and a product made by a different manufacturer from a manufacturer by which the imaging apparatus body is made, the control section performs focus adjustment not using a detection result of the phase difference detection section but based on the output from the imaging section, and when the interchangeable lens is not a reflecting telephoto lens or is a product made by the same manufacture by which the imaging apparatus body is made, the control section performs focus adjustment at least based on the detection result of the phase difference detection section.
26. An imaging apparatus, comprising:
an imaging section configured to convert light into an electrical signal by photoelectric conversion to output the electrical signal;
a phase difference detection section configured to receive light to perform phase difference detection; and
a control section configured to perform photometry using the imaging section in parallel with controlling the phase difference detection section to allow the phase difference detection section to perform the phase difference detection.
27. The imaging apparatus of claim 26 , further comprising
a focus lens configured to adjust a focus position, wherein
the control section is configured to perform focus adjustment using contrast detection based on an output from the imaging section and determines, based on a detection result of the phase difference detection, a direction in which the focus lens is driven when driving of the focus lens is started in the focus adjustment.
28. The imaging apparatus of claim 26 , wherein
the control section detects a contrast value of an image based on an output from the imaging section and performs, when the contrast value is equal to or larger than a predetermined value, focus adjustment at least based on a detection result of the phase difference detection section, and performs, when the contrast value is smaller than the predetermined value, focus adjustment not using the detection result of the phase difference detection section but based on the output from the imaging section.
29. The imaging apparatus of claim 26 , further comprising:
an imaging apparatus body in which the imaging section, the phase difference detection section and the control section are provided; and
an interchangeable lens attached to the imaging apparatus body so as to be removable,
wherein
when the interchangeable lens is a reflecting telephoto lens and a product made by a different manufacturer from a manufacturer by which the imaging apparatus body is made, the control section performs focus adjustment not using a detection result of the phase difference detection section but based on an output from the imaging section, and when the interchangeable lens is not a reflecting telephoto lens or is a product made by the same manufacture by which the imaging apparatus body is made, the control section performs focus adjustment at least based on the detection result of the phase difference detection section.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/657,239 US20130044246A1 (en) | 2008-02-22 | 2012-10-22 | Imaging apparatus |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008041578 | 2008-02-22 | ||
JP2008-041576 | 2008-02-22 | ||
JP2008-041578 | 2008-02-22 | ||
JP2008041576 | 2008-02-22 | ||
PCT/JP2009/000671 WO2009104390A1 (en) | 2008-02-22 | 2009-02-18 | Imaging device |
US91895710A | 2010-08-23 | 2010-08-23 | |
US13/657,239 US20130044246A1 (en) | 2008-02-22 | 2012-10-22 | Imaging apparatus |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/000671 Division WO2009104390A1 (en) | 2008-02-22 | 2009-02-18 | Imaging device |
US91895710A Division | 2008-02-22 | 2010-08-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130044246A1 true US20130044246A1 (en) | 2013-02-21 |
Family
ID=40985278
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/918,957 Expired - Fee Related US8319870B2 (en) | 2008-02-22 | 2009-02-18 | Imaging apparatus |
US13/657,239 Abandoned US20130044246A1 (en) | 2008-02-22 | 2012-10-22 | Imaging apparatus |
US13/657,177 Abandoned US20130050550A1 (en) | 2008-02-22 | 2012-10-22 | Imaging apparatus |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/918,957 Expired - Fee Related US8319870B2 (en) | 2008-02-22 | 2009-02-18 | Imaging apparatus |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/657,177 Abandoned US20130050550A1 (en) | 2008-02-22 | 2012-10-22 | Imaging apparatus |
Country Status (4)
Country | Link |
---|---|
US (3) | US8319870B2 (en) |
JP (3) | JP5128616B2 (en) |
CN (1) | CN101952759B (en) |
WO (1) | WO2009104390A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120092545A1 (en) * | 2009-07-07 | 2012-04-19 | Canon Kabushiki Kaisha | Focus detection apparatus |
US20160080631A1 (en) * | 2014-09-15 | 2016-03-17 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
US9900494B2 (en) | 2014-09-09 | 2018-02-20 | Fujifilm Corporation | Imaging device and focus control method |
US10250793B2 (en) | 2011-06-29 | 2019-04-02 | Nikon Corporation | Focus adjustment device having a control unit that drives a focus adjustment optical system to a focused position acquired first by either a contrast detection system or a phase difference detection system |
Families Citing this family (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10298834B2 (en) | 2006-12-01 | 2019-05-21 | Google Llc | Video refocusing |
JPWO2009069752A1 (en) * | 2007-11-29 | 2011-04-21 | 京セラ株式会社 | Imaging apparatus and electronic apparatus |
JP5436149B2 (en) * | 2009-11-05 | 2014-03-05 | キヤノン株式会社 | Imaging device |
JP5653035B2 (en) * | 2009-12-22 | 2015-01-14 | キヤノン株式会社 | Imaging apparatus, focus detection method, and control method |
JP5537186B2 (en) * | 2010-02-24 | 2014-07-02 | オリンパスイメージング株式会社 | Attachment lens device and imaging device including the same |
JP5691247B2 (en) * | 2010-05-28 | 2015-04-01 | ソニー株式会社 | Interchangeable lens, imaging device, imaging system, interchangeable lens control method and program |
JP5778931B2 (en) * | 2011-01-25 | 2015-09-16 | キヤノン株式会社 | Imaging apparatus and control method thereof |
KR20130005882A (en) * | 2011-07-07 | 2013-01-16 | 삼성전자주식회사 | Digital photographing apparatus, method for the same, and method for auto-focusing |
JP2013050690A (en) * | 2011-08-04 | 2013-03-14 | Nikon Corp | Focus adjusting device and imaging apparatus |
JP5253688B1 (en) * | 2011-08-10 | 2013-07-31 | オリンパスメディカルシステムズ株式会社 | Endoscope device |
JP5900257B2 (en) * | 2012-09-11 | 2016-04-06 | ソニー株式会社 | Processing apparatus, processing method, and program |
JP5937690B2 (en) * | 2012-09-19 | 2016-06-22 | 富士フイルム株式会社 | Imaging apparatus and control method thereof |
JP5799178B2 (en) | 2012-11-29 | 2015-10-21 | 富士フイルム株式会社 | Imaging apparatus and focus control method |
EP2951987A4 (en) * | 2013-02-01 | 2016-08-10 | Canon Kk | Image pickup apparatus, image processing apparatus, image processing method, and non-transitory computer-readable storage medium |
US10334151B2 (en) * | 2013-04-22 | 2019-06-25 | Google Llc | Phase detection autofocus using subaperture images |
DE102013210078B4 (en) | 2013-05-29 | 2015-04-30 | Trumpf Werkzeugmaschinen Gmbh + Co. Kg | Apparatus, method and computer program product for determining the focus position of a high energy beam |
JP6410431B2 (en) * | 2014-01-30 | 2018-10-24 | オリンパス株式会社 | Camera system |
CN106464783B (en) * | 2014-05-26 | 2020-01-21 | 索尼公司 | Image pickup control apparatus, image pickup apparatus, and image pickup control method |
JP6346793B2 (en) * | 2014-06-03 | 2018-06-20 | オリンパス株式会社 | IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM |
JP2016006449A (en) | 2014-06-20 | 2016-01-14 | キヤノン株式会社 | Image pickup apparatus and control method thereof |
US9648236B2 (en) * | 2015-02-19 | 2017-05-09 | Blackberry Limited | Device with a front facing camera having discrete focus positions |
US10341632B2 (en) | 2015-04-15 | 2019-07-02 | Google Llc. | Spatial random access enabled video system with a three-dimensional viewing volume |
US10469873B2 (en) | 2015-04-15 | 2019-11-05 | Google Llc | Encoding and decoding virtual reality video |
US10444931B2 (en) | 2017-05-09 | 2019-10-15 | Google Llc | Vantage generation and interactive playback |
US10419737B2 (en) | 2015-04-15 | 2019-09-17 | Google Llc | Data structures and delivery methods for expediting virtual reality playback |
US10567464B2 (en) | 2015-04-15 | 2020-02-18 | Google Llc | Video compression with adaptive view-dependent lighting removal |
US10546424B2 (en) | 2015-04-15 | 2020-01-28 | Google Llc | Layered content delivery for virtual and augmented reality experiences |
US10540818B2 (en) | 2015-04-15 | 2020-01-21 | Google Llc | Stereo image generation and interactive playback |
JP6600362B2 (en) * | 2015-10-30 | 2019-10-30 | オリンパス株式会社 | Endoscope apparatus and method for operating endoscope apparatus |
US9880447B2 (en) * | 2016-06-27 | 2018-01-30 | Google Llc | Camera module assembly with movable reflective elements |
US10015389B2 (en) * | 2016-09-22 | 2018-07-03 | Omnivision Technologies, Inc. | Image sensor with asymmetric-microlens phase-detection auto-focus (PDAF) detectors, associated PDAF imaging system, and associated method |
US10679361B2 (en) | 2016-12-05 | 2020-06-09 | Google Llc | Multi-view rotoscope contour propagation |
JP2020095069A (en) | 2017-03-31 | 2020-06-18 | 株式会社ニコン | Imaging device |
US10594945B2 (en) | 2017-04-03 | 2020-03-17 | Google Llc | Generating dolly zoom effect using light field image data |
US10474227B2 (en) | 2017-05-09 | 2019-11-12 | Google Llc | Generation of virtual reality with 6 degrees of freedom from limited viewer data |
US10354399B2 (en) | 2017-05-25 | 2019-07-16 | Google Llc | Multi-view back-projection to a light-field |
JP7230807B2 (en) * | 2017-08-09 | 2023-03-01 | ソニーグループ株式会社 | SIGNAL PROCESSING DEVICE, IMAGING DEVICE, SIGNAL PROCESSING METHOD AND PROGRAM |
US10965862B2 (en) | 2018-01-18 | 2021-03-30 | Google Llc | Multi-camera navigation interface |
EP3745705A4 (en) * | 2018-03-23 | 2020-12-30 | Huawei Technologies Co., Ltd. | Video image anti-shake method and terminal |
JP2022134228A (en) | 2021-03-03 | 2022-09-15 | キヤノン株式会社 | Control unit, imaging apparatus, lens device, control method, and program |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5962809A (en) * | 1982-10-04 | 1984-04-10 | Olympus Optical Co Ltd | Focus detecting device |
JPS6462830A (en) * | 1987-09-02 | 1989-03-09 | Sharp Kk | Optical detector |
JP3385172B2 (en) * | 1996-11-26 | 2003-03-10 | 京セラ株式会社 | Automatic focus detection device using two-dimensional sensor |
JPH11352394A (en) * | 1998-06-09 | 1999-12-24 | Minolta Co Ltd | Focus detector |
JP3592147B2 (en) * | 1998-08-20 | 2004-11-24 | キヤノン株式会社 | Solid-state imaging device |
JP2000101880A (en) | 1998-09-25 | 2000-04-07 | Olympus Optical Co Ltd | Image pickup device |
JP4007716B2 (en) * | 1999-04-20 | 2007-11-14 | オリンパス株式会社 | Imaging device |
JP4756721B2 (en) | 1999-05-17 | 2011-08-24 | キヤノン株式会社 | Camera focus detection device |
JP4323002B2 (en) | 1999-05-26 | 2009-09-02 | オリンパス株式会社 | Imaging device |
JP3977062B2 (en) * | 2001-11-21 | 2007-09-19 | キヤノン株式会社 | Imaging apparatus and focus adjustment method |
US6768867B2 (en) | 2002-05-17 | 2004-07-27 | Olympus Corporation | Auto focusing system |
JP2004046132A (en) | 2002-05-17 | 2004-02-12 | Olympus Corp | Automatic focusing system |
JP2005175976A (en) * | 2003-12-12 | 2005-06-30 | Canon Inc | Automatic focusing system of image pickup apparatus employing image pickup element having multilayer photodiode |
JP4478599B2 (en) * | 2005-03-22 | 2010-06-09 | キヤノン株式会社 | Imaging device |
JP2007135140A (en) | 2005-11-14 | 2007-05-31 | Konica Minolta Photo Imaging Inc | Imaging apparatus |
US7728903B2 (en) * | 2005-11-30 | 2010-06-01 | Nikon Corporation | Focus adjustment device, focus adjustment method and camera |
JP4834394B2 (en) | 2005-12-09 | 2011-12-14 | キヤノン株式会社 | Imaging apparatus and control method thereof |
JP4935078B2 (en) | 2006-01-10 | 2012-05-23 | 株式会社ニコン | Solid-state imaging device and electronic camera using the same |
US7751700B2 (en) * | 2006-03-01 | 2010-07-06 | Nikon Corporation | Focus adjustment device, imaging device and focus adjustment method |
JP2007248782A (en) * | 2006-03-15 | 2007-09-27 | Olympus Imaging Corp | Focusing device and camera |
US7725018B2 (en) * | 2006-08-01 | 2010-05-25 | Canon Kabushiki Kaisha | Focus control apparatus, image sensing apparatus and focus control method |
JP4349407B2 (en) * | 2006-11-17 | 2009-10-21 | ソニー株式会社 | Imaging device |
CN101779154B (en) * | 2007-08-13 | 2012-03-07 | 松下电器产业株式会社 | Imaging device and camera |
US8077233B2 (en) * | 2008-02-22 | 2011-12-13 | Panasonic Corporation | Imaging apparatus |
JP4902890B2 (en) * | 2008-02-22 | 2012-03-21 | パナソニック株式会社 | Imaging device |
US8068728B2 (en) * | 2008-02-22 | 2011-11-29 | Panasonic Corporation | Imaging apparatus |
US8077255B2 (en) * | 2008-02-22 | 2011-12-13 | Panasonic Corporation | Imaging apparatus |
CN102077577B (en) * | 2008-10-28 | 2013-05-15 | 松下电器产业株式会社 | Image pickup unit |
JP5147987B2 (en) * | 2009-02-18 | 2013-02-20 | パナソニック株式会社 | Imaging device |
JP5247522B2 (en) * | 2009-02-18 | 2013-07-24 | パナソニック株式会社 | Imaging device |
JP5604160B2 (en) * | 2010-04-09 | 2014-10-08 | パナソニック株式会社 | Imaging device |
-
2009
- 2009-02-18 US US12/918,957 patent/US8319870B2/en not_active Expired - Fee Related
- 2009-02-18 CN CN2009801059349A patent/CN101952759B/en not_active Expired - Fee Related
- 2009-02-18 JP JP2009554217A patent/JP5128616B2/en not_active Expired - Fee Related
- 2009-02-18 WO PCT/JP2009/000671 patent/WO2009104390A1/en active Application Filing
-
2012
- 2012-10-22 US US13/657,239 patent/US20130044246A1/en not_active Abandoned
- 2012-10-22 US US13/657,177 patent/US20130050550A1/en not_active Abandoned
- 2012-10-30 JP JP2012239334A patent/JP2013061661A/en active Pending
- 2012-10-30 JP JP2012239335A patent/JP2013054374A/en active Pending
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120092545A1 (en) * | 2009-07-07 | 2012-04-19 | Canon Kabushiki Kaisha | Focus detection apparatus |
US8730374B2 (en) * | 2009-07-07 | 2014-05-20 | Canon Kabushiki Kaisha | Focus detection apparatus |
US10250793B2 (en) | 2011-06-29 | 2019-04-02 | Nikon Corporation | Focus adjustment device having a control unit that drives a focus adjustment optical system to a focused position acquired first by either a contrast detection system or a phase difference detection system |
US10855905B2 (en) | 2011-06-29 | 2020-12-01 | Nikon Corporation | Focus adjustment device and imaging apparatus |
US11418698B2 (en) | 2011-06-29 | 2022-08-16 | Nikon Corporation | Focus adjustment device and imaging apparatus |
US9900494B2 (en) | 2014-09-09 | 2018-02-20 | Fujifilm Corporation | Imaging device and focus control method |
US20160080631A1 (en) * | 2014-09-15 | 2016-03-17 | Lg Electronics Inc. | Mobile terminal and method of controlling the same |
US9681038B2 (en) * | 2014-09-15 | 2017-06-13 | Lg Electronics Inc. | Mobile terminal and method for setting a focal point value |
Also Published As
Publication number | Publication date |
---|---|
CN101952759A (en) | 2011-01-19 |
JPWO2009104390A1 (en) | 2011-06-16 |
JP5128616B2 (en) | 2013-01-23 |
US20130050550A1 (en) | 2013-02-28 |
US8319870B2 (en) | 2012-11-27 |
CN101952759B (en) | 2012-12-19 |
WO2009104390A1 (en) | 2009-08-27 |
US20110001858A1 (en) | 2011-01-06 |
JP2013054374A (en) | 2013-03-21 |
JP2013061661A (en) | 2013-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8319870B2 (en) | Imaging apparatus | |
US20110304765A1 (en) | Imaging apparatus | |
US8077233B2 (en) | Imaging apparatus | |
US8988584B2 (en) | Imaging apparatus | |
US8384815B2 (en) | Imaging apparatus | |
JP5247663B2 (en) | Imaging device | |
US8077255B2 (en) | Imaging apparatus | |
US8593563B2 (en) | Imaging device and imaging apparatus including the same | |
JPWO2010050184A1 (en) | Imaging unit | |
US8078047B2 (en) | Imaging apparatus | |
JP2010113272A (en) | Imaging apparatus | |
JP4863370B2 (en) | Imaging device | |
US8068728B2 (en) | Imaging apparatus | |
JP2010113273A (en) | Imaging apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |