WO2009035147A1 - Imaging apparatus - Google Patents

Imaging apparatus Download PDF

Info

Publication number
WO2009035147A1
WO2009035147A1 PCT/JP2008/066915 JP2008066915W WO2009035147A1 WO 2009035147 A1 WO2009035147 A1 WO 2009035147A1 JP 2008066915 W JP2008066915 W JP 2008066915W WO 2009035147 A1 WO2009035147 A1 WO 2009035147A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel group
focus
focus detection
imaging
image
Prior art date
Application number
PCT/JP2008/066915
Other languages
French (fr)
Inventor
Ken-Ichiro Amano
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Priority to US12/670,178 priority Critical patent/US8212917B2/en
Priority to EP08830582.6A priority patent/EP2191318B1/en
Priority to CN2008801071666A priority patent/CN101802673B/en
Publication of WO2009035147A1 publication Critical patent/WO2009035147A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/365Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals by analysis of the spatial frequency components of the image
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Abstract

An imaging apparatus includes an image pickup device that includes a first pixel group configured to photoelectrically convert an object image formed by a luminous flux from an imaging optical system, and a second pixel group which includes a plurality of pixels configured to photoelectrically convert a split pair of the luminous flux from the imaging optical system and a detecting unit configured to implement a first detection control that changes an imaging state of the image pickup device while detecting a contrast of the object image based on an output of the second pixel group, and then a second detection control that changes the imaging state of the image pickup device while detecting the contrast of the object image based on an output of the first pixel group.

Description

DESCRIPTION
IMAGING APPARATUS
TECHNICAL FIELD
The present invention relates to an imaging apparatus, such as a digital camera and a video camera, and more particularly to an imagining apparatus which implements focus control by using an output from an image pickup device.
BACKGROUND ART
Japanese Patent Laid-Open No. ("JP") 2000-156823 discloses an imaging apparatus configured to use an optical characteristic of a part of pixels in an image pickup device in the imaging apparatus differently from other pixels and to implement focus detections based on an output from the part of pixels.
The imaging apparatus disclosed in JP 2000-156823 arranges plural pairs of focus detection pixels in the part of the image pickup device. FIG. 6 shows an illustrative pixel arrangement of an image pickup device that arranges focus detection pixels in some part of rows in a pixel matrix. In FIG. 6, R, G, and B are normal imaging pixels in which a red filter, a green filter, and a blue filter are respectively arranged. Sl and S2 are first and second focus detection pixels which have different optical characteristics from the imaging pixels.
FIG. 7 shows a structure of the first focus detection pixel Sl . InFIG. 7, a micro lens 501 is formed on the light incident side of the first focus detection pixel. 502 is a smoothing layer that has a flat surface configured to form the micro lens 501.
503 is a light shielding layer which includes a stop aperture part that is decentered in one direction with respect to a center O of the photoelectric conversion area 504 in the first focus detection pixel Sl.
FIG. 8 shows a structure of the second focus detectionpixelS2. In FIG. 8, a micro lens 601 is formed on the light incident side of the second focus detection pixel. 602 is a smoothing layer that has a flat surface configured to form the micro lens 601.
603 is a light shielding layer which has a stop aperture part decentered in an opposite direction from the light shielding layer 503 in the first focusing detection pixel Sl with respect to the center O of the photoelectric conversion area 604 in the second focus detection pixel S2. In other words, the light shielding layers 503 and 603 in the first and second focus detection pixels Sl and S2 include stop aperture parts that are placed symmetrically with respect to the optical axis of each micro lens. This structure provides an equivalent structure of symmetrical splitting of a pupil in the imaging optical system between the first focus detection pixel Sl and the second focus detection pixel S2.
In FIG. 6, the rows which include the first focus detection pixels Sl and those which include the second focus detection pixels S2 are set such that the two images can be closer to each other as the number of pixels in the image pickup device increases. The rows including the first focus detection pixels Sl and those including the second focus detection pixels S2 have the same outputs (or image signals) when the imaging optical system is in an in-focus state to the object.
On the other hand, when the imaging optical system is in an out-focus state, a phase difference occurs between the image signals derived from the rows including the first focus detection pixels Sl and those including the second focus detection pixels S2. The phase-difference directions are opposite between the front focus state and the back focus state. FIGs. 9A and 9B show relationships of focusing states and phase differences. In these figures, both focus detection pixels Sl and S2 are moved closer to one another, and referred to as points A and B. The imaging pixels are omitted.
A luminous flux from a specific spot on the object is split into a luminous flux φ La that is incident upon a focus detection pixel A via the split pupil corresponding to the focus detection pixel A and a luminous flux φ Lb that is incident upon a focus detection pixel B via the split pupil corresponding to the focus detection pixel B. These two luminous fluxes are incident from the same point on the object, and can reach one point on the image pickup device via the same micro lens, as shown in FIG. 9A, when the imaging optical system is in the in-focus state to the object. Accordingly, the image signals from the rows including the first focus detection pixels A (Sl) and those including the second focus detection pixels B (S2) correspond to one another.
However, in an out-focus state by a distance x shown in FIG. 9B, positions which the luminous fluxes φ La and φ Lb reach shift by an amount of change in the incident angles of φ La and φ Lb to the micro lenses. Consequently, a phase difference occurs between the image signals from the rows including the first focus detection pixels A (Sl) and those including the second focus detection pixels B (S2) . With the foregoing in mind, the imaging apparatus disclosed in JP 2000-156823 implements focus control of a phase difference detection method that utilizes the image pickup device. JP 2001-305415 discloses an imaging apparatus that is suitable for detections of a horizontal line and a vertical line of an object, and configured to implement both focus control of a phase difference detection method that utilizes an output from the image pickup device and focus control of a contrast detection method.
However, the imaging apparatus disclosed in JP
2000-156823 has a difficulty in properly obtaining a phase difference or a defocus amount because two images formed on the first and the second focus detection pixels are asymmetrical to one another as a defocus amount increases .
JP 2001-305415 discloses the imaging apparatus that applies a contrast detection method to focus control when a defocus amount obtained by the phase difference method is less reliable, but this reference is silent about proper use of the focus control of the contrast detection method that utilizes the focus detection pixels and the focus control of the contrast detection method that utilizes pixels other than the focus detection pixels. DISCLOSURE OF INVENTION
The present invention provides an imaging apparatus that can implement both focus control of a contrast detection method that utilizes a focus detection pixel and focus control of a contrast detection method that utilizes a pixel other than the focus detection pixel, in addition to focus control of a phase difference detection method that utilizes a focus detection pixel.
An imaging apparatus according to one aspect of the present invention An imaging apparatus includes an image pickup device that includes a first pixel group configured to photoelectrically convert an object image formed by a luminous flux from an imaging optical system, and a second pixel group which includes a plurality of pixels configured to photoelectrically convert a split pair of the luminous flux from the imaging optical system and a detecting unit configured to implement a first detection control that changes an imaging state of the image pickup device while detecting a contrast of the object image based on an output of the second pixel group, and then a second detection control that changes the imaging state of the image pickup device while detecting the contrast of the object image based on an output of the first pixel group. A control method for an imaging apparatus according to another aspect of the present invention provides an image pickup device that includes a first pixel group configured to photoelectrically convert an object image formed by a luminous flux from an imaging optical system, and a second pixel group which includes a plurality of focus detection pixels configured to photoelectrically convert a split one of the luminous fluxes from the imaging optical system. The control method includes steps of implementing a first focus control of a phase difference detection method which utilizes an output of the second pixel group, implementing a second focus control of a contrast detection method which utilizes an output of the first pixel group, implementing a third focus control of a contrast detection method which utilizes an output of the second pixel group, and changing a focus control to be implemented among the first to third focus controls.
Other features and advantages of the present invention will be apparent from the following description given in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the figures thereof .
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram showing a configuration of a digital camera according to an embodiment of the present invention. FIG. 2 is a diagram showing an arrangement of an imaging pixel group and a focus detection pixel group according to the embodiment .
FIG. 3 is a flow chart of an operation of the camera according to the embodiment . FIG. 4 is a diagram showing AF types of the camera according to the embodiment.
FIG. 5 is a flow chart of an operation of the AF types of the camera according to the embodiment.
FIG. 6 is a diagram showing an arrangement of an imaging pixel group and a focus detection pixel group.
FIG. 7 is a diagram showing a structure of a first focus detection pixel.
FIG. 8 is a diagram showing a structure of a second focus detection pixel. FIG. 9A is a schematic view of a phase difference in image signals in accordance with an in-focus state.
FIG. 9B is a schematic view of a phase difference in image signals in accordance with a (front) focus state.
BEST MODE FOR CARRYING OUT THE INVENTION Referring now to the accompanying drawings, a description will be given of an embodiment of the present invention. FIG. 1 shows a configuration of a digital camera as an imaging apparatus according to one embodiment of the present invention.
A camera 100 includes an imaging optical system 101 which forms an object image using a luminous flux from the object, a lens controller 102 which controls a position of a focus lens (not shown) included in the imaging optical system 101, a stop 103 which adjusts an incident light intensity from the imaging optical system 101, and an image pickup device 104 as a photoelectric conversion element that is comprised by a CMOS sensor having a light receiving surface on which the object image is formed by a luminous flux from the imaging optical system 101.
The image pickup device 104 has any one of color filters R, G, and B, and an imaging pixel group (a first pixel group) 105 which has a plurality of imaging pixels for photoelectric conversions of the object image formed by the imaging optical system 101. The imaging pixel group 105 outputs an imaging signal used to generate the object image. The image pickup device 104 has a focus detection pixel group (a second pixel group) 106 which outputs a pair of image signals used to detect a focusing state (or for a focus detection) of the imaging optical system 101.
The focus detection pixel group 106 includes a plurality of first and second focus detection pixels for photoelectric conversions of a luminous flux pupil-split by a pupil splitting optical system 107, which will be described later. The first phase difference sensor has a plurality of first focusing detection pixels , and the second phase difference sensor has a plurality of second focus detection pixels. The first phase difference sensor outputs one of the above pair of image signals whereas the second phase difference sensor outputs the other of the pair of image signals.
The image pickup device 104 includes a pupil splitting optical system 107 configured to make a pupil-split luminous flux among the luminous fluxes from the imaging optical system 101 incident upon the first and second phase difference sensors.
FIG. 2 shows a pixel arrangement in the image pickup device 104 in this embodiment. In FIG. 2, Sl is the first focus detection pixel, and S2 is the second focus detection pixel in the focus detection group 106. The first and second focus detection pixels Sl and S2 have similar structures to those shown in FIGs. 7 and 8. In other words, the light shielding layers in the first and the second focus detection pixels Sl and S2 have stop aperture parts placed symmetrically to one another with respect to the optical axis of the micro lens as a pupil splitting optical system 107.
In FIG. 2, pixel rows into which the first focus detection pixels Sl are discretely inserted form the first phase difference sensor. The pixel rows that are separated from the first phase difference sensor by a predetermined interval (or an interval of one pixel in FIG. 2) and discretely arrange the second focus detection pixels S2 form the second phase difference sensor. One focus detection pixel group (second pixel group) including the first and second phase difference sensors form one focus detection area. In FIG. 2, the first focus detection area and the second focus detection area are arranged on the top and the bottom of the image pickup device 104 respectively.
The camera 100 has a focus detector (or focus detecting unit) 108 which calculates a phase difference between a pair of image signals output from the first and second phase difference sensors in each focus detection area by applying a correlation operation.
The "pair of image signals output from the first and second phase difference sensors (in other words, the focus detection pixel group 106) , " as used herein, means primarily a pair of image signals generated exclusively from output signals of the focus detection pixels Sl and S2. A pair of image signal may also be generated from output signals of the entire focus detection pixel group.
The focus detector 108 further calculates, based on a phase difference, a defocus amount indicative of a focusing state of the imaging optical system 101 to the object whose image is formed on the focus detection area.
In this embodiment, the focus detector 108 calculates a defocus amount, but the focus detector 108 may calculate a phase difference between the image signals and the camera control part 117, which will be described later, may calculate a defocus amount based on the phase difference. In addition, this embodiment regards a focusing state as a defocus amount, but a phase difference may be regarded as a focusing state.
In this way, the focus detector 108 provides individual focus detections for each focus detection area (or calculates a defocus amount) .
As understood from FIGs. 7 and 8, the focus detection pixel group 106 (including the first and second focus detection pixels Sl and S2) has a limited view by a light shielding layer provided to each pixel and has no color filter. Accordingly, the level of the image signal from the focus detection pixel group 106 is different from that of the image signal output from a plurality of pixels near the focus detection pixel group 106 (hereinafter referred as to an adjacent pixel group) in the imaging pixel group 105.
For this reason, the camera 100 has a gain controller 111 configured to control a gain to the image signal from the focus detection pixel group 106 in order to make the level of the image signal from the focus detection pixel group 106 conform to that of the image signal from the adjacent pixel group.
The camera 100 further includes a spatial frequency detector (or frequency component detector) 109 configured to detect the intensity of a specific frequency component (or high frequency component) contained in the image signal from the adjacent pixel group (or imaging pixel group 105) . The high frequency component represents a spatial frequency component of an object image formed in the adjacent pixel group (or the first pixel group) .
The camera 100 further includes a determination switch 110. The determination switch 110 switches a determination criterion of an interpolation performed at a pixel interpolator 112, which will be described later, between the focusing state detected by the focus detection pixel group 106 and the intensity of the high frequency component detected by the spatial frequency detector 109.
The pixel interpolator (or image generator) 112 interpolates and generates image data corresponding to the focus detection pixel group 106 based on an output of the adjacent pixel group. In other words, based on the output of the imaging pixel group 105 (the adjacent pixel group) , the pixel interpolator 112 generates a partial image corresponding to the focus detection pixel group 106 among the entire image obtained from an output from the image pickup device 104.
The "image data corresponding to the focus detection pixel group 106 (or the partial image)" may be image data to an area which entirely covers the focus detection pixel group 106 or image data for each of the focus detection pixels Sl or S2.
The camera 100 further includes an image processor
113 configured to provide image processes, such as a gamma correction, a white balance adjustment, resampling for display, and an image compression and encoding to an image signal output from the imaging pixel group 105.
The camera 100 further includes a display 114 configured to display (still) image data output from the image processor 113, a recorder 115 configured to record image data in a recording medium, such as a semiconductor memory or an optical disc, an operating part 116 which accepts a user's inputs, and a camera controller 117 as a controller configured to control the entire camera 100. The camera controller 117 calculates a driving amount of the focus lens so as to obtain in-focus based on a defocus amount obtained from the focus detector 108. The calculated driving amount is output to the lens controller 102, which, in turn, moves the focus lens based on the driving amount.
In this way, as shown in a circle λλl" in FIG. 4, the camera controller 117 applies AF of a phase difference detection method (which is first focus control simply referred to as a "phase difference AF" hereinafter) which utilizes an output (or image signal) from the focus detection pixels 106 so as to obtain an in-focus state. However, this phase difference AF has a wider in-focus range than AF of a contrast detection method, which will be described later. The camera 100 further includes a sharpness detector 118. The sharpness detector 118 detects (or extracts) a high frequency component contained in an image signal from the imaging pixel group 105 or the focus detection pixel group 106. Then, the sharpness detector 118 generates focus assessment information (or an AF assessment value signal) based on the high frequency component, and outputs it to the camera controller 117. The focus assessment information represents a contrast state of an object image, in other words, sharpness. The camera controller 117 obtains the in-focus state by moving the focus lens to a position that provides the largest value of the focus assessment information. This is the contrast detection method AF, and the in-focus state can be obtained more precisely and rapidly when combined with the phase difference AF.
For example, the camera controller 117 initially performs the phase difference AF, and moves the focus lens to the vicinity of an in-focus position. Next, the camera controller 117 performs AF of a contrast detection method, and moves the focus lens to a more precise in-focus position. The AF of the contrast detection method is also effective in maintaining the in-focus state when the camera 100 takes motion pictures . This embodiment may provide, as shown in a circle
"2" in FIG. 4, AF of a contrast detection method (which is second focus control simply referred to as an "imaging pixel contrast AF" hereinafter) that utilizes an output
(or imaging signal) from the imaging pixel group 105. An alternative embodiment may provide, as shown in a circle "3" in FIG. 4, AF of a contrast detection method (which is third focus control simply referred to as a "focus detection pixel contrast AF" hereinafter) which utilizes an output (or image signal) from the focus detection pixel group 106. In the focus detection pixel contrast AF, any one of the outputs from the first and second focus detection pixels Sl and S2 may be used, or these outputs may be alternately used.
FIG. 3 shows an operation of the camera 100 (mainly the camera controller 117 ) according to this embodiment . The operation is implemented in accordance with a computer program which is stored in a memory (not shown) in the camera controller 117.
The camera controller 117 starts an operation from the step S301 in response to an AF command signal (e.g. , a signal output by pressing a release button halfway) input by the operating part 116. Although not specifically described, an imaging preparation, such as an exposure calculation, is operated parallel to the AF operation.
In the step S301, the camera controller 117 implements an AF operation. When a defocus amount before the AF operation is implemented is small, the in-focus state is available by the phase difference AF which utilizes an output from the focus detection pixel group 106 described above. However, if a defocus amount before the AF operation is implemented is large, two images formed on the first and second focus detection pixels (or phase difference sensors) are asymmetrical to one another, and it will be difficult to accurately obtain a phase difference or a defocus amount.
For this reason, in the step S301, this embodiment performs the AF operation shown in FIG. 5.
In the step S401 shown in FIG. 5, the camera controller 117 instructs the focus detection pixel group 106 in the image pickup device 104 to initiate charge accumulations. After the charge accumulations are completed, the camera controller 117 outputs an image signal from the focus detection pixel group 106 to the focus detector 108. As described above, the focus detector 108 calculates a defocus amount, and outputs it to the camera controller 117. The camera controller 117 obtains the defocus amount from the focus detector 108.
Next, in the step S402, the camera controller 117 determines whether or not the obtained defocus amount is larger than a predetermined amount. If the defocus amount is smaller than the predetermined amount, the flow proceeds to the step S405 which implements the phase difference AF "1." If the defocus amount is larger than the predetermined amount, the flow proceeds to the step S403 in which the camera controller 117 implements the focus detection pixel contrast AF "3." Then, the flow proceeds to the step S404.
In the step S404, the camera 117 implements the imaging pixel contrast AF "2." In this way, the flow proceeds to the step S302 when the defocus amount precisely reduces to the level of the in-focus state.
When the in-focus state is precisely obtained by the focus detection pixel contrast AF "3," the imaging pixel contrast AF "2" may be omitted. In other words, the imaging pixel contrast AF "2" may follow when the focus detection contrast AF "3" cannot provide the sufficiently precise in-focus state.
Thus, this embodiment can provide the in-focus state by the contrast AF which effectively utilizes the focus detection pixel group 106 by implementing the focus detection pixel contrast AF V3" and the imaging pixel contrast AF λλ2" even if the defocus amount is large. In other words, a proper in-focus state regardless of the level of the defocus amount is available through switching of the phase difference AF which utilizes the output from the focus detection pixel group 106, the focus detection pixel contrast AF "3," and the imaging pixel contrast AF "2." Although this embodiment discusses switching of AF to be implemented among the AFs l" to "3" according to the defocus amount, the AF to be implemented may be switched according to the object. For example, as will be described later, since the phase difference AF is less viable when the object has a repetitive pattern, the contrast AFs "2" and "3" may be implemented.
Since it is conceivable that an exposure condition may change due to a change in an object image after the AF operation moves the focus lens, the exposure calculation is repeated at a new focus-lens position, and the flow proceeds to the step S302.
In the step S302, the camera controller 117 determines whether or not an imaging command signal
(e.g., a signal output by completely pressing a release button) is input from the operating part 116. If the imaging command signal has not yet been input, the determination of this step is repeated. When the imaging command signal is input, the flow proceeds to the step S303.
In the step S303, the camera controller 117 instructs the imaging pixel group 105 and the focus detection pixel group 106 in the image pickup device 104 to initiate charge accumulations. After the charge accumulations are completed, the camera controller 117 outputs an image signal of the image pickup device 105 to the spatial frequency detector 109 and the pixel interpolator 112, and outputs an image signal of the focus detection pixel group 106 to the focus detector 108 and the gain controller 111. After these outputs are made, the flow proceeds to the step S304.
In the step S304, the camera controller 117 initializes a counter (n=l) . A numerical value of the counter, "n" corresponds to λΛn" focus detection areas in the image pickup device 104.
In the step S305, the camera controller 117 and the determination switch 110 determine whether or not the focus detection can be made in the nth focus detection area. If a defocus amount is obtained based on a phase difference between the image signals obtained from the focus detection pixels 106, the focus detection may not be properly implemented for an object having a repetitive pattern. Whether the image data interpolation is available according to the defocus amount cannot be determined based on the improper focus detection. Therefore, in such a case, this embodiment determines the availability of the image data interpolation based on a detection result of a high frequency component detected by the spatial frequency detector 109. This is because a high intensity of the high frequency component can be regarded as a small defocus amount in the imaging optical system 101. The flow proceeds to the step S306 when the focus detection is available in the nth focus detection area, but proceeds to the step S311 when the focus detection is unavailable.
In the step S306, the camera controller 117 obtains a defocus amount in the nth focus detection area from the focus detector 108, and determines whether or not the defocus amount is smaller than a predetermined threshold (or first predetermined value) . This determination also determines whether a spatial frequency of the object image formed on the nth focus detection area has a value enough for a good entire image through the image data interpolation by the pixel interpolator 112 configured to generate the image data which corresponds to the focus detection area (or focus detection pixel group 106) . Usually, an image signal (or object image) having a large defocus amount has a small amount of high frequency component (low contrast) . On the other hand, an image signal (or object image) having a small defocus amount (near the in-focus state) has a large amount of high frequency component (high contrast) . As described above, a drop of the sharpness of the image is not so conspicuous when the pixel interpolator 112 interpolates image data with a low spatial frequency of the object image, but that is conspicuous when the pixel interpolator 112 interpolates the image data with a high spatial frequency of the object image. Accordingly, this embodiment controls the image data interpolation by the pixel interpolator 112 according to a level of a defocus amount when the focus detection has been successful: The flow proceeds to the step S307 without the image data interpolation for a defocus amount smaller than the threshold, and proceeds to the step S309 with the image data interpolation for a defocus amount equal to or larger than the threshold.
In the step S307, the camera controller 117 compares an average image signal in the nth focus detection area (referred to as an "nth focus detection pixel group 106" hereinafter) with an average image signal in the adjacent pixel group. Then, the gain controller 111 controls a gain applied to the image of the nth focus detection pixel group 106 such that these signal levels are identical or deemed to be identical. Alternatively, peak values of the image signals may be compared rather than comparing the average image signals of the pixel group with each other. The flow proceeds to the step S308 after the gain is thus controlled.
In the step S308, the camera controller 117 inserts an image signal of the nth focus detection pixel group 106 in which the gain is controlled in the step S307, into an area (or position) which corresponds to the nth focus detection pixel group 106 in the image signal of the imaging pixel group 105. This step can generate synthetic image data that synthesizes an image based on an image signal from the imaging pixel group 105 with a partial image based on the (gain-controlled) image signal from the nth focus detection pixel group 106. The camera controller 117 outputs the synthesized image data to the image processor 113. Then, the flow proceeds to the step S313.
In the step S309, the camera controller 117 directs the pixel interpolator 112 to generate the partial image data for the interpolation corresponding to the nth focus detection pixel group 106 through an interpolation calculation using the image signal of the adjacent pixel group of the nth focus detection pixel group 106. In other words, the image interpolator 112 generates the partial image corresponding to the nth focus detection pixel group 106, among the entire image obtained by the output from the image pickup device 104, based on the output of the imaging pixel group 105 (the adjacent pixel group) . This embodiment needs to interpolate a pixel signal of a green component particularly for the focus detection pixels Sl and S2 due to the periodic color-filter arrangement of the imaging pixel group 105. Accordingly, pixel signals corresponding to the positions of the focus detection pixels Sl and S2 are generated based on signals having green components which are adjacent in oblique directions to the focus detection pixels Sl and S2 of the adjacent pixel group in FIG. 2. The surrounding pixels used for the interpolation are not limited to those having green components, which are adjacent in oblique directions to the focus detection pixels Sl and S2 described above. In other words, an edge may be detected from a positional change of a signal level and the interpolation calculation that takes the edge position of the object image into account may be made by using distant pixels having green components.
The flow proceeds to the step S310 after the partial image data for the interpolation is generated.
In the step S310, the camera controller 117 inserts an image signal on the partial image data for the interpolation for the nth focus detection pixel group 106 generated in the step S309, into an area (or position) which corresponds to the nth focus detection pixel group 106 in the image signal of the imaging pixel group 105, thereby creating synthesized image data between the image based on the image signal from the imaging pixel group 105 and the partial image for the interpolation for the nth focus detection pixel group 106. The camera controller 117 outputs the synthesized data to the image processor 113. Then, the flow proceeds to the step S313. In the step S311, the camera controller 117 directs the spatial frequency detector 109 to detect a high frequency component from the image signal of the adjacent pixel group for the nth focus detection group 106.
In the step S312, the camera controller 117 determines whether or not the intensity of the high frequency component detected in the step S311 is higher than a predetermined threshold (or a second predetermined value) .
If the intensity of the high frequency component is higher than the threshold as described above (or when the contrast is high) , the defocus amount of the imaging optical system 101 is considered small. Then, the step S305 can provide focus detection, and the step S306 operates as if the defocus amount is smaller than the threshold. On the other hand, if the intensity of the high frequency component is lower than the threshold (or when the contrast is low) , the defocus amount of the imaging optical system 101 is considered large. Then, the step 305 can provide focus detection, and the step S306 operates as if the defocus amount is greater than the threshold.
In other words, the flow proceeds to the step S307 without the image data interpolation when the detected intensity is higher than the threshold, and proceeds to the step S309 for the image data interpolation if the detection intensity is lower than threshold.
In the step S313, the camera controller 117 determines whether or not the processes in the steps S305 to S312 are completed to all or n" focus detection areas. The flow proceeds to the step S314, and returns to the step S305, after incrementing a counter value by one when the processes are not completed to all of the focus detection areas. Thereby, the above processes are performed for the next focus detection area. On the other hand, the flow proceeds to the step S315 if the processes are completed to all of the focus detection areas .
In the step S315, the camera controller 117 directs the image processor 113 to implement a gamma correction a white balance adjustment, resampling for display, and an image compression and encoding to the synthesized image data. The image processor 113 outputs the image data in which the above processes are implemented, to the display 114. The display 114 displays the image data in order to make a user check the taken image.
The image processor 113 further outputs the image data in which the above processes are implemented to the recorder 115. The recorder 115 records the image data in the recording medium.
The above operation can provide a good image having a high degree of sharpness even if the image pickup device 104 has many focus detection pixels.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions. This application claims a foreign priority benefit based on Japanese Patent Application 2007-238948, filed on September 14, 2007, which is hereby incorporated by reference herein in its entirety as if fully set forth herein .
FIELD OF INDUSTRIAL APPLICATION
The present invention provides an imaging apparatus that can implement both focus control of a contrast detection method that utilizes a focus detection pixel and focus control of a contrast detection method that utilizes a pixel other than the focus detection pixel, in addition to focus control of a phase difference detection method that utilizes a focus detection pixel.

Claims

CLAIMS :
1. An imaging apparatus comprising: an image pickup device that includes a first pixel group configured to photoelectrically convert an object image formed by a luminous flux from an imaging optical system, and a second pixel group which includes a plurality of pixels configured to photoelectrically convert a split pair of the luminous flux from the imaging optical system; and a detecting unit configured to implement a first detection control that changes an imaging state of the image pickup device while detecting a contrast of the object image based on an output of the second pixel group, and then a second detection control that changes the imaging state of the image pickup device while detecting the contrast of the object image based on an output of the first pixel group.
2. An imaging apparatus comprising: an image pickup device that includes a first pixel group configured to photoelectrically convert an object image formed by a luminous flux from an imaging optical system, and a second pixel group which includes a plurality of pixels configured to photoelectrically convert a split pair of the luminous flux from the imaging optical system; and a controller configured to selectively implement one of a first focus control of a phase difference detection method that utilizes an output of the second pixel group, a second focus control of a contrast detection method that utilizes an output of the first pixel group, and a third focus control of a contrast detection method that utilizes an output of the second pixel group.
3. An imaging apparatus according to claim 2, wherein the controller selects the first focus control or a combination of the second and third focus controls based on a defocus amount of the imaging optical system.
4. An imaging apparatus according to claim 2, wherein the controller implements the first focus control when the defocus amount is smaller than a threshold, and implements the second and third focus controls when the defocus amount is larger than the threshold.
5. An imaging apparatus according to any one of claim 2 and 3, wherein the controller implements the third focus control prior to the second focus control in implementing the second and third focus controls.
6. A control method for an imaging apparatus which provides an image pickup device that includes a first pixel group configured to photoelectrically convert an object image formed by a luminous flux from an imaging optical system, and a second pixel group which includes a plurality of focus detection pixels configured to photoelectrically convert a split one of the luminous fluxes from the imaging optical system, the control method comprising steps of: implementing a first focus control of a phase difference detection method which utilizes an output of the second pixel group; implementing a second focus control of a contrast detection method which utilizes an output of the first pixel group; implementing a third focus control of a contrast detection method which utilizes an output of the second pixel group; and changing a focus control to be implemented among the first to third focus controls.
PCT/JP2008/066915 2007-09-14 2008-09-12 Imaging apparatus WO2009035147A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/670,178 US8212917B2 (en) 2007-09-14 2008-09-12 Imaging apparatus
EP08830582.6A EP2191318B1 (en) 2007-09-14 2008-09-12 Imaging apparatus
CN2008801071666A CN101802673B (en) 2007-09-14 2008-09-12 Imaging apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007-238948 2007-09-14
JP2007238948A JP5264131B2 (en) 2007-09-14 2007-09-14 Imaging device

Publications (1)

Publication Number Publication Date
WO2009035147A1 true WO2009035147A1 (en) 2009-03-19

Family

ID=40452145

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2008/066915 WO2009035147A1 (en) 2007-09-14 2008-09-12 Imaging apparatus

Country Status (5)

Country Link
US (1) US8212917B2 (en)
EP (1) EP2191318B1 (en)
JP (1) JP5264131B2 (en)
CN (1) CN101802673B (en)
WO (1) WO2009035147A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102472881A (en) * 2009-07-07 2012-05-23 佳能株式会社 Focus detection apparatus
US20120229696A1 (en) * 2010-01-15 2012-09-13 Canon Kabushiki Kaisha Image Capturing Apparatus

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5029274B2 (en) * 2007-10-10 2012-09-19 株式会社ニコン Imaging device
US8279318B2 (en) * 2007-12-14 2012-10-02 Canon Kabushiki Kaisha Image pickup apparatus and display control method for the same
JP5219865B2 (en) * 2008-02-13 2013-06-26 キヤノン株式会社 Imaging apparatus and focus control method
JP2010091943A (en) * 2008-10-10 2010-04-22 Canon Inc Imaging apparatus
JP5322783B2 (en) * 2009-06-05 2013-10-23 キヤノン株式会社 IMAGING DEVICE AND CONTROL METHOD OF IMAGING DEVICE
JP5653035B2 (en) * 2009-12-22 2015-01-14 キヤノン株式会社 Imaging apparatus, focus detection method, and control method
JP5454223B2 (en) * 2010-02-25 2014-03-26 株式会社ニコン camera
JP5126261B2 (en) * 2010-03-18 2013-01-23 株式会社ニコン camera
JP5322995B2 (en) 2010-05-10 2013-10-23 キヤノン株式会社 Imaging apparatus and control method thereof
JP2012003087A (en) * 2010-06-17 2012-01-05 Olympus Corp Imaging apparatus
JP2012150289A (en) * 2011-01-19 2012-08-09 Olympus Corp Image processing device, imaging apparatus, and image processing method
JP5539585B2 (en) * 2011-03-24 2014-07-02 富士フイルム株式会社 Color imaging device, imaging device, and imaging program
TW201245768A (en) * 2011-03-29 2012-11-16 Sony Corp Image pickup apparatus, image pickup device, image processing method, aperture control method, and program
JP5396566B2 (en) 2011-03-30 2014-01-22 富士フイルム株式会社 Imaging apparatus and autofocus control method thereof
WO2012137650A1 (en) * 2011-04-01 2012-10-11 富士フイルム株式会社 Imaging device and program
JP5967950B2 (en) * 2011-04-20 2016-08-10 キヤノン株式会社 Imaging device and imaging apparatus
JP2012226246A (en) * 2011-04-22 2012-11-15 Nikon Corp Focus detector and imaging apparatus
KR101777351B1 (en) * 2011-05-16 2017-09-11 삼성전자주식회사 Image pickup device, digital photographing apparatus using the device, auto-focusing method, and computer-readable storage medium for performing the method
JP5966283B2 (en) * 2011-09-02 2016-08-10 株式会社ニコン Camera body and camera
KR20130038035A (en) * 2011-10-07 2013-04-17 삼성전자주식회사 Image sensor
JP5973708B2 (en) * 2011-10-21 2016-08-23 オリンパス株式会社 Imaging apparatus and endoscope apparatus
JP5888940B2 (en) * 2011-11-11 2016-03-22 オリンパス株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
JP2014026062A (en) * 2012-07-26 2014-02-06 Sony Corp Imaging apparatus and imaging method
JP6202927B2 (en) * 2012-08-31 2017-09-27 キヤノン株式会社 Distance detection device, imaging device, program, recording medium, and distance detection method
WO2014041733A1 (en) * 2012-09-11 2014-03-20 ソニー株式会社 Imaging device and focus control method
JP2014106476A (en) * 2012-11-29 2014-06-09 Canon Inc Focus detection device, imaging device, imaging system, focus detection method, program and storage medium
JP6239862B2 (en) * 2013-05-20 2017-11-29 キヤノン株式会社 Focus adjustment apparatus, focus adjustment method and program, and imaging apparatus
TW201514599A (en) * 2013-10-07 2015-04-16 Novatek Microelectronics Corp Image sensor and image capturing system
JP6270400B2 (en) * 2013-10-09 2018-01-31 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
JP6223160B2 (en) * 2013-12-10 2017-11-01 キヤノン株式会社 Imaging device, control method thereof, and control program
JP6525498B2 (en) * 2013-12-12 2019-06-05 キヤノン株式会社 Image pickup apparatus, control method thereof and control program
JP6476547B2 (en) * 2014-01-28 2019-03-06 株式会社ニコン Focus detection device and imaging device
WO2015146214A1 (en) * 2014-03-25 2015-10-01 富士フイルム株式会社 Imaging device and focusing control method
JP6405243B2 (en) * 2014-03-26 2018-10-17 キヤノン株式会社 Focus detection apparatus and control method thereof
JP2016009043A (en) * 2014-06-24 2016-01-18 ソニー株式会社 Image sensor, arithmetic method, and electronic device
JP6486041B2 (en) * 2014-09-11 2019-03-20 キヤノン株式会社 Imaging apparatus and control method thereof
EP3245547A4 (en) 2015-01-14 2018-12-26 Invisage Technologies, Inc. Phase-detect autofocus
JPWO2016157569A1 (en) * 2015-03-27 2017-11-16 オリンパス株式会社 Imaging apparatus and focus evaluation apparatus
US20160295122A1 (en) 2015-04-03 2016-10-06 Canon Kabushiki Kaisha Display control apparatus, display control method, and image capturing apparatus
JP6808333B2 (en) * 2015-04-03 2021-01-06 キヤノン株式会社 Display control device and method, and imaging device
EP3098638B1 (en) * 2015-05-29 2022-05-11 Phase One A/S Adaptive autofocusing system
JP6702669B2 (en) * 2015-07-29 2020-06-03 キヤノン株式会社 Imaging device and control method thereof
JP6700973B2 (en) * 2016-05-24 2020-05-27 キヤノン株式会社 Imaging device and control method thereof
CN106210527B (en) * 2016-07-29 2017-08-25 广东欧珀移动通信有限公司 The PDAF calibration methods and device moved based on MEMS
CN106161958A (en) * 2016-08-24 2016-11-23 广西小草信息产业有限责任公司 A kind of electronically controlled image capture method and device thereof
JP6237858B2 (en) * 2016-10-25 2017-11-29 株式会社ニコン Focus detection apparatus and imaging apparatus
CN109297940A (en) * 2018-09-06 2019-02-01 中国科学院沈阳自动化研究所 One kind laser defocusing amount self-checking device and its adjusting method under micro-meter scale
KR20220036630A (en) * 2020-09-16 2022-03-23 삼성전자주식회사 Image processing device and image processing method for color correction, and image processing system including the same
CN112383715B (en) * 2020-12-07 2022-05-17 Oppo(重庆)智能科技有限公司 Image acquisition device, terminal and image acquisition method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0943507A (en) * 1995-08-02 1997-02-14 Canon Inc Electric still camera and focusing control method thereof
JP2000292686A (en) * 1999-04-06 2000-10-20 Olympus Optical Co Ltd Image pickup device
JP2003244712A (en) * 2002-02-19 2003-08-29 Canon Inc Imaging apparatus and system
JP2004046132A (en) * 2002-05-17 2004-02-12 Olympus Corp Automatic focusing system
JP2005106994A (en) * 2003-09-29 2005-04-21 Canon Inc Focal point detecting device, imaging device, and method for controlling them
JP2006154065A (en) * 2004-11-26 2006-06-15 Canon Inc Imaging apparatus and its control method
JP2007158692A (en) * 2005-12-05 2007-06-21 Nikon Corp Solid state imaging device and electronic camera using the same

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0743605A (en) * 1993-08-02 1995-02-14 Minolta Co Ltd Automatic focusing device
JP3592147B2 (en) 1998-08-20 2004-11-24 キヤノン株式会社 Solid-state imaging device
JP2000155256A (en) * 1998-11-19 2000-06-06 Olympus Optical Co Ltd Interchangeable photographing lens device, camera body and camera system
US6819360B1 (en) 1999-04-01 2004-11-16 Olympus Corporation Image pickup element and apparatus for focusing
JP4908668B2 (en) * 2000-04-19 2012-04-04 キヤノン株式会社 Focus detection device
JP2003295047A (en) * 2002-04-05 2003-10-15 Canon Inc Image pickup device and image pickup system
US6768867B2 (en) 2002-05-17 2004-07-27 Olympus Corporation Auto focusing system
JP2004240054A (en) * 2003-02-04 2004-08-26 Olympus Corp Camera
JP4324402B2 (en) * 2003-04-08 2009-09-02 Hoya株式会社 Camera autofocus device
EP1684503B1 (en) * 2005-01-25 2016-01-13 Canon Kabushiki Kaisha Camera and autofocus control method therefor
JP2007150643A (en) * 2005-11-28 2007-06-14 Sony Corp Solid state imaging element, driving method therefor, and imaging apparatus
US7728903B2 (en) * 2005-11-30 2010-06-01 Nikon Corporation Focus adjustment device, focus adjustment method and camera
JP5168797B2 (en) * 2006-03-01 2013-03-27 株式会社ニコン Imaging device
JP2007248782A (en) * 2006-03-15 2007-09-27 Olympus Imaging Corp Focusing device and camera
JP4349407B2 (en) * 2006-11-17 2009-10-21 ソニー株式会社 Imaging device
JP4321579B2 (en) * 2006-11-28 2009-08-26 ソニー株式会社 Imaging device
JP5288752B2 (en) * 2007-09-07 2013-09-11 キヤノン株式会社 Imaging device
JP2009069255A (en) * 2007-09-11 2009-04-02 Sony Corp Imaging device and focusing control method
JP5194688B2 (en) * 2007-10-01 2013-05-08 株式会社ニコン Solid-state imaging device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0943507A (en) * 1995-08-02 1997-02-14 Canon Inc Electric still camera and focusing control method thereof
JP2000292686A (en) * 1999-04-06 2000-10-20 Olympus Optical Co Ltd Image pickup device
JP2003244712A (en) * 2002-02-19 2003-08-29 Canon Inc Imaging apparatus and system
JP2004046132A (en) * 2002-05-17 2004-02-12 Olympus Corp Automatic focusing system
JP2005106994A (en) * 2003-09-29 2005-04-21 Canon Inc Focal point detecting device, imaging device, and method for controlling them
JP2006154065A (en) * 2004-11-26 2006-06-15 Canon Inc Imaging apparatus and its control method
JP2007158692A (en) * 2005-12-05 2007-06-21 Nikon Corp Solid state imaging device and electronic camera using the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2191318A4 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102472881A (en) * 2009-07-07 2012-05-23 佳能株式会社 Focus detection apparatus
US8730374B2 (en) 2009-07-07 2014-05-20 Canon Kabushiki Kaisha Focus detection apparatus
US20120229696A1 (en) * 2010-01-15 2012-09-13 Canon Kabushiki Kaisha Image Capturing Apparatus
CN102741724A (en) * 2010-01-15 2012-10-17 佳能株式会社 Image capturing apparatus
US8599304B2 (en) * 2010-01-15 2013-12-03 Canon Kabushiki Kaisha Image capturing apparatus having focus detection pixels producing a focus signal with reduced noise

Also Published As

Publication number Publication date
EP2191318B1 (en) 2018-11-14
CN101802673B (en) 2012-04-18
JP2009069577A (en) 2009-04-02
US20100194967A1 (en) 2010-08-05
CN101802673A (en) 2010-08-11
US8212917B2 (en) 2012-07-03
EP2191318A4 (en) 2012-11-21
JP5264131B2 (en) 2013-08-14
EP2191318A1 (en) 2010-06-02

Similar Documents

Publication Publication Date Title
EP2191318B1 (en) Imaging apparatus
EP2179581B1 (en) Image-pickup apparatus and control method therof
US8018524B2 (en) Image-pickup method and apparatus having contrast and phase difference forcusing methods wherein a contrast evaluation area is changed based on phase difference detection areas
US9319659B2 (en) Image capturing device and image capturing method
JP5424679B2 (en) Imaging apparatus and signal processing apparatus
JP4823167B2 (en) Imaging device
US8466992B2 (en) Image processing apparatus, image processing method, and program
US8854533B2 (en) Image capture apparatus and control method therefor
US9578231B2 (en) Image capture apparatus and method for controlling the same
JP4823168B2 (en) Imaging device
WO2011013725A1 (en) Image pickup apparatus that performs automatic focus control and control method for the image pickup apparatus
US8902294B2 (en) Image capturing device and image capturing method
JP4823169B2 (en) Imaging device
JP6486041B2 (en) Imaging apparatus and control method thereof
JP5207893B2 (en) Imaging apparatus and control method thereof
JP2016071275A (en) Image-capturing device and focus control program
JP2014206601A (en) Imaging apparatus and focus adjustment method

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880107166.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08830582

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 12670178

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2008830582

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE