WO2012073727A1 - Imaging device and focal position detection method - Google Patents

Imaging device and focal position detection method Download PDF

Info

Publication number
WO2012073727A1
WO2012073727A1 PCT/JP2011/076721 JP2011076721W WO2012073727A1 WO 2012073727 A1 WO2012073727 A1 WO 2012073727A1 JP 2011076721 W JP2011076721 W JP 2011076721W WO 2012073727 A1 WO2012073727 A1 WO 2012073727A1
Authority
WO
WIPO (PCT)
Prior art keywords
phase difference
curve
difference detection
correlation calculation
divided
Prior art date
Application number
PCT/JP2011/076721
Other languages
French (fr)
Japanese (ja)
Inventor
貴嗣 青木
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2012073727A1 publication Critical patent/WO2012073727A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/02Viewfinders
    • G03B13/10Viewfinders adjusting viewfinders field
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • G03B13/20Rangefinders coupled with focusing arrangements, e.g. adjustment of rangefinder automatically focusing camera
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets

Definitions

  • the present invention relates to an imaging apparatus that detects the distance to a subject and controls the focal position of a photographic lens, and an in-focus position detection method thereof.
  • the focus position detection method for detecting the distance to the main subject includes a contrast method and a phase difference AF method.
  • the phase difference AF method is often used in single-lens reflex cameras because it can detect the in-focus position at a higher speed and with higher accuracy than the contrast method.
  • phase difference AF method employed in the conventional single-lens reflex camera is, for example, as described in Patent Document 1 below, arranged separately on two right and left sides apart from a solid-state image sensor that captures a subject image.
  • a phase difference detection line sensor is provided, and the distance to the main subject is detected based on the phase difference between the detection information of the first line sensor and the detection information of the second line sensor.
  • phase difference AF method described in Patent Document 1 requires a line sensor for phase difference detection in addition to the solid-state imaging device, which increases the parts cost and manufacturing cost, and further increases the size of the apparatus. .
  • Patent Document 2 a pixel in which a pixel for detecting a phase difference is provided on a light receiving surface of a solid-state imaging device has been proposed.
  • a solid-state image sensor in which pixels for phase difference detection are formed as a solid-state image sensor that captures a subject image, an external phase difference detection sensor becomes unnecessary, and the cost can be reduced.
  • Patent Document 2 the conventional technique described in Patent Document 2 is intended for single-lens reflex cameras, and is premised on mounting a large-area solid-state imaging device.
  • the phase difference detection pixel as described in Patent Document 2, the light shielding film opening of each of a pair of adjacent pixels is made small, and the light shielding film opening position of one and the other is set to the phase difference detection direction (normal case). Is configured to detect the phase difference by shifting in the horizontal direction.
  • a large-format (large area) solid-state imaging device that can increase the light receiving area of each pixel can obtain phase difference information at high speed and high accuracy even if the light shielding film aperture is slightly reduced.
  • the light receiving area of each pixel cannot be increased.
  • the original light-shielding film aperture is small. If the information is acquired at high speed, there arises a problem that the accuracy of the phase difference information, that is, the focus position detection accuracy is lowered.
  • An object of the present invention is to provide an imaging apparatus capable of acquiring phase difference information and obtaining a focus position at high speed and with high accuracy even when applied to a solid-state image sensor having a small area, and a focus position detection method thereof. is there.
  • a first phase difference detection pixel and a second phase difference detection that are divided into pupils in a phase difference detection area provided on a light receiving surface that captures an image of a subject.
  • An image sensor in which a pair of pixels is arranged two-dimensionally;
  • a focus lens that is placed in front of the optical path of the image sensor and forms a focused optical image of the subject on the light receiving surface;
  • An in-focus position detection method of an imaging apparatus comprising: a control unit that obtains a phase difference from the distribution curve of the first lens and controls the drive of the focus lens to the in-focus position based on the phase difference, Dividing the phase difference detection area into a plurality of divided areas in a direction perpendicular to the phase difference detection direction; For each of the divided areas, output signals from the plurality of first phase difference detection pixels arranged in a line in the perpendicular direction in the divided area are output to the plurality of first signals arranged in the one arrangement direction in the divided area.
  • One phase difference detection pixel is added for each first phase difference detection pixel, and for each of the divided areas, a plurality of the second phase differences arranged in a row in the perpendicular direction in the divided area
  • the output signal from the detection pixel is added to each of the plurality of second phase difference detection pixels arranged in the one arrangement direction in the divided area for each second phase difference detection pixel, Correlation between a first addition signal obtained by adding the output signal of the first phase difference detection pixel and a second addition signal obtained by adding the output signal of the second phase difference detection pixel to the divided area.
  • Is calculated for each of the first and second phase difference detection pixels to calculate a correlation calculation curve for each divided area Among the correlation calculation curves calculated for each of the divided areas, a comprehensive evaluation curve obtained by performing a predetermined calculation process on an arbitrary plurality of correlation calculation curves is obtained, and the focus lens is combined with the focus lens from the comprehensive evaluation curve. A defocus amount for driving and controlling the focal position is obtained.
  • an evaluation curve (correlation calculation curve) is obtained for each divided area, and a predetermined calculation process is performed on the plurality of evaluation curves to obtain an overall evaluation curve (a plurality of areas). Since the defocus amount is calculated from the number of pixels, the number of added pixels is not increased and the pattern of the subject is not averaged. Even when a small solid-state image sensor is mounted, the phase difference AF can be performed at high speed and with high accuracy. Can be done.
  • FIG. 1 is a functional block diagram of a digital camera according to an embodiment of the present invention.
  • the digital camera 10 of the present embodiment has a function of capturing a still image or a moving image of a subject and digitally processing a captured image signal in the camera 10, and includes a photographic lens 20 including a telephoto lens and a focus lens, and a photographic lens 20.
  • AGC automatic gain adjustment
  • an analog / digital conversion unit (A / D) 23 that converts analog image data output from the analog signal processing unit 22 into digital image data, and a system control unit (CPU) 29 described later
  • the A / D 23, the analog signal processing unit 22, the solid-state imaging device 21, and the photographic lens 20 are driven and controlled by It includes a moving section 24, a flash 25 that emits light in response to an instruction from the CPU 29.
  • the digital camera 10 further includes a digital signal processing unit 26 that takes in digital image data output from the A / D 23 and performs interpolation processing, white balance correction, RGB / YC conversion processing, and the like.
  • a compression / decompression processing unit 27 that compresses or reversely decompresses the image data
  • a display unit 28 that displays a menu or the like, and a through image or a captured image
  • a system control unit that performs overall control of the entire digital camera
  • a media interface (I / F) unit 31 that performs interface processing between the (CPU) 29, an internal memory 30 such as a frame memory, and a recording medium 32 that stores JPEG image data, and the like are connected to each other.
  • an operation unit 33 for inputting an instruction from the user to the system control unit 29. It is connected.
  • the system control unit 29 analyzes the captured image data (through image) output from the solid-state imaging device 21 in a moving image state and processed by the digital signal processing unit 26 using the subordinate digital signal processing unit 26 and the like as described later. To obtain an evaluation curve (correlation calculation curve) and detect the distance to the main subject. Then, the system control unit 29 controls the position of the focus lens placed in front of the optical path of the solid-state imaging device 21 in the photographic lens 20 via the driving unit 24, and the optical image focused on the subject is a solid-state imaging device. An image is formed on the light receiving surface 21.
  • the solid-state image pickup device 21 is a CMOS type in this embodiment, and an output signal of the solid-state image pickup device 21 is processed by an analog signal processing unit (AFE: analog front end) 22, and this AFE portion (correlated double sampling processing or A circuit for performing a clamp process, a signal amplifier circuit for performing gain control, and the like) are generally provided as a peripheral circuit on a solid-state imaging device chip.
  • AFE analog signal processing unit
  • a horizontal scanning circuit, a vertical scanning circuit, a noise reduction circuit, a synchronization signal generation circuit, and the like are also formed as peripheral circuits around the light receiving unit on the chip of the solid-state imaging device 21, and the A / D in FIG.
  • the conversion part 23 may also be formed.
  • the solid-state imaging device 21 may be a CCD type, and the embodiments described below can be applied as they are.
  • FIG. 2 is an explanatory diagram of the light receiving surface of the solid-state image sensor 21.
  • a large number of pixels (light receiving elements: photodiodes) (not shown) are arranged in a two-dimensional array.
  • a plurality of pixels are arranged in a square lattice pattern.
  • the pixel array is not limited to a square lattice array, and may be a so-called honeycomb array in which even-numbered pixel rows are shifted by 1/2 pixel pitch with respect to odd-numbered pixel rows.
  • a rectangular phase difference detection area 40 is provided at an arbitrary partial region position on the light receiving surface, in the illustrated example, at the center position. In this example, only one phase difference detection area 40 is provided with respect to the light receiving surface. However, the phase difference detection area 40 may be provided at a plurality of locations so that AF can be performed anywhere on the imaging screen. The entire area of the light receiving surface may be used as a phase difference detection area.
  • the phase difference detection area 40 is divided into four in the vertical direction (up and down direction y) with respect to the phase difference detection direction (in this example, the left and right direction, that is, the x direction is the phase difference detection direction). For each of the divided areas I, II, III, and IV, a correlation calculation described later is performed. Note that the number of divisions is not limited to four, and may be six or seven, and can be divided into an arbitrary number.
  • FIG. 3 is a schematic enlarged surface view of a portion indicated by a dotted rectangular frame 41 in FIG. 2 in the phase difference detection area 40.
  • a large number of pixels are arranged in a square lattice on the light receiving surface of the solid-state imaging device 21, and the same applies to the phase difference detection area 40.
  • each pixel is indicated by R (red), G (green), and B (blue).
  • R, G, and B represent colors of the color filters stacked on each pixel, and the color filter array is a Bayer array in this example, but is not limited to the Bayer array, and other color filters such as stripes. An array may be used.
  • phase difference detection pixels 1x and 1y Pixels of the same color adjacent to are designated as phase difference detection pixels 1x and 1y.
  • paired pixels for phase difference detection are provided at discrete and periodic positions in the area 40, in the illustrated example, at checkered positions.
  • the same color pixels are diagonally adjacent because the color filter array is a Bayer array, and in the case of a horizontal stripe array, the same color pixels are arranged in the horizontal direction. Will be adjacent to.
  • each pixel constituting the pair may be provided separately in each filter row of the same color closest in the vertical direction. The same applies to the vertical stripe arrangement.
  • the phase difference detection pixels 1x and 1y are provided in the most G filter mounted pixels among R, G, and B, and 8 pixels are arranged in the horizontal direction (x direction) and in the vertical direction (y direction). It is arranged so that every eight pixels and the entire checkered position. Accordingly, when viewed in the phase difference detection direction (left-right direction), the phase difference detection pixels 1x are arranged every four pixels.
  • FIG. 4 is a diagram schematically showing only the phase difference detection pixels 1x and 1y of FIG.
  • the phase difference detection pixels 1x and 1y constituting the paired pixels are formed such that the light shielding film openings 2x and 2y are smaller than other pixels (pixels other than the phase difference detection pixels), and the pixel 1x
  • the light shielding film opening 2x is eccentrically provided in the left direction
  • the light shielding film opening 2y of the pixel 1y is eccentrically provided in the right direction (phase difference detection direction).
  • the curve X shown in the lower part of FIG. 4 is a graph in which the detection signal amounts of the phase difference detection pixels 1x arranged in a horizontal row are plotted, and the curve Y is a detection signal of the phase difference detection pixel 1y that forms a pair with these pixels 1x. It is the graph which plotted quantity.
  • the paired pixels 1x and 1y are adjacent pixels and are very close to each other, and therefore are considered to receive light from the same subject. For this reason, it is considered that the curve X and the curve Y have the same shape, and the shift in the left-right direction (phase difference detection direction) is caused by the difference between the image seen by one pixel 1x of the paired pixel divided by the pupil and the other pixel 1y. This is the amount of phase difference from the image seen in.
  • the phase difference amount (lateral shift amount) can be obtained, and the distance to the subject can be calculated from the phase difference amount.
  • a known method for example, a method described in Patent Document 1 or a method described in Patent Document 2 may be adopted as a method for obtaining the evaluation value of the correlation amount between the curve X and the curve Y.
  • the integrated value of the absolute value of the difference between each point X (i) constituting the curve X and each point Y (i + j) constituting the curve Y is used as the evaluation value, and the j value giving the maximum evaluation value is expressed as the phase difference.
  • the detection signals of the pixels 1x at the same horizontal position are added by a plurality of pixels in the vertical direction, and the detection signals of the pixels 1y at the same horizontal position are added in the vertical direction. If only a plurality of pixels are added, the influence of noise can be reduced and the detection accuracy (AF accuracy) of the in-focus position can be improved.
  • the arrangement area of the phase difference detection pixels to be added to the pixels in the phase difference detection area 40 extends in the vertical direction (vertical direction). It will be.
  • the subject pattern is usually different between a pattern imaged in the upper part of the phase difference detection area 40, a pattern imaged in the middle part, and a pattern imaged in the lower part. For this reason, if all these pixels are added, the pattern after subject pixel addition is averaged in the phase difference detection direction (left and right direction), resulting in a problem that the evaluation value for obtaining the phase difference is lowered. .
  • the phase difference detection area 40 is divided into four, the pixel addition range is limited to each divided area, and pixels are not added beyond the divided area.
  • pixels are added to each divided area I, II, III, IV to obtain a divided area evaluation curve (correlation calculation curve), and each divided area evaluation curve is added to evaluate the entire evaluation curve of the phase difference detection area 40 ( Comprehensive evaluation curve).
  • FIG. 5 is a graph showing the respective evaluation curves I, II, III, IV for each divided area, and a total evaluation curve (evaluation curve for all areas) V obtained by summing up these four divided area evaluation curves.
  • the divided area evaluation curve I is the same as the divided area I in FIG. 4 obtained by adding the detection signals of the phase difference detection pixels 1x in the divided area I in the vertical direction (for example, reference numeral 45 in FIG. 3).
  • 4 is an evaluation curve obtained by correlation calculation with the curve Y in FIG. 4 obtained by adding the detection signals of the phase difference detection pixel 1y in the vertical direction (for example, reference numeral 46 in FIG. 3).
  • the maximum evaluation value is obtained as the minimum value.
  • the divided area evaluation curve II is an evaluation curve obtained in the divided area II
  • the divided area evaluation curve III is an evaluation curve obtained in the divided area III
  • the divided area evaluation curve IV is obtained in the divided area IV. Evaluation curve.
  • the number of added pixels for obtaining each of these four divided area evaluation curves I, II, III, and IV is substantially divided with respect to the number of phase difference detection pixels 1x arranged in the vertical direction of the phase difference detection area 40. Since the number of areas is one-thousand, there is little possibility that the subject pattern is averaged, and the evaluation value can be calculated with high accuracy.
  • a total evaluation curve V obtained by summing up these four divided area evaluation curves I, II, III, and IV is obtained, and further, a sub-pixel interpolation operation is performed from the total evaluation curve V, whereby a phase difference is obtained.
  • the amount (defocus amount) is obtained.
  • the minimum of the comprehensive evaluation curve V Subpixel interpolation of the position that gives the true minimum value (maximum evaluation value), that is, the phase difference amount, taking into account the position of the value and the slope of the curve that extends to the right and the curve that extends to the left with respect to this minimum value. Calculate by calculation. As a result, the phase difference amount can be obtained for each pixel in FIG.
  • the four divided area evaluation curves I, II, III, and IV are summed to obtain the overall evaluation curve V, and the maximum evaluation value is obtained.
  • the correlation calculation is performed for each of the divided areas I, II, III, and IV, there may be a divided area where only a correlation calculation result with low reliability can be obtained.
  • Whether or not the reliability is high is determined by quantifying the reliability and determining whether or not the reliability is higher than a predetermined threshold. For example, when there are a plurality of minimum values that have the same minimum evaluation value in a single divided area evaluation curve, it is determined that the reliability is low.
  • the subject A is not focused and the subject B is not focused. turn into. Therefore, either subject A or subject B is selected.
  • the subject closer to the camera is determined as the main subject. That is, the subject on the side where the amount of positional deviation is small (see the horizontal axis in FIG. 5: the side close to the imaging device) is selected, and only the divided area evaluation curves I and III are added to obtain the total evaluation curve V, and the subpixel interpolation is performed. To calculate the phase difference amount.
  • FIG. 9 is a diagram showing a pixel arrangement of phase difference detection pixels according to another embodiment of the present invention.
  • the phase difference detection pixels 1x and 1y are provided with only the phase difference detection direction as the horizontal direction (x direction).
  • the phase difference detection direction is used, and the vertical direction (y direction) is also the second phase difference detection direction.
  • the upper left pixel A (corresponding to the phase difference detection pixel 1x) and the lower right pixel B ( The phase difference in the horizontal direction (x direction) is detected by the phase difference detection pixel 1y), and the phase difference in the vertical direction (y direction) is detected by the upper right pixel C and the lower left pixel D. That is, the small light shielding film opening of the pixel A is decentered to the left side, the small light shielding film opening of the pixel B is decentered to the right side, the small light shielding film opening of the pixel C is decentered upward, and the small light shielding film opening of the pixel D is It is eccentric to the side.
  • the vertical and horizontal positions are arranged. Detect phase difference.
  • the rectangular phase difference detection area 40 is divided into a mesh shape, and is divided into four areas I, II, III, and IV in the example shown in FIG. Each of the four divided areas is considered to have a high probability of obtaining the same evaluation curve in the vertical and horizontal directions within the same area.
  • FIG. 12 is a diagram illustrating an example in which a total of eight divided area evaluation curves, that is, a horizontal divided area evaluation curve and a vertical divided area evaluation curve are obtained in each of the four divided areas.
  • a horizontal divided area evaluation curve Ia and a vertical divided area evaluation curve Ib are obtained, and from the divided area II, a horizontal divided area evaluation curve IIa and a vertical divided area evaluation curve IIb are obtained.
  • a horizontal divided area evaluation curve IIIa and a vertical divided area evaluation curve IIIb are obtained, and from the divided area IV, the horizontal divided area evaluation curve IVa and the vertical divided area evaluation curve are obtained.
  • IVb is obtained.
  • the imaging apparatus divides the phase difference detection area into a plurality of regions, obtains a divided area evaluation curve for each divided area, adds only the reliable divided area evaluation curve, and adds up the total.
  • the comprehensive evaluation curve By obtaining a comprehensive evaluation curve and calculating a phase difference amount by interpolating the comprehensive evaluation curve with sub-pixels, it is possible to perform focusing control of the photographing lens. For this reason, even when a small-area image sensor is used, it is possible to obtain high-speed and high-precision AF performance comparable to that of a single-lens reflex camera.
  • the pupil-divided pair pixel constituting the phase difference detection pixel has been described by using an example in which the reduced light shielding film openings are shifted in opposite directions.
  • the method of configuring is not limited to this, and for example, one microlens may be mounted on a pair of pixels to divide the pupil.
  • the pixel need not necessarily be provided at the periodic and discrete positions. 4 may be random positions (even if the phase difference detection pixels provided in the same row are at random positions, the curves X and Y in FIG. 4 can be obtained), or all the pixels are used as phase difference detection pixels. That's fine.
  • the imaging apparatus and the focus position detection method thereof include the first phase difference detection pixel and the second phase divided in the phase difference detection area provided on the light receiving surface that captures the image of the subject.
  • An image sensor in which a pair of pixels composed of phase difference detection pixels are two-dimensionally arranged;
  • a focus lens that is placed in front of the optical path of the image sensor and forms a focused optical image of the subject on the light receiving surface;
  • An in-focus position detection method of an imaging apparatus comprising: a control unit that obtains a phase difference from the distribution curve of the first lens and controls the drive of the focus lens to the in-focus position based on the phase difference, Dividing the phase difference detection area into a plurality of divided areas in a direction perpendicular to the phase difference detection direction; For each of the divided areas, output signals from the plurality of first phase difference detection pixels arranged in a line in the perpendicular direction in the divided area are output to the plurality of first signals arranged in the one arrangement direction in the divided area.
  • One phase difference detection pixel is added for each first phase difference detection pixel, and for each of the divided areas, a plurality of the second phase differences arranged in a row in the perpendicular direction in the divided area
  • the output signal from the detection pixel is added to each of the plurality of second phase difference detection pixels arranged in the one arrangement direction in the divided area for each second phase difference detection pixel, Correlation between a first addition signal obtained by adding the output signal of the first phase difference detection pixel and a second addition signal obtained by adding the output signal of the second phase difference detection pixel to the divided area.
  • Is calculated for each of the first and second phase difference detection pixels to calculate a correlation calculation curve for each divided area Among the correlation calculation curves calculated for each of the divided areas, a comprehensive evaluation curve obtained by performing a predetermined calculation process on an arbitrary plurality of correlation calculation curves is obtained, and the focus lens is combined with the focus lens from the comprehensive evaluation curve. It is characterized in that a defocus amount for driving control to a focus position is obtained.
  • the reliability of each correlation calculation curve for each divided area is evaluated, and the reliability is higher than a predetermined threshold. Only the correlation calculation curve for each divided area is subjected to the predetermined calculation process to obtain the comprehensive evaluation curve.
  • the imaging apparatus and the focus position detection method thereof include an evaluation value represented as an absolute value of a difference between the first addition signal and the second addition signal in the correlation calculation curve for each divided area.
  • the comprehensive evaluation curve is obtained by selectively using only the correlation calculation curve for each divided area where the position where the maximum value is within the same range and the reliability is higher than a predetermined threshold value.
  • the imaging apparatus and the focus position detection method of the embodiment take a majority decision of the position that gives the maximum evaluation value of the correlation calculation curve for each divided area, and only the correlation calculation curve for each divided area becomes a large number.
  • the comprehensive evaluation curve is obtained by selectively using.
  • the position where the evaluation value is maximized is determined.
  • the correlation calculation curve having a smaller phase difference is selected.
  • the imaging apparatus and the focus position detection method when a plurality of correlation calculation curves in which the position where the evaluation value is maximized are within the same range are obtained, among the plurality of divided areas, The correlation calculation curve is selected for a divided area including a subject whose evaluation value is maximum at a position closer to the image sensor.
  • a pair pixel of the first phase difference detection pixel and the second phase difference detection pixel for detecting a phase difference is provided in the phase difference detection area of the image sensor, and the phase difference detection area is formed in a plurality of mesh shapes. Dividing into divided areas, for each divided area, obtaining a correlation calculation curve for each of the divided areas in the horizontal direction and a correlation calculation curve for each of the divided areas in the vertical direction. The comprehensive evaluation curve is obtained based on the above.
  • the comprehensive evaluation curve is obtained from the correlation calculation curves of the remaining divided areas excluding the correlation calculation curves of the divided areas.
  • the phase difference amount can be detected, and the phase difference AF can be realized at high speed and with high accuracy even with a small and small area solid-state imaging device.
  • the in-focus position detection method according to the present invention can obtain high-speed and high-precision AF performance, a digital camera, particularly a compact digital camera, a camera-equipped mobile phone, an electronic device with a camera, an imaging device for an endoscope, etc. It is useful to apply to.
  • a digital camera particularly a compact digital camera, a camera-equipped mobile phone, an electronic device with a camera, an imaging device for an endoscope, etc. It is useful to apply to.
  • This application is based on Japanese Patent Application No. 2010-267932 filed on Nov. 30, 2010, the contents of which are incorporated herein by reference.
  • Phase difference detection pixels 1x, 1y Phase difference detection pixels 2x, 2y Light shielding film opening 10 of phase difference detection pixels Imaging device 20 Shooting lens 21 Solid-state imaging device 24 Drive unit 26 Digital signal processing unit 29 System control unit 40 Phase difference detection areas I, II, III , IV Divided area

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Optics & Photonics (AREA)
  • Focusing (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)

Abstract

The purpose of the present invention is to achieve high-speed and high-precision phase-difference autofocusing. Provided is an imaging device comprising: an imaging element with a first and a second phase difference detection pixel (x, y) that have been pupil-divided inside a phase difference detection area (40) arranged two dimensionally in a light-receiving surface; a focus lens positioned at the front of an optical path for the imaging element; and a control means that drive controls the focus lens to the focal position. The control means: divides the area (40) into a plurality of divided areas (I, II, III, IV); for each divided area, adds output signals for a plurality of pixels (x) lined up in a row in a direction perpendicular to the phase difference detection method in a divided area to a plurality of pixels (x) lined up in the phase difference detection direction in said divided area, for each pixel (x); and similarly adds output signals for pixels (y). The control means then calculates for each divided area the correlation between the pixel (x) signals added and the pixel (y) signals added to calculate an evaluation curve, performs a prescribed arithmetic processing on the evaluation curve calculated for each divided area, and finds an overall evaluation curve for the whole area. The control means finds the amount of defocus from this overall evaluation curve.

Description

撮像装置及びその合焦位置検出方法Imaging apparatus and focus position detection method thereof
 本発明は、被写体までの距離を検出し撮影レンズの焦点位置制御を行う撮像装置及びその合焦位置検出方法に関する。 The present invention relates to an imaging apparatus that detects the distance to a subject and controls the focal position of a photographic lens, and an in-focus position detection method thereof.
 主要な被写体までの距離を検出する合焦位置検出方法には、コントラスト方式や位相差AF方式がある。位相差AF方式は、コントラスト方式に比べて合焦位置の検出を高速,高精度に行うことができるため、一眼レフカメラで多く採用されている。 The focus position detection method for detecting the distance to the main subject includes a contrast method and a phase difference AF method. The phase difference AF method is often used in single-lens reflex cameras because it can detect the in-focus position at a higher speed and with higher accuracy than the contrast method.
 従来の一眼レフカメラで採用されている位相差AF方式は、例えば下記の特許文献1に記載されている様に、被写体画像を撮像する固体撮像素子とは別に、2つの左右に配置される位相差検出用ラインセンサを設け、第1のラインセンサの検出情報と第2のラインセンサの検出情報との位相差に基づき、主要被写体までの距離を検出する構成になっている。 The phase difference AF method employed in the conventional single-lens reflex camera is, for example, as described in Patent Document 1 below, arranged separately on two right and left sides apart from a solid-state image sensor that captures a subject image. A phase difference detection line sensor is provided, and the distance to the main subject is detected based on the phase difference between the detection information of the first line sensor and the detection information of the second line sensor.
 この特許文献1記載の位相差AF方式は、固体撮像素子とは別に位相差検出用のラインセンサが必要となり、部品コスト,製造コストが嵩んでしまい、更に装置が大型化してしまうという問題がある。 The phase difference AF method described in Patent Document 1 requires a line sensor for phase difference detection in addition to the solid-state imaging device, which increases the parts cost and manufacturing cost, and further increases the size of the apparatus. .
 これに対し、下記の特許文献2に記載されている様に、固体撮像素子受光面上に位相差検出用の画素を設けたものが提案されている。被写体画像を撮像する固体撮像素子として位相差検出用の画素が形成された固体撮像素子を採用することで、外部の位相差検出用センサが不要となり、低コスト化を図ることが可能となる。 On the other hand, as described in Patent Document 2 below, a pixel in which a pixel for detecting a phase difference is provided on a light receiving surface of a solid-state imaging device has been proposed. By adopting a solid-state image sensor in which pixels for phase difference detection are formed as a solid-state image sensor that captures a subject image, an external phase difference detection sensor becomes unnecessary, and the cost can be reduced.
日本国特開2010―8443号公報Japanese Unexamined Patent Publication No. 2010-8443 日本国特開2010―91991号公報Japanese Laid-Open Patent Publication No. 2010-91991
 しかしながら、特許文献2に記載の従来技術は、一眼レフカメラが対象であり、大面積の固体撮像素子を搭載することが前提となっている。位相差検出画素は、特許文献2に記載されている様に、1対の隣接画素の夫々の遮光膜開口を小さくし、かつ一方と他方で遮光膜開口位置を位相差検出方向(通常の場合は左右方向)にずらすことで位相差を検出する構成になっている。 However, the conventional technique described in Patent Document 2 is intended for single-lens reflex cameras, and is premised on mounting a large-area solid-state imaging device. In the phase difference detection pixel, as described in Patent Document 2, the light shielding film opening of each of a pair of adjacent pixels is made small, and the light shielding film opening position of one and the other is set to the phase difference detection direction (normal case). Is configured to detect the phase difference by shifting in the horizontal direction.
 一画素一画素の受光面積を大きくとれる大判(大面積)の固体撮像素子であれば、遮光膜開口を少しくらい小さくしても位相差情報を高速かつ高精度にとることができる。しかし、一画素一画素の受光面積を大きくとれない、例えばコンパクトカメラ等に搭載する固体撮像素子では、もともとの遮光膜開口が小さいため、これを更に小さくし、受光時間を短時間にして位相差情報を高速に取得すると、位相差情報の精度つまり合焦位置検出精度が落ちてしまうという問題が生じる。 A large-format (large area) solid-state imaging device that can increase the light receiving area of each pixel can obtain phase difference information at high speed and high accuracy even if the light shielding film aperture is slightly reduced. However, the light receiving area of each pixel cannot be increased. For example, in a solid-state image sensor mounted on a compact camera or the like, the original light-shielding film aperture is small. If the information is acquired at high speed, there arises a problem that the accuracy of the phase difference information, that is, the focus position detection accuracy is lowered.
 本発明の目的は、小面積の固体撮像素子に適用した場合でも高速かつ高精度に位相差情報を取得し合焦位置を求めることができる撮像装置及びその合焦位置検出方法を提供することにある。 An object of the present invention is to provide an imaging apparatus capable of acquiring phase difference information and obtaining a focus position at high speed and with high accuracy even when applied to a solid-state image sensor having a small area, and a focus position detection method thereof. is there.
 本発明の撮像装置及びその合焦位置検出方法は、被写体の画像を撮像する受光面に設けられた位相差検出エリア内に、瞳分割した第1の位相差検出画素及び第2の位相差検出画素で構成されるペア画素が2次元配列された撮像素子と、
 該撮像素子の光路前段に置かれ、前記被写体の合焦した光学像を前記受光面に結像させるフォーカスレンズと、
 前記第1の位相差検出画素からの出力信号の前記ペア画素の一方の配列方向に対する第1の分布曲線と、前記第2の位相差検出画素からの出力信号の前記一方の配列方向に対する第2の分布曲線との位相差を求め、該位相差に基づき前記フォーカスレンズを前記合焦する位置に駆動制御する制御手段とを備える撮像装置の合焦位置検出方法であって、
 前記位相差の検出方向に対して直角方向に前記位相差検出エリアを複数の分割エリアに分割し、
 前記分割エリア毎に、当該分割エリア内の前記直角方向に一列に並ぶ複数の前記第1の位相差検出画素からの出力信号を、該分割エリア内の前記一方の配列方向に並ぶ複数の前記第1の位相差検出画素に対して、該第1の位相差検出画素毎に加算すると共に、前記分割エリア毎に、当該分割エリア内の前記直角方向に一列に並ぶ複数の前記第2の位相差検出画素からの出力信号を、該分割エリア内の前記一方の配列方向に並ぶ複数の前記第2の位相差検出画素に対して、該第2の位相差検出画素毎に加算し、
 前記分割エリアに対して、前記第1の位相差検出画素の出力信号を加算した第1の加算信号と前記第2の位相差検出画素の出力信号を加算した第2の加算信号との相関関係を前記第1及び第2の位相差検出画素毎に演算して分割エリア毎の相関演算曲線を算出し、
 前記分割エリア毎に算出された前記相関演算曲線のうち、任意の複数個の前記相関演算曲線に対して所定演算処理を施した総合評価曲線を求め、前記総合評価曲線から前記フォーカスレンズを前記合焦位置に駆動制御するためのデフォーカス量を求めることを特徴とする。
According to the imaging apparatus and the focus position detection method of the present invention, a first phase difference detection pixel and a second phase difference detection that are divided into pupils in a phase difference detection area provided on a light receiving surface that captures an image of a subject. An image sensor in which a pair of pixels is arranged two-dimensionally;
A focus lens that is placed in front of the optical path of the image sensor and forms a focused optical image of the subject on the light receiving surface;
A first distribution curve of the output signal from the first phase difference detection pixel with respect to one arrangement direction of the pair pixels, and a second distribution curve with respect to the one arrangement direction of the output signal from the second phase difference detection pixel. An in-focus position detection method of an imaging apparatus, comprising: a control unit that obtains a phase difference from the distribution curve of the first lens and controls the drive of the focus lens to the in-focus position based on the phase difference,
Dividing the phase difference detection area into a plurality of divided areas in a direction perpendicular to the phase difference detection direction;
For each of the divided areas, output signals from the plurality of first phase difference detection pixels arranged in a line in the perpendicular direction in the divided area are output to the plurality of first signals arranged in the one arrangement direction in the divided area. One phase difference detection pixel is added for each first phase difference detection pixel, and for each of the divided areas, a plurality of the second phase differences arranged in a row in the perpendicular direction in the divided area The output signal from the detection pixel is added to each of the plurality of second phase difference detection pixels arranged in the one arrangement direction in the divided area for each second phase difference detection pixel,
Correlation between a first addition signal obtained by adding the output signal of the first phase difference detection pixel and a second addition signal obtained by adding the output signal of the second phase difference detection pixel to the divided area. Is calculated for each of the first and second phase difference detection pixels to calculate a correlation calculation curve for each divided area,
Among the correlation calculation curves calculated for each of the divided areas, a comprehensive evaluation curve obtained by performing a predetermined calculation process on an arbitrary plurality of correlation calculation curves is obtained, and the focus lens is combined with the focus lens from the comprehensive evaluation curve. A defocus amount for driving and controlling the focal position is obtained.
 本発明によれば、分割エリア毎に評価曲線(相関演算曲線)を求め、複数の評価曲線に対し所定演算処理を施して全体(複数エリア)の総合評価曲線を求め、この全体の総合評価曲線からデフォーカス量を算出するため、画素加算数が多くなって被写体のパターンが平均化されてしまうことがなく、小さな固体撮像素子を搭載する場合であっても高速かつ高精度に位相差AFを行うことが可能となる。 According to the present invention, an evaluation curve (correlation calculation curve) is obtained for each divided area, and a predetermined calculation process is performed on the plurality of evaluation curves to obtain an overall evaluation curve (a plurality of areas). Since the defocus amount is calculated from the number of pixels, the number of added pixels is not increased and the pattern of the subject is not averaged. Even when a small solid-state image sensor is mounted, the phase difference AF can be performed at high speed and with high accuracy. Can be done.
本発明の一実施形態に係る撮像装置の機能ブロック構成図である。It is a functional block block diagram of the imaging device which concerns on one Embodiment of this invention. 図1に示す固体撮像素子の受光面上に設ける位相差検出エリアの説明図である。It is explanatory drawing of the phase difference detection area provided on the light-receiving surface of the solid-state image sensor shown in FIG. 図2の点線矩形枠内の表面拡大模式図である。It is a surface expansion schematic diagram in the dotted-line rectangular frame of FIG. 図3の位相差検出画素とその検出信号だけを抜き出して求める位相差量の概念を説明する図である。It is a figure explaining the concept of the phase difference amount calculated | required by extracting only the phase difference detection pixel of FIG. 3, and its detection signal. 分割エリア毎の評価曲線と合計の評価曲線の説明図である。It is explanatory drawing of the evaluation curve for every division area, and a total evaluation curve. 分割エリア毎の評価曲線の信頼性説明図である。It is reliability explanatory drawing of the evaluation curve for every division area. 分割エリア毎の評価曲線のうち異なる被写体に対する評価曲線が求められた場合の多数決例の説明図である。It is explanatory drawing of the majority decision example when the evaluation curve with respect to a different subject is calculated | required among the evaluation curves for every divided area. 分割エリア毎の評価曲線のうち異なる被写体に対する評価曲線の数が同数となった場合の説明図である。It is explanatory drawing when the number of the evaluation curves with respect to a different subject becomes the same number among the evaluation curves for every divided area. 水平方向及び垂直方向を位相差検出方向とする位相差検出画素の説明図である。It is explanatory drawing of the phase difference detection pixel which makes a horizontal direction and a vertical direction a phase difference detection direction. 図9の位相差方向の説明図である。It is explanatory drawing of the phase difference direction of FIG. 図9の位相差検出画素を用いたときの位相差検出エリアの分割例を示す図である。It is a figure which shows the example of a division | segmentation of a phase difference detection area when using the phase difference detection pixel of FIG. 図9に示す分割エリア毎に求めた水平方向,垂直方向の評価曲線の一例を示す図である。It is a figure which shows an example of the evaluation curve of the horizontal direction calculated | required for every division area shown in FIG. 9, and a vertical direction.
 以下、本発明の一実施形態について、図面を参照して説明する。 Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
 図1は、本発明の一実施形態に係るデジタルカメラの機能ブロック図である。本実施形態のデジタルカメラ10は、被写体の静止画像或いは動画像を撮影しカメラ10内で撮像画像信号をデジタル処理する機能を有し、望遠レンズ及びフォーカスレンズを備える撮影レンズ20と、撮影レンズ20の背部に置かれその結像面に配置された固体撮像素子21と、固体撮像素子21の各画素から出力されるアナログの画像データを自動利得調整(AGC)や相関二重サンプリング処理等のアナログ処理するアナログ信号処理部22と、アナログ信号処理部22から出力されるアナログ画像データをデジタル画像データに変換するアナログデジタル変換部(A/D)23と、後述のシステム制御部(CPU)29からの指示によってA/D23,アナログ信号処理部22,固体撮像素子21,撮影レンズ20の駆動制御を行う駆動部24と、CPU29からの指示によって発光するフラッシュ25とを備える。 FIG. 1 is a functional block diagram of a digital camera according to an embodiment of the present invention. The digital camera 10 of the present embodiment has a function of capturing a still image or a moving image of a subject and digitally processing a captured image signal in the camera 10, and includes a photographic lens 20 including a telephoto lens and a focus lens, and a photographic lens 20. The analog image data output from each pixel of the solid-state image sensor 21 and the solid-state image sensor 21 placed on the imaging surface of the solid-state image sensor 21 and the analog image data such as automatic gain adjustment (AGC) and correlated double sampling processing. From an analog signal processing unit 22 to be processed, an analog / digital conversion unit (A / D) 23 that converts analog image data output from the analog signal processing unit 22 into digital image data, and a system control unit (CPU) 29 described later The A / D 23, the analog signal processing unit 22, the solid-state imaging device 21, and the photographic lens 20 are driven and controlled by It includes a moving section 24, a flash 25 that emits light in response to an instruction from the CPU 29.
 本実施形態のデジタルカメラ10は更に、A/D23から出力されるデジタル画像データを取り込み補間処理やホワイトバランス補正,RGB/YC変換処理等を行うデジタル信号処理部26と、画像データをJPEG形式などの画像データに圧縮したり逆に伸長したりする圧縮/伸長処理部27と、メニューなどを表示したりスルー画像や撮像画像を表示する表示部28と、デジタルカメラ全体を統括制御するシステム制御部(CPU)29と、フレームメモリ等の内部メモリ30と、JPEG画像データ等を格納する記録メディア32との間のインタフェース処理を行うメディアインタフェース(I/F)部31と、これらを相互に接続するバス39とを備え、また、システム制御部29には、ユーザからの指示入力を行う操作部33が接続されている。 The digital camera 10 according to the present embodiment further includes a digital signal processing unit 26 that takes in digital image data output from the A / D 23 and performs interpolation processing, white balance correction, RGB / YC conversion processing, and the like. A compression / decompression processing unit 27 that compresses or reversely decompresses the image data, a display unit 28 that displays a menu or the like, and a through image or a captured image, and a system control unit that performs overall control of the entire digital camera A media interface (I / F) unit 31 that performs interface processing between the (CPU) 29, an internal memory 30 such as a frame memory, and a recording medium 32 that stores JPEG image data, and the like are connected to each other. And an operation unit 33 for inputting an instruction from the user to the system control unit 29. It is connected.
 システム制御部29は、固体撮像素子21から動画状態で出力されデジタル信号処理部26で処理された撮像画像データ(スルー画像)を配下のデジタル信号処理部26等を用いて後述するように解析して評価曲線(相関演算曲線)を求め、主要被写体までの距離を検出する。そして、システム制御部29は、駆動部24を介し、撮影レンズ20のうちの固体撮像素子21の光路前段に置かれたフォーカスレンズの位置制御を行い、被写体に合焦した光学像が固体撮像素子21の受光面に結像するようにする。 The system control unit 29 analyzes the captured image data (through image) output from the solid-state imaging device 21 in a moving image state and processed by the digital signal processing unit 26 using the subordinate digital signal processing unit 26 and the like as described later. To obtain an evaluation curve (correlation calculation curve) and detect the distance to the main subject. Then, the system control unit 29 controls the position of the focus lens placed in front of the optical path of the solid-state imaging device 21 in the photographic lens 20 via the driving unit 24, and the optical image focused on the subject is a solid-state imaging device. An image is formed on the light receiving surface 21.
 固体撮像素子21は、本実施形態ではCMOS型であり、固体撮像素子21の出力信号をアナログ信号処理部(AFE:アナログフロントエンド)22で処理するが、このAFE部分(相関二重サンプリング処理やクランプ処理を行う回路や利得制御を行う信号増幅回路等)は固体撮像素子チップ上に周辺回路として設けられるのが普通である。また、固体撮像素子21のチップ上には、その他にも、水平走査回路や垂直走査回路、雑音低減回路,同期信号発生回路等が周辺回路として受光部周りに形成され、図1のA/D変換部23も形成される場合がある。なお、固体撮像素子21は、CCD型でも、以下に説明する実施形態はそのまま適用可能である。 The solid-state image pickup device 21 is a CMOS type in this embodiment, and an output signal of the solid-state image pickup device 21 is processed by an analog signal processing unit (AFE: analog front end) 22, and this AFE portion (correlated double sampling processing or A circuit for performing a clamp process, a signal amplifier circuit for performing gain control, and the like) are generally provided as a peripheral circuit on a solid-state imaging device chip. In addition, a horizontal scanning circuit, a vertical scanning circuit, a noise reduction circuit, a synchronization signal generation circuit, and the like are also formed as peripheral circuits around the light receiving unit on the chip of the solid-state imaging device 21, and the A / D in FIG. The conversion part 23 may also be formed. The solid-state imaging device 21 may be a CCD type, and the embodiments described below can be applied as they are.
 図2は、固体撮像素子21の受光面の説明図である。固体撮像素子21の受光面には、図示しない多数の画素(受光素子:フォトダイオード)が二次元アレイ状に配列形成されている。この実施形態では、複数の画素が正方格子状に配列形成されている。なお、画素配列は正方格子配列に限るものではなく、奇数行の画素行に対して偶数行の画素行が1/2画素ピッチずつずらして配列された、所謂、ハニカム配列でも良い。 FIG. 2 is an explanatory diagram of the light receiving surface of the solid-state image sensor 21. On the light receiving surface of the solid-state image sensor 21, a large number of pixels (light receiving elements: photodiodes) (not shown) are arranged in a two-dimensional array. In this embodiment, a plurality of pixels are arranged in a square lattice pattern. Note that the pixel array is not limited to a square lattice array, and may be a so-called honeycomb array in which even-numbered pixel rows are shifted by 1/2 pixel pitch with respect to odd-numbered pixel rows.
 受光面の任意の一部領域位置、図示する例では中央位置に、矩形の位相差検出エリア40が設けられている。この例では、位相差検出エリア40は受光面に対して1箇所だけ設けているが、撮影画面内のどこでもAF可能とできるよう複数箇所に設けても良い。受光面の全領域を位相差検出エリアとすることでも良い。 A rectangular phase difference detection area 40 is provided at an arbitrary partial region position on the light receiving surface, in the illustrated example, at the center position. In this example, only one phase difference detection area 40 is provided with respect to the light receiving surface. However, the phase difference detection area 40 may be provided at a plurality of locations so that AF can be performed anywhere on the imaging screen. The entire area of the light receiving surface may be used as a phase difference detection area.
 本実施形態では、位相差検出エリア40を、位相差検出方向(この例では左右方向つまりx方向を位相差検出方向としている。)に対し垂直方向(上下方向y)に4分割しており、各分割エリアI,II,III,IV毎に、後述の相関演算を行う構成となっている。なお、分割数は4つに限るものではなく、6分割でも7分割でも良く、任意数に分割できる。 In the present embodiment, the phase difference detection area 40 is divided into four in the vertical direction (up and down direction y) with respect to the phase difference detection direction (in this example, the left and right direction, that is, the x direction is the phase difference detection direction). For each of the divided areas I, II, III, and IV, a correlation calculation described later is performed. Note that the number of divisions is not limited to four, and may be six or seven, and can be divided into an arbitrary number.
 図3は、位相差検出エリア40内の、図2の点線矩形枠41で示す部分の表面拡大模式図である。固体撮像素子21の受光面には、多数の画素が正方格子配列されており、位相差検出エリア40内でも同様である。 FIG. 3 is a schematic enlarged surface view of a portion indicated by a dotted rectangular frame 41 in FIG. 2 in the phase difference detection area 40. A large number of pixels are arranged in a square lattice on the light receiving surface of the solid-state imaging device 21, and the same applies to the phase difference detection area 40.
 図示する例では、各画素をR(赤),G(緑),B(青)で示している。R,G,Bは各画素上に積層したカラーフィルタの色を表し、カラーフィルタ配列は、この例ではベイヤ配列となっているが、ベイヤ配列に限るものではなく、ストライプ等の他のカラーフィルタ配列でも良い。 In the illustrated example, each pixel is indicated by R (red), G (green), and B (blue). R, G, and B represent colors of the color filters stacked on each pixel, and the color filter array is a Bayer array in this example, but is not limited to the Bayer array, and other color filters such as stripes. An array may be used.
 位相差検出エリア40内の画素配列,カラーフィルタ配列は、位相差検出エリア40外の受光面の画素配列,カラーフィルタ配列と同じであるが、位相差検出エリア40内では、ペアを構成する斜めに隣接する同色画素を、位相差検出画素1x,1yとしている。位相差検出用のペア画素は、本実施形態では、エリア40内の離散的,周期的な位置、図示する例では市松位置に設けられている。 The pixel array and the color filter array in the phase difference detection area 40 are the same as the pixel array and the color filter array on the light receiving surface outside the phase difference detection area 40. Pixels of the same color adjacent to are designated as phase difference detection pixels 1x and 1y. In the present embodiment, paired pixels for phase difference detection are provided at discrete and periodic positions in the area 40, in the illustrated example, at checkered positions.
 なお、図示する例では、カラーフィルタ配列がベイヤ配列のため同色画素が斜めに隣接するのであって、横ストライプ配列の場合には同色画素は水平方向に並ぶため、ペアを構成する2画素は横に隣接することになる。あるいは、横ストライプ配列で同じ色フィルタ行内にペアを構成する2画素を設けるのではなく、縦方向に最近接する同色のフィルタ行の夫々にペアを構成する各画素を離して設けることでも良い。縦ストライプ配列の場合も同様である。 In the illustrated example, the same color pixels are diagonally adjacent because the color filter array is a Bayer array, and in the case of a horizontal stripe array, the same color pixels are arranged in the horizontal direction. Will be adjacent to. Alternatively, instead of providing two pixels constituting a pair in the same color filter row in the horizontal stripe arrangement, each pixel constituting the pair may be provided separately in each filter row of the same color closest in the vertical direction. The same applies to the vertical stripe arrangement.
 本実施形態では、位相差検出画素1x,1yを、R,G,Bのうち最も多いGフィルタ搭載画素に設けており、水平方向(x方向)に8画素置き、垂直方向(y方向)に8画素置き、かつ全体的に市松位置となるように配置されている。従って、位相差検出方向(左右方向)で見たとき、位相差検出画素1xは4画素置きに配置されることになる。 In the present embodiment, the phase difference detection pixels 1x and 1y are provided in the most G filter mounted pixels among R, G, and B, and 8 pixels are arranged in the horizontal direction (x direction) and in the vertical direction (y direction). It is arranged so that every eight pixels and the entire checkered position. Accordingly, when viewed in the phase difference detection direction (left-right direction), the phase difference detection pixels 1x are arranged every four pixels.
 図4は、図3の位相差検出画素1x,1yだけを抜き出して模式的に表示した図である。ペア画素を構成する位相差検出画素1x,1yは、特許文献2と同様に、その遮光膜開口2x,2yが他の画素(位相差検出画素以外の画素)より小さく形成され、かつ、画素1xの遮光膜開口2xは左方向に偏心して設けられ、画素1yの遮光膜開口2yは右方向(位相差検出方向)に偏心して設けられている。 FIG. 4 is a diagram schematically showing only the phase difference detection pixels 1x and 1y of FIG. Similarly to Patent Document 2, the phase difference detection pixels 1x and 1y constituting the paired pixels are formed such that the light shielding film openings 2x and 2y are smaller than other pixels (pixels other than the phase difference detection pixels), and the pixel 1x The light shielding film opening 2x is eccentrically provided in the left direction, and the light shielding film opening 2y of the pixel 1y is eccentrically provided in the right direction (phase difference detection direction).
 図4の下段に示す曲線Xは、横一行に並ぶ位相差検出画素1xの検出信号量をプロットしたグラフであり、曲線Yは、これら画素1xとペアを構成する位相差検出画素1yの検出信号量をプロットしたグラフである。 The curve X shown in the lower part of FIG. 4 is a graph in which the detection signal amounts of the phase difference detection pixels 1x arranged in a horizontal row are plotted, and the curve Y is a detection signal of the phase difference detection pixel 1y that forms a pair with these pixels 1x. It is the graph which plotted quantity.
 ペア画素1x,1yは隣接画素であり極めて近接しているため、同一被写体からの光を受光していると考えられる。このため、曲線Xと曲線Yとは同一形状になると考えられ、その左右方向(位相差検出方向)のずれが、瞳分割したペア画素の一方の画素1xで見た画像と、他方の画素1yで見た画像との位相差量となる。 The paired pixels 1x and 1y are adjacent pixels and are very close to each other, and therefore are considered to receive light from the same subject. For this reason, it is considered that the curve X and the curve Y have the same shape, and the shift in the left-right direction (phase difference detection direction) is caused by the difference between the image seen by one pixel 1x of the paired pixel divided by the pupil and the other pixel 1y. This is the amount of phase difference from the image seen in.
 この曲線Xと曲線Yの相関演算を行うことで、位相差量(横ズレ量)を求めることができ、この位相差量から被写体までの距離を算出することが可能となる。曲線Xと曲線Yの相関量の評価値を求める方法は、公知の方法(例えば特許文献1に記載された方法や特許文献2に記載された方法)を採用すれば良い。例えば、曲線Xを構成する各点X(i)と、曲線Yを構成する各点Y(i+j)の差分の絶対値の積算値を評価値とし、最大評価値を与えるj値を、位相差量(横ズレ量)とする。 By performing a correlation calculation between the curve X and the curve Y, the phase difference amount (lateral shift amount) can be obtained, and the distance to the subject can be calculated from the phase difference amount. A known method (for example, a method described in Patent Document 1 or a method described in Patent Document 2) may be adopted as a method for obtaining the evaluation value of the correlation amount between the curve X and the curve Y. For example, the integrated value of the absolute value of the difference between each point X (i) constituting the curve X and each point Y (i + j) constituting the curve Y is used as the evaluation value, and the j value giving the maximum evaluation value is expressed as the phase difference. The amount (lateral deviation).
 しかし、1画素1画素の受光面積が小さい場合、個々の信号量は小さくなってノイズの割合が増えるため、相関演算を行っても精度良く位相差量を検出することが困難となる。そこで、図2の位相差検出エリア40内において、水平方向同一位置にある画素1xの検出信号を垂直方向に複数画素分だけ加算し、水平方向同一位置にある画素1yの検出信号を垂直方向に複数画素分だけ加算すれば、ノイズの影響を低減して合焦位置の検出精度(AF精度)を向上させることが可能となる。 However, when the light receiving area of each pixel is small, the amount of each signal is small and the ratio of noise is increased. Therefore, it is difficult to detect the phase difference with high accuracy even if correlation calculation is performed. Therefore, within the phase difference detection area 40 of FIG. 2, the detection signals of the pixels 1x at the same horizontal position are added by a plurality of pixels in the vertical direction, and the detection signals of the pixels 1y at the same horizontal position are added in the vertical direction. If only a plurality of pixels are added, the influence of noise can be reduced and the detection accuracy (AF accuracy) of the in-focus position can be improved.
 しかし、画素加算数を多くすれば良いというものではなく、画素加算数が多くなるほど、それだけ位相差検出エリア40の画素加算対象とする位相差検出画素の配置領域が上下方向(垂直方向)に延びることになる。被写体パターンは、位相差検出エリア40の上部分で撮像されるパターンと中間部分で撮像されるパターンと下部分で撮像されるパターンとでは異なるのが普通である。このため、これらを全て画素加算してしまうと、位相差検出方向(左右方向)で被写体の画素加算後のパターンが平均化されてしまい、位相差を求める評価値が下がってしまうという問題が生じる。 However, it is not necessary to increase the number of pixel additions. As the number of pixel additions increases, the arrangement area of the phase difference detection pixels to be added to the pixels in the phase difference detection area 40 extends in the vertical direction (vertical direction). It will be. The subject pattern is usually different between a pattern imaged in the upper part of the phase difference detection area 40, a pattern imaged in the middle part, and a pattern imaged in the lower part. For this reason, if all these pixels are added, the pattern after subject pixel addition is averaged in the phase difference detection direction (left and right direction), resulting in a problem that the evaluation value for obtaining the phase difference is lowered. .
 そこで、本実施形態では、図2に示す様に、位相差検出エリア40を4分割し、画素加算範囲を各分割エリア内に限定し、分割エリアを超えて画素加算しない様にしている。つまり、各分割エリアI,II,III,IV毎に画素加算して分割エリア評価曲線(相関演算カーブ)を求め、各分割エリア評価曲線を加算して位相差検出エリア40の全体の評価曲線(総合評価曲線)を求めることとしている。 Therefore, in the present embodiment, as shown in FIG. 2, the phase difference detection area 40 is divided into four, the pixel addition range is limited to each divided area, and pixels are not added beyond the divided area. In other words, pixels are added to each divided area I, II, III, IV to obtain a divided area evaluation curve (correlation calculation curve), and each divided area evaluation curve is added to evaluate the entire evaluation curve of the phase difference detection area 40 ( Comprehensive evaluation curve).
 図5には、分割エリア毎の夫々の評価曲線I,II,III,IVと、これら4本の分割エリア評価曲線を合計した総合評価曲線(全エリアの評価曲線)Vを示すグラフである。分割エリア評価曲線Iは、分割領域I内で位相差検出画素1xの検出信号を垂直方向(例えば図3の符号45)に画素加算して得られた図4の曲線Xと、同じく分割領域I内で位相差検出画素1yの検出信号を垂直方向(例えば図3の符号46)に画素加算して得られた図4の曲線Yとを相関演算して求めた評価曲線である。この例の場合、最大評価値は最小値として求められる。 FIG. 5 is a graph showing the respective evaluation curves I, II, III, IV for each divided area, and a total evaluation curve (evaluation curve for all areas) V obtained by summing up these four divided area evaluation curves. The divided area evaluation curve I is the same as the divided area I in FIG. 4 obtained by adding the detection signals of the phase difference detection pixels 1x in the divided area I in the vertical direction (for example, reference numeral 45 in FIG. 3). 4 is an evaluation curve obtained by correlation calculation with the curve Y in FIG. 4 obtained by adding the detection signals of the phase difference detection pixel 1y in the vertical direction (for example, reference numeral 46 in FIG. 3). In this example, the maximum evaluation value is obtained as the minimum value.
 同じく、分割エリア評価曲線IIは分割領域IIで得られた評価曲線であり、分割エリア評価曲線IIIは分割領域IIIで得られた評価曲線であり、分割エリア評価曲線IVは分割領域IVで得られた評価曲線である。 Similarly, the divided area evaluation curve II is an evaluation curve obtained in the divided area II, the divided area evaluation curve III is an evaluation curve obtained in the divided area III, and the divided area evaluation curve IV is obtained in the divided area IV. Evaluation curve.
 これら4本の分割エリア評価曲線I,II,III,IVの各々を求めるための加算画素数は、位相差検出エリア40の垂直方向に並ぶ位相差検出画素1xの画素数に対して、ほぼ分割エリア数分の1となるため、被写体のパターンが平均化される虞が少なく、精度良く評価値を算出することが可能となる。 The number of added pixels for obtaining each of these four divided area evaluation curves I, II, III, and IV is substantially divided with respect to the number of phase difference detection pixels 1x arranged in the vertical direction of the phase difference detection area 40. Since the number of areas is one-thousand, there is little possibility that the subject pattern is averaged, and the evaluation value can be calculated with high accuracy.
 そして、本実施形態では、これら4本の分割エリア評価曲線I,II,III,IVを合計した総合評価曲線Vを求め、更に、総合評価曲線Vからサブピクセル補間演算を行うことで、位相差量(デフォーカス量)を求める。これにより、ノイズに強く、かつ被写体の各分割エリアの評価値を保ったまま位相差の高精度な演算が可能となり、AF精度の向上を図ることが可能となる。 In this embodiment, a total evaluation curve V obtained by summing up these four divided area evaluation curves I, II, III, and IV is obtained, and further, a sub-pixel interpolation operation is performed from the total evaluation curve V, whereby a phase difference is obtained. The amount (defocus amount) is obtained. This makes it possible to calculate the phase difference with high accuracy while being resistant to noise and maintaining the evaluation value of each divided area of the subject, thereby improving the AF accuracy.
 図5のグラフの横軸の1単位は、図3の位相差検出画素の画素間隔(8画素間隔の市松配列であるため、4画素間隔となる。)であるため、総合評価曲線Vの最小値の位置と、この最小値に対して右側に延びる曲線と左側に延びる曲線の夫々の勾配等を勘案して、真の最小値(最大評価値)を与える位置つまり位相差量をサブピクセル補間演算で算出する。これにより、図3の一画素単位で位相差量を求めることが可能となる。 Since one unit of the horizontal axis of the graph of FIG. 5 is the pixel interval of the phase difference detection pixels of FIG. 3 (because of the checkered array of 8 pixel intervals, the interval is 4 pixels), the minimum of the comprehensive evaluation curve V Subpixel interpolation of the position that gives the true minimum value (maximum evaluation value), that is, the phase difference amount, taking into account the position of the value and the slope of the curve that extends to the right and the curve that extends to the left with respect to this minimum value. Calculate by calculation. As a result, the phase difference amount can be obtained for each pixel in FIG.
 上述した実施形態では、4本の分割エリア評価曲線I,II,III,IVを合計して総合評価曲線Vを求め、最大評価値を求めるとして説明した。しかし、各分割エリアI,II,III,IV毎に相関演算を行った場合、信頼性の低い相関演算結果しか得られない分割エリアも存在することがある。 In the above-described embodiment, the four divided area evaluation curves I, II, III, and IV are summed to obtain the overall evaluation curve V, and the maximum evaluation value is obtained. However, when the correlation calculation is performed for each of the divided areas I, II, III, and IV, there may be a divided area where only a correlation calculation result with low reliability can be obtained.
 例えば、図6に示す様に、相関カーブ(評価曲線)IIだけみたとき、最大評価値(この例では最小値)を与える横軸位置が明確でなく不明(例えば複数点の存在)の場合が存在する。この様な場合には、信頼性の低い相関カーブII以外の、信頼性の高い相関カーブI,III,IVの3本の相関カーブだけを加算して合計の総合評価曲線(分割領域Iの相関演算曲線は使っていないため「全」エリアではなく複数エリアの相関演算曲線である)Vを求め、この総合評価曲線Vをサブピクセル補間処理し、フォーカスレンズを合焦位置に合わせるための位相差量つまりデフォーカス量を算出する。これにより、AF精度の更なる向上を図ることが可能となる。 For example, as shown in FIG. 6, when only the correlation curve (evaluation curve) II is viewed, the position of the horizontal axis that gives the maximum evaluation value (minimum value in this example) is not clear and unknown (for example, the presence of a plurality of points). Exists. In such a case, only the three correlation curves I, III, and IV with high reliability other than the low reliability correlation curve II are added to obtain a total comprehensive evaluation curve (correlation of the divided region I). Since the calculation curve is not used, V is a correlation calculation curve for a plurality of areas instead of the “all” area), and the overall evaluation curve V is subjected to sub-pixel interpolation processing, and the phase difference for adjusting the focus lens to the in-focus position. The amount, that is, the defocus amount is calculated. Thereby, it is possible to further improve the AF accuracy.
 信頼性が高いか否かは、信頼性を数値化しこれが所定閾値より高いか否かで判定する。例えば、1本の分割エリア評価曲線に評価値が同程度の最小値となる極小値が複数存在したとき、信頼性が低いと判断する。 Whether or not the reliability is high is determined by quantifying the reliability and determining whether or not the reliability is higher than a predetermined threshold. For example, when there are a plurality of minimum values that have the same minimum evaluation value in a single divided area evaluation curve, it is determined that the reliability is low.
 図2の位相差検出エリア40内に撮影距離の異なる複数の被写体が存在する場合がある。仮に、被写体A,Bが存在するとする。この様な場合、例えば図7に示す様に、分割エリアI,II,IIIでは被写体Aに対して最大評価値を出す評価曲線I,II,IIIが算出され、分割エリアIVでは被写体Bに対して最大評価値を出す評価曲線IVが算出される。図6の場合と異なり、図7の評価曲線IVは、信頼性が低い訳ではなく、信頼性は高い。 There may be a plurality of subjects with different shooting distances in the phase difference detection area 40 of FIG. Assume that subjects A and B exist. In such a case, for example, as shown in FIG. 7, in the divided areas I, II, and III, evaluation curves I, II, and III for calculating the maximum evaluation value for the subject A are calculated, and in the divided area IV, for the subject B. Thus, an evaluation curve IV for obtaining the maximum evaluation value is calculated. Unlike the case of FIG. 6, the evaluation curve IV of FIG. 7 is not necessarily low in reliability but high in reliability.
 しかし、この様な場合に、全部の分割エリア評価曲線I,II,III,IVを合計して総合評価曲線Vを求めると誤合焦となり、合焦位置を誤る虞がある。そこで、この様な場合には、個々の評価曲線I,II,III,IVの最大評価値をとる横軸上の位置(精確な位置ではなく、ある程度の同範囲内にあると考えられる位置で良い。)の多数決をとり、分割エリア評価曲線IVを除外し、多数となる分割エリア評価曲線I,II,IIIを合計することで総合評価曲線Vを求める。これにより、位相差検出エリア(AFエリア)に撮影距離の異なる被写体が存在しても、誤合焦を防ぐことが可能となる。 However, in such a case, if all the divided area evaluation curves I, II, III, and IV are summed up to obtain the overall evaluation curve V, there is a possibility that the in-focus position may be incorrect. Therefore, in such a case, a position on the horizontal axis that takes the maximum evaluation value of each evaluation curve I, II, III, IV (not a precise position, but a position that is considered to be within a certain range of some extent). The total evaluation curve V is obtained by adding the majority of the divided area evaluation curves I, II, and III. As a result, even when subjects with different shooting distances exist in the phase difference detection area (AF area), it is possible to prevent erroneous focusing.
 図7の説明では、4本の分割エリア評価曲線の多数決をとるとして説明した。しかし、分割エリア数が4個であるため、2対2の同数となる場合も存在する。この場合を図8で説明する。図8では、分割エリア評価曲線I,III(被写体Aに対する評価曲線とする。)と、分割エリア評価曲線II,IV(被写体Bに対する評価曲線とする。)の2本ずつとなっている。 In the description of FIG. 7, it is assumed that the majority of the four divided area evaluation curves is taken. However, since the number of divided areas is 4, there may be a case where the number is equal to 2 to 2. This case will be described with reference to FIG. In FIG. 8, there are two divided area evaluation curves I and III (the evaluation curve for the subject A) and two divided area evaluation curves II and IV (the evaluation curve for the subject B).
 この様な場合に、4本の分割エリア評価曲線I,II,III,IVを合計して総合評価曲線Vを求めると、被写体Aにも合焦せず、被写体Bにも合焦しない結果になってしまう。そこで、被写体Aか被写体Bのいずれか一方を選ぶことになる。本実施形態では、被写体Aと被写体Bのうち、カメラに近い側の被写体を主要被写体と判断することにする。つまり、位置ズレ量が小さい側(図5の横軸参照:撮像装置に近い側)の被写体を選択し、分割エリア評価曲線I,IIIだけを加算して総合評価曲線Vを求め、サブピクセル補間して位相差量を算出する。 In such a case, when the total evaluation curve V is obtained by adding the four divided area evaluation curves I, II, III, and IV, the subject A is not focused and the subject B is not focused. turn into. Therefore, either subject A or subject B is selected. In the present embodiment, of the subjects A and B, the subject closer to the camera is determined as the main subject. That is, the subject on the side where the amount of positional deviation is small (see the horizontal axis in FIG. 5: the side close to the imaging device) is selected, and only the divided area evaluation curves I and III are added to obtain the total evaluation curve V, and the subpixel interpolation is performed. To calculate the phase difference amount.
 これにより、AFエリアに撮影距離の異なる被写体が存在する場合でも、誤合焦を防ぐことが可能となり、かつ、近い被写体に優先的にAFを合わせることが可能となる。 As a result, even when there are subjects with different shooting distances in the AF area, it is possible to prevent erroneous focusing, and it is possible to preferentially focus on a close subject.
 図9は、本発明の別実施形態に係る位相差検出画素の画素配列を示す図である。上述した各実施形態では、位相差検出方向を横方向(x方向)だけとして位相差検出画素1x,1yを設けたが、本実施形態では、図10に示す様に、横方向を第1位相差検出方向とし、縦方向(y方向)も第2位相差検出方向としている。 FIG. 9 is a diagram showing a pixel arrangement of phase difference detection pixels according to another embodiment of the present invention. In each of the embodiments described above, the phase difference detection pixels 1x and 1y are provided with only the phase difference detection direction as the horizontal direction (x direction). However, in this embodiment, as shown in FIG. The phase difference detection direction is used, and the vertical direction (y direction) is also the second phase difference detection direction.
 このため、図9に示す様に、この実施形態では、正方配列の最隣接4画素A,B,C,Dのうち、左上画素A(位相差検出画素1xに相当)と右下画素B(位相差検出画素1yに相当)とで水平方向(x方向)の位相差を検出し、右上画素Cと左下画素Dとで垂直方向(y方向)の位相差を検出する構成となっている。即ち、画素Aの小さな遮光膜開口を左側に偏心させ、画素Bの小さな遮光膜開口を右側に偏心させ、画素Cの小さな遮光膜開口を上側に偏心させ、画素Dの小さな遮光膜開口を下側に偏心させている。 For this reason, as shown in FIG. 9, in this embodiment, among the four adjacent pixels A, B, C, and D in the square array, the upper left pixel A (corresponding to the phase difference detection pixel 1x) and the lower right pixel B ( The phase difference in the horizontal direction (x direction) is detected by the phase difference detection pixel 1y), and the phase difference in the vertical direction (y direction) is detected by the upper right pixel C and the lower left pixel D. That is, the small light shielding film opening of the pixel A is decentered to the left side, the small light shielding film opening of the pixel B is decentered to the right side, the small light shielding film opening of the pixel C is decentered upward, and the small light shielding film opening of the pixel D is It is eccentric to the side.
 上述した図9に示す位相差検出画素A,B,C,Dの組を、図10の位相差検出エリア40内の周期的かつ離散的位置に配列することで、縦方向,横方向の位相差を検出する。この場合、矩形の位相差検出エリア40をメッシュ状に、図11に示す例では縦横共に2分割の計4つのエリアI,II,III,IVに分割する。この4つの分割エリアの夫々では、同一エリア内で縦方向,横方向共に同じ様な評価曲線が得られる蓋然性が高いと考えられる。 By arranging the set of the phase difference detection pixels A, B, C, and D shown in FIG. 9 at periodic and discrete positions in the phase difference detection area 40 of FIG. 10, the vertical and horizontal positions are arranged. Detect phase difference. In this case, the rectangular phase difference detection area 40 is divided into a mesh shape, and is divided into four areas I, II, III, and IV in the example shown in FIG. Each of the four divided areas is considered to have a high probability of obtaining the same evaluation curve in the vertical and horizontal directions within the same area.
 図12は、4つの分割エリアの夫々で水平方向の分割エリア評価曲線と垂直方向の分割エリア評価曲線の計8本の分割エリア評価曲線を求めた場合の一例を示す図である。分割エリアIからは、水平方向の分割エリア評価曲線Iaと垂直方向の分割エリア評価曲線Ibが求まり、分割エリアIIからは、水平方向の分割エリア評価曲線IIaと垂直方向の分割エリア評価曲線IIbが求まり、分割エリアIIIからは、水平方向の分割エリア評価曲線IIIaと垂直方向の分割エリア評価曲線IIIbが求まり、分割エリアIVからは、水平方向の分割エリア評価曲線IVaと垂直方向の分割エリア評価曲線IVbが求まる。 FIG. 12 is a diagram illustrating an example in which a total of eight divided area evaluation curves, that is, a horizontal divided area evaluation curve and a vertical divided area evaluation curve are obtained in each of the four divided areas. From the divided area I, a horizontal divided area evaluation curve Ia and a vertical divided area evaluation curve Ib are obtained, and from the divided area II, a horizontal divided area evaluation curve IIa and a vertical divided area evaluation curve IIb are obtained. From the divided area III, a horizontal divided area evaluation curve IIIa and a vertical divided area evaluation curve IIIb are obtained, and from the divided area IV, the horizontal divided area evaluation curve IVa and the vertical divided area evaluation curve are obtained. IVb is obtained.
 図12の例では、分割エリアIVの評価値の信頼性が小さいため、分割エリアI,II,IIIの計6本の分割エリア評価曲線を加算して合計の総合評価曲線Vを求め、サブピクセル補間を行い、位相差量つまりデフォーカス量を算出する。この様に、異なる方向の位相差を検出して合焦位置を求めるため、AF精度の更なる向上を図ることが可能となる。
また、信頼性の低い分割エリアを除外してAF演算には用いないため、AF精度を更に向上させることができる。
In the example of FIG. 12, since the reliability of the evaluation value of the divided area IV is small, a total of six divided area evaluation curves for the divided areas I, II, and III are added to obtain a total evaluation curve V, and the subpixels are obtained. Interpolation is performed to calculate a phase difference amount, that is, a defocus amount. In this way, since the in-focus position is obtained by detecting the phase difference in different directions, it is possible to further improve the AF accuracy.
Further, since the divided areas with low reliability are excluded and are not used for the AF calculation, the AF accuracy can be further improved.
 以上述べた実施形態の様に、撮像装置は、位相差検出エリアを複数領域に分割し、分割エリア毎に分割エリア評価曲線を求め、信頼性の高い分割エリア評価曲線だけを加算して合計の総合評価曲線を求め、この総合評価曲線をサブピクセル補間して位相差量を算出することで、撮影レンズの合焦制御を行うことができる。そのため、小面積の撮像素子を使用する場合であっても一眼レフカメラと同程度の高速かつ高精度なAF性能を得ることが可能となる。 As in the above-described embodiment, the imaging apparatus divides the phase difference detection area into a plurality of regions, obtains a divided area evaluation curve for each divided area, adds only the reliable divided area evaluation curve, and adds up the total. By obtaining a comprehensive evaluation curve and calculating a phase difference amount by interpolating the comprehensive evaluation curve with sub-pixels, it is possible to perform focusing control of the photographing lens. For this reason, even when a small-area image sensor is used, it is possible to obtain high-speed and high-precision AF performance comparable to that of a single-lens reflex camera.
 なお、上述した実施形態では、位相差検出画素を構成する瞳分割したペア画素として、小さくした遮光膜開口を互いに反対方向にずらした例を用いて説明したが、瞳分割して位相差検出画素を構成する方法はこれに限るものではなく、例えばペア画素に一つのマイクロレンズを搭載して瞳分割することでも良い。 In the above-described embodiment, the pupil-divided pair pixel constituting the phase difference detection pixel has been described by using an example in which the reduced light shielding film openings are shifted in opposite directions. The method of configuring is not limited to this, and for example, one microlens may be mounted on a pair of pixels to divide the pupil.
 また、上述した実施形態では、個々のエリア毎の相関演算曲線に対し所要演算処理を施して全エリア(或いは複数エリア)の総合評価曲線を求める場合、所要演算処理として「加算合計」する例について述べたが、他にも、「平均値」や「乗算値」を求めることでも良い。 Further, in the above-described embodiment, when the required calculation process is performed on the correlation calculation curve for each area to obtain a total evaluation curve for all areas (or a plurality of areas), “addition and summation” is performed as the required calculation process. As described above, “average value” and “multiplication value” may also be obtained.
 更に、上述した実施形態では、位相差を検出するペア画素を位相差検出エリア内の離散的,周期的な位置に設けた例について述べたが、必ずしも周期的,離散的位置に設ける必要はなく、ランダムな位置(同一行に設けられる位相差検出画素がランダムな位置にあっても図4の曲線X,Yは求められる。)であっても良く、あるいは全画素を位相差検出画素とすることでも良い。 Furthermore, in the above-described embodiment, the example in which the pair pixels for detecting the phase difference are provided at discrete and periodic positions in the phase difference detection area has been described. However, the pixel need not necessarily be provided at the periodic and discrete positions. 4 may be random positions (even if the phase difference detection pixels provided in the same row are at random positions, the curves X and Y in FIG. 4 can be obtained), or all the pixels are used as phase difference detection pixels. That's fine.
 以上述べた実施形態の撮像装置及びその合焦位置検出方法は、被写体の画像を撮像する受光面に設けられた位相差検出エリア内に、瞳分割した第1の位相差検出画素及び第2の位相差検出画素で構成されるペア画素が2次元配列された撮像素子と、
 該撮像素子の光路前段に置かれ、前記被写体の合焦した光学像を前記受光面に結像させるフォーカスレンズと、
 前記第1の位相差検出画素からの出力信号の前記ペア画素の一方の配列方向に対する第1の分布曲線と、前記第2の位相差検出画素からの出力信号の前記一方の配列方向に対する第2の分布曲線との位相差を求め、該位相差に基づき前記フォーカスレンズを前記合焦する位置に駆動制御する制御手段とを備える撮像装置の合焦位置検出方法であって、
 前記位相差の検出方向に対して直角方向に前記位相差検出エリアを複数の分割エリアに分割し、
 前記分割エリア毎に、当該分割エリア内の前記直角方向に一列に並ぶ複数の前記第1の位相差検出画素からの出力信号を、該分割エリア内の前記一方の配列方向に並ぶ複数の前記第1の位相差検出画素に対して、該第1の位相差検出画素毎に加算すると共に、前記分割エリア毎に、当該分割エリア内の前記直角方向に一列に並ぶ複数の前記第2の位相差検出画素からの出力信号を、該分割エリア内の前記一方の配列方向に並ぶ複数の前記第2の位相差検出画素に対して、該第2の位相差検出画素毎に加算し、
 前記分割エリアに対して、前記第1の位相差検出画素の出力信号を加算した第1の加算信号と前記第2の位相差検出画素の出力信号を加算した第2の加算信号との相関関係を前記第1及び第2の位相差検出画素毎に演算して分割エリア毎の相関演算曲線を算出し、
 前記分割エリア毎に算出された前記相関演算曲線のうち、任意の複数個の前記相関演算曲線に対して所定演算処理を施した総合評価曲線を求め、前記総合評価曲線から前記フォーカスレンズを前記合焦する位置に駆動制御するためのデフォーカス量を求めることを特徴とする。
The imaging apparatus and the focus position detection method thereof according to the embodiments described above include the first phase difference detection pixel and the second phase divided in the phase difference detection area provided on the light receiving surface that captures the image of the subject. An image sensor in which a pair of pixels composed of phase difference detection pixels are two-dimensionally arranged;
A focus lens that is placed in front of the optical path of the image sensor and forms a focused optical image of the subject on the light receiving surface;
A first distribution curve of the output signal from the first phase difference detection pixel with respect to one arrangement direction of the pair pixels, and a second distribution curve with respect to the one arrangement direction of the output signal from the second phase difference detection pixel. An in-focus position detection method of an imaging apparatus, comprising: a control unit that obtains a phase difference from the distribution curve of the first lens and controls the drive of the focus lens to the in-focus position based on the phase difference,
Dividing the phase difference detection area into a plurality of divided areas in a direction perpendicular to the phase difference detection direction;
For each of the divided areas, output signals from the plurality of first phase difference detection pixels arranged in a line in the perpendicular direction in the divided area are output to the plurality of first signals arranged in the one arrangement direction in the divided area. One phase difference detection pixel is added for each first phase difference detection pixel, and for each of the divided areas, a plurality of the second phase differences arranged in a row in the perpendicular direction in the divided area The output signal from the detection pixel is added to each of the plurality of second phase difference detection pixels arranged in the one arrangement direction in the divided area for each second phase difference detection pixel,
Correlation between a first addition signal obtained by adding the output signal of the first phase difference detection pixel and a second addition signal obtained by adding the output signal of the second phase difference detection pixel to the divided area. Is calculated for each of the first and second phase difference detection pixels to calculate a correlation calculation curve for each divided area,
Among the correlation calculation curves calculated for each of the divided areas, a comprehensive evaluation curve obtained by performing a predetermined calculation process on an arbitrary plurality of correlation calculation curves is obtained, and the focus lens is combined with the focus lens from the comprehensive evaluation curve. It is characterized in that a defocus amount for driving control to a focus position is obtained.
 また、実施形態の撮像装置及びその合焦位置検出方法は、前記複数エリアの相関演算曲線を算出するとき前記分割エリア毎の各相関演算曲線の信頼性評価を行い、信頼性が所定閾値より高い前記分割エリア毎の相関演算曲線に対してだけ前記所定演算処理を施して前記総合評価曲線を求めることを特徴とする。 In the imaging apparatus and the focus position detection method according to the embodiment, when calculating the correlation calculation curves of the plurality of areas, the reliability of each correlation calculation curve for each divided area is evaluated, and the reliability is higher than a predetermined threshold. Only the correlation calculation curve for each divided area is subjected to the predetermined calculation process to obtain the comprehensive evaluation curve.
 また、実施形態の撮像装置及びその合焦位置検出方法は、前記分割エリア毎の相関演算曲線において前記第1の加算信号と前記第2の加算信号との差分の絶対値として表される評価値が最大となる位置が同程度の範囲内となる、信頼性が所定閾値以上高い分割エリア毎の相関演算曲線だけを選択的に用いて前記総合評価曲線を求めることを特徴とする。 In addition, the imaging apparatus and the focus position detection method thereof according to the embodiment include an evaluation value represented as an absolute value of a difference between the first addition signal and the second addition signal in the correlation calculation curve for each divided area. The comprehensive evaluation curve is obtained by selectively using only the correlation calculation curve for each divided area where the position where the maximum value is within the same range and the reliability is higher than a predetermined threshold value.
 また、実施形態の撮像装置及びその合焦位置検出方法は、前記分割エリア毎の相関演算曲線の前記最大の評価値を与える位置の多数決をとり、多数となる前記分割エリア毎の相関演算曲線だけを選択的に用いて前記総合評価曲線を求めることを特徴とする。 In addition, the imaging apparatus and the focus position detection method of the embodiment take a majority decision of the position that gives the maximum evaluation value of the correlation calculation curve for each divided area, and only the correlation calculation curve for each divided area becomes a large number. The comprehensive evaluation curve is obtained by selectively using.
 また、実施形態の撮像装置及びその合焦位置検出方法は、前記評価値が最大となる位置が同程度の範囲内となる相関演算曲線が複数求まるときは、前記評価値が最大となる位置の位相差が小さい方の前記相関演算曲線を選択することを特徴とする。 Further, in the imaging apparatus and the focus position detection method according to the embodiment, when a plurality of correlation calculation curves in which the position where the evaluation value is maximized are within the same range are obtained, the position where the evaluation value is maximized is determined. The correlation calculation curve having a smaller phase difference is selected.
 また、実施形態の撮像装置及びその合焦位置検出方法は、前記評価値が最大となる位置が同程度の範囲内となる相関演算曲線が複数求まるときは、前記複数の分割エリアのうち、前記評価値が最大となる位置が前記撮像素子から近い方の被写体を含む分割エリアの前記相関演算曲線を選択することを特徴とする。 In addition, in the imaging apparatus and the focus position detection method according to the embodiment, when a plurality of correlation calculation curves in which the position where the evaluation value is maximized are within the same range are obtained, among the plurality of divided areas, The correlation calculation curve is selected for a divided area including a subject whose evaluation value is maximum at a position closer to the image sensor.
 また、実施形態の撮像装置及びその合焦位置検出方法は、水平方向の位相差を検出する前記第1の位相差検出画素及び前記第2の位相差検出画素のペア画素の他に垂直方向の位相差を検出する前記第1の位相差検出画素及び前記第2の位相差検出画素のペア画素を前記撮像素子の前記位相差検出エリア内に設け、前記位相差検出エリアをメッシュ状に複数の分割エリアに分割し、前記分割エリア毎に、前記水平方向の前記分割エリア毎の相関演算曲線と前記垂直方向の前記分割エリア毎の相関演算曲線とを求め、前記分割エリア毎の相関演算曲線に基づいて前記総合評価曲線を求めることを特徴とする。 In addition, in the imaging apparatus and the focus position detection method thereof according to the embodiment, in addition to the pair of pixels of the first phase difference detection pixel and the second phase difference detection pixel that detect the phase difference in the horizontal direction, A pair pixel of the first phase difference detection pixel and the second phase difference detection pixel for detecting a phase difference is provided in the phase difference detection area of the image sensor, and the phase difference detection area is formed in a plurality of mesh shapes. Dividing into divided areas, for each divided area, obtaining a correlation calculation curve for each of the divided areas in the horizontal direction and a correlation calculation curve for each of the divided areas in the vertical direction. The comprehensive evaluation curve is obtained based on the above.
 また、前記分割エリア毎の相関演算曲線のうち、同一分割エリアにおける前記水平方向の相関演算曲線と、前記垂直方向の相関演算曲線との評価値が最大となる位置が互いに異なる場合には、当該分割エリアの相関演算曲線を除外した残りの分割エリアの相関演算曲線から前記総合評価曲線を求めることを特徴とする。 In addition, among the correlation calculation curves for each of the divided areas, if the position where the evaluation value of the horizontal correlation calculation curve and the vertical correlation calculation curve in the same divided area is different from each other, The comprehensive evaluation curve is obtained from the correlation calculation curves of the remaining divided areas excluding the correlation calculation curves of the divided areas.
 以上述べた実施形態によれば、分割エリア毎に画素加算を行うため、画素加算数が多くなって被写体のパターンが平均化されていることがなく、また、画素加算を行うために精度の高い位相差量を検出でき、小型小面積の固体撮像素子であっても高速かつ高精度に位相差AFを実現可能となる。 According to the embodiment described above, since pixel addition is performed for each divided area, the number of pixel additions is increased and the pattern of the subject is not averaged, and the pixel addition is highly accurate. The phase difference amount can be detected, and the phase difference AF can be realized at high speed and with high accuracy even with a small and small area solid-state imaging device.
 本発明に係る合焦位置検出方法は、高速かつ高精度なAF性能を得ることができるため、デジタルカメラ特にコンパクトなデジタルカメラやカメラ付携帯電話機、カメラ付電子装置、内視鏡用撮像装置等に適用すると有用である。
 本出願は、2010年11月30日出願の日本特許出願番号2010-267932に基づくものであり、その内容はここに参照として取り込まれる。
Since the in-focus position detection method according to the present invention can obtain high-speed and high-precision AF performance, a digital camera, particularly a compact digital camera, a camera-equipped mobile phone, an electronic device with a camera, an imaging device for an endoscope, etc. It is useful to apply to.
This application is based on Japanese Patent Application No. 2010-267932 filed on Nov. 30, 2010, the contents of which are incorporated herein by reference.
1x,1y 位相差検出画素
2x,2y 位相差検出画素の遮光膜開口
10 撮像装置
20 撮影レンズ
21 固体撮像素子
24 駆動部
26 デジタル信号処理部
29 システム制御部
40 位相差検出エリア
I,II,III,IV 分割エリア
1x, 1y Phase difference detection pixels 2x, 2y Light shielding film opening 10 of phase difference detection pixels Imaging device 20 Shooting lens 21 Solid-state imaging device 24 Drive unit 26 Digital signal processing unit 29 System control unit 40 Phase difference detection areas I, II, III , IV Divided area

Claims (16)

  1.  被写体の画像を撮像する受光面に設けられた位相差検出エリア内に、瞳分割した第1の位相差検出画素及び第2の位相差検出画素で構成されるペア画素が2次元配列された撮像素子と、
     該撮像素子の光路前段に置かれ、前記被写体の合焦した光学像を前記受光面に結像させるフォーカスレンズと、
     前記第1の位相差検出画素からの出力信号の前記ペア画素の一方の配列方向に対する第1の分布曲線と、前記第2の位相差検出画素からの出力信号の前記一方の配列方向に対する第2の分布曲線との位相差を求め、該位相差に基づき前記フォーカスレンズを前記合焦する位置に駆動制御する制御手段とを備え、
     該制御手段は、
     前記位相差の検出方向に対して直角方向に前記位相差検出エリアを複数の分割エリアに分割する手段と、
     前記分割エリア毎に、当該分割エリア内の前記直角方向に一列に並ぶ複数の前記第1の位相差検出画素からの出力信号を、該分割エリア内の前記一方の配列方向に並ぶ複数の前記第1の位相差検出画素に対して、該第1の位相差検出画素毎に加算すると共に、前記分割エリア毎に、当該分割エリア内の前記直角方向に一列に並ぶ複数の前記第2の位相差検出画素からの出力信号を、該分割エリア内の前記一方の配列方向に並ぶ複数の前記第2の位相差検出画素に対して、該第2の位相差検出画素毎に加算する手段と、
     前記分割エリアに対して、前記第1の位相差検出画素の出力信号を加算した第1の加算信号と前記第2の位相差検出画素の出力信号を加算した第2の加算信号との相関関係を前記第1及び第2の位相差検出画素毎に演算して分割エリア毎の相関演算曲線を算出する手段と、
     前記分割エリア毎に算出された前記相関演算曲線のうち、任意の複数個の前記相関演算曲線に対して所定演算処理を施した総合評価曲線を求め、前記総合評価曲線から前記フォーカスレンズを前記合焦する位置に駆動制御するためのデフォーカス量を求める手段とを備える撮像装置。
    Imaging in which a pair of pixels composed of a first phase difference detection pixel and a second phase difference detection pixel divided into pupils are two-dimensionally arranged in a phase difference detection area provided on a light receiving surface for capturing an image of a subject. Elements,
    A focus lens that is placed in front of the optical path of the image sensor and forms a focused optical image of the subject on the light receiving surface;
    A first distribution curve of the output signal from the first phase difference detection pixel with respect to one arrangement direction of the pair pixels, and a second distribution curve with respect to the one arrangement direction of the output signal from the second phase difference detection pixel. And a control means for driving and controlling the focus lens to the in-focus position based on the phase difference.
    The control means includes
    Means for dividing the phase difference detection area into a plurality of divided areas in a direction perpendicular to the phase difference detection direction;
    For each of the divided areas, output signals from the plurality of first phase difference detection pixels arranged in a line in the perpendicular direction in the divided area are output to the plurality of first signals arranged in the one arrangement direction in the divided area. One phase difference detection pixel is added for each first phase difference detection pixel, and for each of the divided areas, a plurality of the second phase differences arranged in a row in the perpendicular direction in the divided area Means for adding the output signal from the detection pixel for each of the second phase difference detection pixels to the plurality of second phase difference detection pixels arranged in the one arrangement direction in the divided area;
    Correlation between a first addition signal obtained by adding the output signal of the first phase difference detection pixel and a second addition signal obtained by adding the output signal of the second phase difference detection pixel to the divided area. Calculating a correlation calculation curve for each divided area by calculating for each of the first and second phase difference detection pixels;
    Among the correlation calculation curves calculated for each of the divided areas, a comprehensive evaluation curve obtained by performing a predetermined calculation process on an arbitrary plurality of correlation calculation curves is obtained, and the focus lens is combined with the focus lens from the comprehensive evaluation curve. An image pickup apparatus comprising: a unit that obtains a defocus amount for driving and controlling the focus position.
  2.  請求項1に記載の撮像装置であって、
     前記複数エリアの相関演算曲線を算出するとき前記分割エリア毎の各相関演算曲線の信頼性評価を行い、信頼性が所定閾値より高い前記分割エリア毎の相関演算曲線に対してだけ前記所定演算処理を施して前記総合評価曲線を求めることを特徴とする撮像装置。
    The imaging apparatus according to claim 1,
    When calculating the correlation calculation curves of the plurality of areas, the reliability calculation of each correlation calculation curve for each of the divided areas is performed, and the predetermined calculation processing is performed only for the correlation calculation curves of each of the divided areas whose reliability is higher than a predetermined threshold. To obtain the overall evaluation curve.
  3.  請求項1に記載の撮像装置であって、
     前記分割エリア毎の相関演算曲線において前記第1の加算信号と前記第2の加算信号との差分の絶対値として表される評価値が最大となる位置が同程度の範囲内となる、信頼性が所定閾値以上高い分割エリア毎の相関演算曲線だけを選択的に用いて前記総合評価曲線を求めることを特徴とする撮像装置。
    The imaging apparatus according to claim 1,
    Reliability in which the position where the evaluation value expressed as the absolute value of the difference between the first addition signal and the second addition signal is maximum is within the same range in the correlation calculation curve for each divided area. An imaging apparatus characterized in that the comprehensive evaluation curve is obtained by selectively using only a correlation calculation curve for each divided area having a threshold value higher than a predetermined threshold.
  4.  請求項3に記載の撮像装置であって、
     前記分割エリア毎の相関演算曲線の前記最大の評価値を与える位置の多数決をとり、多数となる前記分割エリア毎の相関演算曲線だけを選択的に用いて前記総合評価曲線を求めることを特徴とする撮像装置。
    The imaging apparatus according to claim 3,
    Taking a majority decision of the position that gives the maximum evaluation value of the correlation calculation curve for each divided area, and selectively obtaining only the correlation calculation curve for each of the divided areas to obtain the comprehensive evaluation curve An imaging device.
  5.  請求項3に記載の撮像装置であって、
     前記評価値が最大となる位置が同程度の範囲内となる相関演算曲線が複数求まるときは、前記評価値が最大となる位置の位相差が小さい方の前記相関演算曲線を選択することを特徴とする撮像装置。
    The imaging apparatus according to claim 3,
    When a plurality of correlation calculation curves in which the position where the evaluation value is maximized is within the same range are obtained, the correlation calculation curve having a smaller phase difference at the position where the evaluation value is maximum is selected. An imaging device.
  6.  請求項3に記載の撮像装置であって、
     前記評価値が最大となる位置が同程度の範囲内となる相関演算曲線が複数求まるときは、前記複数の分割エリアのうち、前記評価値が最大となる位置が前記撮像素子から近い方の被写体を含む分割エリアの前記相関演算曲線を選択することを特徴とする撮像装置。
    The imaging apparatus according to claim 3,
    When a plurality of correlation calculation curves in which the position where the evaluation value is maximized are within the same range are obtained, the subject whose position where the evaluation value is maximum is closer to the imaging element among the plurality of divided areas. An image pickup apparatus comprising: selecting the correlation calculation curve in a divided area including:
  7.  請求項1に記載の撮像装置であって、
     水平方向の位相差を検出する前記第1の位相差検出画素及び前記第2の位相差検出画素のペア画素の他に、垂直方向の位相差を検出する前記第1の位相差検出画素及び前記第2の位相差検出画素のペア画素を前記撮像素子の前記位相差検出エリア内に設け、前記位相差検出エリアをメッシュ状に複数の分割エリアに分割し、前記分割エリア毎に、前記水平方向の前記分割エリア毎の相関演算曲線と前記垂直方向の前記分割エリア毎の相関演算曲線とを求め、前記分割エリア毎の相関演算曲線に基づいて前記総合評価曲線を求める撮像装置。
    The imaging apparatus according to claim 1,
    In addition to the paired pixels of the first phase difference detection pixel and the second phase difference detection pixel that detect the phase difference in the horizontal direction, the first phase difference detection pixel that detects the phase difference in the vertical direction, A pair of second phase difference detection pixels is provided in the phase difference detection area of the image sensor, the phase difference detection area is divided into a plurality of divided areas in a mesh shape, and the horizontal direction is determined for each of the divided areas. An imaging apparatus that obtains a correlation calculation curve for each of the divided areas and a correlation calculation curve for each of the divided areas in the vertical direction, and obtains the comprehensive evaluation curve based on the correlation calculation curve of each of the divided areas.
  8.  請求項7に記載の撮像装置であって、
     前記分割エリア毎の相関演算曲線のうち、同一分割エリアにおける前記水平方向の相関演算曲線と、前記垂直方向の相関演算曲線との評価値が最大となる位置が互いに異なる場合には、当該分割エリアの相関演算曲線を除外した残りの分割エリアの相関演算曲線から前記総合評価曲線を求める撮像装置。
    The imaging apparatus according to claim 7,
    Of the correlation calculation curves for each of the divided areas, when the positions where the evaluation values of the horizontal correlation calculation curve and the vertical correlation calculation curve in the same divided area are maximum are different from each other, the divided area An imaging device for obtaining the comprehensive evaluation curve from the correlation calculation curves of the remaining divided areas excluding the correlation calculation curve.
  9.  被写体の画像を撮像する受光面に設けられた位相差検出エリア内に、瞳分割した第1の位相差検出画素及び第2の位相差検出画素で構成されるペア画素が2次元配列された撮像素子と、
     該撮像素子の光路前段に置かれ、前記被写体の合焦した光学像を前記受光面に結像させるフォーカスレンズと、
     前記第1の位相差検出画素からの出力信号の前記ペア画素の一方の配列方向に対する第1の分布曲線と、前記第2の位相差検出画素からの出力信号の前記一方の配列方向に対する第2の分布曲線との位相差を求め、該位相差に基づき前記フォーカスレンズを前記合焦する位置に駆動制御する制御手段とを備える撮像装置の合焦位置検出方法であって、
     前記位相差の検出方向に対して直角方向に前記位相差検出エリアを複数の分割エリアに分割し、
     前記分割エリア毎に、当該分割エリア内の前記直角方向に一列に並ぶ複数の前記第1の位相差検出画素からの出力信号を、該分割エリア内の前記一方の配列方向に並ぶ複数の前記第1の位相差検出画素に対して、該第1の位相差検出画素毎に加算すると共に、前記分割エリア毎に、当該分割エリア内の前記直角方向に一列に並ぶ複数の前記第2の位相差検出画素からの出力信号を、該分割エリア内の前記一方の配列方向に並ぶ複数の前記第2の位相差検出画素に対して、該第2の位相差検出画素毎に加算し、
     前記分割エリアに対して、前記第1の位相差検出画素の出力信号を加算した第1の加算信号と前記第2の位相差検出画素の出力信号を加算した第2の加算信号との相関関係を前記第1及び第2の位相差検出画素毎に演算して分割エリア毎の相関演算曲線を算出し、
     前記分割エリア毎に算出された前記相関演算曲線のうち、任意の複数個の前記相関演算曲線に対して所定演算処理を施した総合評価曲線を求め、前記総合評価曲線から前記フォーカスレンズを前記合焦する位置に駆動制御するためのデフォーカス量を求めることを特徴とする撮像装置の合焦位置検出方法。
    Imaging in which a pair of pixels composed of a first phase difference detection pixel and a second phase difference detection pixel divided into pupils are two-dimensionally arranged in a phase difference detection area provided on a light receiving surface for capturing an image of a subject. Elements,
    A focus lens that is placed in front of the optical path of the image sensor and forms a focused optical image of the subject on the light receiving surface;
    A first distribution curve of the output signal from the first phase difference detection pixel with respect to one arrangement direction of the pair pixels, and a second distribution curve with respect to the one arrangement direction of the output signal from the second phase difference detection pixel. An in-focus position detection method of an imaging apparatus, comprising: a control unit that obtains a phase difference from the distribution curve and controls the drive of the focus lens to the in-focus position based on the phase difference,
    Dividing the phase difference detection area into a plurality of divided areas in a direction perpendicular to the phase difference detection direction;
    For each of the divided areas, output signals from the plurality of first phase difference detection pixels arranged in a line in the perpendicular direction in the divided area are output to the plurality of first signals arranged in the one arrangement direction in the divided area. One phase difference detection pixel is added for each first phase difference detection pixel, and for each of the divided areas, a plurality of the second phase differences arranged in a row in the perpendicular direction in the divided area The output signal from the detection pixel is added to each of the plurality of second phase difference detection pixels arranged in the one arrangement direction in the divided area for each of the second phase difference detection pixels,
    Correlation between a first addition signal obtained by adding the output signal of the first phase difference detection pixel and a second addition signal obtained by adding the output signal of the second phase difference detection pixel to the divided area. For each of the first and second phase difference detection pixels to calculate a correlation calculation curve for each divided area,
    Among the correlation calculation curves calculated for each of the divided areas, a comprehensive evaluation curve obtained by performing a predetermined calculation process on an arbitrary plurality of correlation calculation curves is obtained, and the focus lens is combined with the focus lens from the comprehensive evaluation curve. An in-focus position detection method for an imaging apparatus, characterized in that a defocus amount for driving and controlling to a focus position is obtained.
  10.  請求項9に記載の撮像装置の合焦位置検出方法であって、
     前記複数エリアの相関演算曲線を算出するとき前記分割エリア毎の各相関演算曲線の信頼性評価を行い、信頼性が所定閾値より高い前記分割エリア毎の相関演算曲線に対してだけ前記所定演算処理を施して前記総合評価曲線を求めることを特徴とする撮像装置の合焦位置検出方法。
    An in-focus position detection method for an imaging apparatus according to claim 9,
    When calculating the correlation calculation curves of the plurality of areas, the reliability calculation of each correlation calculation curve for each of the divided areas is performed, and the predetermined calculation processing is performed only for the correlation calculation curves of each of the divided areas whose reliability is higher than a predetermined threshold. The focus position detection method of the imaging device, wherein the comprehensive evaluation curve is obtained by applying
  11.  請求項9に記載の撮像装置の合焦位置検出方法であって、
     前記分割エリア毎の相関演算曲線において前記第1の加算信号と前記第2の加算信号との差分の絶対値として表される評価値が最大となる位置が同程度の範囲内となる、信頼性が所定閾値以上高い分割エリア毎の相関演算曲線だけを選択的に用いて前記総合評価曲線を求めることを特徴とする撮像装置の合焦位置検出方法。
    An in-focus position detection method for an imaging apparatus according to claim 9,
    Reliability in which the position where the evaluation value expressed as the absolute value of the difference between the first addition signal and the second addition signal is maximum is within the same range in the correlation calculation curve for each divided area. An in-focus position detection method for an imaging apparatus, wherein the comprehensive evaluation curve is obtained by selectively using only a correlation calculation curve for each divided area having a threshold value higher than a predetermined threshold.
  12.  請求項11に記載の撮像装置の合焦位置検出方法であって、
     前記分割エリア毎の相関演算曲線の前記最大の評価値を与える位置の多数決をとり、多数となる前記分割エリア毎の相関演算曲線だけを選択的に用いて前記総合評価曲線を求めることを特徴とする撮像装置の合焦位置検出方法。
    An in-focus position detection method for an imaging apparatus according to claim 11,
    Taking a majority decision of the position that gives the maximum evaluation value of the correlation calculation curve for each divided area, and selectively obtaining only the correlation calculation curve for each of the divided areas to obtain the comprehensive evaluation curve A method for detecting a focus position of an imaging apparatus.
  13.  請求項11に記載の撮像装置の合焦位置検出方法であって、
     前記評価値が最大となる位置が同程度の範囲内となる相関演算曲線が複数求まるときは、前記評価値が最大となる位置の位相差が小さい方の前記相関演算曲線を選択することを特徴とする撮像装置の合焦位置検出方法。
    An in-focus position detection method for an imaging apparatus according to claim 11,
    When a plurality of correlation calculation curves in which the position where the evaluation value is maximized is within the same range are obtained, the correlation calculation curve having a smaller phase difference at the position where the evaluation value is maximum is selected. An in-focus position detection method for an imaging apparatus.
  14.  請求項11に記載の撮像装置の合焦位置検出方法であって、
     前記評価値が最大となる位置が同程度の範囲内となる相関演算曲線が複数求まるときは、前記複数の分割エリアのうち、前記評価値が最大となる位置が前記撮像素子から近い方の被写体を含む分割エリアの前記相関演算曲線を選択することを特徴とする撮像装置の合焦位置検出方法。
    An in-focus position detection method for an imaging apparatus according to claim 11,
    When a plurality of correlation calculation curves in which the position where the evaluation value is maximized are within the same range are obtained, the subject whose position where the evaluation value is maximum is closer to the imaging element among the plurality of divided areas. A focus position detection method for an imaging apparatus, wherein the correlation calculation curve in a divided area including the image is selected.
  15.  請求項9に記載の撮像装置の合焦位置検出方法であって、
     水平方向の位相差を検出する前記第1の位相差検出画素及び前記第2の位相差検出画素のペア画素の他に垂直方向の位相差を検出する前記第1の位相差検出画素及び前記第2の位相差検出画素のペア画素を前記撮像素子の前記位相差検出エリア内に設け、前記位相差検出エリアをメッシュ状に複数の分割エリアに分割し、前記分割エリア毎に、前記水平方向の前記分割エリア毎の相関演算曲線と前記垂直方向の前記分割エリア毎の相関演算曲線とを求め、前記分割エリア毎の相関演算曲線に基づいて前記総合評価曲線を求める撮像装置の合焦位置検出方法。
    An in-focus position detection method for an imaging apparatus according to claim 9,
    In addition to the paired pixels of the first phase difference detection pixel and the second phase difference detection pixel that detect the phase difference in the horizontal direction, the first phase difference detection pixel and the first phase difference detection pixel that detect the phase difference in the vertical direction. A pair of phase difference detection pixels is provided in the phase difference detection area of the image sensor, the phase difference detection area is divided into a plurality of divided areas in a mesh shape, and the horizontal direction is divided into each divided area. An in-focus position detection method for an imaging apparatus that obtains a correlation calculation curve for each divided area and a correlation calculation curve for each divided area in the vertical direction, and obtains the comprehensive evaluation curve based on the correlation calculation curve for each divided area .
  16.  請求項15に記載の撮像装置の合焦位置検出方法であって、
     前記分割エリア毎の相関演算曲線のうち、同一分割エリアにおける前記水平方向の相関演算曲線と、前記垂直方向の相関演算曲線との評価値が最大となる位置が互いに異なる場合には、当該分割エリアの相関演算曲線を除外した残りの分割エリア毎の相関演算曲線から前記総合評価曲線を求める撮像装置の合焦位置検出方法。
    An in-focus position detection method for an imaging apparatus according to claim 15,
    Of the correlation calculation curves for each of the divided areas, when the positions where the evaluation values of the horizontal correlation calculation curve and the vertical correlation calculation curve in the same divided area are maximum are different from each other, the divided area The focus position detection method of the imaging device which calculates | requires the said comprehensive evaluation curve from the correlation calculation curve for every remaining division area which excluded this correlation calculation curve.
PCT/JP2011/076721 2010-11-30 2011-11-18 Imaging device and focal position detection method WO2012073727A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-267932 2010-11-30
JP2010267932A JP2014032214A (en) 2010-11-30 2010-11-30 Imaging apparatus and focus position detection method

Publications (1)

Publication Number Publication Date
WO2012073727A1 true WO2012073727A1 (en) 2012-06-07

Family

ID=46171671

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/076721 WO2012073727A1 (en) 2010-11-30 2011-11-18 Imaging device and focal position detection method

Country Status (2)

Country Link
JP (1) JP2014032214A (en)
WO (1) WO2012073727A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015141081A1 (en) * 2014-03-18 2015-09-24 富士フイルム株式会社 Imaging device and focus control method
CN109073859A (en) * 2016-04-07 2018-12-21 富士胶片株式会社 Focusing control apparatus, lens assembly, photographic device, focusing control method, focusing control program

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106133576B (en) * 2014-03-25 2018-09-11 富士胶片株式会社 Photographic device and focusing control method
JP6465562B2 (en) * 2014-04-30 2019-02-06 キヤノン株式会社 Imaging apparatus and imaging method
JP6506560B2 (en) 2015-01-20 2019-04-24 キヤノン株式会社 Focus control device and method therefor, program, storage medium
JP2017129788A (en) 2016-01-21 2017-07-27 キヤノン株式会社 Focus detection device and imaging device
JP6740019B2 (en) * 2016-06-13 2020-08-12 キヤノン株式会社 Imaging device and control method thereof
JP7279313B2 (en) * 2018-07-20 2023-05-23 株式会社ニコン Focus detection device and imaging device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008103885A (en) * 2006-10-18 2008-05-01 Nikon Corp Imaging device, focus detecting device, and imaging apparatus
JP2009086424A (en) * 2007-10-01 2009-04-23 Nikon Corp Imaging sensor and imaging device
JP2009244429A (en) * 2008-03-28 2009-10-22 Canon Inc Imaging apparatus
JP2010139520A (en) * 2008-12-09 2010-06-24 Canon Inc Hybrid af device and method for controlling the same
JP2010152161A (en) * 2008-12-25 2010-07-08 Canon Inc Imaging device
JP2010191883A (en) * 2009-02-20 2010-09-02 Nikon Corp Image-capturing device and image processing program
JP2010213038A (en) * 2009-03-11 2010-09-24 Nikon Corp Imaging apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008103885A (en) * 2006-10-18 2008-05-01 Nikon Corp Imaging device, focus detecting device, and imaging apparatus
JP2009086424A (en) * 2007-10-01 2009-04-23 Nikon Corp Imaging sensor and imaging device
JP2009244429A (en) * 2008-03-28 2009-10-22 Canon Inc Imaging apparatus
JP2010139520A (en) * 2008-12-09 2010-06-24 Canon Inc Hybrid af device and method for controlling the same
JP2010152161A (en) * 2008-12-25 2010-07-08 Canon Inc Imaging device
JP2010191883A (en) * 2009-02-20 2010-09-02 Nikon Corp Image-capturing device and image processing program
JP2010213038A (en) * 2009-03-11 2010-09-24 Nikon Corp Imaging apparatus

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015141081A1 (en) * 2014-03-18 2015-09-24 富士フイルム株式会社 Imaging device and focus control method
JP5982600B2 (en) * 2014-03-18 2016-08-31 富士フイルム株式会社 Imaging apparatus and focus control method
US9819853B2 (en) 2014-03-18 2017-11-14 Fujifilm Corporation Imaging device and focusing control method
CN109073859A (en) * 2016-04-07 2018-12-21 富士胶片株式会社 Focusing control apparatus, lens assembly, photographic device, focusing control method, focusing control program
CN109073859B (en) * 2016-04-07 2021-07-16 富士胶片株式会社 Focus control device, lens device, imaging device, focus control method, and focus control program
US11256064B2 (en) 2016-04-07 2022-02-22 Fujifilm Corporation Focusing control device, lens device, imaging device, focusing control method, focusing control program

Also Published As

Publication number Publication date
JP2014032214A (en) 2014-02-20

Similar Documents

Publication Publication Date Title
JP5493010B2 (en) Imaging apparatus and focus position detection method thereof
JP5493011B2 (en) Imaging apparatus and focus position detection method thereof
US9742984B2 (en) Image capturing apparatus and method of controlling the same
JP5572765B2 (en) Solid-state imaging device, imaging apparatus, and focusing control method
JP5860936B2 (en) Imaging device
JP5547349B2 (en) Digital camera
JP5159700B2 (en) Optical apparatus and focus detection method
EP2762942B1 (en) Solid-state image capture element, image capture device, and focus control method
WO2012073727A1 (en) Imaging device and focal position detection method
JP5361535B2 (en) Imaging device
JP5629832B2 (en) Imaging device and method for calculating sensitivity ratio of phase difference pixel
JP2014139679A (en) Imaging apparatus and method for controlling focusing of the same
JP2012215785A (en) Solid-state image sensor and image capturing apparatus
US20120099006A1 (en) Image pickup apparatus
EP2690873A1 (en) Color imaging element, imaging device, and imaging program
JP2015169708A (en) Image-capturing device, control method thereof, and control program
JP5539585B2 (en) Color imaging device, imaging device, and imaging program
EP2690875A1 (en) Color image sensor, imaging device, and control program for imaging device
WO2013069445A1 (en) Three-dimensional imaging device and image processing method
JP5634614B2 (en) Imaging device and imaging apparatus
JP5539583B2 (en) Color imaging device, imaging device, and imaging program
JP6748529B2 (en) Imaging device and imaging device

Legal Events

Date Code Title Description
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11844931

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11844931

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP