JP2011033975A - Imaging apparatus - Google Patents

Imaging apparatus Download PDF

Info

Publication number
JP2011033975A
JP2011033975A JP2009182346A JP2009182346A JP2011033975A JP 2011033975 A JP2011033975 A JP 2011033975A JP 2009182346 A JP2009182346 A JP 2009182346A JP 2009182346 A JP2009182346 A JP 2009182346A JP 2011033975 A JP2011033975 A JP 2011033975A
Authority
JP
Japan
Prior art keywords
focus detection
pixel
pixels
array
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2009182346A
Other languages
Japanese (ja)
Other versions
JP5381472B2 (en
Inventor
Yosuke Kusaka
洋介 日下
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2009182346A priority Critical patent/JP5381472B2/en
Publication of JP2011033975A publication Critical patent/JP2011033975A/en
Application granted granted Critical
Publication of JP5381472B2 publication Critical patent/JP5381472B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To obtain data in a cross pixel position where two focus-detecting pixel arrays cross each other, regarding data array of each of two focus-detecting pixel arrays on an imaging device using a pupil phase difference detection system. <P>SOLUTION: When a first focus-detecting pixel is arranged in the cross pixel position where the first focus-detecting pixel array and the second focus-detecting pixel array cross each other, among data arrays α14, α24 to α84 of the first focus-detecting pixel array, data α44 of the first focus-detecting pixel array in the cross pixel position is obtained as its own data of the first focus-detecting pixel in the cross pixel position. Among data arrays β41, β42 to β48 of the second focus-detecting pixel array, data β44 of the second focus-detecting pixel array in the cross pixel position is obtained based on the data α44 in the cross pixel position and data α34, α54, β43 and β45 in the pixel positions adjacent to the cross pixel position. <P>COPYRIGHT: (C)2011,JPO&INPIT

Description

本発明は、撮像装置に関する。   The present invention relates to an imaging apparatus.

撮像画素と瞳分割方式の焦点検出画素を混載する撮像素子を備えた撮像装置が知られている(例えば、特許文献1参照)。このような撮像装置においては瞳分割方式の焦点検出画素を直線状に配列し、焦点検出画素配列の出力に基づき焦点検出画素配列状に形成された像の焦点調節状態を検出している。   2. Description of the Related Art An imaging device including an imaging element in which an imaging pixel and a pupil division type focus detection pixel are mounted is known (see, for example, Patent Document 1). In such an imaging apparatus, pupil-divided focus detection pixels are arranged in a straight line, and the focus adjustment state of an image formed in the focus detection pixel array is detected based on the output of the focus detection pixel array.

特開2008−224801号公報JP 2008-224801 A

しかしながら、上述した従来の撮像装置においては、焦点検出画素配列の方向において像のコントラスト変化がない場合には焦点調節状態の検出ができない。このような不具合を回避するために、複数の方向に配列された焦点検出画素を交差させて配置することが考えられるが、複数の焦点検出画素配列が交差する画素位置においてはどれか1つの焦点検出画素配列に属する焦点検出画素しか配置できない。そのためそれ以外の焦点検出画素配列による焦点調節状態の検出においては、交差画素位置における焦点検出画素の出力が欠損してしまい、焦点調節状態の検出精度が低下してしまうという欠点がある。   However, in the conventional imaging device described above, the focus adjustment state cannot be detected when there is no change in the contrast of the image in the direction of the focus detection pixel array. In order to avoid such a problem, it is conceivable that the focus detection pixels arranged in a plurality of directions are arranged so as to intersect with each other, but any one focus is provided at a pixel position where the plurality of focus detection pixel arrays intersect. Only focus detection pixels belonging to the detection pixel array can be arranged. For this reason, in the detection of the focus adjustment state using the other focus detection pixel arrays, there is a drawback in that the output of the focus detection pixel at the intersection pixel position is lost and the detection accuracy of the focus adjustment state is lowered.

請求項1に記載の発明による撮像装置は、各々が被写体像に関する撮像信号を出力する複数の撮像画素と、各々が分割瞳像に関する焦点検出信号を出力する複数の焦点検出画素とが二次元配列に従って配置され、複数の焦点検出画素が互いに隣接して複数の直線列に配置されることによって形成される複数の焦点検出画素配列が所定の画素位置で交差するように配置された撮像素子と、所定の画素位置の近傍に配置された複数の焦点検出画素のうちの、少なくとも1つが出力する焦点検出信号と、該所定の画素位置における焦点検出信号とに基づいて、複数の焦点検出画素配列の各々に対応する焦点検出信号データ配列に含まれる、該所定の画素位置における特定の焦点検出信号データを取得する焦点検出信号取得手段と、焦点検出信号データ配列に基づき、瞳分割型位相差検出方式によって、被写体像に関する焦点調節状態を検出する焦点検出手段とを備えることを特徴する。   The imaging apparatus according to claim 1 is a two-dimensional array of a plurality of imaging pixels each outputting an imaging signal relating to a subject image and a plurality of focus detection pixels each outputting a focus detection signal relating to a divided pupil image. An image sensor that is arranged in such a manner that a plurality of focus detection pixel arrays formed by arranging a plurality of focus detection pixels adjacent to each other and arranged in a plurality of linear rows intersect at a predetermined pixel position; Based on the focus detection signal output by at least one of the plurality of focus detection pixels arranged in the vicinity of the predetermined pixel position and the focus detection signal at the predetermined pixel position, the plurality of focus detection pixel arrays Focus detection signal acquisition means for acquiring specific focus detection signal data at the predetermined pixel position included in the corresponding focus detection signal data array, and focus detection signal data Based on the sequence, the split-pupil phase detection method, which comprising: a focus detection means for detecting a focusing state of a subject image.

本発明によれば、上述の欠点を克服し、複数の方向において高精度に焦点調節状態を検出できる。   According to the present invention, the above-described drawbacks can be overcome and the focus adjustment state can be detected with high accuracy in a plurality of directions.

一実施の形態のデジタルスチルカメラの構成を示す横断面図である。It is a cross-sectional view showing the configuration of the digital still camera of one embodiment. 交換レンズの撮影画面上における焦点検出位置を示す図である。It is a figure which shows the focus detection position on the imaging | photography screen of an interchangeable lens. 撮像素子の詳細な構成を示す正面図である。It is a front view which shows the detailed structure of an image pick-up element. 撮像画素と焦点検出画素のマイクロレンズの形状を示す図である。It is a figure which shows the shape of the micro lens of an imaging pixel and a focus detection pixel. 撮像画素の正面図である。It is a front view of an imaging pixel. 焦点検出画素の正面図である。It is a front view of a focus detection pixel. 撮像画素の断面図である。It is sectional drawing of an imaging pixel. 焦点検出画素の断面図である。It is sectional drawing of a focus detection pixel. 撮像画素が受光する撮影光束の様子を説明するための図である。It is a figure for demonstrating the mode of the imaging light beam which an imaging pixel receives. 焦点検出画素が受光する撮影光束の様子を説明するための図である。It is a figure for demonstrating the mode of the imaging light beam which a focus detection pixel receives. デジタルスチルカメラの撮像動作を示すフローチャートである。It is a flowchart which shows the imaging operation of a digital still camera. 一対のデータのずらし量kに対する相関量C(k)の関係を示す図である。It is a figure which shows the relationship of the correlation amount C (k) with respect to the shift amount k of a pair of data. 第1の焦点検出画素配列と第2の焦点検出画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。It is the figure which showed the area | region of 8 pixels x 8 pixels centering on the pixel position where a 1st focus detection pixel arrangement | sequence and a 2nd focus detection pixel arrangement | sequence cross | intersect. 撮像素子の詳細な構成を示す正面図である。It is a front view which shows the detailed structure of an image pick-up element. 第1の焦点検出画素配列と第2の焦点検出画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。It is the figure which showed the area | region of 8 pixels x 8 pixels centering on the pixel position where a 1st focus detection pixel arrangement | sequence and a 2nd focus detection pixel arrangement | sequence cross | intersect. 撮像素子の詳細な構成を示す正面図である。It is a front view which shows the detailed structure of an image pick-up element. 焦点検出画素の正面図である。It is a front view of a focus detection pixel. 焦点検出画素の断面図である。It is sectional drawing of a focus detection pixel. 第1の焦点検出画素配列と第2の焦点検出画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。It is the figure which showed the area | region of 8 pixels x 8 pixels centering on the pixel position where a 1st focus detection pixel arrangement | sequence and a 2nd focus detection pixel arrangement | sequence cross | intersect. 撮像素子の詳細な構成を示す正面図である。It is a front view which shows the detailed structure of an image pick-up element. 第1の焦点検出画素配列と第2の焦点検出画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。It is the figure which showed the area | region of 8 pixels x 8 pixels centering on the pixel position where a 1st focus detection pixel arrangement | sequence and a 2nd focus detection pixel arrangement | sequence cross | intersect. 交換レンズの撮影画面上における焦点検出位置を示す図である。It is a figure which shows the focus detection position on the imaging | photography screen of an interchangeable lens. 撮像素子の詳細な構成を示す正面図である。It is a front view which shows the detailed structure of an image pick-up element. 焦点検出画素の正面図である。It is a front view of a focus detection pixel. 第1の焦点検出画素配列と第2の焦点検出画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。It is the figure which showed the area | region of 8 pixels x 8 pixels centering on the pixel position where a 1st focus detection pixel arrangement | sequence and a 2nd focus detection pixel arrangement | sequence cross | intersect. 撮像素子の詳細な構成を示す正面図である。It is a front view which shows the detailed structure of an image pick-up element. 撮像素子の詳細な構成を示す正面図である。It is a front view which shows the detailed structure of an image pick-up element. 焦点検出画素および撮像画素の正面図である。It is a front view of a focus detection pixel and an imaging pixel.

一実施の形態の撮像装置として、レンズ交換式のデジタルスチルカメラを例に挙げて説明する。図1は本実施の形態のデジタルスチルカメラの構成を示す横断面図である。本実施の形態のデジタルスチルカメラ201は交換レンズ202とカメラボディ203から構成され、交換レンズ202がマウント部204を介してカメラボディ203に装着される。カメラボディ203にはマウント部204を介して種々の撮影光学系を有する交換レンズ202が装着可能である。   As an imaging apparatus according to an embodiment, an interchangeable lens digital still camera will be described as an example. FIG. 1 is a cross-sectional view showing the configuration of the digital still camera of the present embodiment. A digital still camera 201 according to the present embodiment includes an interchangeable lens 202 and a camera body 203, and the interchangeable lens 202 is attached to the camera body 203 via a mount unit 204. An interchangeable lens 202 having various photographing optical systems can be attached to the camera body 203 via a mount unit 204.

交換レンズ202は、レンズ209、ズーミング用レンズ208、フォーカシング用レンズ210、絞り211、レンズ駆動制御装置206などを備えている。レンズ駆動制御装置206は不図示のマイクロコンピューター、メモリ、駆動制御回路などから構成され、フォーカシング用レンズ210の焦点調節と絞り211の開口径調節のための駆動制御や、ズーミング用レンズ208、フォーカシング用レンズ210および絞り211の状態検出などを行う他、後述するボディ駆動制御装置214との通信により後述するレンズ情報の送信とカメラ情報(デフォーカス量や絞り値など)の受信を行う。絞り211は、光量およびボケ量調整のために光軸中心に開口径が可変な開口を形成する。   The interchangeable lens 202 includes a lens 209, a zooming lens 208, a focusing lens 210, a diaphragm 211, a lens drive control device 206, and the like. The lens drive control device 206 includes a microcomputer (not shown), a memory, a drive control circuit, and the like. The lens drive control device 206 includes drive control for adjusting the focus of the focusing lens 210 and the aperture diameter of the aperture 211, zooming lens 208, and focusing. In addition to detecting the state of the lens 210 and the aperture 211, the transmission of lens information (described later) and the reception of camera information (defocus amount, aperture value, etc.) are performed through communication with a body drive control device 214 (described later). The aperture 211 forms an aperture having a variable aperture diameter at the center of the optical axis in order to adjust the amount of light and the amount of blur.

カメラボディ203は撮像素子212、ボディ駆動制御装置214、液晶表示素子駆動回路215、液晶表示素子216、接眼レンズ217、メモリカード219などを備えている。撮像素子212には、撮像画素が二次元状に配置されるとともに、焦点検出位置(焦点検出エリア)に対応した部分に焦点検出画素が組み込まれている。この撮像素子212については詳細を後述する。   The camera body 203 includes an imaging element 212, a body drive control device 214, a liquid crystal display element drive circuit 215, a liquid crystal display element 216, an eyepiece lens 217, a memory card 219, and the like. In the imaging element 212, imaging pixels are two-dimensionally arranged, and focus detection pixels are incorporated in portions corresponding to focus detection positions (focus detection areas). Details of the image sensor 212 will be described later.

ボディ駆動制御装置214はマイクロコンピューター、メモリ、駆動制御回路などから構成され、撮像素子212の駆動制御と画像信号および焦点検出信号の読み出しと、焦点検出信号に基づく焦点検出演算と交換レンズ202の焦点調節を繰り返し行うとともに、画像信号の処理と記録、デジタルスチルカメラ201の動作制御などを行う。また、ボディ駆動制御装置214は電気接点213を介してレンズ駆動制御装置206と通信を行い、レンズ情報の受信とカメラ情報の送信を行う。   The body drive control device 214 includes a microcomputer, a memory, a drive control circuit, and the like, and controls the drive of the image sensor 212, reads out the image signal and the focus detection signal, performs the focus detection calculation based on the focus detection signal, and the focus of the interchangeable lens 202. The adjustment is repeated, and image signal processing and recording, operation control of the digital still camera 201 and the like are performed. The body drive control device 214 communicates with the lens drive control device 206 through the electrical contact 213 to receive lens information and transmit camera information.

液晶表示素子216は電気的なビューファインダー(EVF:Electronic View Finder)として機能する。液晶表示素子駆動回路215は撮像素子212によるスルー画像を液晶表示素子216に表示し、撮影者は接眼レンズ217を介してスルー画像を観察することができる。メモリカード219は、撮像素子212により撮像された画像を記憶する画像ストレージである。   The liquid crystal display element 216 functions as an electric view finder (EVF). The liquid crystal display element driving circuit 215 displays a through image by the imaging element 212 on the liquid crystal display element 216, and the photographer can observe the through image through the eyepiece lens 217. The memory card 219 is an image storage that stores an image captured by the image sensor 212.

交換レンズ202を通過した光束により、撮像素子212の受光面上に被写体像が形成される。この被写体像は撮像素子212により光電変換され、画像信号と焦点検出信号がボディ駆動制御装置214へ送られる。   A subject image is formed on the light receiving surface of the image sensor 212 by the light beam that has passed through the interchangeable lens 202. This subject image is photoelectrically converted by the image sensor 212, and an image signal and a focus detection signal are sent to the body drive control device 214.

ボディ駆動制御装置214は、撮像素子212の焦点検出画素からの焦点検出信号に基づいてデフォーカス量を算出し、このデフォーカス量をレンズ駆動制御装置206へ送る。また、ボディ駆動制御装置214は、撮像素子212からの画像信号を処理して画像データを生成し、メモリカード219に格納するとともに、撮像素子212からのスルー画像信号を液晶表示素子駆動回路215へ送り、スルー画像を液晶表示素子216に表示させる。さらに、ボディ駆動制御装置214は、レンズ駆動制御装置206へ絞り制御情報を送って絞り211の開口制御を行う。   The body drive control device 214 calculates a defocus amount based on the focus detection signal from the focus detection pixel of the image sensor 212 and sends the defocus amount to the lens drive control device 206. The body drive control device 214 processes the image signal from the image sensor 212 to generate image data, stores the image data in the memory card 219, and transmits the through image signal from the image sensor 212 to the liquid crystal display element drive circuit 215. The through image is displayed on the liquid crystal display element 216. Further, the body drive control device 214 sends aperture control information to the lens drive control device 206 to control the aperture of the aperture 211.

レンズ駆動制御装置206は、フォーカシング状態、ズーミング状態、絞り設定状態、絞り開放F値などに応じてレンズ情報を更新する。具体的には、ズーミング用レンズ208とフォーカシング用レンズ210の位置と絞り211の絞り値を検出し、これらのレンズ位置と絞り値に応じてレンズ情報を演算したり、あるいは予め用意されたルックアップテーブルからレンズ位置と絞り値に応じたレンズ情報を選択する。   The lens drive controller 206 updates the lens information according to the focusing state, zooming state, aperture setting state, aperture opening F value, and the like. Specifically, the positions of the zooming lens 208 and the focusing lens 210 and the aperture value of the aperture 211 are detected, and lens information is calculated according to these lens positions and aperture values, or a lookup prepared in advance. Lens information corresponding to the lens position and aperture value is selected from the table.

レンズ駆動制御装置206は、受信したデフォーカス量に基づいてレンズ駆動量を算出し、レンズ駆動量に応じてフォーカシング用レンズ210を合焦位置へ駆動する。また、レンズ駆動制御装置206は受信した絞り値に応じて絞り211を駆動する。   The lens drive control device 206 calculates a lens drive amount based on the received defocus amount, and drives the focusing lens 210 to the in-focus position according to the lens drive amount. Further, the lens drive control device 206 drives the diaphragm 211 in accordance with the received diaphragm value.

図2は、交換レンズ202の撮影画面上における焦点検出位置(焦点検出エリア)を示す図であり、後述する撮像素子212上の焦点検出画素列が焦点検出の際に撮影画面上で像をサンプリングする領域(焦点検出エリア、焦点検出位置)の一例を示す。この例では、矩形の撮影画面100上の中央(光軸上)および左右の3箇所に十字型の焦点検出エリア101〜103が配置される。長方形で示す焦点検出エリアの長手方向に、焦点検出画素が水平方向および垂直方向に直線的に配列される。   FIG. 2 is a diagram showing a focus detection position (focus detection area) on the shooting screen of the interchangeable lens 202, and an image is sampled on the shooting screen when a focus detection pixel column on the image sensor 212 described later performs focus detection. An example of a region (focus detection area, focus detection position) to be performed is shown. In this example, cross-shaped focus detection areas 101 to 103 are arranged at the center (on the optical axis) on the rectangular shooting screen 100 and at three positions on the left and right. In the longitudinal direction of the focus detection area indicated by a rectangle, focus detection pixels are linearly arranged in the horizontal direction and the vertical direction.

図3は撮像素子212の詳細な構成を示す正面図であり、図2における焦点検出エリア102の近傍を拡大した画素配列の詳細を示す。撮像素子212には撮像画素310が二次元正方格子状に稠密に配列される。撮像画素310は赤画素(R)、緑画素(G)、青画素(B)からなり、ベイヤー配列の配置規則によって配置されている。垂直方向の焦点検出用には撮像画素と同一の画素サイズを有する垂直方向焦点検出用の焦点検出画素313、314が交互に、本来緑画素と青画素が連続的に配置されるべき垂直方向の直線上に連続して配列される。同じく水平方向の焦点検出用には撮像画素と同一の画素サイズを有する水平方向焦点検出用の焦点検出画素315、316が交互に、本来緑画素と赤画素が連続的に配置されるべき水平方向の直線上に連続して配列される。   FIG. 3 is a front view showing a detailed configuration of the image sensor 212, and shows details of a pixel arrangement in which the vicinity of the focus detection area 102 in FIG. 2 is enlarged. Imaging pixels 310 are densely arranged on the imaging element 212 in a two-dimensional square lattice pattern. The imaging pixel 310 includes a red pixel (R), a green pixel (G), and a blue pixel (B), and is arranged according to a Bayer arrangement rule. For vertical focus detection, vertical focus detection focus detection pixels 313 and 314 having the same pixel size as the imaging pixel are alternately arranged, and the original green pixels and blue pixels should be continuously arranged in the vertical direction. Arranged continuously on a straight line. Similarly, for horizontal focus detection, horizontal focus detection focus detection pixels 315 and 316 having the same pixel size as the imaging pixel are alternately arranged, and the original green and red pixels should be continuously arranged in the horizontal direction. Are arranged on a straight line.

水平方向の焦点検出画素配列と垂直方向の焦点検出画素配列が交差する画素位置(ベイヤー配列の配置規則で緑画素が配置されるべき位置)には、垂直方向焦点検出用の焦点検出画素313が配置される。焦点検出エリア101、103における画素配列も基本的に図3に示す配列と同様である。   At the pixel position where the horizontal focus detection pixel array intersects the vertical focus detection pixel array (the position where the green pixel should be arranged according to the Bayer array arrangement rule), the focus detection pixel 313 for vertical focus detection is provided. Be placed. The pixel arrangement in the focus detection areas 101 and 103 is basically the same as that shown in FIG.

図4は撮像画素と焦点検出画素のマイクロレンズ10の形状を示す図であって、元々画素サイズより大きな円形のマイクロレンズ9から画素サイズに対応した正方形の形状で切り出した形状をしており、マイクロレンズ10の光軸を通る対角線の方向の断面とマイクロレンズ10の光軸を通る水平線の方向の断面とはそれぞれ図4に(a)、(b)で示す形状になっている。   FIG. 4 is a diagram illustrating the shape of the microlens 10 of the imaging pixel and the focus detection pixel, which is a shape that is originally cut out from a circular microlens 9 larger than the pixel size in a square shape corresponding to the pixel size, The cross section in the direction of the diagonal line passing through the optical axis of the microlens 10 and the cross section in the direction of the horizontal line passing through the optical axis of the microlens 10 have shapes shown in FIGS.

撮像画素310は、図5に示すように矩形のマイクロレンズ10、後述の遮光マスクで受光領域を正方形に制限された光電変換部11、および色フィルタ(不図示)から構成される。色フィルタは赤(R)、緑(G)、青(B)の3種類からなり、それぞれの分光感度特性を有している。撮像素子212には、各色フィルタを備えた撮像画素310がベイヤー配列されている。   As shown in FIG. 5, the imaging pixel 310 includes a rectangular microlens 10, a photoelectric conversion unit 11 whose light receiving area is limited to a square by a light shielding mask described later, and a color filter (not shown). There are three types of color filters, red (R), green (G), and blue (B), and each has a spectral sensitivity characteristic. In the image pickup device 212, image pickup pixels 310 having respective color filters are arranged in a Bayer array.

焦点検出画素313、314、315、316には全ての色に対して焦点検出を行うために全ての可視光を透過する白色フィルタが設けられており、その白色フィルタは、緑画素、赤画素および青画素の分光感度特性を加算したような分光感度特性を有する。換言すると、その焦点検出画素313、314、315、316が高い分光感度を示す光波長領域は、緑画素、赤画素および青画素が高い分光感度を示す光波長領域を包括している。   The focus detection pixels 313, 314, 315, and 316 are provided with white filters that transmit all visible light in order to perform focus detection for all colors. The white filters include green pixels, red pixels, It has a spectral sensitivity characteristic that is the sum of the spectral sensitivity characteristics of blue pixels. In other words, the light wavelength region in which the focus detection pixels 313, 314, 315, and 316 exhibit high spectral sensitivity includes the light wavelength region in which the green pixel, red pixel, and blue pixel exhibit high spectral sensitivity.

焦点検出画素313は、正面図を表した図6(a)に示すように、矩形のマイクロレンズ10と後述の遮光マスクで受光領域を正方形の上半分(正方形を水平線で2等分した場合の上半分)に制限された光電変換部13、および白色フィルタ(不図示)とから構成される。   As shown in FIG. 6A showing a front view, the focus detection pixel 313 has a light-receiving area that is an upper half of a square (a square is divided into two equal parts by a horizontal line) using a rectangular microlens 10 and a light-shielding mask described later. The photoelectric conversion unit 13 is limited to the upper half) and a white filter (not shown).

また、焦点検出画素314は、正面図を表した図6(b)に示すように、矩形のマイクロレンズ10と後述の遮光マスクで受光領域を正方形の下半分(正方形を水平線で2等分した場合の下半分)に制限された光電変換部14、および白色フィルタ(不図示)とから構成される。   Further, as shown in FIG. 6B showing a front view, the focus detection pixel 314 has a light receiving area divided into a lower half of a square (a square is divided into two equal parts by a horizontal line) using a rectangular microlens 10 and a light shielding mask described later. The photoelectric conversion unit 14 is limited to the lower half of the case, and a white filter (not shown).

焦点検出画素313の正面図と焦点検出画素314の正面図とをマイクロレンズ10を基準に重ね合わせて表示すると、遮光マスクで受光領域を制限された光電変換部13と14が垂直方向に並んでいる。また図6(a)、(b)において、正方形を半分にした受光領域の部分に正方形を半分にした残りの部分(破線部分)を加えると、撮像画素の受光領域と同じサイズの正方形となる。   When the front view of the focus detection pixel 313 and the front view of the focus detection pixel 314 are displayed so as to overlap each other with the microlens 10 as a reference, the photoelectric conversion units 13 and 14 whose light receiving areas are limited by the light shielding mask are arranged in the vertical direction. Yes. In FIGS. 6A and 6B, when the remaining portion (broken line portion) obtained by halving the square is added to the light receiving region portion obtained by halving the square, a square having the same size as the light receiving region of the imaging pixel is obtained. .

焦点検出画素315は、正面図を表した図6(c)に示すように、矩形のマイクロレンズ10と後述の遮光マスクで受光領域を正方形の左半分(正方形を垂直線で2等分した場合の左半分)に制限された光電変換部15、および白色フィルタ(不図示)とから構成される。   As shown in FIG. 6C, which shows a front view, the focus detection pixel 315 has a rectangular half-lens 10 and a light-shielding mask, which will be described later. The left half of the photoelectric conversion unit 15 and a white filter (not shown).

また、焦点検出画素316は、正面図を表した図6(d)に示すように、矩形のマイクロレンズ10と後述の遮光マスクで受光領域を正方形の右半分(正方形を垂直線で2等分した場合の右半分)に制限された光電変換部16、および白色フィルタ(不図示)とから構成される。   Further, as shown in FIG. 6D showing a front view, the focus detection pixel 316 divides the light receiving region into a right half of a square (a square is divided into two equal parts by a vertical line) using a rectangular microlens 10 and a light shielding mask described later. The photoelectric conversion unit 16 limited to the right half) and a white filter (not shown).

焦点検出画素315の正面図と焦点検出画素316の正面図とをマイクロレンズ10を基準に重ね合わせて表示すると、遮光マスクで受光領域を制限された光電変換部15と16が水平方向に並んでいる。また図6(c)、(d)において、正方形を半分にした受光領域の部分に正方形を半分にした残りの部分(破線部分)を加えると、撮像画素の受光領域と同じサイズの正方形となる。   When the front view of the focus detection pixel 315 and the front view of the focus detection pixel 316 are displayed so as to overlap each other with the microlens 10 as a reference, the photoelectric conversion units 15 and 16 in which the light receiving area is limited by the light shielding mask are arranged in the horizontal direction. Yes. In FIGS. 6C and 6D, when the remaining portion (broken line portion) obtained by halving the square is added to the light receiving region portion obtained by halving the square, a square having the same size as the light receiving region of the imaging pixel is obtained. .

図7は、垂直方向の直線で撮像画素配列の断面をとった場合の撮像画素310の断面図である。撮像画素310では撮像用の光電変換部11の上に近接して遮光マスク30が形成され、光電変換部11は、遮光マスク30の開口部30aを通過した光を受光する。遮光マスク30の上には平坦化層31が形成され、その上に色フィルタ38が形成される。色フィルタ38の上には平坦化層32が形成され、その上にマイクロレンズ10が形成される。マイクロレンズ10により開口部30aの形状が前方に投影される。光電変換部11は半導体回路基板29上に形成される。   FIG. 7 is a cross-sectional view of the imaging pixel 310 when a cross section of the imaging pixel array is taken along a straight line in the vertical direction. In the imaging pixel 310, a light shielding mask 30 is formed in proximity to the imaging photoelectric conversion unit 11, and the photoelectric conversion unit 11 receives light that has passed through the opening 30 a of the light shielding mask 30. A planarizing layer 31 is formed on the light shielding mask 30, and a color filter 38 is formed thereon. A planarizing layer 32 is formed on the color filter 38, and the microlens 10 is formed thereon. The shape of the opening 30 a is projected forward by the microlens 10. The photoelectric conversion unit 11 is formed on the semiconductor circuit substrate 29.

図8は、垂直方向の直線で焦点検出画素313、314からなる焦点検出画素配列の断面をとった場合の焦点検出画素313、314の断面図である。焦点検出画素313、314では焦点検出用の光電変換部13,14の上に近接して遮光マスク30が形成され、光線変換部13,14は、遮光マスク30の開口部30b、30cを通過した光を受光する。遮光マスク30の上には平坦化層31が形成され、その上に白色フィルタ39が形成される。白色フィルタ39の上には平坦化層32が形成され、その上にマイクロレンズ10が形成される。マイクロレンズ10により開口部30b、30cの形状が前方に投影される。光電変換部13,14は半導体回路基板29上に形成される。焦点検出画素315、316の構造も焦点検出画素313、314の構造を90度回転しただけであって、基本的に図8に示す焦点検出画素の構造と同様である。   FIG. 8 is a cross-sectional view of the focus detection pixels 313 and 314 when a cross section of the focus detection pixel array including the focus detection pixels 313 and 314 is taken along a vertical straight line. In the focus detection pixels 313 and 314, a light shielding mask 30 is formed in proximity to the focus detection photoelectric conversion units 13 and 14, and the light beam conversion units 13 and 14 have passed through the openings 30 b and 30 c of the light shielding mask 30. Receives light. A planarizing layer 31 is formed on the light shielding mask 30, and a white filter 39 is formed thereon. A planarizing layer 32 is formed on the white filter 39, and the microlens 10 is formed thereon. The shapes of the openings 30b and 30c are projected forward by the microlens 10. The photoelectric conversion units 13 and 14 are formed on the semiconductor circuit substrate 29. The structure of the focus detection pixels 315 and 316 is merely the same as the structure of the focus detection pixel shown in FIG. 8 except that the structure of the focus detection pixels 313 and 314 is rotated by 90 degrees.

図9は、図3、図7に示す撮像画素310が受光する撮影光束の様子を説明するための図であって、垂直方向の直線で撮像画素配列の断面をとっている。撮像素子上に配列された全ての撮像画素の光電変換部11は、光電変換部11に近接して配置された前記遮光マスク開口30aを通過した光束を受光する。遮光マスク開口30aの形状は、各撮像画素のマイクロレンズ10によりマイクロレンズ10から測距瞳距離dだけ離間した射出瞳90上の全撮像画素共通な領域95に投影される。   FIG. 9 is a diagram for explaining the state of the imaging light beam received by the imaging pixel 310 shown in FIGS. 3 and 7, and takes a cross section of the imaging pixel array with a straight line in the vertical direction. The photoelectric conversion units 11 of all the imaging pixels arranged on the imaging element receive the light flux that has passed through the light shielding mask opening 30a disposed in the vicinity of the photoelectric conversion unit 11. The shape of the light-shielding mask opening 30a is projected by the microlens 10 of each imaging pixel onto a region 95 common to all imaging pixels on the exit pupil 90 that is separated from the microlens 10 by the distance measuring pupil distance d.

従って各撮像画素の光電変換部11は、領域95と各撮像画素のマイクロレンズ10を通過する光束71を受光し、領域95を通過して各撮像画素のマイクロレンズ10へ向う光束71によって各マイクロレンズ10上に形成される像の強度に対応した信号を出力する。   Therefore, the photoelectric conversion unit 11 of each imaging pixel receives the light beam 71 that passes through the region 95 and the microlens 10 of each imaging pixel, and passes through the region 95 to the microlens 10 of each imaging pixel, and the microbeams 71 pass through each micropixel 10. A signal corresponding to the intensity of the image formed on the lens 10 is output.

図10は、図3、図8に示す焦点検出画素313,314が受光する焦点検出光束の様子を図9と比較して説明するための図であって、垂直方向の直線で焦点検出画素配列の断面をとっている。   FIG. 10 is a diagram for explaining the state of the focus detection light beam received by the focus detection pixels 313 and 314 shown in FIGS. 3 and 8 in comparison with FIG. 9, and the focus detection pixel array is represented by a straight line in the vertical direction. The cross section is taken.

撮像素子上に配列された全ての焦点検出画素の光電変換部13,14は、光電変換部13,14に近接して配置された前記遮光マスク開口30b、30cを通過した光束を受光する。遮光マスク開口30bの形状は、各焦点検出画素313のマイクロレンズ10により、マイクロレンズ10から測距瞳距離dだけ離間した射出瞳90上の焦点検出画素313に全てに共通した領域93に投影される。同じく、遮光マスク開口30cの形状は、各焦点検出画素314のマイクロレンズ10により、マイクロレンズ10から測距瞳距離dだけ離間した射出瞳90上の焦点検出画素314に全てに共通した領域94に投影される。一対の領域93,94を測距瞳と呼ぶ。   The photoelectric conversion units 13 and 14 of all the focus detection pixels arranged on the image sensor receive the light flux that has passed through the light shielding mask openings 30b and 30c disposed in proximity to the photoelectric conversion units 13 and 14. The shape of the light-shielding mask opening 30b is projected by the microlens 10 of each focus detection pixel 313 onto an area 93 that is common to all focus detection pixels 313 on the exit pupil 90 that are separated from the microlens 10 by the distance measurement pupil distance d. The Similarly, the shape of the light-shielding mask opening 30c is an area 94 common to all the focus detection pixels 314 on the exit pupil 90 separated from the microlens 10 by the distance measurement pupil distance d by the microlens 10 of each focus detection pixel 314. Projected. The pair of areas 93 and 94 is called a distance measuring pupil.

従って各焦点検出画素313の光電変換部13は、測距瞳93と各撮像画素のマイクロレンズ10を通過する光束73を受光し、測距瞳93を通過して各撮像画素のマイクロレンズ10へ向う光束73によって各マイクロレンズ10上に形成される像の強度に対応した信号を出力する。また各焦点検出画素314の光電変換部14は、測距瞳94と各撮像画素のマイクロレンズ10を通過する光束74を受光し、測距瞳94を通過して各撮像画素のマイクロレンズ10へ向う光束74によって各マイクロレンズ10上に形成される像の強度に対応した信号を出力する。   Accordingly, the photoelectric conversion unit 13 of each focus detection pixel 313 receives the light beam 73 passing through the distance measuring pupil 93 and the micro lens 10 of each imaging pixel, and passes through the distance measuring pupil 93 to the micro lens 10 of each imaging pixel. A signal corresponding to the intensity of the image formed on each microlens 10 is output by the light beam 73 directed. The photoelectric conversion unit 14 of each focus detection pixel 314 receives the light beam 74 passing through the distance measuring pupil 94 and the micro lens 10 of each imaging pixel, and passes through the distance measuring pupil 94 to the micro lens 10 of each imaging pixel. A signal corresponding to the intensity of the image formed on each microlens 10 is output by the light beam 74 directed.

一対の焦点検出画素313,314が受光する光束73,74が通過する射出瞳90上の測距瞳93と94を統合した領域は、撮像画素310が受光する光束71が通過する射出瞳90上の領域95と一致し、射出瞳90上において光束73,74は光束71に対して相補的な関係になっている。   A region where the distance measuring pupils 93 and 94 on the exit pupil 90 through which the light beams 73 and 74 received by the pair of focus detection pixels 313 and 314 pass is integrated on the exit pupil 90 through which the light beam 71 received by the imaging pixel 310 passes. The light fluxes 73 and 74 are complementary to the light flux 71 on the exit pupil 90.

上述した一対の焦点検出画素313、314を交互にかつ直線状に多数配置し、各焦点検出画素の光電変換部の出力を測距瞳93および測距瞳94に対応した一対の出力グループにまとめることによって、測距瞳93と測距瞳94をそれぞれ通過する一対の光束が焦点検出画素配列上(垂直方向)に形成する一対の像の強度分布に関する情報が得られる。この情報に対して後述する像ズレ検出演算処理(相関演算処理、位相差検出処理)を施すことによって、いわゆる瞳分割型位相差検出方式で一対の像の像ズレ量が検出される。さらに、像ズレ量に一対の測距瞳の重心間隔と測距瞳距離の比例関係に応じた変換演算を行うことによって、焦点検出位置(垂直方向)における予定結像面と結像面の偏差(デフォーカス量)が算出される。   A large number of the pair of focus detection pixels 313 and 314 described above are arranged alternately and linearly, and the output of the photoelectric conversion unit of each focus detection pixel is collected into a pair of output groups corresponding to the distance measurement pupil 93 and the distance measurement pupil 94. Thus, information on the intensity distribution of the pair of images formed on the focus detection pixel array (vertical direction) by the pair of light beams passing through the distance measuring pupil 93 and the distance measuring pupil 94 can be obtained. By applying an image shift detection calculation process (correlation calculation process, phase difference detection process), which will be described later, to this information, an image shift amount of a pair of images is detected by a so-called pupil division type phase difference detection method. Furthermore, by performing a conversion operation according to the proportional relationship between the distance between the center of gravity of the pair of distance measurement pupils and the distance measurement pupil distance, the deviation between the planned image formation surface and the image formation surface at the focus detection position (vertical direction) is performed. (Defocus amount) is calculated.

焦点検出画素315、316が受光する焦点検出光束も焦点検出画素313、314の受光する一対の焦点検出光束73,74を90度回転しただけであって、基本的に図10に示す光束と同様であり、測距瞳93、94を90度回転した一対の測距瞳が設定される。一対の焦点検出画素315、316を交互にかつ直線状に多数配置し、各焦点検出画素の光電変換部の出力を一対の測距瞳に対応した一対の出力グループにまとめることによって、一対の測距瞳をそれぞれ通過する一対の光束が焦点検出画素配列上(水平方向)に形成する一対の像の強度分布に関する情報が得られる。この情報に基づき、焦点検出位置(水平方向)における予定結像面と結像面の偏差(デフォーカス量)が算出される。   The focus detection light beams received by the focus detection pixels 315 and 316 are merely the same as the light beams shown in FIG. 10 except that the pair of focus detection light beams 73 and 74 received by the focus detection pixels 313 and 314 are rotated by 90 degrees. A pair of distance measurement pupils obtained by rotating the distance measurement pupils 93 and 94 by 90 degrees are set. A large number of pairs of focus detection pixels 315 and 316 are arranged alternately and in a straight line, and the output of the photoelectric conversion unit of each focus detection pixel is collected into a pair of output groups corresponding to the pair of distance measurement pupils. Information on the intensity distribution of the pair of images formed on the focus detection pixel array (horizontal direction) by the pair of light beams respectively passing through the distance pupil is obtained. Based on this information, the deviation (defocus amount) between the planned imaging plane and the imaging plane at the focus detection position (horizontal direction) is calculated.

図11は、デジタルスチルカメラ201の撮像動作を示すフローチャートである。ボディ駆動制御装置214は、ステップS100でデジタルスチルカメラ201の電源がオンされると、ステップS110以降の撮像動作を開始する。ステップS110において撮像素子310から全画素データを読み出す。続くステップS120では、撮像画素のデータを間引きしたデータを電子ビューファインダーに表示させる。ステップS130では選択された焦点検出エリアにおいて、第1の焦点検出画素(垂直方向に配列された焦点検出画素313,314)に基づいて、第2の焦点検出画素(水平方向に配列された焦点検出画素315,316)配列が第1の焦点検出画素配列と交差する画素位置における仮想的な第2の焦点検出画素のデータを補間する。詳細は後述する。なお、焦点検出エリアは、撮影者が焦点検出エリア選択部材(不図示)を用いて焦点検出エリア101〜103の内のいずれかを予め選択しているものとする。   FIG. 11 is a flowchart showing the imaging operation of the digital still camera 201. When the power of the digital still camera 201 is turned on in step S100, the body drive control device 214 starts an imaging operation after step S110. In step S110, all pixel data is read from the image sensor 310. In subsequent step S120, data obtained by thinning out the data of the imaging pixels is displayed on the electronic viewfinder. In step S130, based on the first focus detection pixels (focus detection pixels 313 and 314 arranged in the vertical direction) in the selected focus detection area, the second focus detection pixels (focus detection arranged in the horizontal direction). Pixels 315 and 316) Interpolate the data of the virtual second focus detection pixel at the pixel position where the array intersects the first focus detection pixel array. Details will be described later. The focus detection area is assumed to be selected in advance by the photographer using one of the focus detection areas 101 to 103 using a focus detection area selection member (not shown).

ステップS135では第1の焦点検出画素配列の一対の像データおよび仮想的な第2の焦点検出画素を含む第2の焦点検出画素配列の一対の像データに基づいて後述する像ズレ検出演算処理(相関演算処理、位相差検出処理)およびデフォーカス変換演算を行い、選択された焦点検出エリアにおける垂直方向と水平方向のデフォーカス量を算出する。さらに2方向のデフォーカス量を平均して最終的なデフォーカス量を算出する。なお、一方の方向のデフォーカス量の信頼性が低い場合またはデフォーカス量の算出が不能であった場合は、デフォーカス量の平均は行わず、他方のデフォーカス量を採用する。また、両方向ともデフォーカス量の信頼性が低い場合またはデフォーカス量の算出が不能であった場合は焦点検出不能となる。   In step S135, an image shift detection calculation process (described later) is performed based on the pair of image data of the first focus detection pixel array and the pair of image data of the second focus detection pixel array including the virtual second focus detection pixel. Correlation calculation processing, phase difference detection processing) and defocus conversion calculation are performed to calculate the defocus amounts in the vertical and horizontal directions in the selected focus detection area. Further, the final defocus amount is calculated by averaging the defocus amounts in the two directions. If the reliability of the defocus amount in one direction is low or the defocus amount cannot be calculated, the defocus amount is not averaged and the other defocus amount is used. Further, when the reliability of the defocus amount is low in both directions, or when the defocus amount cannot be calculated, the focus cannot be detected.

ステップS140で合焦近傍か否か、すなわち算出されたデフォーカス量の絶対値が所定値以内であるか否かを調べる。合焦近傍でないと判定された場合はステップS150へ進み、デフォーカス量をレンズ駆動制御装置206へ送信し、交換レンズ202のフォーカシングレンズ210を合焦位置に駆動させる。その後、ステップS110へ戻って上述した動作を繰り返す。   In step S140, it is checked whether or not the focus is close, that is, whether or not the calculated absolute value of the defocus amount is within a predetermined value. If it is determined that the lens is not in focus, the process proceeds to step S150, the defocus amount is transmitted to the lens drive controller 206, and the focusing lens 210 of the interchangeable lens 202 is driven to the focus position. Then, it returns to step S110 and repeats the operation | movement mentioned above.

なお、焦点検出不能な場合もこのステップに分岐し、レンズ駆動制御装置206へスキャン駆動命令を送信し、交換レンズ202のフォーカシングレンズ210を無限から至近までの間でスキャン駆動させる。その後、ステップS110へ戻って上述した動作を繰り返す。   Even when focus detection is impossible, the process branches to this step, a scan drive command is transmitted to the lens drive control device 206, and the focusing lens 210 of the interchangeable lens 202 is driven to scan from infinity to the nearest. Then, it returns to step S110 and repeats the operation | movement mentioned above.

ステップS140で合焦近傍であると判定された場合はステップS160へ進み、シャッターボタン(不図示)の操作によりシャッターレリーズがなされたか否かを判別する。シャッターレリーズがなされていないと判定された場合はステップS110へ戻り、上述した動作を繰り返す。一方、シャッターレリーズがなされたと判定された場合はステップS170へ進み、レンズ駆動制御装置206へ絞り調整命令を送信し、交換レンズ202の絞り値を制御F値(撮影者または自動により設定されたF値)にする。絞り制御が終了した時点で、撮像素子212に撮像動作を行わせ、撮像素子212の撮像画素310および全ての焦点検出画素313,314、315、316から画像データを読み出す。   If it is determined in step S140 that the focus is close, the process proceeds to step S160, and it is determined whether or not a shutter release has been performed by operating a shutter button (not shown). If it is determined that the shutter release has not been performed, the process returns to step S110 and the above-described operation is repeated. On the other hand, if it is determined that the shutter release has been performed, the process proceeds to step S170, where an aperture adjustment command is transmitted to the lens drive control unit 206, and the aperture value of the interchangeable lens 202 is set to the control F value (the F or the camera set automatically Value). When the aperture control is completed, the image sensor 212 is caused to perform an imaging operation, and image data is read from the imaging pixel 310 of the imaging element 212 and all the focus detection pixels 313, 314, 315, and 316.

ステップS180において、焦点検出画素列の各画素位置における仮想的な撮像画素のデータを焦点検出画素313,314、315、316の周囲の撮像画素310のデータと焦点検出画素313,314、315、316のデータに基づいて画素補間する。なおこの補間処理については後述する。続くステップS190では、撮像画素310のデータおよび補間された撮像画素310のデータからなる画像データをメモリカード219に記憶し、ステップS110へ戻って上述した動作を繰り返す。   In step S180, the data of the virtual imaging pixels at the respective pixel positions in the focus detection pixel row are used as the data of the imaging pixels 310 around the focus detection pixels 313, 314, 315, and 316 and the focus detection pixels 313, 314, 315, and 316. Pixel interpolation is performed based on the data. This interpolation process will be described later. In the subsequent step S190, image data composed of the data of the imaging pixel 310 and the data of the interpolated imaging pixel 310 is stored in the memory card 219, and the process returns to step S110 to repeat the above-described operation.

図11のステップS135における像ズレ検出演算処理(相関演算処理、位相差検出処理)の詳細について以下説明する。簡単のため2つの焦点検出画素配列(第1の焦点検出画素配列および第2の焦点検出画素配列)の一方に対する処理を記載するが、もう一方に対する処理も同様である。   Details of the image shift detection calculation process (correlation calculation process, phase difference detection process) in step S135 of FIG. 11 will be described below. For the sake of simplicity, the processing for one of the two focus detection pixel arrays (the first focus detection pixel array and the second focus detection pixel array) will be described, but the process for the other is also the same.

焦点検出画素313,314、または焦点検出画素315,316が検出する一対の像は、測距瞳がレンズの絞り開口により口径蝕を受けて光量バランスが崩れている可能性があるので、光量バランスに対して像ズレ検出精度を維持できるタイプの相関演算を施す。
焦点検出画素列から読み出された一対のデータ列(A1〜A1、A2〜A2:Mはデータ数)に対し、特開2007−333720号公報に開示された相関演算式(1)を行い、相関量C(k)を演算する。
C(k)=Σ|A1×A2n+1+k−A2n+k×A1n+1| (1)
In the pair of images detected by the focus detection pixels 313 and 314 or the focus detection pixels 315 and 316, there is a possibility that the distance measurement pupil is vignetted by the aperture of the lens and the light amount balance is lost. Is subjected to a correlation calculation capable of maintaining the image shift detection accuracy.
For a pair of data strings (A1 1 to A1 M , A2 1 to A2 M : M is the number of data) read out from the focus detection pixel string, a correlation calculation formula (1) disclosed in Japanese Patent Laid-Open No. 2007-333720 ) To calculate the correlation amount C (k).
C (k) = Σ | A1 n × A2 n + 1 + k -A2 n + k × A1 n + 1 | (1)

式(1)において、Σ演算はnについて累積されるが、nのとる範囲は、像ずらし量kに応じてA1、A1n+1、A2n+k、A2n+1+kのデータが存在する範囲に限定される。像ずらし量kは整数であり、データ列のデータ間隔を単位とした相対的ずらし量である。 In equation (1), the Σ operation is accumulated for n, but the range taken by n is limited to a range in which data of A1 n , A1 n + 1 , A2 n + k , A2 n + 1 + k exists according to the image shift amount k. . The image shift amount k is an integer and is a relative shift amount with the data interval of the data string as a unit.

式(1)の演算結果は、図12(a)に示すように、一対のデータの相関が高いシフト量(図12(a)ではk=k=2)において相関量C(k)が極小(小さいほど相関度が高い)になる。式(2)〜(5)による3点内挿の手法を用いて連続的な相関量に対する極小値C(k)を与えるずらし量kを求める。
=k+D/SLOP (2)
C(k)= C(k)−|D| (3)
D={C(k−1)−C(k+1)}/2 (4)
SLOP=MAX{C(k+1)−C(kj),C(k−1)−C(k)}
(5)
As shown in FIG. 12A, the calculation result of the expression (1) indicates that the correlation amount C (k) is obtained when the pair of data has a high correlation amount (k = k j = 2 in FIG. 12A). Minimal (the smaller the value, the higher the degree of correlation). The shift amount k s that gives the minimum value C (k s ) with respect to the continuous correlation amount is obtained using the three-point interpolation method according to the equations (2) to (5).
k s = k j + D / SLOP (2)
C (k s ) = C (k j ) − | D | (3)
D = {C (k j −1) −C (k j +1)} / 2 (4)
SLOP = MAX {C ( kj + 1) -C ( kj ), C ( kj- 1) -C ( kj )}
(5)

式(2)で算出されたずらし量kの信頼性があるかどうかは、以下のようにして判定される。図12(b)に示すように、一対のデータの相関度が低い場合は、内挿された相関量の極小値C(k)の値が大きくなる。したがって、C(k)が所定の閾値以上の場合は算出されたずらし量の信頼性が低いと判定し、算出されたずらし量ksをキャンセルする。 Whether or not the shift amount k s calculated by Expression (2) is reliable is determined as follows. As shown in FIG. 12B, when the degree of correlation between a pair of data is low, the value of the interpolated correlation minimum value C (k s ) increases. Therefore, when C (k s ) is equal to or greater than a predetermined threshold value, it is determined that the reliability of the calculated shift amount is low, and the calculated shift amount ks is canceled.

あるいは、C(k)をデータのコントラストで規格化するために、コントラストに比例した値となるSLOPでC(k)を除した値が所定値以上の場合は、算出されたずらし量の信頼性が低いと判定し、算出されたずらし量ksをキャンセルする。あるいはまた、コントラストに比例した値となるSLOPが所定値以下の場合は、被写体が低コントラストであり、算出されたずらし量の信頼性が低いと判定し、算出されたずらし量kをキャンセルする。 Alternatively, in order to normalize C (k s ) with the contrast of data, when the value obtained by dividing C (k s ) by SLOP that is proportional to the contrast is equal to or greater than a predetermined value, the calculated shift amount It is determined that the reliability is low, and the calculated shift amount ks is canceled. Alternatively, when SLOP that is a value proportional to the contrast is equal to or smaller than a predetermined value, it is determined that the subject has low contrast and the reliability of the calculated shift amount is low, and the calculated shift amount k s is canceled. .

図12(c)に示すように、一対のデータの相関度が低く、ずらし量の範囲kmin〜kmaxの間で相関量C(k)の落ち込みがない場合は、極小値C(k)を求めることができず、このような場合は焦点検出不能と判定する。 Figure 12 (c), the low level of correlation between the pair of data when there is no drop in correlation quantity C (k) is between the shift amount in the range k min to k max, the minimum value C (k s In such a case, it is determined that the focus cannot be detected.

算出されたずらし量kの信頼性があると判定された場合は、式(6)により像ズレ量shftに換算される。
shft=PY×k (6)
If it is determined that the calculated shift amount k s is reliable, it is converted into the image shift amount shft by Equation (6).
shft = PY × k s (6)

式(6)において、PYは焦点検出画素313,314、または焦点検出画素315,316の画素ピッチの2倍(検出ピッチ)である。式(6)で算出された像ずらし量に所定の変換係数Kを乗じて、式(7)に表されるようにデフォーカス量defへ変換する。なお変換係数Kは測距瞳距離dを一対の測距瞳の重心間隔で除算した値である。
def=K×shft (7)
In Expression (6), PY is twice the pixel pitch (detection pitch) of the focus detection pixels 313 and 314 or the focus detection pixels 315 and 316. The image shift amount calculated by Expression (6) is multiplied by a predetermined conversion coefficient Kd , and converted to a defocus amount def as represented by Expression (7). The conversion coefficient Kd is a value obtained by dividing the distance-measuring pupil distance d by the center-of-gravity interval between the pair of distance-measuring pupils.
def = K d × shft (7)

図11のステップS130における焦点検出画素補間処理の詳細について以下に説明する。図13は第1の焦点検出画素配列と第2の焦点検出画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。第1の焦点検出画素配列のデータを図13のようにα14、α24、・・・、α84で示しており、第2の焦点検出画素配列のデータをβ41、β42、・・・、β48で示している。なお交差画素位置には第1の焦点検出画素が配置されている(すなわち、第2の焦点検出画素は配置されていない)ので、この画素位置に仮想的に第2の焦点検出画素316が配置された場合のデータをβ44としている。   Details of the focus detection pixel interpolation process in step S130 of FIG. 11 will be described below. FIG. 13 is a diagram showing an area of 8 pixels × 8 pixels centering on a pixel position where the first focus detection pixel array and the second focus detection pixel array intersect. The data of the first focus detection pixel array is indicated by α14, α24,..., Α84 as shown in FIG. 13, and the data of the second focus detection pixel array is indicated by β41, β42,. ing. Since the first focus detection pixel is disposed at the intersecting pixel position (that is, the second focus detection pixel is not disposed), the second focus detection pixel 316 is virtually disposed at the pixel position. In this case, the data is β44.

前述したように、第1の焦点検出画素313が受光する焦点検出光束73と第1の焦点検出画素314が受光する焦点検出光束74を足し合わせた光束は、第2の焦点検出画素315が受光する焦点検出光束と第2の焦点検出画素316が受光する焦点検出光束を足し合わせた光束と等しくなる。   As described above, the light beam obtained by adding the focus detection light beam 73 received by the first focus detection pixel 313 and the focus detection light beam 74 received by the first focus detection pixel 314 is received by the second focus detection pixel 315. The focus detection light beam and the focus detection light beam received by the second focus detection pixel 316 are equal to each other.

したがって、交差画素位置における第1の焦点検出画素313が受光する焦点検出光束73により生成されるデータと、該交差画素位置に、第1の焦点検出画素314が仮想的に配置されたとした場合において、第1の焦点検出画素314が受光する焦点検出光束74により生成されるデータとを足し合わせたデータγは、交差画素位置およびその近傍の第1の焦点検出画素のデータに基づき次式のように算出される。
γ=α44+(α34+α54)/2 (8)
Therefore, in the case where it is assumed that the data generated by the focus detection light beam 73 received by the first focus detection pixel 313 at the intersection pixel position and the first focus detection pixel 314 is virtually arranged at the intersection pixel position. The data γ obtained by adding the data generated by the focus detection light beam 74 received by the first focus detection pixel 314 is based on the data of the first focus detection pixel in the vicinity of the intersection pixel position and the following equation: Is calculated.
γ = α44 + (α34 + α54) / 2 (8)

一方、交差画素位置において、第2の焦点検出画素315が受光する光束により生成されるデータδは、交差画素位置を挟む第2の焦点検出画素315のデータを平均して次式のように算出される。
δ=(β43+β45)/2 (9)
On the other hand, the data δ generated by the light beam received by the second focus detection pixel 315 at the intersection pixel position is calculated as the following equation by averaging the data of the second focus detection pixels 315 sandwiching the intersection pixel position. Is done.
δ = (β43 + β45) / 2 (9)

以上より交差画素位置に仮想的に第2の焦点検出画素316を配置した場合のデータβ44は以下のように求められる。
β44=γ―δ=α44+(α34+α54)/2―(β43+β45)/2
(10)
As described above, the data β44 when the second focus detection pixel 316 is virtually arranged at the intersection pixel position is obtained as follows.
β44 = γ−δ = α44 + (α34 + α54) / 2− (β43 + β45) / 2
(10)

式(10)のようにして求めたデータβ44は、交差画素位置のデータおよび交差画素位置に隣接した画素のデータを使用して求めているので、交差画素位置近傍の焦点検出画素316のデータを単純に平均して求めた場合、例えばβ44=(β42+β46)/2として求めた場合に比較して補間精度が格段に向上し、この補間データを用いた焦点検出の演算精度も向上する。   Since the data β44 obtained by the equation (10) is obtained by using the data of the intersection pixel position and the data of the pixel adjacent to the intersection pixel position, the data of the focus detection pixel 316 near the intersection pixel position is obtained. When calculated simply by averaging, for example, the interpolation accuracy is remarkably improved as compared with the case where β44 = (β42 + β46) / 2, for example, and the calculation accuracy of focus detection using the interpolation data is also improved.

図11のステップS180における撮像画素補間処理の詳細について以下に説明する。図13において、撮像画素がベイヤー配列の配置規則に従って配置される場合、交差画素位置には緑画素が配置される。交差画素位置に仮想的に緑画素が配置された場合のデータをG44とすると、データG44は交差画素位置に最隣接する4つの緑画素データG33、G35、G53、G55を平均して求めることができる。
G44=(G33+G35+G53+G55)/4 (11)
Details of the imaging pixel interpolation processing in step S180 of FIG. 11 will be described below. In FIG. 13, when the imaging pixels are arranged according to the arrangement rule of the Bayer array, green pixels are arranged at the intersection pixel positions. If the data when the green pixel is virtually arranged at the crossing pixel position is G44, the data G44 can be obtained by averaging the four green pixel data G33, G35, G53, and G55 that are closest to the crossing pixel position. it can.
G44 = (G33 + G35 + G53 + G55) / 4 (11)

交差画素位置以外の焦点検出画素配列上に、ベイヤー配列の配置規則に従って仮想的に緑画素が配置された場合のデータも、最隣接の4つの緑画素のデータを平均することにより求めることができる。また、交差画素位置以外の焦点検出画素配列上にベイヤー配列の配置規則に従って仮想的に赤画素、青画素が配置された場合のデータは、基本的に仮想的な画素の画素位置に最近接する2つの赤画素または青画素のデータを平均して求める。例えば、交差画素位置の下隣の画素位置に、仮想的に青画素が配置された場合のデータをB54とすると、B54は次式で求められる。
B54=(B52+B56)/2 (12)
Data when a green pixel is virtually arranged on the focus detection pixel array other than the intersecting pixel position according to the arrangement rule of the Bayer array can also be obtained by averaging the data of the four adjacent green pixels. . In addition, the data when virtual red pixels and blue pixels are virtually arranged on the focus detection pixel array other than the intersecting pixel position according to the arrangement rule of the Bayer array is basically closest to the pixel position of the virtual pixel 2. The data of two red pixels or blue pixels is averaged. For example, assuming that the data when a blue pixel is virtually arranged at the pixel position below the intersecting pixel position is B54, B54 is obtained by the following equation.
B54 = (B52 + B56) / 2 (12)

斜め方向の最近接に青画素がある場合にはそれらを平均に用いてもよい。例えば、交差画素位置の下隣の画素位置に、仮想的に青画素が配置された場合のデータをB54とすると、B54は次式で求められる。
B54=(B52+B56+B32+B36+B72+B76)/6 (13)
If there are blue pixels closest to the diagonal, they may be used for averaging. For example, assuming that the data when a blue pixel is virtually arranged at the pixel position below the intersecting pixel position is B54, B54 is obtained by the following equation.
B54 = (B52 + B56 + B32 + B36 + B72 + B76) / 6 (13)

また、例えば、交差画素位置の右隣の画素位置に、仮想的に赤画素が配置された場合のデータをR45とすると、R45は次式で求められる。
R45=(R25+R65)/2 (14)
Further, for example, assuming that the data when a red pixel is virtually arranged at the pixel position right next to the intersecting pixel position is R45, R45 is obtained by the following equation.
R45 = (R25 + R65) / 2 (14)

斜め方向の最近接に赤画素がある場合にはそれらを平均に用いてもよい。例えば、交差画素位置の右隣の画素位置に、仮想的に赤画素が配置された場合のデータをR45とすると、R45は次式で求められる。
R45=(R25+R65+R23+R63+R27+R67)/6 (15)
If there are red pixels closest to the diagonal, they may be used for averaging. For example, assuming that the data when a red pixel is virtually arranged at the pixel position right next to the intersecting pixel position is R45, R45 is obtained by the following equation.
R45 = (R25 + R65 + R23 + R63 + R27 + R67) / 6 (15)

上記の撮像画素補間処理においては、仮想的な撮像画素に近接する同じ色の撮像画素のデータを用いているが、焦点検出画素のデータをさらに用いるようにしても良い。例えば、交差画素位置の下隣の画素位置に、仮想的に青画素が配置された場合のデータをB54とした場合、該画素位置近傍において色相が大きく変化しないと仮定すれば、焦点検出画素のデータに基づいて求めた該画素位置における輝度データ(α54+(α44+α64)/2)と、近傍の撮像画素のデータに基づいて求めた青色/白色の成分比((B52+B56)/(B52+B56+G53+G55+R63+R65))とを掛け合わせることにより、以下のようにデータB54を求めることができる。
B54=(α54+(α44+α64)/2)×(B52+B56)/(B52+B56+G53+G55+R63+R65) (16)
In the imaging pixel interpolation process, data of imaging pixels of the same color adjacent to the virtual imaging pixel is used, but data of focus detection pixels may be further used. For example, if the data when a virtual blue pixel is virtually arranged at the pixel position below the intersecting pixel position is B54, assuming that the hue does not change greatly in the vicinity of the pixel position, the focus detection pixel The luminance data (α54 + (α44 + α64) / 2) obtained based on the data and the blue / white component ratio ((B52 + B56) / (B52 + B56 + G53 + G55 + R63 + R65)) obtained based on the data of the neighboring imaging pixels. By multiplying, data B54 can be obtained as follows.
B54 = (α54 + (α44 + α64) / 2) × (B52 + B56) / (B52 + B56 + G53 + G55 + R63 + R65) (16)

このように、仮想的な撮像画素の画素位置およびその近傍における焦点検出画素のデータを用いることにより、高周波の空間周波数成分を有する像に対する撮像画素補間精度を向上することができる。   As described above, by using the pixel position of the virtual imaging pixel and the data of the focus detection pixel in the vicinity thereof, the imaging pixel interpolation accuracy for an image having a high-frequency spatial frequency component can be improved.

上記実施形態のように複数の焦点検出画素配列が交差している場合において、高精度に交差画素位置の焦点検出画素のデータを補間することができるので、複数の方向において正確に焦点検出を行うことが可能になる。   In the case where a plurality of focus detection pixel arrays intersect as in the above embodiment, the focus detection pixel data at the intersecting pixel position can be interpolated with high accuracy, so that focus detection is accurately performed in a plurality of directions. It becomes possible.

また撮像画素をベイヤー配列の配置規則で配列するとともに、複数の焦点検出画素配列の交差画素位置を緑画素が配置されるべき画素位置としているので、交差画素位置の仮想的な撮像画素のデータを補間する場合において、高精度に補間することが可能になる。   In addition, the imaging pixels are arranged according to the arrangement rule of the Bayer arrangement, and the intersection pixel positions of the plurality of focus detection pixel arrays are set as the pixel positions where the green pixels are to be arranged. In the case of interpolation, it becomes possible to perform interpolation with high accuracy.

また画面中心(撮影光軸)から離れた焦点検出エリアにおいて瞳分割型の焦点検出を行う場合には、一対の焦点検出光束の並び方向が焦点検出エリアと画面中心を結ぶ直線に直交する方向に並ぶほうが、交換レンズの射出瞳位置が変化した場合に、一対の焦点検出光束の口径蝕のアンバランスが生じにくいので有利である。上記実施形態においては画面中心(撮影光軸)から離れた焦点検出エリア102、103において、交差画素位置には焦点検出エリアと画面中心を結ぶ直線に対し直交する方向(角度が大きい方向)に配列された第1の焦点検出画素配列に属する焦点検出画素313を配置することにより、像高の高い焦点検出エリアにおいて焦点検出光束のアンバランスを防止し、確実に焦点検出を行うことができる。   When performing pupil-division focus detection in a focus detection area away from the screen center (imaging optical axis), the alignment direction of the pair of focus detection light beams is perpendicular to the straight line connecting the focus detection area and the screen center. Arrangement is more advantageous because when the exit pupil position of the interchangeable lens changes, an imbalance of vignetting between the pair of focus detection light beams is less likely to occur. In the above-described embodiment, in the focus detection areas 102 and 103 separated from the screen center (imaging optical axis), the intersecting pixel positions are arranged in a direction orthogonal to the straight line connecting the focus detection area and the screen center (a direction in which the angle is large). By disposing the focus detection pixels 313 belonging to the first focus detection pixel array, it is possible to prevent the focus detection light beam from being unbalanced in the focus detection area having a high image height and to perform focus detection reliably.

《発明の他の実施の形態》
<交差画素位置に第3の焦点検出画素を配置>
図14は、別実施形態の撮像素子212の詳細な構成を示す正面図であり、図3に対応する図である。図3と異なる点は、垂直方向の焦点検出画素配列(第1の焦点検出画素配列)と水平方向の焦点検出画素配列(第2の焦点検出画素配列)が交差する画素位置(ベイヤー配列の配置規則で緑画素が配置されるべき画素位置)に、第1の焦点検出画素および第2の焦点検出画素とは異なる焦点検出画素317(第3の焦点検出画素)が配置される点である。
<< Other Embodiments of the Invention >>
<A third focus detection pixel is arranged at the intersection pixel position>
FIG. 14 is a front view illustrating a detailed configuration of the image sensor 212 according to another embodiment, and corresponds to FIG. 3. 3 is different from FIG. 3 in that the pixel position (Bayer array arrangement) where the vertical focus detection pixel array (first focus detection pixel array) and the horizontal focus detection pixel array (second focus detection pixel array) intersect. The point is that a focus detection pixel 317 (third focus detection pixel) different from the first focus detection pixel and the second focus detection pixel is arranged at a pixel position where a green pixel is to be arranged by rule.

焦点検出画素317は、フィルタ以外は撮像画素310と同様な構造を有しており、フィルタは焦点検出画素と同じ白色フィルタを有している。焦点検出画素317の受光領域は焦点検出画素313、314、315、316の2倍のサイズの正方形になっており、焦点検出画素317(第3の焦点検出画素)が配置された画素位置に仮想的に焦点検出画素313と314または焦点検出画素315と316が配置されているとした場合に、焦点検出画素317のデータは仮想的な焦点検出画素313、314のデータを加算した値、または仮想的な焦点検出画素313、314のデータを加算した値となる。   The focus detection pixel 317 has the same structure as the imaging pixel 310 except for the filter, and the filter has the same white filter as the focus detection pixel. The light receiving area of the focus detection pixel 317 is a square that is twice the size of the focus detection pixels 313, 314, 315, and 316, and is virtually at the pixel position where the focus detection pixel 317 (third focus detection pixel) is arranged. When the focus detection pixels 313 and 314 or the focus detection pixels 315 and 316 are arranged, the data of the focus detection pixel 317 is a value obtained by adding the data of the virtual focus detection pixels 313 and 314, or the virtual This is a value obtained by adding data of typical focus detection pixels 313 and 314.

図15は、第1の焦点検出画素配列と第2の焦点検出画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。第1の焦点検出画素配列のデータを図のようにα14、α24、・・・、α84で示しており、第2の焦点検出画素配列のデータをβ41、β42、・・・、β48で示している。   FIG. 15 is a diagram illustrating an area of 8 pixels × 8 pixels centering on a pixel position where the first focus detection pixel array and the second focus detection pixel array intersect. The data of the first focus detection pixel array is indicated by α14, α24,..., Α84 as shown in the figure, and the data of the second focus detection pixel array is indicated by β41, β42,. Yes.

焦点検出画素317(第3の焦点検出画素)のデータをη44とすると、焦点検出画素317が配置された画素位置における仮想的な第1の焦点検出画素313のデータα44、および焦点検出画素317が配置された画素位置における仮想的な第2の焦点検出画素313のデータβ44は、以下のように算出される。
α44=η44−(α34+α54)/2 (17)
β44=η44−(β43+β45)/2 (18)
If the data of the focus detection pixel 317 (third focus detection pixel) is η44, the data α44 of the virtual first focus detection pixel 313 and the focus detection pixel 317 at the pixel position where the focus detection pixel 317 is arranged are The data β44 of the virtual second focus detection pixel 313 at the arranged pixel position is calculated as follows.
α44 = η44− (α34 + α54) / 2 (17)
β44 = η44− (β43 + β45) / 2 (18)

上記実施形態においては、交差画素位置に第1の焦点検出画素を配置する場合に比較して、仮想的な第2の焦点検出画素のデータの補間精度が向上する。   In the above embodiment, the interpolation accuracy of the data of the virtual second focus detection pixel is improved as compared with the case where the first focus detection pixel is arranged at the intersection pixel position.

<1つの焦点検出画素に一対の受光領域を備える>
図3に示す撮像素子212の部分拡大図では、各画素に1つの光電変換部を有する一対の焦点検出画素313,314および一対の焦点検出画素315,316を備える例を示したが、ひとつの焦点検出画素内に一対の光電変換部を備えるようにしてもよい。図16は、このような撮像素子212の部分拡大図であり、焦点検出画素311および312は一対の光電変換部を備える。第1の焦点検出画素311の画素配列と第2の焦点検出画素312の画素配列とが交差する画素位置には、第1の焦点検出画素311が配置される。
<Each focus detection pixel has a pair of light receiving regions>
In the partially enlarged view of the image sensor 212 illustrated in FIG. 3, an example in which each pixel includes a pair of focus detection pixels 313 and 314 and a pair of focus detection pixels 315 and 316 each having one photoelectric conversion unit is illustrated. A pair of photoelectric conversion units may be provided in the focus detection pixel. FIG. 16 is a partially enlarged view of such an image sensor 212. The focus detection pixels 311 and 312 include a pair of photoelectric conversion units. The first focus detection pixel 311 is disposed at a pixel position where the pixel array of the first focus detection pixel 311 and the pixel array of the second focus detection pixel 312 intersect.

図17(a)に示す焦点検出画素311は、図6(a)、(b)に示す焦点検出画素313と焦点検出画素314のペアに相当した機能を果たし、図17(b)に示す焦点検出画素311は、図6(c)、(d)に示す焦点検出画素315と焦点検出画素316のペアに相当した機能を果たす。焦点検出画素311、312は、図17(a)、(b)に示すように、マイクロレンズ10と、一対の光電変換部13,14または一対の光電変換部15,16とから構成される。焦点検出画素311、312には白色フィルタが配置されており、その分光感度特性は、光電変換を行うフォトダイオードの分光感度特性と、赤外カットフィルタ(不図示)の分光感度特性とを総合した分光感度特性となる。つまり、上述した緑画素、赤画素および青画素の分光感度特性を加算したような分光感度特性となり、その感度の光波長領域は緑画素、赤画素および青画素の感度の光波長領域を包括している。   The focus detection pixel 311 shown in FIG. 17A performs a function corresponding to the pair of the focus detection pixel 313 and the focus detection pixel 314 shown in FIGS. 6A and 6B, and the focus detection pixel shown in FIG. The detection pixel 311 performs a function corresponding to the pair of the focus detection pixel 315 and the focus detection pixel 316 shown in FIGS. As shown in FIGS. 17A and 17B, the focus detection pixels 311 and 312 include a microlens 10 and a pair of photoelectric conversion units 13 and 14 or a pair of photoelectric conversion units 15 and 16. The focus detection pixels 311 and 312 are provided with white filters. The spectral sensitivity characteristics of the focus detection pixels 311 and 312 are the total of the spectral sensitivity characteristics of a photodiode that performs photoelectric conversion and the spectral sensitivity characteristics of an infrared cut filter (not shown). Spectral sensitivity characteristics. In other words, the spectral sensitivity characteristic is the sum of the spectral sensitivity characteristics of the green pixel, red pixel, and blue pixel described above, and the optical wavelength region of the sensitivity includes the optical wavelength region of the sensitivity of the green pixel, red pixel, and blue pixel. ing.

図18は、図17(a)に示した焦点検出画素311の断面図であって、光電変換部13,14の上に近接して遮光マスク30が形成され、遮光マスク30の開口部30dを通過した光を光線変換部13,14は受光する。遮光マスク30の上には平坦化層31が形成され、その上に白色フィルタ39が形成される。白色フィルタ39の上には平坦化層32が形成され、その上にマイクロレンズ10が形成される。マイクロレンズ10により開口部30dに制限された光電変換部13,14の形状が前方に投影されて、一対の測距瞳を形成する。光電変換部13,14は半導体回路基板29上に形成される。焦点検出画素312の構造も焦点検出画素311の構造を90度回転しただけであって、基本的に図18に示す焦点検出画素の構造と同様である。   FIG. 18 is a cross-sectional view of the focus detection pixel 311 shown in FIG. 17A, in which a light shielding mask 30 is formed in proximity to the photoelectric conversion units 13 and 14, and an opening 30 d of the light shielding mask 30 is formed. The light beam converters 13 and 14 receive the light that has passed. A planarizing layer 31 is formed on the light shielding mask 30, and a white filter 39 is formed thereon. A planarizing layer 32 is formed on the white filter 39, and the microlens 10 is formed thereon. The shape of the photoelectric conversion units 13 and 14 limited to the opening 30d by the microlens 10 is projected forward to form a pair of distance measuring pupils. The photoelectric conversion units 13 and 14 are formed on the semiconductor circuit substrate 29. The structure of the focus detection pixel 312 is only the structure of the focus detection pixel 311 rotated by 90 degrees, and is basically the same as the structure of the focus detection pixel shown in FIG.

図19は、第1の焦点検出画素311の画素配列と第2の焦点検出画素312の画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。交差画素位置を含む連続した3画素の第1の焦点検出画素311の配列の一対のデータを、図のように(A34、B34)、(A44、B44)、(A54、B54)とし、交差画素位置を挟む2画素の第2の焦点検出画素312の配列の一対のデータを、図のように(C43、D43)、(C45、D45)とすると、焦点交差画素位置における仮想的な第2の焦点検出画素312の一対のデータ(C44、D44)は、以下のように算出される。
S=2×(A44+B44)/(C43+C45+D43+D45) (19)
C44=S×(C43+C45)/2 (20)
D44=S×(D43+D45)/2 (21)
FIG. 19 is a diagram illustrating an 8 × 8 pixel region centering on a pixel position where the pixel array of the first focus detection pixel 311 and the pixel array of the second focus detection pixel 312 intersect. A pair of data of the arrangement of the first focus detection pixels 311 of three consecutive pixels including the intersection pixel positions is (A34, B34), (A44, B44), (A54, B54) as shown in the figure, and the intersection pixels Assuming that a pair of data of the array of the second focus detection pixels 312 of the two pixels sandwiching the position is (C43, D43), (C45, D45) as shown in the figure, a virtual second at the focus crossing pixel position is obtained. A pair of data (C44, D44) of the focus detection pixel 312 is calculated as follows.
S = 2 × (A44 + B44) / (C43 + C45 + D43 + D45) (19)
C44 = S × (C43 + C45) / 2 (20)
D44 = S × (D43 + D45) / 2 (21)

<交差画素位置に第3の焦点検出画素を配置>
図20は、別実施形態の撮像素子212の詳細な構成を示す正面図である。図16と異なる点は、垂直方向の焦点検出画素配列(第1の焦点検出画素配列)と水平方向の焦点検出画素配(第2の焦点検出画素配列)列とが交差する画素位置(ベイヤー配列の配置規則で緑画素が配置されるべき位置)には、焦点検出画素317(第3の焦点検出画素)が配置される点である。
<A third focus detection pixel is arranged at the intersection pixel position>
FIG. 20 is a front view showing a detailed configuration of the image sensor 212 of another embodiment. The difference from FIG. 16 is that pixel positions (Bayer array) where the vertical focus detection pixel array (first focus detection pixel array) and the horizontal focus detection pixel array (second focus detection pixel array) intersect. The focus detection pixel 317 (the third focus detection pixel) is arranged at a position where the green pixel is to be arranged according to the arrangement rule (1).

焦点検出画素317は、フィルタ以外は撮像画素310と同様な構造を有しており、フィルタは焦点検出画素と同じ白色フィルタを有している。焦点検出画素317の受光領域のサイズは、焦点検出画素312の一対の受光部および焦点検出画素312の一対の受光部のサイズと同じサイズになっており、焦点検出画素317(第3の焦点検出画素)が配置された画素位置に、仮想的に焦点検出画素311または焦点検出画素312が配置されているとした場合に、焦点検出画素317のデータは、仮想的な焦点検出画素311の一対データを加算した値、または仮想的な焦点検出画素312の一対のデータを加算した値となる。   The focus detection pixel 317 has the same structure as the imaging pixel 310 except for the filter, and the filter has the same white filter as the focus detection pixel. The size of the light receiving region of the focus detection pixel 317 is the same as the size of the pair of light receiving portions of the focus detection pixel 312 and the pair of light receiving portions of the focus detection pixel 312, and the focus detection pixel 317 (third focus detection). When the focus detection pixel 311 or the focus detection pixel 312 is virtually disposed at the pixel position where the pixel) is disposed, the data of the focus detection pixel 317 is a pair of data of the virtual focus detection pixel 311. Or a value obtained by adding a pair of virtual focus detection pixel 312 data.

図21は、第1の焦点検出画素311の画素配列と第2の焦点検出画素312の画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。焦点検出画素317(第3の焦点検出画素)のデータをη44とし、交差画素位置の上下の第1の焦点検出画素311の配列の一対のデータを、図のように(A34、B34)、(A54、B54)とし、交差画素位置の左右の2画素の第2の焦点検出画素312の配列の一対のデータを、図のように(C43、D43)、(C45、D45)とすると、焦点交差画素位置における仮想的な第1の焦点検出画素311の一対のデータ(A44、B44)、および仮想的な第2の焦点検出画素312の一対のデータ(C44、D44)は、以下のように算出される。
S1=2×η44/(A34+B34+A54+B54) (22)
A44=S1×(A34+A54)/2 (23)
B44=S1×(B34+B54)/2 (24)
S2=2×η44/(C43+C45+D43+D45) (25)
C44=S2×(C43+C45)/2 (26)
D44=S2×(D43+D45)/2 (27)
FIG. 21 is a diagram illustrating an 8 × 8 pixel region centered on a pixel position where the pixel array of the first focus detection pixel 311 and the pixel array of the second focus detection pixel 312 intersect. The data of the focus detection pixel 317 (third focus detection pixel) is η44, and a pair of data of the arrangement of the first focus detection pixels 311 above and below the crossing pixel position is (A34, B34), ( A54, B54), and the pair of data of the second focus detection pixels 312 at the left and right of the crossing pixel position are (C43, D43) and (C45, D45) as shown in the figure, the focus crossing The pair of data (A44, B44) of the virtual first focus detection pixel 311 and the pair of data (C44, D44) of the virtual second focus detection pixel 312 at the pixel position are calculated as follows. Is done.
S1 = 2 × η44 / (A34 + B34 + A54 + B54) (22)
A44 = S1 × (A34 + A54) / 2 (23)
B44 = S1 × (B34 + B54) / 2 (24)
S2 = 2 × η44 / (C43 + C45 + D43 + D45) (25)
C44 = S2 × (C43 + C45) / 2 (26)
D44 = S2 × (D43 + D45) / 2 (27)

<×字型の焦点検出エリア、交差画素位置=緑>
図22は、交換レンズ202の撮影画面上における焦点検出位置(焦点検出エリア)を示す図であり、詳細な構成を後述する撮像素子212上の焦点検出画素列が、焦点検出の際に撮影画面上で像をサンプリングする領域(焦点検出エリア、焦点検出位置)の一例を示す。この例では、矩形の撮影画面100上の中央および上下左右の5箇所に焦点検出エリア111〜115が配置される。焦点検出画素が、長方形で示す焦点検出エリアの長手方向に対して、斜め右上がり45度方向および斜め左上がり45度方向に直線的に配列される。2つの方向に配列した焦点検出画素配列は各焦点検出エリアの中心で交差する。交差する画素位置では、交差する画素位置と画面中心とを結ぶ直線に直交する方向に配列された焦点検出画素配列に属する焦点検出画素が配置される。
<X-shaped focus detection area, cross pixel position = green>
FIG. 22 is a diagram showing a focus detection position (focus detection area) on the imaging screen of the interchangeable lens 202, and a focus detection pixel column on the image sensor 212, which will be described in detail later, is an imaging screen for focus detection. An example of a region (focus detection area, focus detection position) where an image is sampled is shown. In this example, focus detection areas 111 to 115 are arranged at the center on the rectangular shooting screen 100 and at five locations on the top, bottom, left, and right. The focus detection pixels are linearly arranged in a 45 ° obliquely upward and 45 ° obliquely upward direction with respect to the longitudinal direction of the focus detection area indicated by a rectangle. The focus detection pixel arrays arranged in the two directions intersect at the center of each focus detection area. At the intersecting pixel positions, focus detection pixels belonging to the focus detection pixel array arranged in a direction orthogonal to a straight line connecting the intersecting pixel position and the screen center are arranged.

図23は、撮像素子212の詳細な構成を示す正面図であり、撮像素子212上の焦点検出エリア114の近傍を拡大して示す。撮像素子212には撮像画素310が二次元正方格子状に稠密に配列される。撮像画素310は赤画素(R)、緑画素(G)、青画素(B)からなり、ベイヤー配列の配置規則によって配置されている。焦点検出エリア114に対応する位置には、撮像画素と同一の画素サイズを有し、白色フィルタを備えた焦点検出画素323と324が、交互に、本来緑画素が連続的に配置されるべき斜め右上がり45度方向の直線上に連続して配列されるとともに、白色フィルタを備えた焦点検出画素325と326が、交互に、本来緑画素が連続的に配置されるべき斜め左上がり45度方向の直線上に連続して配列される。焦点検出画素323と324、および焦点検出画素325と326は、本来緑画素が配置される画素位置に配置されるとともに、交差画素位置も、本来緑画素が配置されるべき画素位置になっている。   FIG. 23 is a front view showing a detailed configuration of the image sensor 212, and shows an enlarged vicinity of the focus detection area 114 on the image sensor 212. Imaging pixels 310 are densely arranged on the imaging element 212 in a two-dimensional square lattice pattern. The imaging pixel 310 includes a red pixel (R), a green pixel (G), and a blue pixel (B), and is arranged according to a Bayer arrangement rule. At the position corresponding to the focus detection area 114, the focus detection pixels 323 and 324 having the same pixel size as the image pickup pixel and provided with the white filter are alternately diagonally arranged so that the green pixels should be continuously arranged. Focus detection pixels 325 and 326, which are continuously arranged on a straight line in a 45 degree upward direction and provided with a white filter, are alternately arranged in a 45 degree oblique left upward direction in which green pixels should be continuously arranged. Are arranged on a straight line. The focus detection pixels 323 and 324 and the focus detection pixels 325 and 326 are arranged at the pixel positions where the green pixels are originally arranged, and the intersection pixel positions are also the pixel positions where the green pixels are supposed to be arranged. .

焦点検出画素323は、正面図を表した図24(a)に示すように、矩形のマイクロレンズ10と後述の遮光マスクで受光領域を正方形の半分(正方形を左上がり45度方向の対角線で2等分した場合の右上半分)に制限された光電変換部23、および白色フィルタ(不図示)とから構成される。   As shown in FIG. 24A, which shows a front view, the focus detection pixel 323 has a rectangular half-lens 10 and a light-shielding mask, which will be described later. It is comprised from the photoelectric conversion part 23 restrict | limited to the upper right half at the time of equally dividing, and the white filter (not shown).

また、焦点検出画素324は、正面図を表した図24(b)に示すように、矩形のマイクロレンズ10と後述の遮光マスクで受光領域を正方形の半分(正方形を左上がり45度方向の対角線で2等分した場合の左下半分)に制限された光電変換部24、および白色フィルタ(不図示)とから構成される。   Further, as shown in FIG. 24B, which shows a front view, the focus detection pixel 324 has a rectangular half-lens (the square is a diagonal line in the 45 ° upward direction with a square as the light receiving area by using a rectangular microlens 10 and a light shielding mask described later. And a white filter (not shown).

焦点検出画素323の正面図と焦点検出画素324の正面図とをマイクロレンズ10を基準に重ね合わせて表示すると、遮光マスクで受光領域を制限された光電変換部23と24が右上がり斜め45度方向に並んでいる。また図24(a)、(b)において、正方形を半分にした受光領域の部分に正方形を半分にした残りの部分(破線部分)を加えると、撮像画素の受光領域と同じサイズの正方形となる。   When the front view of the focus detection pixel 323 and the front view of the focus detection pixel 324 are superimposed and displayed with the microlens 10 as a reference, the photoelectric conversion units 23 and 24 in which the light receiving area is limited by the light-shielding mask are inclined upward 45 degrees. It is lined up in the direction. Further, in FIGS. 24A and 24B, when the remaining part (broken line part) obtained by halving the square is added to the light receiving area part obtained by halving the square, a square having the same size as the light receiving area of the imaging pixel is obtained. .

焦点検出画素325、326の正面図も、焦点検出画素323、324の正面図を90度回転しただけであって、基本的に図24(a)、(b)に示す焦点検出画素323、324の正面図と同様である。   The front view of the focus detection pixels 325 and 326 is also merely a 90 ° rotation of the front view of the focus detection pixels 323 and 324. Basically, the focus detection pixels 323 and 324 shown in FIGS. This is the same as the front view.

図25は、第1の焦点検出画素323,324の画素配列と第2の焦点検出画素325,326の画素配列とが交差する画素位置を中心とした8画素×8画素の領域を示した図である。第1の焦点検出画素配列のデータを図のようにE81、E72、・・・、E18で示しており、第2の焦点検出画素配列のデータをF21、β32、・・・、F87で示している。なお、交差画素位置には第1の焦点検出画素323が配置されているので、この画素位置に仮想的に第2の焦点検出画素325が配置された場合のデータをF54としている。   FIG. 25 is a diagram illustrating an 8 × 8 pixel region centered on a pixel position where the pixel array of the first focus detection pixels 323 and 324 and the pixel array of the second focus detection pixels 325 and 326 intersect. It is. The first focus detection pixel array data is indicated by E81, E72,..., E18 as shown in the figure, and the second focus detection pixel array data is indicated by F21, β32,. Yes. Since the first focus detection pixel 323 is arranged at the intersection pixel position, the data when the second focus detection pixel 325 is virtually arranged at this pixel position is F54.

交差画素位置において、第1の焦点検出画素323のデータと第1の焦点検出画素324のデータとを加算した値が、第2の焦点検出画素325のデータと第2の焦点検出画素326のデータとを加算した値と等しくなるという前提に基づくと、交差画素位置に仮想的に第2の焦点検出画素325を配置した場合のデータF54は、以下のように求められる。
F54=E54+(E45+E63)/2―(F43+F65)/2 (28)
The value obtained by adding the data of the first focus detection pixel 323 and the data of the first focus detection pixel 324 at the intersection pixel position is the data of the second focus detection pixel 325 and the data of the second focus detection pixel 326. Based on the premise that the second focus detection pixel 325 is virtually disposed at the intersection pixel position, the data F54 is obtained as follows.
F54 = E54 + (E45 + E63) / 2− (F43 + F65) / 2 (28)

図25において、撮像画素がベイヤー配列の配置規則に従って配置される場合、交差画素位置には緑画素が配置される。交差画素位置に仮想的に緑画素が配置された場合のデータをG54とすると、データG54は、交差画素位置に最隣接する4つの緑画素データG34、G52、G56、G74を平均して求めることができる。
G54=(G34+G52+G56+G74)/4 (29)
In FIG. 25, when the imaging pixels are arranged according to the arrangement rule of the Bayer array, green pixels are arranged at the intersection pixel positions. If the data when a green pixel is virtually arranged at the intersection pixel position is G54, the data G54 is obtained by averaging the four green pixel data G34, G52, G56, and G74 that are closest to the intersection pixel position. Can do.
G54 = (G34 + G52 + G56 + G74) / 4 (29)

交差画素位置から斜め方向に1画素離れた画素位置に、仮想的に緑画素が配置された場合のデータについては、たとえば、交差画素位置の右斜め45度の隣接画素位置のデータをG45とすると、データG45は、該画素位置に最隣接する4つの緑画素データG34、G25、G56、G47を平均して求めることができる。
G45=(G34+G25+G56+G47)/4 (30)
For data when a green pixel is virtually arranged at a pixel position that is one pixel away from the intersecting pixel position in an oblique direction, for example, if the data of the adjacent pixel position at 45 degrees to the right of the intersecting pixel position is G45 The data G45 can be obtained by averaging the four green pixel data G34, G25, G56, and G47 closest to the pixel position.
G45 = (G34 + G25 + G56 + G47) / 4 (30)

交差画素位置から斜め方向に2画素離れた画素位置に、仮想的に緑画素が配置された場合のデータについては、たとえば、交差画素位置の右斜め45度に2画素離れた画素位置のデータをG36とすると、データG36は、該画素位置に最隣接する6つの緑画素データG34、G25、G56、G47、G16、G38を平均して求めることができる。
G36=(G34+G25+G56+G47+G16+G38)/6 (31)
For data when a virtual green pixel is virtually placed at a pixel position that is two pixels away from the intersecting pixel position in an oblique direction, for example, data of a pixel position that is two pixels away diagonally 45 degrees to the right of the intersecting pixel position Assuming G36, the data G36 can be obtained by averaging six green pixel data G34, G25, G56, G47, G16, and G38 that are closest to the pixel position.
G36 = (G34 + G25 + G56 + G47 + G16 + G38) / 6 (31)

上記の実施形態では、焦点検出画素は青画素、赤画素に比較して高密度に配置された緑画素の画素位置に配置されるので、撮像画素データの補間誤差が目立ちにくくなる。   In the above embodiment, since the focus detection pixels are arranged at the pixel positions of the green pixels arranged at a higher density than the blue pixels and the red pixels, the interpolation error of the imaged pixel data is less noticeable.

<×字型の焦点検出エリア、交差画素位置=青>
図26は、図23に示す実施形態の変形例であって、本来青画素が配置されるべき画素位置を交差画素位置にした場合を示した図である。このような配置において、焦点検出画素は、本来青画素および赤画素が配置されるべき画素位置に配置されるので、仮想的な青画素および仮想的な赤画素のデータは、該仮想的な青画素および仮想的な赤画素の画素位置を基準として、水平方向および垂直方向に2画素離れた4つの青画素のデータまたは4つの赤画素のデータを平均することによって、画一的な補間演算処理により求めることができる。
<X-shaped focus detection area, cross pixel position = blue>
FIG. 26 is a modification of the embodiment shown in FIG. 23, and shows a case where the pixel position where the blue pixel is originally supposed to be arranged is the cross pixel position. In such an arrangement, since the focus detection pixels are arranged at the pixel positions where the blue pixels and the red pixels are supposed to be arranged, the data of the virtual blue pixels and the virtual red pixels is the virtual blue pixels. Uniform interpolation calculation processing by averaging four blue pixel data or four red pixel data two pixels apart in the horizontal and vertical directions with reference to the pixel position of the pixel and the virtual red pixel It can ask for.

<水平方向と斜め方向の焦点検出画素配列の交差>
以上の実施形態においては、2つの焦点検出画素配列が直交して交差していたが、必ずしも直交しなくても構わない。図27は、水平方向の焦点検出画素配列と斜め右上がり45度方向の焦点検出画素配列とが交差する実施形態の撮像素子212の詳細な構成を示す正面図であり、撮像素子212上の焦点検出エリアの近傍を拡大して示している。このような焦点検出画素配列に対しても、上述した実施形態と同様にして、交差画素位置の仮想的な焦点検出画素のデータを求めることができる。また、焦点検出画素の数は2つに限定されたものではなく、たとえば、水平方向、垂直方向、斜め右上がり45度方向、斜め左上がり45度方向の4つの方向に配列した焦点検出画素配列を、1つの交差画素位置で交差させるようにした場合においても、本発明を適用することができる。
<Intersection of horizontal and diagonal focus detection pixel arrays>
In the above embodiment, the two focus detection pixel arrays intersect at right angles. However, they need not necessarily be at right angles. FIG. 27 is a front view illustrating a detailed configuration of the image sensor 212 according to the embodiment in which the horizontal focus detection pixel array and the focus detection pixel array of 45 degrees obliquely to the right cross, and the focus on the image sensor 212. The vicinity of the detection area is shown enlarged. Also for such a focus detection pixel array, data of virtual focus detection pixels at intersection pixel positions can be obtained in the same manner as in the above-described embodiment. Further, the number of focus detection pixels is not limited to two. For example, focus detection pixel arrays arranged in four directions, that is, a horizontal direction, a vertical direction, a diagonally upward 45 degree direction, and a diagonally upward 45 degree direction. The present invention can also be applied to a case where the two are crossed at one crossing pixel position.

<焦点検出画素の受光領域形状が半円>
また、必ずしも撮像画素の光電変換部の受光領域を正方形として、焦点検出画素の光電変換部の受光領域の形状を、該正方形を2等分した矩形または三角形にする必要はない。例えば、図3の実施形態において、撮像画素310の光電変換部11の受光領域の形状が、図28(e)に示すような円形であるとした場合には、図3における焦点検出画素313、314、415,316に対応した焦点検出画素333、334、335,336の光電変換部33、34、35,36の受光領域は、図28(a)、(b)、(c)、(d)に示すように、撮像画素11の受光領域を水平線で2等分した半円領域(光電変換部33、34)および垂直線で2等分した半円領域(光電変換部35、36)とすることができる。
<The shape of the light receiving area of the focus detection pixel is a semicircle>
In addition, it is not always necessary that the light receiving region of the photoelectric conversion unit of the imaging pixel is a square and the shape of the light receiving region of the photoelectric conversion unit of the focus detection pixel is a rectangle or a triangle obtained by dividing the square into two equal parts. For example, in the embodiment of FIG. 3, when the shape of the light receiving region of the photoelectric conversion unit 11 of the imaging pixel 310 is a circle as shown in FIG. 28E, the focus detection pixel 313 in FIG. The light receiving regions of the photoelectric conversion units 33, 34, 35, and 36 of the focus detection pixels 333, 334, 335, and 336 corresponding to 314, 415, and 316 are shown in FIGS. 28 (a), (b), (c), and (d). ), A semicircular region (photoelectric conversion units 33 and 34) obtained by dividing the light receiving region of the image pickup pixel 11 into two equal parts by a horizontal line, and a semicircular region (photoelectric conversion units 35 and 36) obtained by dividing the light receiving region by two vertical parts. can do.

上述した実施形態における撮像素子では、焦点検出画素が白色フィルタを備えた例を示したが、撮像画素と同じ色フィルタ(例えば緑フィルタ)を備えるようにした場合にも本発明を適用することができる。   In the image sensor in the above-described embodiment, an example in which the focus detection pixel includes a white filter has been described. However, the present invention can be applied to a case where the same color filter (for example, a green filter) as that of the image capture pixel is provided. it can.

上述した実施形態における撮像素子では、撮像画素がベイヤー配列の色フィルタを備えた例を示したが、色フィルタの構成や配列はこれに限定されることはなく、補色フィルタ(緑:G、イエロー:Ye、マゼンタ:Mg,シアン:Cy)の配列やベイヤー配列以外の配列にも本発明を適用することができる。また、色フィルタを備えないモノクロの撮像素子にも適用することができる。   In the image pickup device according to the above-described embodiment, an example in which the image pickup pixel includes a Bayer color filter is shown. However, the configuration and arrangement of the color filter are not limited to this, and the complementary color filter (green: G, yellow). : Ye, magenta: Mg, cyan: Cy) and other arrangements other than the Bayer arrangement. Further, the present invention can also be applied to a monochrome imaging device that does not include a color filter.

上述した実施形態においては、撮像素子として、CCDイメージセンサやCMOSイメージセンサなどを適用することができる。   In the above-described embodiment, a CCD image sensor, a CMOS image sensor, or the like can be applied as the imaging element.

上述した実施形態においては、撮像素子と光学系との間に光学要素を何も配置していないが、適宜必要な光学要素を挿入することが可能である。例えば、赤外カットフィルタや光学的ローパスフィルタやハーフミラーなどを設置してもよい。   In the above-described embodiment, no optical element is arranged between the image sensor and the optical system, but a necessary optical element can be appropriately inserted. For example, an infrared cut filter, an optical low-pass filter, a half mirror, or the like may be installed.

なお、撮像装置としては、上述したような、カメラボディに交換レンズが装着される構成のデジタルスチルカメラやフィルムスチルカメラに限定されない。例えば、レンズ一体型のデジタルスチルカメラ、フィルムスチルカメラ、あるいはビデオカメラにも本発明を適用することができる。さらには、携帯電話などに内蔵される小型カメラモジュール、監視カメラやロボット用の視覚認識装置、車載カメラなどにも適用できる。   Note that the imaging apparatus is not limited to the digital still camera or the film still camera having the configuration in which the interchangeable lens is mounted on the camera body as described above. For example, the present invention can also be applied to a lens-integrated digital still camera, film still camera, or video camera. Furthermore, the present invention can be applied to a small camera module built in a mobile phone, a surveillance camera, a visual recognition device for a robot, an in-vehicle camera, and the like.

9、10 マイクロレンズ、11、13、14、15、16、23、24、33、34、35、36 光電変換部、29 半導体回路基板、30 遮光マスク、30a、30b、30c、30d 開口部、31、32 平坦化層、38 色フィルタ、39 白色フィルタ、71、73、74 光束、90 射出瞳、91 交換レンズの光軸、93、94 測距瞳、95 領域、100 撮影画面、101、102、103、111、112、113、114、115 焦点検出エリア、201 デジタルスチルカメラ、202 交換レンズ、203 カメラボディ、204 マウント部、206 レンズ駆動制御装置、208 ズーミング用レンズ、209 レンズ、210 フォーカシング用レンズ、211 絞り、212 撮像素子、213 電気接点、214 ボディ駆動制御装置、215 液晶表示素子駆動回路、216 液晶表示素子、217 接眼レンズ、219 メモリカード、310 撮像画素、311、312、313、314、315、316、317、323、324、325、326、333、334、335、336 焦点検出画素
9, 10 Microlens 11, 11, 14, 15, 16, 23, 24, 33, 34, 35, 36 Photoelectric conversion part, 29 Semiconductor circuit board, 30 Shading mask, 30a, 30b, 30c, 30d Opening part, 31, 32 Flattening layer, 38 color filter, 39 White filter, 71, 73, 74 Light flux, 90 Exit pupil, 91 Optical axis of interchangeable lens, 93, 94 Distance pupil, 95 area, 100 Shooting screen, 101, 102 , 103, 111, 112, 113, 114, 115 Focus detection area, 201 Digital still camera, 202 Interchangeable lens, 203 Camera body, 204 Mount unit, 206 Lens drive control device, 208 Zooming lens, 209 lens, 210 For focusing Lens, 211 Aperture, 212 Image sensor, 213 Electrical contact, 214 Body drive system Device, 215 liquid crystal display element driving circuit, 216 liquid crystal display element, 217 eyepiece, 219 memory card, 310 imaging pixel, 311, 312, 313, 314, 315, 316, 317, 323, 324, 325, 326, 333, 334, 335, 336 Focus detection pixels

Claims (13)

各々が被写体像に関する撮像信号を出力する複数の撮像画素と、各々が分割瞳像に関する焦点検出信号を出力する複数の焦点検出画素とが二次元配列に従って配置され、前記複数の焦点検出画素が互いに隣接して複数の直線列に配置されることによって形成される複数の焦点検出画素配列が所定の画素位置で交差するように配置された撮像素子と、
前記所定の画素位置の近傍に配置された前記複数の焦点検出画素のうちの、少なくとも1つが出力する前記焦点検出信号と、該所定の画素位置における前記焦点検出信号とに基づいて、前記複数の焦点検出画素配列の各々に対応する焦点検出信号データ配列に含まれる、該所定の画素位置における特定の焦点検出信号データを取得する焦点検出信号取得手段と、
前記焦点検出信号データ配列に基づき、瞳分割型位相差検出方式によって、前記被写体像に関する焦点調節状態を検出する焦点検出手段とを備えることを特徴する撮像装置。
A plurality of imaging pixels each outputting an imaging signal relating to a subject image and a plurality of focus detection pixels each outputting a focus detection signal relating to a divided pupil image are arranged according to a two-dimensional array, and the plurality of focus detection pixels are mutually connected. An image sensor arranged so that a plurality of focus detection pixel arrays formed by being arranged in a plurality of adjacent linear rows intersect at a predetermined pixel position;
Based on the focus detection signal output by at least one of the plurality of focus detection pixels arranged in the vicinity of the predetermined pixel position and the focus detection signal at the predetermined pixel position, Focus detection signal acquisition means for acquiring specific focus detection signal data at the predetermined pixel position included in the focus detection signal data array corresponding to each of the focus detection pixel arrays;
An imaging apparatus comprising: a focus detection unit configured to detect a focus adjustment state relating to the subject image by a pupil division type phase difference detection method based on the focus detection signal data array.
請求項1に記載の撮像装置において、
前記複数の焦点検出画素配列は、第1の焦点検出画素配列と第2の焦点検出画素配列とを含み、
前記第1の焦点検出画素配列は、第1の方向に並ぶ前記分割瞳像に関する前記焦点検出信号を出力する前記複数の焦点検出画素を含み、
前記第2の焦点検出画素配列は、第2の方向に並ぶ前記分割瞳像に関する前記焦点検出信号を出力する前記複数の焦点検出画素を含み、
前記所定の画素位置に配置される焦点検出画素は、前記第1の方向に並ぶ前記分割瞳像に関する前記焦点検出信号を出力する前記複数の焦点検出画素の1つであって、
前記焦点検出信号取得手段は、前記第1の焦点検出画素配列に対応する前記焦点検出信号データ配列に含まれる前記特定の焦点検出信号データとして、前記所定の画素位置における前記焦点検出信号を取得し、前記所定の画素位置の近傍の前記第2の焦点検出画素配列に配置された、前記複数の焦点検出画素が出力する前記焦点検出信号と、該所定の画素位置における前記焦点検出信号とに基づいて、前記第2の焦点検出画素配列に対応する前記焦点検出信号データ配列に含まれる前記特定の焦点検出信号データを取得することを特徴とする撮像装置。
The imaging device according to claim 1,
The plurality of focus detection pixel arrays include a first focus detection pixel array and a second focus detection pixel array;
The first focus detection pixel array includes the plurality of focus detection pixels that output the focus detection signals related to the divided pupil images arranged in a first direction,
The second focus detection pixel array includes the plurality of focus detection pixels that output the focus detection signals related to the divided pupil images arranged in a second direction,
The focus detection pixel arranged at the predetermined pixel position is one of the plurality of focus detection pixels that outputs the focus detection signal related to the divided pupil images arranged in the first direction,
The focus detection signal acquisition unit acquires the focus detection signal at the predetermined pixel position as the specific focus detection signal data included in the focus detection signal data array corresponding to the first focus detection pixel array. Based on the focus detection signals output from the plurality of focus detection pixels arranged in the second focus detection pixel array in the vicinity of the predetermined pixel positions and the focus detection signals at the predetermined pixel positions. And acquiring the specific focus detection signal data included in the focus detection signal data array corresponding to the second focus detection pixel array.
請求項1に記載の撮像装置において、
前記複数の焦点検出画素配列は、第1の焦点検出画素配列と第2の焦点検出画素配列とを含み、
前記第1の焦点検出画素配列は、第1の方向に並ぶ前記分割瞳像に関する前記焦点検出信号を出力する前記複数の焦点検出画素を含み、
前記第2の焦点検出画素配列は、第2の方向に並ぶ前記分割瞳像に関する前記焦点検出信号を出力する前記複数の焦点検出画素を含み、
前記所定の画素位置に配置される焦点検出画素は、前記第1の方向に並ぶ前記分割瞳像に関する前記焦点検出信号を出力する前記複数の焦点検出画素および前記第2の方向に並ぶ前記分割瞳像に関する前記焦点検出信号を出力する前記複数の焦点検出画素のいずれとも異なる焦点検出画素であって、
前記焦点検出信号取得手段は、前記所定の画素位置の近傍の前記第1の焦点検出画素配列に配置された、前記複数の焦点検出画素が出力する前記焦点検出信号と、該所定の画素位置における前記焦点検出信号とに基づいて、前記第1の焦点検出画素配列に対応する前記焦点検出信号データ配列に含まれる前記特定の焦点検出信号データを取得するとともに、前記所定の画素位置の近傍の前記第2の焦点検出画素配列に配置された、前記複数の焦点検出画素が出力する前記焦点検出信号と、該所定の画素位置における前記焦点検出信号とに基づいて、前記第2の焦点検出画素配列に対応する前記焦点検出信号データ配列に含まれる前記特定の焦点検出信号データを取得することを特徴とする撮像装置。
The imaging device according to claim 1,
The plurality of focus detection pixel arrays include a first focus detection pixel array and a second focus detection pixel array;
The first focus detection pixel array includes the plurality of focus detection pixels that output the focus detection signals related to the divided pupil images arranged in a first direction,
The second focus detection pixel array includes the plurality of focus detection pixels that output the focus detection signals related to the divided pupil images arranged in a second direction,
The focus detection pixels arranged at the predetermined pixel positions are the plurality of focus detection pixels that output the focus detection signals related to the divided pupil images arranged in the first direction and the divided pupils arranged in the second direction. A focus detection pixel that is different from any of the plurality of focus detection pixels that output the focus detection signal for an image,
The focus detection signal acquisition unit is arranged in the first focus detection pixel array in the vicinity of the predetermined pixel position, and outputs the focus detection signal output from the plurality of focus detection pixels, and the predetermined pixel position. Based on the focus detection signal, the specific focus detection signal data included in the focus detection signal data array corresponding to the first focus detection pixel array is acquired, and the vicinity of the predetermined pixel position is acquired. Based on the focus detection signal output from the plurality of focus detection pixels arranged in the second focus detection pixel array and the focus detection signal at the predetermined pixel position, the second focus detection pixel array An image pickup apparatus that acquires the specific focus detection signal data included in the focus detection signal data array corresponding to
請求項2または3に記載の撮像装置において、
該所定の画素位置の近傍とは、該所定の画素位置に隣接する画素位置であることを特徴とする撮像装置。
In the imaging device according to claim 2 or 3,
The vicinity of the predetermined pixel position is a pixel position adjacent to the predetermined pixel position.
請求項1〜4に記載の撮像装置において、
前記複数の焦点検出画素の出力する前記焦点検出信号と該複数の焦点検出画素に近接した前記複数の撮像画素の出力する前記撮像信号とに基づいて、該複数の焦点検出画素の画素位置における前記撮像信号を補間する撮像信号補間手段と、
前記複数の撮像画素の出力する前記撮像信号と前記撮像信号補間手段によって補間された前記撮像信号とに基づいて、撮像画像信号を生成する撮像画像生成手段とをさらに備えることを特徴とする撮像装置。
In the imaging device according to claims 1 to 4,
Based on the focus detection signals output from the plurality of focus detection pixels and the imaging signals output from the plurality of imaging pixels adjacent to the plurality of focus detection pixels, the pixel positions of the plurality of focus detection pixels are Imaging signal interpolation means for interpolating the imaging signal;
An imaging apparatus, further comprising: a captured image generation unit configured to generate a captured image signal based on the imaging signal output from the plurality of imaging pixels and the imaging signal interpolated by the imaging signal interpolation unit. .
請求項2に記載の撮像装置において、
前記二次元配列は、正方格子状の配列であって、
前記第1の焦点検出画素配列の配列方向は、前記撮像素子の外形の1辺に対して平行な方向または垂直方向であって、
前記第2の焦点検出画素配列の配列方向は、前記第1の焦点検出画素配列の配列方向と直交する方向であることを特徴とする撮像装置。
The imaging device according to claim 2,
The two-dimensional array is a square lattice array,
The arrangement direction of the first focus detection pixel arrangement is a direction parallel to or perpendicular to one side of the outer shape of the imaging element,
An imaging apparatus, wherein an arrangement direction of the second focus detection pixel array is a direction orthogonal to an arrangement direction of the first focus detection pixel array.
請求項2に記載の撮像装置において、
前記二次元配列は、正方格子状の配列であって、
前記第1の焦点検出画素配列の配列方向は、前記撮像素子の外形の1辺に対して斜め45度方向であって、
前記第2の焦点検出画素配列の配列方向は、前記第1の焦点検出画素配列の配列方向と直交する方向であることを特徴とする撮像装置。
The imaging device according to claim 2,
The two-dimensional array is a square lattice array,
The array direction of the first focus detection pixel array is a 45 degree oblique direction with respect to one side of the outer shape of the image sensor,
An imaging apparatus, wherein an arrangement direction of the second focus detection pixel array is a direction orthogonal to an arrangement direction of the first focus detection pixel array.
請求項2〜7のいずれか1項に記載の撮像装置において、
前記第1の焦点検出画素配列は、前記撮像素子に入射して前記第1の方向に並ぶ前記分割瞳像を形成する第1の一対の光束のうちの一方を受光する第1の焦点検出画素と、前記第1の一対の光束のうちの他方を受光する第2の焦点検出画素とを含み、かつ前記第1の焦点検出画素と前記第2の焦点検出画素とが交互に繰り返して前記第1の方向と平行に配置される配列であって、
前記第2の焦点検出画素配列は、前記撮像素子に入射して前記第2の方向に並ぶ前記分割瞳像を形成する第2の一対の光束のうちの一方を受光する第3の焦点検出画素と、前記第2の一対の光束のうちの他方を受光する第4の焦点検出画素とを含み、かつ前記第3の焦点検出画素と前記第4の焦点検出画素とが交互に繰り返して前記第2の方向と平行に配置される配列であることを特徴とする撮像装置。
In the imaging device according to any one of claims 2 to 7,
The first focus detection pixel array receives a first focus detection pixel that receives one of a first pair of light beams that are incident on the image sensor and form the divided pupil image aligned in the first direction. And a second focus detection pixel that receives the other of the first pair of light fluxes, and the first focus detection pixel and the second focus detection pixel are alternately repeated to form the first focus detection pixel. An array arranged parallel to the direction of 1,
The second focus detection pixel array is a third focus detection pixel that receives one of a second pair of light beams that are incident on the image sensor and form the divided pupil image aligned in the second direction. And a fourth focus detection pixel that receives the other of the second pair of light fluxes, and the third focus detection pixel and the fourth focus detection pixel are alternately repeated. An image pickup apparatus, wherein the image pickup apparatus is arranged in parallel with the direction of 2.
請求項2〜7のいずれか1項に記載の撮像装置において、
前記第1の焦点検出画素配列は、前記撮像素子に入射して前記第1の方向に並ぶ前記分割瞳像を形成する第1の一対の光束のうちの一方を受光する第1の受光部と前記第1の一対の光束のうちの他方を受光する第2の受光部とを有する第5の焦点検出画素を含み、かつ前記第1の受光部と前記第2の受光部とが交互に繰り返して前記第1の方向と平行に配置されるように、前記第5の焦点検出画素が配置される配列であって、
前記第2の焦点検出画素配列は、前記撮像素子に入射して前記第2の方向に並ぶ前記分割瞳を形成する第2の一対の光束のうちの一方を受光する第3の受光部と前記第2の一対の光束のうちの他方を受光する第4の受光部とを有する第6の焦点検出画素を含み、かつ前記第3の受光部と前記第4の受光部とが交互に繰り返して前記第2の方向と平行に配置されるように、前記第6の焦点検出画素が配置される配列であることを特徴とする撮像装置。
In the imaging device according to any one of claims 2 to 7,
The first focus detection pixel array includes a first light receiving unit configured to receive one of a first pair of light beams that are incident on the image sensor and form the divided pupil image arranged in the first direction. Including a fifth focus detection pixel having a second light receiving portion for receiving the other of the first pair of light beams, and the first light receiving portion and the second light receiving portion are alternately repeated. An array in which the fifth focus detection pixels are arranged so as to be arranged in parallel with the first direction,
The second focus detection pixel array includes a third light receiving unit that receives one of a second pair of light beams that are incident on the imaging element and form the divided pupils arranged in the second direction, and the third light receiving unit. Including a sixth focus detection pixel having a fourth light receiving portion for receiving the other of the second pair of light beams, and the third light receiving portion and the fourth light receiving portion are alternately repeated. An image pickup apparatus, wherein the sixth focus detection pixels are arranged so as to be arranged in parallel with the second direction.
請求項8または9に記載の撮像装置において、
前記第1の一対の光束および前記第2の一対の光束の各々は、前記撮像素子に入射して前記被写体像を形成する撮影光束に関して互いに相補的な光束であることを特徴とする撮像装置。
The imaging device according to claim 8 or 9,
Each of the first pair of light beams and the second pair of light beams is a light beam complementary to each other with respect to a photographing light beam that is incident on the image sensor and forms the subject image.
請求項1〜10のいずれか1項に記載の撮像装置において、
前記複数の撮像画素は、緑色光に感度を有する緑画素と、赤色光に感度を有する赤画素と、青色光に感度を有する青画素と含み、前記緑画素と前記赤画素と前記青画素とによって形成されるベイヤー配列に従って、前記撮像素子に配置されることを特徴とする撮像装置。
In the imaging device according to any one of claims 1 to 10,
The plurality of imaging pixels include a green pixel having sensitivity to green light, a red pixel having sensitivity to red light, and a blue pixel having sensitivity to blue light, and the green pixel, the red pixel, and the blue pixel, The image pickup device is arranged on the image pickup device in accordance with a Bayer array formed by:
請求項11に記載の撮像装置において、
前記所定の画素位置は、前記ベイヤー配列に従って前記緑画素が配置されるべき画素位置と一致することを特徴とする撮像装置。
The imaging device according to claim 11,
The imaging apparatus according to claim 1, wherein the predetermined pixel position matches a pixel position where the green pixel is to be arranged according to the Bayer array.
請求項2に記載の撮像装置において、
前記撮像素子上に前記被写体像を形成する光学系をさらに備え、
前記光学系の光軸と前記撮像素子の受光面とが交差する光軸位置と前記所定の画素位置とを結ぶ線分と、前記第1の焦点検出画素配列の配列方向とによって形成される第1の交角の角度、および前記線分と、前記第2の焦点検出画素配列の配列方向とによって形成される第2の交角の角度のうち、他方に比して角度が大きい一方の交角を形成する焦点検出画素配列に含まれる前記複数の焦点検出画素が前記所定の画素位置に配置されることを特徴とする撮像装置。
The imaging device according to claim 2,
An optical system for forming the subject image on the image sensor;
A line segment connecting the optical axis position where the optical axis of the optical system and the light receiving surface of the imaging device intersect with the predetermined pixel position, and the first direction of the first focus detection pixel array are formed. One intersection angle that is larger than the other of the second intersection angle angles formed by the angle of the intersection angle of 1 and the line segment and the arrangement direction of the second focus detection pixel array is formed. The imaging apparatus, wherein the plurality of focus detection pixels included in the focus detection pixel array are arranged at the predetermined pixel position.
JP2009182346A 2009-08-05 2009-08-05 Imaging device Active JP5381472B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009182346A JP5381472B2 (en) 2009-08-05 2009-08-05 Imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009182346A JP5381472B2 (en) 2009-08-05 2009-08-05 Imaging device

Publications (2)

Publication Number Publication Date
JP2011033975A true JP2011033975A (en) 2011-02-17
JP5381472B2 JP5381472B2 (en) 2014-01-08

Family

ID=43763088

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009182346A Active JP5381472B2 (en) 2009-08-05 2009-08-05 Imaging device

Country Status (1)

Country Link
JP (1) JP5381472B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9060118B2 (en) 2012-05-30 2015-06-16 Samsung Electronics Co., Ltd. Image systems and sensors having focus detection pixels therein
US9420164B1 (en) 2015-09-24 2016-08-16 Qualcomm Incorporated Phase detection autofocus noise reduction
US9571762B2 (en) 2013-05-22 2017-02-14 Sony Corporation Signal processor and signal processing method, solid-state imaging apparatus, and electronic device
US9804357B2 (en) 2015-09-25 2017-10-31 Qualcomm Incorporated Phase detection autofocus using masked and unmasked photodiodes
US10212332B2 (en) 2014-06-24 2019-02-19 Sony Corporation Image sensor, calculation method, and electronic device for autofocus
CN111683234A (en) * 2020-06-04 2020-09-18 深圳开立生物医疗科技股份有限公司 Endoscope imaging method and device and related equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03218430A (en) * 1989-11-14 1991-09-26 Nikon Corp Focal point detection photometry device
JP2007293370A (en) * 2007-07-30 2007-11-08 Olympus Corp Imaging apparatus
WO2008032820A1 (en) * 2006-09-14 2008-03-20 Nikon Corporation Imaging element and imaging device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03218430A (en) * 1989-11-14 1991-09-26 Nikon Corp Focal point detection photometry device
WO2008032820A1 (en) * 2006-09-14 2008-03-20 Nikon Corporation Imaging element and imaging device
JP2007293370A (en) * 2007-07-30 2007-11-08 Olympus Corp Imaging apparatus

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9060118B2 (en) 2012-05-30 2015-06-16 Samsung Electronics Co., Ltd. Image systems and sensors having focus detection pixels therein
US9571762B2 (en) 2013-05-22 2017-02-14 Sony Corporation Signal processor and signal processing method, solid-state imaging apparatus, and electronic device
US10212332B2 (en) 2014-06-24 2019-02-19 Sony Corporation Image sensor, calculation method, and electronic device for autofocus
US9420164B1 (en) 2015-09-24 2016-08-16 Qualcomm Incorporated Phase detection autofocus noise reduction
US9729779B2 (en) 2015-09-24 2017-08-08 Qualcomm Incorporated Phase detection autofocus noise reduction
US9804357B2 (en) 2015-09-25 2017-10-31 Qualcomm Incorporated Phase detection autofocus using masked and unmasked photodiodes
CN111683234A (en) * 2020-06-04 2020-09-18 深圳开立生物医疗科技股份有限公司 Endoscope imaging method and device and related equipment

Also Published As

Publication number Publication date
JP5381472B2 (en) 2014-01-08

Similar Documents

Publication Publication Date Title
JP5454223B2 (en) camera
JP5012495B2 (en) IMAGING ELEMENT, FOCUS DETECTION DEVICE, FOCUS ADJUSTMENT DEVICE, AND IMAGING DEVICE
JP5381472B2 (en) Imaging device
JP5423111B2 (en) Focus detection apparatus and imaging apparatus
JP5609098B2 (en) Imaging device
JP5278123B2 (en) Imaging device
JP5600941B2 (en) Focus detection apparatus and imaging apparatus
JP5804105B2 (en) Imaging device
JP5338112B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5962830B2 (en) Focus detection device
JP5407314B2 (en) Focus detection apparatus and imaging apparatus
JP2010091848A (en) Focus detecting apparatus and imaging apparatus
JP5338113B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5804104B2 (en) Focus adjustment device
JP5685892B2 (en) Focus detection device, focus adjustment device, and imaging device
JP5614227B2 (en) Focus detection apparatus and imaging apparatus
JP5338119B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5332384B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5399627B2 (en) Focus detection device, focus adjustment device, and imaging device
JP5338118B2 (en) Correlation calculation device, focus detection device, and imaging device
JP4968009B2 (en) Correlation calculation method, correlation calculation device, focus detection device, and imaging device
JP4968010B2 (en) Correlation calculation method, correlation calculation device, focus detection device, and imaging device
JP2011053519A (en) Imaging device, and imaging apparatus
JP5476702B2 (en) Imaging device and imaging apparatus
JP2019091093A (en) Focus adjustment device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120806

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130621

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130625

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130814

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130903

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130916

R150 Certificate of patent or registration of utility model

Ref document number: 5381472

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250