JP2010139624A - Imaging element and imaging apparatus - Google Patents

Imaging element and imaging apparatus Download PDF

Info

Publication number
JP2010139624A
JP2010139624A JP2008314315A JP2008314315A JP2010139624A JP 2010139624 A JP2010139624 A JP 2010139624A JP 2008314315 A JP2008314315 A JP 2008314315A JP 2008314315 A JP2008314315 A JP 2008314315A JP 2010139624 A JP2010139624 A JP 2010139624A
Authority
JP
Japan
Prior art keywords
focus detection
imaging
pixel
pixels
detection pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2008314315A
Other languages
Japanese (ja)
Other versions
JP5407314B2 (en
Inventor
Yosuke Kusaka
洋介 日下
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2008314315A priority Critical patent/JP5407314B2/en
Publication of JP2010139624A publication Critical patent/JP2010139624A/en
Application granted granted Critical
Publication of JP5407314B2 publication Critical patent/JP5407314B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To suppress image deterioration to the minimum by being detected up to a large defocusing amount. <P>SOLUTION: An imaging element for arranging at least one row of a focus detection pixel row for arranging first focus detection pixels 313 receiving one light flux and a second focus detection pixels 314 receiving the other light flux in one row out of a pair of light fluxes passing a pair of regions of an emission pupil of an optical system forming images on an imaging plane on a part of arrangement of an imaging pixel two-dimensionally arranged on the imaging plane is formed by dividing array of the first focus detection pixels 313 and the second focus detection pixels 314 into a plurality of the regions; alternately arranging the first focus detection pixels 313 and the second focus detection pixels 314 in at least one region in the plurality of the regions; and alternately arranging the second focus detection pixels 314 and one or a plurality of imaging pixels without arranging the first focus detection pixel 313 in at least another region. <P>COPYRIGHT: (C)2010,JPO&INPIT

Description

本発明は撮像素子と撮像装置に関する。   The present invention relates to an imaging element and an imaging apparatus.

光学系を通過する一対の光束が形成する一対の像に対応した一対の像信号を生成する複数個の焦点検出用画素の配列を、撮像画素の配列中に混在させたイメージセンサーを備え、撮像画素の出力により画像信号を生成するとともに、焦点検出画素が生成する一対の像信号のズレ量に基づいて、光学系の焦点調節状態を検出する、いわゆる瞳分割位相差検出方式の焦点検出機能を備えた撮像装置が知られている(例えば、特許文献1参照)。
このような撮像装置において、焦点検出画素はマイクロレンズとその背後に配置された光電変換部とから構成されるとともに、焦点検出画素の配列により生成された一対の像信号を相対的にシフトした場合の相関度に基づいてズレ量を算出し、このズレ量をデフォーカス量に変換することによって焦点検出を行っている。また、画像データを生成する際には、焦点検出画素位置における撮像用の画素データを焦点検出画素の周囲に配置された撮像画素の画素データに基づいて補間して求めている。
特開2007−270312号公報
An image sensor is provided, which includes an image sensor in which an array of a plurality of focus detection pixels that generate a pair of image signals corresponding to a pair of images formed by a pair of light beams passing through an optical system are mixed in the array of imaging pixels. A focus detection function of a so-called pupil division phase difference detection system that generates an image signal based on the output of the pixel and detects the focus adjustment state of the optical system based on the shift amount of the pair of image signals generated by the focus detection pixel. An imaging device provided is known (see, for example, Patent Document 1).
In such an imaging apparatus, the focus detection pixel is configured by a microlens and a photoelectric conversion unit disposed behind the microlens, and a pair of image signals generated by the array of focus detection pixels is relatively shifted. The amount of deviation is calculated on the basis of the degree of correlation, and focus detection is performed by converting the amount of deviation into a defocus amount. Further, when generating image data, pixel data for imaging at the focus detection pixel position is obtained by interpolation based on pixel data of imaging pixels arranged around the focus detection pixel.
JP 2007-270312 A

しかしながら、大きなデフォーカス量まで検出するためには、一対の像信号のジフト範囲も大きくしなければならないので、必然的に焦点検出画素を多数配列しなければならず、その結果補間される画素データも増加するので、画像品質も補間画素データの数に比例して劣化してしまう傾向があった。   However, in order to detect a large defocus amount, the dift range of a pair of image signals must also be increased, and thus a large number of focus detection pixels must be arranged, and as a result, pixel data to be interpolated Therefore, the image quality tends to deteriorate in proportion to the number of interpolated pixel data.

(1) 請求項1の発明は、撮像面上に二次元状に配列された撮像画素の配列の一部に、撮像面上に像を形成する光学系の射出瞳の一対の領域を通過した一対の光束の内、一方の光束を受光する第1焦点検出画素と他方の光束を受光する第2焦点検出画素とを一列に配列した焦点検出画素列が少なくとも1列配置された撮像素子であって、第1焦点検出画素と第2焦点検出画素の配列を複数の領域に区分し、複数の領域の内の少なくとも1の領域では第1焦点検出画素と第2焦点検出画素とを交互に配列し、他の少なくとも1の領域では第1焦点検出画素を配列せずに第2焦点検出画素と1または複数の撮像画素とを交互に配列した。
(2) 請求項2の発明は、請求項1に記載の撮像素子において、撮像面上には緑色フィルターを有する撮像画素、赤色フィルターを有する撮像画素および青色フィルターを有する撮像画素がベイヤー配列されており、第1焦点検出画素を、ベイヤー配列中の赤色フィルターを有する撮像画素または青色フィルターを有する撮像画素の位置に配置し、第2焦点検出画素を、ベイヤー配列中の緑色フィルターを有する撮像画素の位置に配置した。
(3) 請求項3の発明は、請求項1または請求項2に記載の撮像素子において、第1焦点検出画素と第2焦点検出画素はそれぞれ、マイクロレンズと光電変換部を有する。
(4) 請求項4の発明は、請求項1〜3のいずれか一項に記載の撮像素子において、複数の領域の内の第2焦点検出画素と1または複数の撮像画素とを交互に配列した領域は、焦点検出画素列の端を含む領域である。
(5) 請求項5の発明は、請求項1〜4のいずれか一項に記載の撮像素子において、複数の領域の内の第1焦点検出画素と第2焦点検出画素とを交互に配列した領域は、焦点検出画素列の中央部の領域である。
(6) 請求項6の発明は、請求項1〜5のいずれか一項に記載の撮像素子と、第1焦点検出画素の配列が生成する第1データ列に対して、第2焦点検出画素の配列が生成する第2データ列をシフトさせることによって、一対の光束が形成する一対の像のズレ量を演算するズレ量演算手段と、ズレ量を撮像面と光学系の合焦面との間のデフォーカス量に変換する変換手段と、第1焦点検出画素および第2焦点検出画素の位置における仮想的な撮像画素の出力データを、第1焦点検出画素および第2焦点検出画素の周囲の撮像画素の出力データにより補間する補間手段とを備える。
(7) 請求項7の発明は、請求項6に記載の撮像装置において、ズレ量演算手段は、第1データ列に対して第2データ列を第2焦点検出画素の配列ピッチ単位のシフト量でシフトさせ、第1データ列と第2データ列との相関度を演算するとともに、最も高い相関度を示すシフト量に応じて第1データ列と第2データ列の相対的なズレ量を演算する。
(8) 請求項8の発明は、請求項7に記載の撮像装置において、ズレ量演算手段は、最も高い相関度を示したシフト量とその前後のシフト量の相関度に基づき、配列ピッチ単位以下の量でシフト量を内挿し、内挿されたシフト量に基づき第1データ列と第2データ列のズレ量を演算する。
(9) 請求項9の発明は、請求項6〜8のいずれか一項に記載の撮像装置において、補間手段は、第2焦点検出画素の位置における仮想的な撮像画素の出力データを、第2焦点検出画素の周囲の撮像画素の出力データにより補間する際に、第2焦点検出画素と第2焦点検出画素との間に撮像画素が存在する場合には、当該撮像画素の出力データも用いて補間する。
(1) The invention of claim 1 has passed through a pair of regions of the exit pupil of the optical system that forms an image on the imaging surface, in a part of the array of imaging pixels arrayed two-dimensionally on the imaging surface An image sensor in which at least one focus detection pixel array in which a first focus detection pixel that receives one light beam and a second focus detection pixel that receives the other light beam in a pair is arranged in a line. The first focus detection pixels and the second focus detection pixels are divided into a plurality of regions, and the first focus detection pixels and the second focus detection pixels are alternately arranged in at least one of the plurality of regions. In the other at least one region, the second focus detection pixels and one or a plurality of imaging pixels are alternately arranged without arranging the first focus detection pixels.
(2) According to a second aspect of the present invention, in the imaging device according to the first aspect, an imaging pixel having a green filter, an imaging pixel having a red filter, and an imaging pixel having a blue filter are arranged in a Bayer array on the imaging surface. The first focus detection pixel is arranged at the position of the image pickup pixel having the red filter or the blue filter in the Bayer array, and the second focus detection pixel is set to the image pickup pixel having the green filter in the Bayer array. Placed in position.
(3) In the image pickup device according to claim 1 or 2, the first focus detection pixel and the second focus detection pixel each include a microlens and a photoelectric conversion unit.
(4) According to a fourth aspect of the present invention, in the imaging device according to any one of the first to third aspects, a second focus detection pixel and one or a plurality of imaging pixels in a plurality of regions are alternately arranged. This area is an area including the end of the focus detection pixel column.
(5) According to the invention of claim 5, in the imaging device according to any one of claims 1 to 4, the first focus detection pixels and the second focus detection pixels of the plurality of regions are alternately arranged. The region is a central region of the focus detection pixel row.
(6) The invention according to claim 6 is the second focus detection pixel for the first data string generated by the array of the imaging element according to any one of claims 1 to 5 and the first focus detection pixel. By shifting the second data string generated by the arrangement of the first and second positions, a shift amount calculating means for calculating the shift amount of the pair of images formed by the pair of light beams, and the shift amount between the imaging surface and the focusing surface of the optical system. Conversion means for converting to a defocus amount between them, and output data of the virtual imaging pixels at the positions of the first focus detection pixel and the second focus detection pixel, around the first focus detection pixel and the second focus detection pixel. Interpolation means for performing interpolation based on output data of the imaging pixels.
(7) According to a seventh aspect of the present invention, in the imaging apparatus according to the sixth aspect, the shift amount calculating means shifts the second data string with respect to the first data string by an arrangement pitch unit of the second focus detection pixels. To calculate the degree of correlation between the first data string and the second data string, and the relative shift amount between the first data string and the second data string according to the shift amount indicating the highest degree of correlation. To do.
(8) The invention according to claim 8 is the imaging apparatus according to claim 7, wherein the deviation amount calculating means is based on the degree of correlation between the shift amount showing the highest degree of correlation and the shift amount before and after the shift amount. The shift amount is interpolated by the following amount, and the shift amount between the first data string and the second data string is calculated based on the interpolated shift amount.
(9) According to a ninth aspect of the present invention, in the imaging apparatus according to any one of the sixth to eighth aspects, the interpolation means outputs the output data of the virtual imaging pixel at the position of the second focus detection pixel. When an image pickup pixel exists between the second focus detection pixel and the second focus detection pixel when interpolating with the output data of the image pickup pixels around the bifocal detection pixel, the output data of the image pickup pixel is also used. To interpolate.

本発明によれば、瞳分割型の焦点検出画素を撮像画素と混在させた撮像素子において、大きなデフォーカス量まで検出可能にするとともに、画像劣化の度合いを最小限に押さえることができる。   According to the present invention, it is possible to detect a large defocus amount and minimize the degree of image degradation in an imaging element in which pupil-divided focus detection pixels are mixed with imaging pixels.

一実施の形態の撮像素子を備えた撮像装置として、レンズ交換式デジタルスチルカメラを例に上げて説明する。図1は一実施の形態のカメラの構成を示すカメラの横断面図である。一実施の形態のデジタルスチルカメラ201は交換レンズ202とカメラボディ203から構成され、交換レンズ202がマウント部204を介してカメラボディ203に装着される。カメラボディ203にはマウント部204を介して種々の撮影光学系を有する交換レンズ202が装着可能である。   As an image pickup apparatus including the image pickup device according to the embodiment, a lens interchangeable digital still camera will be described as an example. FIG. 1 is a cross-sectional view of a camera showing the configuration of the camera of one embodiment. A digital still camera 201 according to an embodiment includes an interchangeable lens 202 and a camera body 203, and the interchangeable lens 202 is attached to the camera body 203 via a mount unit 204. An interchangeable lens 202 having various photographing optical systems can be attached to the camera body 203 via a mount unit 204.

交換レンズ202はレンズ209、ズーミング用レンズ208、フォーカシング用レンズ210、絞り211、レンズ駆動制御装置206などを備えている。レンズ駆動制御装置206は不図示のマイクロコンピューター、メモリ、駆動制御回路などから構成され、フォーカシング用レンズ210の焦点調節と絞り211の開口径調節のための駆動制御や、ズーミング用レンズ208、フォーカシング用レンズ210および絞り211の状態検出などを行う他、後述するボディ駆動制御装置214との通信によりレンズ情報の送信とカメラ情報の受信を行う。絞り211は、光量およびボケ量調整のために光軸中心に開口径が可変な開口を形成する。   The interchangeable lens 202 includes a lens 209, a zooming lens 208, a focusing lens 210, an aperture 211, a lens drive control device 206, and the like. The lens drive control device 206 includes a microcomputer (not shown), a memory, a drive control circuit, and the like. The lens drive control device 206 includes drive control for adjusting the focus of the focusing lens 210 and the aperture diameter of the aperture 211, zooming lens 208, and focusing. In addition to detecting the state of the lens 210 and the aperture 211, the lens information is transmitted and the camera information is received through communication with a body drive control device 214 described later. The aperture 211 forms an aperture having a variable aperture diameter at the center of the optical axis in order to adjust the amount of light and the amount of blur.

カメラボディ203は撮像素子212、ボディ駆動制御装置214、液晶表示素子駆動回路215、液晶表示素子216、接眼レンズ217、メモリカード219などを備えている。撮像素子212には、撮像画素が二次元状に配置されるとともに、焦点検出位置に対応した部分に焦点検出画素が組み込まれている。この撮像素子212については詳細を後述する。   The camera body 203 includes an imaging element 212, a body drive control device 214, a liquid crystal display element drive circuit 215, a liquid crystal display element 216, an eyepiece lens 217, a memory card 219, and the like. In the imaging element 212, imaging pixels are two-dimensionally arranged, and focus detection pixels are incorporated in portions corresponding to focus detection positions. Details of the image sensor 212 will be described later.

ボディ駆動制御装置214はマイクロコンピューター、メモリ、駆動制御回路などから構成され、撮像素子212の駆動制御と画像信号および焦点検出信号の読み出しと、焦点検出信号に基づく焦点検出演算と交換レンズ202の焦点調節を繰り返し行うとともに、画像信号の処理と記録、カメラの動作制御などを行う。また、ボディ駆動制御装置214は電気接点213を介してレンズ駆動制御装置206と通信を行い、レンズ情報の受信とカメラ情報(デフォーカス量や絞り値など)の送信を行う。   The body drive control device 214 includes a microcomputer, a memory, a drive control circuit, and the like. The adjustment is repeated, and image signal processing and recording, camera operation control, and the like are performed. The body drive control device 214 communicates with the lens drive control device 206 via the electrical contact 213 to receive lens information and send camera information (defocus amount, aperture value, etc.).

液晶表示素子216は電気的なビューファインダー(EVF:Electronic View Finder)として機能する。液晶表示素子駆動回路215は撮像素子212によるスルー画像を液晶表示素子216に表示し、撮影者は接眼レンズ217を介してスルー画像を観察することができる。メモリカード219は、撮像素子212により撮像された画像を記憶する画像ストレージである。   The liquid crystal display element 216 functions as an electric view finder (EVF). The liquid crystal display element driving circuit 215 displays a through image by the imaging element 212 on the liquid crystal display element 216, and the photographer can observe the through image through the eyepiece lens 217. The memory card 219 is an image storage that stores an image captured by the image sensor 212.

交換レンズ202を通過した光束により、撮像素子212の受光面上に被写体像が形成される。この被写体像は撮像素子212により光電変換され、画像信号と焦点検出信号がボディ駆動制御装置214へ送られる。   A subject image is formed on the light receiving surface of the image sensor 212 by the light beam that has passed through the interchangeable lens 202. This subject image is photoelectrically converted by the image sensor 212, and an image signal and a focus detection signal are sent to the body drive control device 214.

ボディ駆動制御装置214は、撮像素子212の焦点検出画素からの焦点検出信号に基づいてデフォーカス量を算出し、このデフォーカス量をレンズ駆動制御装置206へ送る。また、ボディ駆動制御装置214は、撮像素子212からの画像信号を処理して画像を生成し、メモリカード219に格納するとともに、撮像素子212からのスルー画像信号を液晶表示素子駆動回路215へ送り、スルー画像を液晶表示素子216に表示させる。さらに、ボディ駆動制御装置214は、レンズ駆動制御装置206へ絞り制御情報を送って絞り211の開口制御を行う。   The body drive control device 214 calculates the defocus amount based on the focus detection signal from the focus detection pixel of the image sensor 212 and sends the defocus amount to the lens drive control device 206. The body drive control device 214 processes the image signal from the image sensor 212 to generate an image, stores the image in the memory card 219, and sends the through image signal from the image sensor 212 to the liquid crystal display element drive circuit 215. The through image is displayed on the liquid crystal display element 216. Further, the body drive control device 214 sends aperture control information to the lens drive control device 206 to control the aperture of the aperture 211.

レンズ駆動制御装置206は、フォーカシング状態、ズーミング状態、絞り設定状態、絞り開放F値などに応じてレンズ情報を更新する。具体的には、ズーミング用レンズ208とフォーカシング用レンズ210の位置と絞り211の絞り値を検出し、これらのレンズ位置と絞り値に応じてレンズ情報を演算したり、あるいは予め用意されたルックアップテーブルからレンズ位置と絞り値に応じたレンズ情報を選択する。   The lens drive controller 206 updates the lens information according to the focusing state, zooming state, aperture setting state, aperture opening F value, and the like. Specifically, the positions of the zooming lens 208 and the focusing lens 210 and the aperture value of the aperture 211 are detected, and lens information is calculated according to these lens positions and aperture values, or a lookup prepared in advance. Lens information corresponding to the lens position and aperture value is selected from the table.

レンズ駆動制御装置206は、受信したデフォーカス量に基づいてレンズ駆動量を算出し、レンズ駆動量に応じてフォーカシング用レンズ210を合焦位置へ駆動する。また、レンズ駆動制御装置206は受信した絞り値に応じて絞り211を駆動する。   The lens drive control device 206 calculates a lens drive amount based on the received defocus amount, and drives the focusing lens 210 to the in-focus position according to the lens drive amount. Further, the lens drive control device 206 drives the diaphragm 211 in accordance with the received diaphragm value.

図2は、交換レンズ202の撮影画面上における焦点検出位置を示す図であり、後述する撮像素子212上の焦点検出画素列が焦点検出の際に撮影画面上で像をサンプリングする領域(焦点検出エリア、焦点検出位置)の一例を示す。この例では、矩形の撮影画面100上の中央および上下の3箇所に焦点検出エリア101〜103が配置される。長方形で示す焦点検出エリアの長手方向に、焦点検出画素が直線的に配列される。なお、ここでは焦点検出エリア101〜103が図の上下方向に延びた矩形とされているが、これに限られない。左右方向または斜め方向に延びた矩形とされてもよい。   FIG. 2 is a diagram showing a focus detection position on the photographing screen of the interchangeable lens 202, and a region (focus detection) in which a focus detection pixel array on the image sensor 212 described later samples an image on the photographing screen when focus detection is performed. An example of an area and a focus detection position is shown. In this example, focus detection areas 101 to 103 are arranged at the center and three locations above and below the rectangular shooting screen 100. Focus detection pixels are linearly arranged in the longitudinal direction of the focus detection area indicated by a rectangle. Note that, here, the focus detection areas 101 to 103 are rectangular extending in the vertical direction in the figure, but are not limited thereto. You may make it the rectangle extended in the left-right direction or the diagonal direction.

図3は撮像素子212の詳細な構成を示す正面図であり、撮像素子212上の焦点検出エリア101の近傍を拡大して示す。なお、焦点検出画素配列は長いので、配列の上端部近傍、中央近傍、下端部近傍を分割して示している。撮像素子212には撮像画素310が二次元正方格子状(緑画素、赤画素、青画素からなるベイヤー配列)に稠密に配列されるとともに、焦点検出エリア101に対応する位置には焦点検出用の焦点検出画素313、314が垂直方向の直線上に交互に配列される。ただし、焦点検出画素配列の上端部および下端部近傍においては焦点検出画素314のみが配置され、焦点検出画素313が配置されるべき画素位置には撮像画素310(青画素)が配置される。焦点検出画素313、314はそれぞれ撮像画素の青画素と緑画素が配置されるべき位置に配置される。なお、図示を省略するが、焦点検出エリア102、103の近傍の構成も図3に示す構成と同様である。   FIG. 3 is a front view showing a detailed configuration of the image sensor 212, and shows an enlarged vicinity of the focus detection area 101 on the image sensor 212. Since the focus detection pixel array is long, the vicinity of the upper end, the vicinity of the center, and the vicinity of the lower end of the array are shown separately. In the image sensor 212, the imaging pixels 310 are densely arranged in a two-dimensional square lattice pattern (Bayer array including green pixels, red pixels, and blue pixels), and a position for focus detection is provided at a position corresponding to the focus detection area 101. The focus detection pixels 313 and 314 are alternately arranged on a straight line in the vertical direction. However, only the focus detection pixel 314 is arranged in the vicinity of the upper end and the lower end of the focus detection pixel array, and the imaging pixel 310 (blue pixel) is arranged at the pixel position where the focus detection pixel 313 is to be arranged. The focus detection pixels 313 and 314 are arranged at positions where blue pixels and green pixels of the imaging pixels are to be arranged, respectively. Although not shown, the configuration in the vicinity of the focus detection areas 102 and 103 is the same as the configuration shown in FIG.

撮像画素310は、図4に示すようにマイクロレンズ10、光電変換部11、および色フィルター(不図示)から構成される。色フィルターは赤(R)、緑(G)、青(B)の3種類からなり、それぞれの分光感度は図6に示す特性になっている。撮像素子212には、各色フィルターを備えた撮像画素310がベイヤー配列されている。   As illustrated in FIG. 4, the imaging pixel 310 includes a microlens 10, a photoelectric conversion unit 11, and a color filter (not shown). There are three types of color filters, red (R), green (G), and blue (B), and the respective spectral sensitivities have the characteristics shown in FIG. In the image pickup device 212, image pickup pixels 310 having respective color filters are arranged in a Bayer array.

焦点検出画素313は、図5(a)に示すようにマイクロレンズ10と光電変換部13とから構成され、光電変換部13の形状は矩形である。また、焦点検出画素314は、図5(b)に示すようにマイクロレンズ10と光電変換部14とから構成され、光電変換部14の形状は矩形である。焦点検出画素313と焦点検出画素314とをマイクロレンズ10を重ね合わせて表示すると、光電変換部13と14が垂直方向に並んでいる。   As shown in FIG. 5A, the focus detection pixel 313 includes the microlens 10 and the photoelectric conversion unit 13, and the photoelectric conversion unit 13 has a rectangular shape. Further, the focus detection pixel 314 includes the microlens 10 and the photoelectric conversion unit 14 as illustrated in FIG. 5B, and the photoelectric conversion unit 14 has a rectangular shape. When the focus detection pixel 313 and the focus detection pixel 314 are displayed with the microlens 10 superimposed, the photoelectric conversion units 13 and 14 are arranged in the vertical direction.

焦点検出画素313、314には光量をかせぐために色フィルターが設けられておらず、その分光特性は光電変換を行うフォトダイオードの分光感度と、赤外カットフィルター(不図示)の分光特性とを総合した分光特性(図7参照)となる。つまり、図6に示す緑画素、赤画素および青画素の分光特性を加算したような分光特性となり、その感度の光波長領域は緑画素、赤画素および青画素の感度の光波長領域を包括している。   The focus detection pixels 313 and 314 are not provided with a color filter to increase the amount of light, and the spectral characteristics of the focus detection pixels 313 and 314 include the spectral sensitivity of a photodiode that performs photoelectric conversion and the spectral characteristics of an infrared cut filter (not shown). Spectral characteristics (see FIG. 7). That is, the spectral characteristics are obtained by adding the spectral characteristics of the green pixel, the red pixel, and the blue pixel shown in FIG. 6, and the light wavelength region of the sensitivity includes the light wavelength regions of the sensitivity of the green pixel, the red pixel, and the blue pixel. ing.

焦点検出用の焦点検出画素313、314は、撮像画素310のBとGが配置されるべき列に配置されている。焦点検出用の焦点検出画素313、314が、撮像画素310のBとGが配置されるべき列に配置されているのは、焦点検出画素の位置における撮像用の画像信号を求めるための補間処理において補間誤差が生じた場合に、人間の視覚特性上、赤画素の補間誤差に比較して青画素の補間誤差が目立たないためである。   The focus detection pixels 313 and 314 for focus detection are arranged in a column where B and G of the imaging pixel 310 should be arranged. The focus detection pixels 313 and 314 for focus detection are arranged in a column in which B and G of the imaging pixel 310 are to be arranged because interpolation processing for obtaining an image signal for imaging at the position of the focus detection pixel is performed. This is because the interpolation error of the blue pixel is less noticeable than the interpolation error of the red pixel due to human visual characteristics when an interpolation error occurs in FIG.

撮像画素310の光電変換部11は、マイクロレンズ10によって最も明るい交換レンズの射出瞳径(例えばF1.0)を通過する光束をすべて受光するような形状に設計される。また、焦点検出画素313、314の光電変換部13、14は、マイクロレンズ10によって交換レンズの射出瞳の一対の所定の領域を通過する光束をそれぞれ受光するような形状に設計される。   The photoelectric conversion unit 11 of the imaging pixel 310 is designed so as to receive all the light beams that pass through the exit pupil diameter (for example, F1.0) of the brightest interchangeable lens by the microlens 10. In addition, the photoelectric conversion units 13 and 14 of the focus detection pixels 313 and 314 are designed in such a shape that the microlens 10 receives light beams that pass through a pair of predetermined regions of the exit pupil of the interchangeable lens.

図8は撮像画素310の断面図である。撮像画素310では撮像用の光電変換部11の前方にマイクロレンズ10が配置され、マイクロレンズ10により光電変換部11の形状が前方に投影される。光電変換部11は半導体回路基板29上に形成される。なお、不図示の色フィルターはマイクロレンズ10と光電変換部11の中間に配置される。   FIG. 8 is a cross-sectional view of the imaging pixel 310. In the imaging pixel 310, the microlens 10 is disposed in front of the photoelectric conversion unit 11 for imaging, and the shape of the photoelectric conversion unit 11 is projected forward by the microlens 10. The photoelectric conversion unit 11 is formed on the semiconductor circuit substrate 29. A color filter (not shown) is arranged between the microlens 10 and the photoelectric conversion unit 11.

図9(a)は焦点検出画素313の断面図である。画面中央の焦点検出エリア101に配置された焦点検出画素313において、光電変換部13の前方にマイクロレンズ10が配置され、マイクロレンズ10により光電変換部13の形状が前方に投影される。光電変換部13は半導体回路基板29上に形成されるとともに、その上にマイクロレンズ10が半導体イメージセンサーの製造工程により一体的かつ固定的に形成される。なお、画面上下の焦点検出エリア102、103に配置された焦点検出画素313の断面構造についても、図9(a)に示す断面構造と同様である。   FIG. 9A is a cross-sectional view of the focus detection pixel 313. In the focus detection pixel 313 disposed in the focus detection area 101 at the center of the screen, the microlens 10 is disposed in front of the photoelectric conversion unit 13, and the shape of the photoelectric conversion unit 13 is projected forward by the microlens 10. The photoelectric conversion unit 13 is formed on the semiconductor circuit substrate 29, and the microlens 10 is integrally and fixedly formed thereon by a semiconductor image sensor manufacturing process. Note that the cross-sectional structure of the focus detection pixels 313 arranged in the focus detection areas 102 and 103 at the top and bottom of the screen is the same as the cross-sectional structure shown in FIG.

図9(b)は焦点検出画素314の断面図である。画面中央の焦点検出エリア101に配置された焦点検出画素314において、光電変換部14の前方にマイクロレンズ10が配置され、マイクロレンズ10により光電変換部14の形状が前方に投影される。光電変換部14は半導体回路基板29上に形成されるとともに、その上にマイクロレンズ10が半導体イメージセンサーの製造工程により一体的かつ固定的に形成される。なお、画面上下の焦点検出エリア102、103に配置された焦点検出画素314の断面構造についても、図9(b)に示す断面構造と同様である。   FIG. 9B is a cross-sectional view of the focus detection pixel 314. In the focus detection pixel 314 disposed in the focus detection area 101 in the center of the screen, the microlens 10 is disposed in front of the photoelectric conversion unit 14, and the shape of the photoelectric conversion unit 14 is projected forward by the microlens 10. The photoelectric conversion unit 14 is formed on the semiconductor circuit substrate 29, and the microlens 10 is integrally and fixedly formed thereon by the manufacturing process of the semiconductor image sensor. Note that the cross-sectional structure of the focus detection pixels 314 arranged in the focus detection areas 102 and 103 at the top and bottom of the screen is the same as the cross-sectional structure shown in FIG.

図10は、マイクロレンズを用いた瞳分割型位相差検出方式の焦点検出光学系の構成を示す。なお、焦点検出画素の部分は拡大して示す。図において、90は、交換レンズ202(図1参照)の予定結像面に配置されたマイクロレンズから前方dの距離に設定された射出瞳である。この距離dは、マイクロレンズの曲率、屈折率、マイクロレンズと光電変換部との間の距離などに応じて決まる距離であって、この明細書では測距瞳距離と呼ぶ。91は交換レンズの光軸、10はマイクロレンズ、11,13、14は光電変換部、310は撮像画素、313、314は焦点検出画素、71は撮影光束、73、74は焦点検出光束である。   FIG. 10 shows a configuration of a pupil division type phase difference detection type focus detection optical system using a microlens. The focus detection pixel portion is shown in an enlarged manner. In the figure, reference numeral 90 denotes an exit pupil set at a distance d forward from the microlens arranged on the planned imaging plane of the interchangeable lens 202 (see FIG. 1). This distance d is a distance determined according to the curvature and refractive index of the microlens, the distance between the microlens and the photoelectric conversion unit, and is referred to as a distance measuring pupil distance in this specification. 91 is an optical axis of the interchangeable lens, 10 is a microlens, 11, 13 and 14 are photoelectric conversion units, 310 is an imaging pixel, 313 and 314 are focus detection pixels, 71 is a photographing light beam, and 73 and 74 are focus detection light beams. .

95は、マイクロレンズ10により投影された光電変換部11の領域である。また、93は、マイクロレンズ10により投影された光電変換部13の領域であり、この明細書では測距瞳と呼ぶ。同様に、94は、マイクロレンズ10b、10dにより投影された光電変換部14の領域であり、この明細書では測距瞳と呼ぶ。図10では、説明を解りやすくするために93,94、95楕円形の領域で示しているが、実際には光電変換部の形状が拡大投影された形状になる。   Reference numeral 95 denotes an area of the photoelectric conversion unit 11 projected by the microlens 10. Reference numeral 93 denotes an area of the photoelectric conversion unit 13 projected by the microlens 10 and is referred to as a distance measuring pupil in this specification. Similarly, 94 is an area of the photoelectric conversion unit 14 projected by the microlenses 10b and 10d, and is called a distance measuring pupil in this specification. In FIG. 10, in order to make the explanation easy to understand, the regions are indicated by 93, 94, and 95 elliptical regions. However, in reality, the shape of the photoelectric conversion unit is an enlarged projection shape.

図10では、撮影光軸に隣接する4つの画素(焦点検出画素313、314と2の撮像画素310)を模式的に例示しているが、焦点検出エリア101のその他の焦点検出画素および撮像画素においても、また画面周辺部の焦点検出エリア102、103の焦点検出画素および撮像画素においても、各光電変換部はそれぞれ対応した領域95および測距瞳93、94から各マイクロレンズに到来する光束を受光するように構成されている。焦点検出画素の配列方向は一対の測距瞳の並び方向、すなわち一対の光電変換部の並び方向と一致させる。   FIG. 10 schematically illustrates four pixels (focus detection pixels 313, 314, and 2 imaging pixels 310) adjacent to the imaging optical axis, but other focus detection pixels and imaging pixels in the focus detection area 101. In addition, also in the focus detection pixels and the imaging pixels in the focus detection areas 102 and 103 in the periphery of the screen, each photoelectric conversion unit transmits the light flux that arrives at each microlens from the corresponding region 95 and distance measurement pupils 93 and 94, respectively. It is configured to receive light. The arrangement direction of the focus detection pixels is made to coincide with the arrangement direction of the pair of distance measuring pupils, that is, the arrangement direction of the pair of photoelectric conversion units.

マイクロレンズ10は交換レンズ202(図1参照)の予定結像面近傍に配置されており、マイクロレンズ10によりその背後に配置された光電変換部11,13、14の形状がマイクロレンズ10から測距瞳距離dだけ離間した射出瞳90上に投影され、その投影形状は領域95および測距瞳93,94を形成する。すなわち、焦点検出画素においては投影距離dにある射出瞳90上で各焦点検出画素の光電変換部の投影形状(測距瞳93,94)が一致するように、各焦点検出画素におけるマイクロレンズと光電変換部の相対的位置関係が定められ、それにより各焦点検出画素における光電変換部の投影方向が決定されている。   The microlens 10 is disposed in the vicinity of the planned imaging plane of the interchangeable lens 202 (see FIG. 1), and the shape of the photoelectric conversion units 11, 13, and 14 disposed behind the microlens 10 is measured from the microlens 10. The projection is performed on the exit pupil 90 separated by the distance pupil distance d, and the projection shape forms a region 95 and distance measurement pupils 93 and 94. That is, in the focus detection pixel, the microlens in each focus detection pixel is matched with the projection shape (ranging pupils 93 and 94) of the photoelectric conversion unit of each focus detection pixel on the exit pupil 90 at the projection distance d. The relative positional relationship of the photoelectric conversion units is determined, and thereby the projection direction of the photoelectric conversion unit in each focus detection pixel is determined.

光電変換部13は測距瞳93を通過し、焦点検出画素313のマイクロレンズ10に向う光束73によりマイクロレンズ10上に形成される像の強度に対応した信号を出力する。また、光電変換部14は測距瞳94を通過し、焦点検出画素314のマイクロレンズ10に向う光束74によりマイクロレンズ10上に形成される像の強度に対応した信号を出力する。同様に、光電変換部11は領域95を通過し、撮像画素のマイクロレンズ10に向う光束92によりマイクロレンズ10上に形成される像の強度に対応した信号を出力する。   The photoelectric conversion unit 13 outputs a signal corresponding to the intensity of the image formed on the microlens 10 by the light beam 73 passing through the distance measuring pupil 93 and directed to the microlens 10 of the focus detection pixel 313. Further, the photoelectric conversion unit 14 outputs a signal corresponding to the intensity of the image formed on the microlens 10 by the light flux 74 that passes through the distance measuring pupil 94 and travels toward the microlens 10 of the focus detection pixel 314. Similarly, the photoelectric conversion unit 11 outputs a signal corresponding to the intensity of the image formed on the microlens 10 by the light beam 92 passing through the region 95 and directed to the microlens 10 of the imaging pixel.

上述した2種類の焦点検出画素を直線状に多数配置し、各画素の光電変換部の出力を測距瞳93および測距瞳94に対応した出力グループにまとめることによって、測距瞳93と測距瞳94をそれぞれ通過する焦点検出用光束が画素列上に形成する一対の像の強度分布に関する情報が得られる。この情報に対して後述する像ズレ検出演算処理(相関演算処理、位相差検出処理)を施すことによって、いわゆる瞳分割型位相差検出方式で一対の像の像ズレ量が検出される。さらに、像ズレ量に一対の測距瞳の重心間隔に応じた変換演算を行うことによって、予定結像面に対する現在の結像面(予定結像面上のマイクロレンズアレイの位置に対応した焦点検出位置における結像面)の偏差(デフォーカス量)が算出される。   A large number of the two types of focus detection pixels described above are arranged in a straight line, and the output of the photoelectric conversion unit of each pixel is grouped into a distance measurement pupil 93 and an output group corresponding to the distance measurement pupil 94, thereby measuring the distance measurement pupil 93 and the measurement pupil 93. Information on the intensity distribution of the pair of images formed on the pixel array by the focus detection light beams that respectively pass through the distance pupil 94 is obtained. By applying an image shift detection calculation process (correlation calculation process, phase difference detection process), which will be described later, to this information, an image shift amount of a pair of images is detected by a so-called pupil division type phase difference detection method. Further, by performing a conversion operation according to the center-of-gravity interval of the pair of distance measuring pupils on the image shift amount, the current imaging plane with respect to the planned imaging plane (the focal point corresponding to the position of the microlens array on the planned imaging plane) The deviation (defocus amount) of the imaging plane at the detection position is calculated.

図11は、一実施の形態のデジタルスチルカメラ(撮像装置)の撮像動作を示すフローチャートである。ボディ駆動制御装置214は、ステップ100でカメラの電源がオンされると、ステップ110以降の撮像動作を開始する。ステップ110において撮像画素のデータを間引き読み出し、電子ビューファインダーに表示させる。続くステップ120では焦点検出画素列から一対の像に対応した一対の像データを読み出す。なお、焦点検出エリアは、撮影者が焦点検出エリア選択部材(不図示)を用いて焦点検出エリア101〜103の内のいずれかを予め選択しているものとする。   FIG. 11 is a flowchart illustrating an imaging operation of the digital still camera (imaging device) according to the embodiment. When the power of the camera is turned on in step 100, the body drive control device 214 starts the imaging operation after step 110. In step 110, the image pickup pixel data is read out and displayed on the electronic viewfinder. In subsequent step 120, a pair of image data corresponding to the pair of images is read from the focus detection pixel array. The focus detection area is assumed to be selected in advance by the photographer using one of the focus detection areas 101 to 103 using a focus detection area selection member (not shown).

ステップ130では読み出された一対の像データに基づいて後述する像ズレ検出演算処理(相関演算処理)を行い、像ズレ量を演算してデフォーカス量に変換する。ステップ140で合焦近傍か否か、すなわち算出されたデフォーカス量の絶対値が所定値以内であるか否かを調べる。合焦近傍でないと判定された場合はステップ150へ進み、デフォーカス量をレンズ駆動制御装置206へ送信し、交換レンズ202のフォーカシングレンズ210を合焦位置に駆動させる。その後、ステップ110へ戻って上述した動作を繰り返す。   In step 130, an image shift detection calculation process (correlation calculation process) described later is performed based on the read pair of image data, and the image shift amount is calculated and converted into a defocus amount. In step 140, it is checked whether or not the focus is close, that is, whether or not the absolute value of the calculated defocus amount is within a predetermined value. If it is determined that the lens is not in focus, the process proceeds to step 150, where the defocus amount is transmitted to the lens drive control device 206, and the focusing lens 210 of the interchangeable lens 202 is driven to the focus position. Then, it returns to step 110 and repeats the above-mentioned operation.

なお、焦点検出不能な場合もこのステップに分岐し、レンズ駆動制御装置206へスキャン駆動命令を送信し、交換レンズ202のフォーカシングレンズ210を無限から至近までの間でスキャン駆動させる。その後、ステップ110へ戻って上述した動作を繰り返す。   Even when focus detection is impossible, the process branches to this step, a scan drive command is transmitted to the lens drive control device 206, and the focusing lens 210 of the interchangeable lens 202 is driven to scan from infinity to the nearest. Then, it returns to step 110 and repeats the above-mentioned operation.

ステップ140で合焦近傍であると判定された場合はステップ160へ進み、シャッターボタン(不図示)の操作によりシャッターレリーズがなされたか否かを判別する。シャッターレリーズがなされていないと判定された場合はステップ110へ戻り、上述した動作を繰り返す。一方、シャッターレリーズがなされたと判定された場合はステップ170へ進み、レンズ駆動制御装置206へ絞り調整命令を送信し、交換レンズ202の絞り値を制御F値(撮影者または自動により設定されたF値)にする。絞り制御が終了した時点で、撮像素子212に撮像動作を行わせ、撮像素子212の撮像画素310および全ての焦点検出画素313,314から画像データを読み出す。   If it is determined in step 140 that the focus is close to the in-focus state, the process proceeds to step 160, where it is determined whether or not a shutter release has been performed by operating a shutter button (not shown). If it is determined that the shutter release has not been performed, the process returns to step 110 to repeat the above-described operation. On the other hand, if it is determined that the shutter release has been performed, the process proceeds to step 170, where an aperture adjustment command is transmitted to the lens drive control device 206, and the aperture value of the interchangeable lens 202 is controlled to a control F value (F set by the photographer or automatically). Value). When the aperture control is finished, the image sensor 212 is caused to perform an imaging operation, and image data is read from the image pickup pixel 310 and all the focus detection pixels 313 and 314 of the image pickup element 212.

ステップ180において、焦点検出画素列の各画素位置の画素データを焦点検出画素の周囲の撮像画素のデータに基づいて画素補間する。続くステップ190では、撮像画素のデータおよび補間されたデータからなる画像データをメモリーカード219に記憶し、ステップ110へ戻って上述した動作を繰り返す。   In step 180, pixel data of each pixel position in the focus detection pixel column is subjected to pixel interpolation based on data of imaging pixels around the focus detection pixel. In the subsequent step 190, image data composed of the imaged pixel data and the interpolated data is stored in the memory card 219, and the process returns to step 110 to repeat the above-described operation.

次に、図11のステップ130における像ズレ検出演算処理(相関演算処理)の詳細について、以下に説明する。   Next, details of the image shift detection calculation process (correlation calculation process) in step 130 of FIG. 11 will be described below.

像ズレ検出の際には、焦点検出エリア101〜103において中央部のみに存在する焦点検出画素313の配列が基準データ列となり、一方、上端部、中央部および下端部に存在する焦点検出画素314の配列が副データ列となる(図3参照)。そして、焦点検出画素313の配列が生成する基準データ列(A1n、ただしn=0〜N)が固定され、この基準データ列に対し焦点検出画素314の配列が生成する副データ列(A2n、ただしn=−m〜N+m)が相対的にデータ列のデータ間隔(データピッチ)だけ順次シフトされ、各シフトにおける相関量が下記(1)式により計算される。
C(k)=Σ|A1n−A2n+k| ・・・(1)
(1)式において、Σ演算はnについて0〜Nまでの積算を行う。シフト量kは整数であり、(1)式の演算はk=−mから+mの範囲で行われる。
At the time of image shift detection, in the focus detection areas 101 to 103, the arrangement of the focus detection pixels 313 existing only in the central portion becomes the reference data row, while the focus detection pixels 314 existing in the upper end portion, the central portion, and the lower end portion. Is an auxiliary data string (see FIG. 3). A reference data string (A1n, where n = 0 to N) generated by the array of focus detection pixels 313 is fixed, and a sub data string (A2n, where the array of focus detection pixels 314 is generated with respect to this reference data string. n = −m to N + m) are sequentially shifted by the data interval (data pitch) of the data sequence, and the correlation amount in each shift is calculated by the following equation (1).
C (k) = Σ | A1n−A2n + k | (1)
In the equation (1), the Σ operation performs integration from 0 to N for n. The shift amount k is an integer, and the calculation of equation (1) is performed in the range of k = −m to + m.

図12は、縦軸にデータ値、横軸にデータ位置をとって基準データ列と副データ列の波形を示したグラフであって、○が焦点検出画素313に対応する基準データ列を示し、●が焦点検出画素314に対応する副データ列を示している。図12において、上記(1)式の演算の様子を簡略的に説明すると、基準データ列の範囲αを固定して、シフト量k(k=−4、−3・・・、+3)に応じて副データ列の範囲をβ1、β2・・・β8と変更しながら(1)式の演算を行うことになる。   FIG. 12 is a graph showing the waveform of the reference data string and the sub data string with the data value on the vertical axis and the data position on the horizontal axis, where ○ indicates the reference data string corresponding to the focus detection pixel 313, ● indicates a sub-data string corresponding to the focus detection pixel 314. In FIG. 12, the state of the calculation of the above equation (1) will be briefly described. The range α of the reference data string is fixed, and according to the shift amount k (k = −4, −3... +3). Thus, the calculation of equation (1) is performed while changing the range of the sub data string to β1, β2,.

(1)式の演算結果は、図13(a)に示すように、一対のデータの相関が高いシフト量(図13(a)ではk=kj=2)において相関量C(k)が極小(小さいほど相関度が高い)になる。下記(2)〜(5)式による3点内挿の手法を用いて、連続的な相関量に対する極小値C(x)を与えるシフト量xを求める。
x=kj+D/SLOP ・・・(2)、
C(x)= C(kj)−|D| ・・・(3)、
D={C(kj-1)−C(kj+1)}/2 ・・・(4)、
SLOP=MAX{C(kj+1)−C(kj),C(kj-1)−C(kj)}・・・(5)
As shown in FIG. 13A, the calculation result of the expression (1) shows that the correlation amount C (k) is minimal at a shift amount having a high correlation between a pair of data (k = kj = 2 in FIG. 13A). (The smaller the value, the higher the degree of correlation). The shift amount x that gives the minimum value C (x) with respect to the continuous correlation amount is obtained by using the three-point interpolation method according to the following equations (2) to (5).
x = kj + D / SLOP (2),
C (x) = C (kj) − | D | (3),
D = {C (kj-1) -C (kj + 1)} / 2 (4),
SLOP = MAX {C (kj + 1) -C (kj), C (kj-1) -C (kj)} (5)

(2)式で算出されたずらし量xの信頼性があるかどうかは、以下のようにして判定される。図13(b)に示すように、一対のデータの相関度が低い場合は、内挿された相関量の極小値C(x)の値が大きくなる。したがって、C(x)が所定のしきい値以上の場合は算出されたずらし量の信頼性が低いと判定し、算出されたずらし量xをキャンセルする。あるいは、C(x)をデータのコントラストで規格化するために、コントラストに比例した値となるSLOPでC(x)を除した値が所定値以上の場合は、算出されたずらし量の信頼性が低いと判定し、算出されたずらし量xをキャンセルする。あるいはまた、コントラストに比例した値となるSLOPが所定値以下の場合は、被写体が低コントラストであり、算出されたずらし量の信頼性が低いと判定し、算出されたずらし量xをキャンセルする。図13(c)に示すように、一対のデータの相関度が低く、シフト範囲kmin〜kmaxの間で相関量C(k)の落ち込みがない場合は、極小値C(x)を求めることができず、このような場合は焦点検出不能と判定する。 Whether or not the shift amount x calculated by the equation (2) is reliable is determined as follows. As shown in FIG. 13B, when the degree of correlation between a pair of data is low, the value of the interpolated minimum value C (x) of the correlation amount increases. Therefore, when C (x) is equal to or greater than a predetermined threshold value, it is determined that the calculated shift amount has low reliability, and the calculated shift amount x is canceled. Alternatively, in order to normalize C (x) with the contrast of data, when the value obtained by dividing C (x) by SLOP that is proportional to the contrast is equal to or greater than a predetermined value, the reliability of the calculated shift amount Is determined to be low, and the calculated shift amount x is canceled. Alternatively, when SLOP that is a value proportional to the contrast is equal to or less than a predetermined value, it is determined that the subject has low contrast and the reliability of the calculated shift amount is low, and the calculated shift amount x is canceled. As shown in FIG. 13C, when the correlation between the pair of data is low and there is no drop in the correlation amount C (k) between the shift ranges kmin to kmax, the minimum value C (x) is obtained. In such a case, it is determined that the focus cannot be detected.

算出されたずらし量xの信頼性があると判定された場合は、下記(6)式により像ズレ量shftに換算される。
shft=PY・(x−0.5) ・・・(6)
(6)式において、PYは検出ピッチ(焦点検出画素313または314どうしの実際の間隔)であり、−0.5は焦点検出画素313と焦点検出画素314が半検出ピッチすれて配置されていることを補正するためのオフセット量である。(6)式で算出された像ズレ量に所定の変換係数aを乗じてデフォーカス量defへ変換する。
def=a・shft ・・・(7)
If it is determined that the calculated shift amount x is reliable, it is converted into the image shift amount shft by the following equation (6).
shft = PY · (x−0.5) (6)
In Expression (6), PY is a detection pitch (actual interval between the focus detection pixels 313 or 314), and -0.5 is arranged with the focus detection pixel 313 and the focus detection pixel 314 separated by a half detection pitch. This is an offset amount for correcting this. The image shift amount calculated by the equation (6) is multiplied by a predetermined conversion coefficient a to be converted into a defocus amount def.
def = a · shft (7)

焦点検出画素が検出する一対の像は、測距瞳がレンズの絞り開口によりけられて光量バランスが崩れている場合には、光量バランスに対して像ズレ検出精度を維持できる(8)式のような相関演算を施してもよい。
C(k)=Σ|A1n・A2n+1+k−A2n+k・A1n+1| ・・・(8)
(8)式において、Σ演算はnについて0〜N−1までの積算を行う。
The pair of images detected by the focus detection pixels can maintain image shift detection accuracy with respect to the light amount balance when the distance measuring pupil is displaced by the aperture of the lens and the light amount balance is lost. Such correlation calculation may be performed.
C (k) = Σ | A1n · A2n + 1 + k−A2n + k · A1n + 1 | (8)
In equation (8), the Σ operation performs integration from 0 to N−1 for n.

図11のステップ180における焦点検出画素位置の画像データの補間演算の詳細について、図14および図15を参照して説明する。なお、以下の説明では、あるひとつの焦点検出画素に注目した処理について説明するが、同様な処理をすべての焦点検出画素位置の画像データの算出に適用する。   Details of the interpolation calculation of the image data at the focus detection pixel position in step 180 in FIG. 11 will be described with reference to FIGS. 14 and 15. In the following description, processing focusing on one focus detection pixel will be described, but the same processing is applied to calculation of image data at all focus detection pixel positions.

図14は、焦点検出画素配列の中央部における焦点検出画素(斜線部)を含んだ領域の拡大図であって、画素単位換算で7行5列の領域に対応する撮像画素のデータ(B11〜B75)を画素のレイアウトに対応して示している。B13a、G23a、B33a、G43a、B53a、G63a、B73aは焦点検出画素位置に仮想的な撮像画素が配置されているとした場合の各撮像画素のデータ(補間すべき撮像画素データ)である。なお、以下の説明では代表としてB33a、G43aを補間する場合の処理について説明するが、同様な処理を適用することでその他の焦点検出画素位置に撮像画素が配置されているとした場合の各撮像画素のデータを補間することができる。
B33a=(B11+B31+B51+B15+B35+B55)/6、
G43a=(G32+G41+G52+G34+G45+G54)/6 ・・・(9)
FIG. 14 is an enlarged view of a region including a focus detection pixel (shaded portion) in the central portion of the focus detection pixel array, and data of imaging pixels (B11 to B11) corresponding to a region of 7 rows and 5 columns in pixel unit conversion. B75) is shown corresponding to the pixel layout. B13a, G23a, B33a, G43a, B53a, G63a, and B73a are data of each imaging pixel (imaging pixel data to be interpolated) when a virtual imaging pixel is arranged at the focus detection pixel position. In the following description, the processing when B33a and G43a are interpolated will be described as a representative, but each imaging when the imaging pixels are arranged at other focus detection pixel positions by applying the same processing. Pixel data can be interpolated.
B33a = (B11 + B31 + B51 + B15 + B35 + B55) / 6,
G43a = (G32 + G41 + G52 + G34 + G45 + G54) / 6 (9)

図15は、焦点検出画素配列の上端部または下端部における焦点検出画素(斜線部)を含んだ領域の拡大図であって、画素単位換算で7行5列の領域に対応する撮像画素のデータ(B11〜B75)を画素のレイアウトに対応して示している。G23a、G43a、G63aは焦点検出画素位置に仮想的な撮像画素が配置されているとした場合の各撮像画素のデータ(補間すべき撮像画素データ)である。なお、以下の説明では代表としてG43aを補間する場合の処理について説明するが、同様な処理を適用することでその他の焦点検出画素位置に撮像画素が配置されているとした場合の各撮像画素のデータを補間することができる。
G43a=(G32+G41+G52+G34+G45+G54)/6 ・・・(10)
FIG. 15 is an enlarged view of a region including focus detection pixels (shaded portions) at the upper end portion or the lower end portion of the focus detection pixel array, and data of imaging pixels corresponding to a region of 7 rows and 5 columns in terms of pixel units. (B11 to B75) are shown corresponding to the layout of the pixels. G23a, G43a, and G63a are data of each imaging pixel (imaging pixel data to be interpolated) when a virtual imaging pixel is arranged at the focus detection pixel position. In the following description, the process when G43a is interpolated will be described as a representative. However, by applying the same process, it is assumed that the image pickup pixels are arranged at other focus detection pixel positions. Data can be interpolated.
G43a = (G32 + G41 + G52 + G34 + G45 + G54) / 6 (10)

このように、一実施の形態では図3に示すように焦点検出画素配列の中央部には一対の焦点検出画素の両方を交互に配置し、上端部および下端部には一対の焦点検出画素の一方のみを配置している。そして、像ズレ検出演算に際しては、中央部のみに配置された焦点検出画素のデータ列を基準として、中央部および上端部、下端部に配置された焦点検出画素のデータ列を相対的にシフトしているので、大きなデフォーカス量(大きな像ズレ量)の検出が可能になるとともに、上端部および下端部における焦点検出画素位置の画像データ(撮像画素を配置したと仮定した場合の仮想撮像画素の出力データ)を求める画素補間においては、一方の焦点検出画素位置における画素データのみを補間すればよいので、画像品質の劣化を抑えることができる。さらに、上端部、下端部に配置される焦点検出画素をベイヤー配列において配置密度が高い緑画素の位置に配置しているために、青画素や赤画素の位置に配置した場合と比較して、補間すべき画素位置に近い撮像画素のデータを画素補間に使用できるので、画像品質の劣化をさらに抑えることができる。   Thus, in one embodiment, as shown in FIG. 3, a pair of focus detection pixels are alternately arranged at the center of the focus detection pixel array, and a pair of focus detection pixels are arranged at the upper end and the lower end. Only one is arranged. In the image shift detection calculation, the data rows of the focus detection pixels arranged in the central portion, the upper end portion, and the lower end portion are relatively shifted with reference to the data row of the focus detection pixels arranged only in the central portion. Therefore, a large defocus amount (a large image shift amount) can be detected, and image data at the focus detection pixel positions at the upper end portion and the lower end portion (virtual image pickup pixel when assuming that the image pickup pixels are arranged) In the pixel interpolation for obtaining the output data), it is only necessary to interpolate the pixel data at one focus detection pixel position, so that deterioration of image quality can be suppressed. Furthermore, since the focus detection pixels arranged at the upper end portion and the lower end portion are arranged at the positions of the green pixels having a high arrangement density in the Bayer arrangement, compared with the case where the focus detection pixels are arranged at the positions of the blue pixels and the red pixels, Since the imaged pixel data close to the pixel position to be interpolated can be used for pixel interpolation, it is possible to further suppress the degradation of image quality.

《発明の他の実施の形態》
以上説明した一実施の形態では、一対の焦点検出画素をベイヤー配列の緑画素と青画素が配列された位置に配置しているが、一対の焦点検出画素をベイヤー配列の緑画素と赤画素が配列された位置に配置してもよい。
<< Other Embodiments of the Invention >>
In the embodiment described above, the pair of focus detection pixels is arranged at the position where the green pixels and the blue pixels in the Bayer array are arranged, but the green pixels and the red pixels in the Bayer array are arranged in the pair of focus detection pixels. You may arrange | position in the arranged position.

撮像素子における焦点検出エリアの配置は図2に限定されることはなく、対角線方向や、その他の位置に水平方向および垂直方向に焦点検出エリアを配置することも可能である。   The arrangement of the focus detection areas in the image sensor is not limited to that shown in FIG. 2, and the focus detection areas can be arranged in the diagonal direction and in other positions in the horizontal and vertical directions.

図3に示す焦点検出画素配列では、焦点検出画素配列の端部において、一対の焦点検出画素313,314の一方の焦点検出画素314と撮像画素を交互に配置しているが、焦点検出画素配列の端部においてさらに焦点検出画素314を間引いて、その代わりに撮像画素を配置することにより、焦点検出画素配列の端部における補間画素データの密度を低くするとともに、補間画素データの補間精度を向上させることにより、画像品質の劣化をさらに抑えることが可能である。   In the focus detection pixel array shown in FIG. 3, one focus detection pixel 314 and imaging pixel of the pair of focus detection pixels 313 and 314 are alternately arranged at the end of the focus detection pixel array. By further thinning out the focus detection pixels 314 at the ends of the pixels and arranging the imaging pixels instead, the density of the interpolation pixel data at the ends of the focus detection pixel array is lowered and the interpolation accuracy of the interpolation pixel data is improved. By doing so, it is possible to further suppress degradation of image quality.

図16は変形例の撮像素子212Aの構成を示す。なお、図16では、図3に示す撮像素子212に対応した焦点検出画素配列の端部の構成のみを示す。焦点検出画素配列の端部において、焦点検出画素314と撮像画素が交互に配置されているが、“最端部”においてはさらに焦点検出画素314が間引かれ、4画素中1画素に焦点検出画素314が配置されている。焦点検出画素314の間にはベイヤー配列に応じて撮像画素が配置されている。   FIG. 16 shows a configuration of an imaging element 212A according to a modification. In FIG. 16, only the configuration of the end of the focus detection pixel array corresponding to the image sensor 212 shown in FIG. 3 is shown. The focus detection pixels 314 and the imaging pixels are alternately arranged at the end of the focus detection pixel array, but the focus detection pixels 314 are further thinned out at the “endmost part”, and focus detection is performed on one of the four pixels. Pixel 314 is arranged. Imaging pixels are arranged between the focus detection pixels 314 according to the Bayer array.

図17は、焦点検出画素配列の上記“最端部”における焦点検出画素(斜線部)を含んだ領域の拡大図であって、画素単位換算で10行5列の領域に対応する撮像画素のデータ(B11〜G105)を画素のレイアウトに対応して示している。G43a、G83aは焦点検出画素位置に仮想的な撮像画素が配置されているとした場合の各撮像画素のデータ(補間すべき撮像画素データ)である。なお、以下の説明では代表としてG43aを補間する場合の処理について説明するが、同様な処理を適用することでその他の焦点検出画素位置に撮像画素が配置されているとした場合の各撮像画素のデータを補間することができる。
G43a=(G32+G41+G52+G23+G63+G34+G45+G54)/8 ・・・(11)
FIG. 17 is an enlarged view of a region including the focus detection pixel (shaded portion) in the “endmost part” of the focus detection pixel array, and the imaging pixels corresponding to the region of 10 rows and 5 columns in terms of pixel units. Data (B11 to G105) are shown corresponding to the layout of the pixels. G43a and G83a are data of each imaging pixel (imaging pixel data to be interpolated) when a virtual imaging pixel is arranged at the focus detection pixel position. In the following description, the process when G43a is interpolated will be described as a representative. However, by applying the same process, it is assumed that the image pickup pixels are arranged at other focus detection pixel positions. Data can be interpolated.
G43a = (G32 + G41 + G52 + G23 + G63 + G34 + G45 + G54) / 8 (11)

(11)式では緑画素の補間データG43aが、この画素位置の周辺における4方向の緑画素データ(水平方向の緑画素データ(G41、G45)、垂直方向の緑画素データ(G23、G63)、右上がり斜め45度方向の緑画素データ(G32、G54)および左上がり斜め45度方向の緑画素データ(G34、G52))から補間されているので、像の方向性によらず高精度な補間が期待できる。
像ズレ検出演算において、上記のように焦点検出画素配列の最端部の副データを用いるような大きなシフト量(即ち大きなデフォーカス量)においては、デフォーカスにより像の高周波成分が落とされ、低周波成分が多く残っているので、焦点検出画素を間引きして像をサンプリングしても像ズレ検出精度が大きく低下するおそれはない。なお、間引きされた副データも用いなければならないシフト値においては、基準データ、副データともに間引きしたデータどうしで(1)式の像ズレ検出演算を行う。
In the equation (11), the green pixel interpolation data G43a is the green pixel data in the four directions around the pixel position (the green pixel data in the horizontal direction (G41, G45), the green pixel data in the vertical direction (G23, G63), Interpolated from the green pixel data (G32, G54) in the 45 ° upward slanting direction and the green pixel data (G34, G52) in the 45 ° slanting upward leftward direction, so that highly accurate interpolation is possible regardless of the directionality of the image. Can be expected.
In the image shift detection calculation, in the case of a large shift amount (that is, a large defocus amount) that uses the sub-data at the extreme end of the focus detection pixel array as described above, the high frequency component of the image is reduced due to the defocus. Since many frequency components remain, even if the image is sampled by thinning out the focus detection pixels, there is no possibility that the image shift detection accuracy is greatly lowered. In the shift value in which the thinned sub data must also be used, the image shift detection calculation of the equation (1) is performed between the thinned data of both the reference data and the sub data.

上述した一実施の形態とその変形例では、焦点検出画素配列の中央部には一対の焦点検出画素の両方を交互に配置し、上端部および下端部には一対の焦点検出画素の一方のみを配置した例を示したが、一対の焦点検出画素の両方を交互に配置した領域は必ずしも中央部の領域である必要はない。例えば、焦点検出画素配列を2つの領域に区分し、一方の領域に一対の焦点検出画素の両方を交互に配置し、他方の領域には一対の焦点検出画素の一方のみを配置してもよい。あるいは、焦点検出画素配列を3つ以上の領域に区分し、1または複数の領域に一対の焦点検出画素の両方を交互に配置し、残りの領域に一対の焦点検出画素の一方のみを配置してよい。ただし、合焦近傍の精密な焦点検出を可能するため、一対の焦点検出画素の両方を交互に配置する領域を中央部にすることが望ましい。   In the above-described embodiment and its modification, both the pair of focus detection pixels are alternately arranged at the center of the focus detection pixel array, and only one of the pair of focus detection pixels is provided at the upper end and the lower end. Although the arrangement example is shown, the area where both the pair of focus detection pixels are alternately arranged is not necessarily the central area. For example, the focus detection pixel array may be divided into two regions, both of the pair of focus detection pixels may be alternately arranged in one region, and only one of the pair of focus detection pixels may be disposed in the other region. . Alternatively, the focus detection pixel array is divided into three or more areas, both of the pair of focus detection pixels are alternately arranged in one or a plurality of areas, and only one of the pair of focus detection pixels is arranged in the remaining area. It's okay. However, in order to enable precise focus detection in the vicinity of the in-focus state, it is desirable to set a region where both of the pair of focus detection pixels are alternately arranged in the center.

焦点検出画素配列をこのような領域区分にする場合には、像ズレ検出演算に際して、1または複数の領域にのみ配置された焦点検出画素のデータ列を基準とし、すべての領域に配置された焦点検出画素のデータ列を相対的にシフトすればよい。これにより、大きなデフォーカス量(大きな像ズレ量)の検出が可能になるとともに、焦点検出画素位置の画像データ(撮像画素を配置したと仮定した場合の仮想撮像画素の出力データ)を求める画素補間においては、一方の焦点検出画素位置における画素データのみを補間すればよいので、画像品質の劣化を抑えることができる。   In the case where the focus detection pixel array is divided into such regions, in the case of image shift detection calculation, the focus array arranged in all regions with reference to a data string of focus detection pixels arranged only in one or a plurality of regions. What is necessary is just to relatively shift the data string of the detection pixels. As a result, a large defocus amount (a large image shift amount) can be detected, and pixel interpolation for obtaining image data at the focus detection pixel position (output data of the virtual imaging pixel when the imaging pixel is arranged) is obtained. Since only pixel data at one focus detection pixel position needs to be interpolated, degradation of image quality can be suppressed.

以上説明した仮想的な撮像画素のデータ補間演算においては、仮想的な撮像画素の周囲に存在する仮想的な撮像画素と同色の撮像画素のデータの単純平均を用いた例を示したが、仮想的な撮像画素と補間に使用する撮像画素との位置関係(方向、距離)に応じた重み係数に応じて加重加算平均を行ってもよい。また、仮想的な撮像画素の周囲の撮像画素のデータを用いて、仮想的な撮像画素の周囲の像の方向性を判別し、判別された方向性に応じた重み係数に応じて加重加算平均を行ってもよい。   In the data interpolation calculation of the virtual imaging pixel described above, an example using the simple average of the imaging pixel data of the same color as the virtual imaging pixel existing around the virtual imaging pixel is shown. A weighted average may be performed according to a weighting factor corresponding to a positional relationship (direction, distance) between a typical imaging pixel and an imaging pixel used for interpolation. In addition, the directionality of the image around the virtual imaging pixel is determined using the data of the imaging pixels around the virtual imaging pixel, and the weighted average is calculated according to the weighting coefficient according to the determined directionality. May be performed.

上述した一実施の形態とその変形例では、撮像画素がRGBのベイヤー配列とした例を示したが、例えば補色フィルタ配列のようなベイヤー配列以外の画素配列であってもよい。   In the above-described embodiment and its modification, an example in which the imaging pixels are RGB Bayer arrays has been described. However, for example, a pixel array other than a Bayer array such as a complementary color filter array may be used.

上述した一実施の形態の図5に示す焦点検出画素313、314では、光電変換部の形状を矩形にした例を示したが、焦点検出画素の光電変換部の形状はこれらに限定されず、他の形状であってもよい。例えば、焦点検出画素の光電変換部の形状を半円形や楕円や多角形にすることも可能である。   In the focus detection pixels 313 and 314 shown in FIG. 5 of the embodiment described above, an example in which the shape of the photoelectric conversion unit is rectangular is shown, but the shape of the photoelectric conversion unit of the focus detection pixel is not limited to these, Other shapes may be used. For example, the shape of the photoelectric conversion unit of the focus detection pixel can be a semicircle, an ellipse, or a polygon.

上述した一実施の形態の図5に示す焦点検出画素313、314では、光電変換部13,14の形状をマイクロレンズ10により前方に投影することによって瞳分割を行っているが、光電変換部13,14の形状を撮像画素310の光電変換部11の形状と同一とするとともに、光電変換部13,14の前に図5に示した光電変換部13,14の形状と同じ開口部を有するマスクを設置し、このマスクの形状をマイクロレンズ10により前方に投影するように焦点検出画素を構成してもよい。この場合、光電変換部13,14はマスクの開口を通過した光束を受光することになる。   In the focus detection pixels 313 and 314 shown in FIG. 5 of the above-described embodiment, pupil division is performed by projecting the shape of the photoelectric conversion units 13 and 14 forward by the microlens 10. , 14 is the same as the shape of the photoelectric conversion unit 11 of the imaging pixel 310, and a mask having the same opening as the shape of the photoelectric conversion units 13, 14 shown in FIG. The focus detection pixel may be configured so that the shape of the mask is projected forward by the microlens 10. In this case, the photoelectric conversion units 13 and 14 receive the light flux that has passed through the opening of the mask.

さらに、図3に示す撮像素子212および図16に示す撮像素子212Aでは、撮像画素と焦点検出画素を稠密正方格子配列に配置した例を示したが、稠密六方格子配列としてもよい。   Further, in the imaging device 212 shown in FIG. 3 and the imaging device 212A shown in FIG. 16, the example in which the imaging pixels and the focus detection pixels are arranged in a dense square lattice arrangement is shown, but a dense hexagonal lattice arrangement may be used.

上述した一実施の形態では、マイクロレンズを用いた瞳分割方式による焦点検出動作を説明したが、本発明はこのような方式の焦点検出に限定されず、特開2008−015157号公報に開示された偏光素子による瞳分割型位相差検出方式の焦点検出装置にも適用可能である。また、焦点検出画素のみにマイクロレンズを設けてもよい。   In the above-described embodiment, the focus detection operation by the pupil division method using the microlens has been described. However, the present invention is not limited to the focus detection of such a method, and is disclosed in Japanese Patent Laid-Open No. 2008-015157. The present invention can also be applied to a pupil detection type phase difference detection type focus detection apparatus using a polarizing element. Further, a micro lens may be provided only in the focus detection pixel.

なお、撮像装置としては、上述したようなカメラボディに交換レンズが装着される構成のデジタルスチルカメラやフィルムスチルカメラに限定されない。例えば、レンズ一体型のデジタルスチルカメラ、フィルムスチルカメラ、あるいはビデオカメラにも本発明を適用することができる。さらには、携帯電話などに内蔵される小型カメラモジュール、監視カメラやロボット用の視覚認識装置、車載カメラなどにも適用できる。   Note that the imaging apparatus is not limited to a digital still camera or a film still camera in which an interchangeable lens is mounted on the camera body as described above. For example, the present invention can also be applied to a lens-integrated digital still camera, film still camera, or video camera. Furthermore, the present invention can be applied to a small camera module built in a mobile phone, a surveillance camera, a visual recognition device for a robot, an in-vehicle camera, and the like.

なお、上述した実施の形態とそれらの変形例において、実施の形態と変形例とのあらゆる組み合わせが可能である。   In the above-described embodiments and their modifications, all combinations of the embodiments and the modifications are possible.

上述した実施の形態とその変形例によれば以下のような作用効果を奏することができる。まず、撮像面上に二次元状に配列された撮像画素の配列の一部に、撮像面上に像を形成する交換レンズ202の射出瞳の一対の領域を通過した一対の光束の内、一方の光束を受光する焦点検出画素313と他方の光束を受光する焦点検出画素314とを一列に配列した焦点検出画素列が少なくとも1列配置された撮像素子212、212Aであって、焦点検出画素313と焦点検出画素314の配列を複数の領域に区分し、複数の領域の内の少なくとも1の領域では焦点検出画素313と焦点検出画素314とを交互に配列し、他の少なくとも1の領域では焦点検出画素313を配列せずに焦点検出画素314と1または複数の撮像画素とを交互に配列した。これにより、瞳分割型の焦点検出画素を撮像画素と混在させた撮像素子において、大きなデフォーカス量まで検出可能にするとともに、画像劣化の度合いを最小限に押さえることができる。   According to the above-described embodiment and its modifications, the following operational effects can be achieved. First, one of a pair of light fluxes that has passed through a pair of regions of the exit pupil of the interchangeable lens 202 that forms an image on the imaging surface, in a part of the array of imaging pixels that are two-dimensionally arranged on the imaging surface. The focus detection pixels 313 include at least one focus detection pixel array in which focus detection pixels 313 that receive one light beam and focus detection pixels 314 that receive the other light beam are arranged in a line. And the focus detection pixels 314 are divided into a plurality of regions, the focus detection pixels 313 and the focus detection pixels 314 are alternately arranged in at least one of the plurality of regions, and the focus is detected in the other at least one region. The focus detection pixels 314 and one or more imaging pixels were alternately arranged without arranging the detection pixels 313. Accordingly, it is possible to detect a large defocus amount and minimize the degree of image degradation in an imaging element in which pupil-divided focus detection pixels are mixed with imaging pixels.

次に、焦点検出画素313を、ベイヤー配列中の赤色フィルターを有する撮像画素または青色フィルターを有する撮像画素の位置に配置するとともに、焦点検出画素314を、ベイヤー配列中の緑色フィルターを有する撮像画素の位置に配置したので、焦点検出画素314を青色撮像画素や赤色撮像画素の位置に配置した場合と比較して、補間すべき画素位置に近い撮像画素のデータを画素補間に使用できるので、画像品質の劣化をさらに抑えることができる。   Next, the focus detection pixel 313 is arranged at the position of the image pickup pixel having the red filter or the blue filter in the Bayer array, and the focus detection pixel 314 is set to the image pickup pixel having the green filter in the Bayer array. Since it is arranged at the position, compared with the case where the focus detection pixel 314 is arranged at the position of the blue imaging pixel or the red imaging pixel, the data of the imaging pixel close to the pixel position to be interpolated can be used for pixel interpolation. Can be further suppressed.

一実施の形態のカメラの構成を示すカメラの横断面図Cross-sectional view of the camera showing the configuration of the camera of one embodiment 交換レンズ202の撮影画面上における焦点検出位置を示す図The figure which shows the focus detection position on the imaging | photography screen of the interchangeable lens 202. 撮像素子212の詳細な構成を示す正面図Front view showing a detailed configuration of the image sensor 212 撮像画素310の構成を示す図The figure which shows the structure of the imaging pixel 310 焦点検出画素313,314の構成を示す図The figure which shows the structure of the focus detection pixel 313,314. 撮像画素310の分光特性を示す図The figure which shows the spectral characteristic of the imaging pixel 310 焦点検出画素313,314の分光特性を示す図The figure which shows the spectral characteristics of the focus detection pixels 313 and 314 撮像画素310の構造を示す断面図Sectional drawing which shows the structure of the imaging pixel 310 焦点検出画素313,314の構造を示す断面図Sectional drawing which shows the structure of focus detection pixel 313,314 マイクロレンズを用いた瞳分割型位相差検出方式の焦点検出光学系の構成を示す図The figure which shows the structure of the focus detection optical system of the pupil division type phase difference detection method using a micro lens 一実施の形態のデジタルスチルカメラ(撮像装置)の撮像動作を示すフローチャートThe flowchart which shows the imaging operation of the digital still camera (imaging device) of one embodiment 縦軸にデータ値、横軸にデータ位置をとって基準データ列と副データ列の波形を示したグラフGraph showing the waveform of the reference data row and the sub data row with the data value on the vertical axis and the data position on the horizontal axis 焦点検出結果の信頼性を説明するための図Diagram for explaining the reliability of focus detection results 図11のステップ180における焦点検出画素位置の画像データの補間演算の詳細を説明するための図The figure for demonstrating the detail of the interpolation calculation of the image data of the focus detection pixel position in step 180 of FIG. 図11のステップ180における焦点検出画素位置の画像データの補間演算の詳細を説明するための図The figure for demonstrating the detail of the interpolation calculation of the image data of the focus detection pixel position in step 180 of FIG. 変形例の撮像素子212Aの構成を示す図The figure which shows the structure of the image pick-up element 212A of a modification. 焦点検出画素配列の上記“最端部”における焦点検出画素(斜線部)を含んだ領域の拡大図Enlarged view of the region including the focus detection pixel (shaded portion) at the “endmost part” of the focus detection pixel array.

符号の説明Explanation of symbols

10;マイクロレンズ、11,13,14;光電変換部、201;カメラ、202;交換レンズ、203;カメラボディ、212,212A;撮像素子、214;ボディ駆動制御装置、310;撮像画素、313,314;焦点検出画素 DESCRIPTION OF SYMBOLS 10; Micro lens, 11, 13, 14; Photoelectric conversion part, 201; Camera, 202; Interchangeable lens, 203; Camera body, 212, 212A; Image sensor, 214; Body drive control device, 310; 314: focus detection pixel

Claims (9)

撮像面上に二次元状に配列された撮像画素の配列の一部に、前記撮像面上に像を形成する光学系の射出瞳の一対の領域を通過した一対の光束の内、一方の光束を受光する第1焦点検出画素と他方の光束を受光する第2焦点検出画素とを一列に配列した焦点検出画素列が少なくとも1列配置された撮像素子であって、
前記焦点検出画素列を複数の領域に区分し、前記複数の領域の内の少なくとも1の領域では前記第1焦点検出画素と前記第2焦点検出画素とを交互に配列し、他の少なくとも1の領域では前記第1焦点検出画素を配列せずに前記第2焦点検出画素と1または複数の前記撮像画素とを交互に配列したことを特徴とする撮像素子。
One of the pair of luminous fluxes that has passed through a pair of regions of the exit pupil of the optical system that forms an image on the imaging plane in a part of the array of imaging pixels arranged two-dimensionally on the imaging plane. An image sensor in which at least one focus detection pixel array in which a first focus detection pixel that receives light and a second focus detection pixel that receives the other light beam is arranged in a row is arranged,
The focus detection pixel row is divided into a plurality of regions, and the first focus detection pixels and the second focus detection pixels are alternately arranged in at least one of the plurality of regions, and at least one other In the region, the second focus detection pixels and one or a plurality of the imaging pixels are alternately arranged without arranging the first focus detection pixels.
請求項1に記載の撮像素子において、
前記撮像面上には緑色フィルターを有する前記撮像画素、赤色フィルターを有する前記撮像画素および青色フィルターを有する前記撮像画素がベイヤー配列されており、
前記第1焦点検出画素を、ベイヤー配列中の赤色フィルターを有する前記撮像画素または青色フィルターを有する前記撮像画素の位置に配置し、
前記第2焦点検出画素を、ベイヤー配列中の緑色フィルターを有する前記撮像画素の位置に配置したことを特徴とする撮像素子。
The imaging device according to claim 1,
On the imaging surface, the imaging pixels having a green filter, the imaging pixels having a red filter, and the imaging pixels having a blue filter are arranged in a Bayer array,
The first focus detection pixel is disposed at a position of the imaging pixel having a red filter or a blue filter in a Bayer array,
The imaging element, wherein the second focus detection pixel is arranged at a position of the imaging pixel having a green filter in a Bayer array.
請求項1または請求項2に記載の撮像素子において、
前記第1焦点検出画素と前記第2焦点検出画素はそれぞれ、マイクロレンズと光電変換部を有することを特徴とする撮像素子。
The imaging device according to claim 1 or 2,
Each of the first focus detection pixel and the second focus detection pixel includes a microlens and a photoelectric conversion unit, respectively.
請求項1〜3のいずれか一項に記載の撮像素子において、
前記複数の領域の内の前記第2焦点検出画素と1または複数の前記撮像画素とを交互に配列した領域は、前記焦点検出画素列の端を含む領域であることを特徴とする撮像素子。
The imaging device according to any one of claims 1 to 3,
The imaging element, wherein a region in which the second focus detection pixels and one or a plurality of the imaging pixels are alternately arranged in the plurality of regions is a region including an end of the focus detection pixel row.
請求項1〜4のいずれか一項に記載の撮像素子において、
前記複数の領域の内の前記第1焦点検出画素と前記第2焦点検出画素とを交互に配列した領域は、前記焦点検出画素列の中央部の領域であることを特徴とする撮像素子。
In the image sensor according to any one of claims 1 to 4,
The imaging element, wherein a region where the first focus detection pixels and the second focus detection pixels are alternately arranged in the plurality of regions is a central region of the focus detection pixel row.
請求項1〜5のいずれか一項に記載の撮像素子と、
前記第1焦点検出画素の配列が生成する第1データ列に対して、前記第2焦点検出画素の配列が生成する第2データ列をシフトさせることによって、前記一対の光束が形成する一対の像のズレ量を演算するズレ量演算手段と、
前記ズレ量を前記撮像面と前記光学系の合焦面との間のデフォーカス量に変換する変換手段と、
前記第1焦点検出画素および前記第2焦点検出画素の位置における仮想的な撮像画素の出力データを、前記第1焦点検出画素および前記第2焦点検出画素の周囲の前記撮像画素の出力データにより補間する補間手段とを備えることを特徴とする撮像装置。
The image sensor according to any one of claims 1 to 5,
A pair of images formed by the pair of light beams by shifting the second data string generated by the array of second focus detection pixels with respect to the first data string generated by the array of first focus detection pixels. A deviation amount calculating means for calculating a deviation amount of
Conversion means for converting the shift amount into a defocus amount between the imaging surface and a focusing surface of the optical system;
Output data of virtual imaging pixels at the positions of the first focus detection pixels and the second focus detection pixels are interpolated by output data of the imaging pixels around the first focus detection pixels and the second focus detection pixels. An imaging apparatus comprising: an interpolation means for performing
請求項6に記載の撮像装置において、
前記ズレ量演算手段は、前記第1データ列に対して前記第2データ列を前記第2焦点検出画素の配列ピッチ単位のシフト量でシフトさせ、前記第1データ列と前記第2データ列との相関度を演算するとともに、最も高い相関度を示すシフト量に応じて前記第1データ列と前記第2データ列の相対的なズレ量を演算することを特徴とする撮像装置。
The imaging device according to claim 6,
The shift amount calculation means shifts the second data string with respect to the first data string by a shift amount in an arrangement pitch unit of the second focus detection pixels, and the first data string, the second data string, And an relative shift amount between the first data string and the second data string according to a shift amount indicating the highest correlation degree.
請求項7に記載の撮像装置において、
前記ズレ量演算手段は、最も高い相関度を示したシフト量とその前後のシフト量の相関度に基づき、前記配列ピッチ単位以下の量でシフト量を内挿し、内挿されたシフト量に基づき前記第1データ列と前記第2データ列のズレ量を演算することを特徴とする撮像装置。
The imaging apparatus according to claim 7,
The deviation amount calculation means interpolates the shift amount by an amount equal to or less than the arrangement pitch unit based on the correlation amount between the shift amount showing the highest degree of correlation and the shift amount before and after the shift amount, and based on the interpolated shift amount An image pickup apparatus that calculates a shift amount between the first data string and the second data string.
請求項6〜8のいずれか一項に記載の撮像装置において、
前記補間手段は、前記第2焦点検出画素の位置における仮想的な撮像画素の出力データを、前記第2焦点検出画素の周囲の前記撮像画素の出力データにより補間する際に、前記第2焦点検出画素と前記第2焦点検出画素との間に前記撮像画素が存在する場合には、当該撮像画素の出力データも用いて補間することを特徴とする撮像装置。
In the imaging device according to any one of claims 6 to 8,
The interpolation means detects the second focus detection when interpolating the output data of the virtual imaging pixel at the position of the second focus detection pixel with the output data of the imaging pixel around the second focus detection pixel. When the imaging pixel exists between a pixel and the second focus detection pixel, the imaging apparatus performs interpolation using output data of the imaging pixel.
JP2008314315A 2008-12-10 2008-12-10 Focus detection apparatus and imaging apparatus Expired - Fee Related JP5407314B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008314315A JP5407314B2 (en) 2008-12-10 2008-12-10 Focus detection apparatus and imaging apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008314315A JP5407314B2 (en) 2008-12-10 2008-12-10 Focus detection apparatus and imaging apparatus

Publications (2)

Publication Number Publication Date
JP2010139624A true JP2010139624A (en) 2010-06-24
JP5407314B2 JP5407314B2 (en) 2014-02-05

Family

ID=42349847

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008314315A Expired - Fee Related JP5407314B2 (en) 2008-12-10 2008-12-10 Focus detection apparatus and imaging apparatus

Country Status (1)

Country Link
JP (1) JP5407314B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012118447A (en) * 2010-12-03 2012-06-21 Nikon Corp Image pickup device and camera
WO2014038258A1 (en) * 2012-09-06 2014-03-13 富士フイルム株式会社 Imaging device and focus control method
WO2014091854A1 (en) * 2012-12-11 2014-06-19 富士フイルム株式会社 Image processing device, image capture device, image processing method, and image processing program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02907A (en) * 1988-05-16 1990-01-05 Minolta Camera Co Ltd Autofocusing device
JP2007155929A (en) * 2005-12-01 2007-06-21 Nikon Corp Solid-state imaging element and imaging apparatus using the same
JP2007279312A (en) * 2006-04-05 2007-10-25 Nikon Corp Imaging element, image sensor and imaging apparatus
JP2008224801A (en) * 2007-03-09 2008-09-25 Nikon Corp Focus detector and imaging apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02907A (en) * 1988-05-16 1990-01-05 Minolta Camera Co Ltd Autofocusing device
JP2007155929A (en) * 2005-12-01 2007-06-21 Nikon Corp Solid-state imaging element and imaging apparatus using the same
JP2007279312A (en) * 2006-04-05 2007-10-25 Nikon Corp Imaging element, image sensor and imaging apparatus
JP2008224801A (en) * 2007-03-09 2008-09-25 Nikon Corp Focus detector and imaging apparatus

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012118447A (en) * 2010-12-03 2012-06-21 Nikon Corp Image pickup device and camera
WO2014038258A1 (en) * 2012-09-06 2014-03-13 富士フイルム株式会社 Imaging device and focus control method
CN104603662A (en) * 2012-09-06 2015-05-06 富士胶片株式会社 Imaging device and focus control method
US9432569B2 (en) 2012-09-06 2016-08-30 Fujifilm Corporation Imaging device and focus control method
WO2014091854A1 (en) * 2012-12-11 2014-06-19 富士フイルム株式会社 Image processing device, image capture device, image processing method, and image processing program
CN104813648A (en) * 2012-12-11 2015-07-29 富士胶片株式会社 Image processing device, image capture device, image processing method, and image processing program
CN104813648B (en) * 2012-12-11 2018-06-05 富士胶片株式会社 Image processing apparatus, photographic device and image processing method

Also Published As

Publication number Publication date
JP5407314B2 (en) 2014-02-05

Similar Documents

Publication Publication Date Title
JP5012495B2 (en) IMAGING ELEMENT, FOCUS DETECTION DEVICE, FOCUS ADJUSTMENT DEVICE, AND IMAGING DEVICE
JP5092685B2 (en) Imaging device and imaging apparatus
JP5029274B2 (en) Imaging device
JP5157400B2 (en) Imaging device
JP4952060B2 (en) Imaging device
JP5163068B2 (en) Imaging device
JP5454223B2 (en) camera
JP5374862B2 (en) Focus detection apparatus and imaging apparatus
JP2009128892A (en) Imaging sensor and image-capturing device
JP5699480B2 (en) Focus detection device and camera
JP5423111B2 (en) Focus detection apparatus and imaging apparatus
JP5381472B2 (en) Imaging device
JP5278123B2 (en) Imaging device
JP5407314B2 (en) Focus detection apparatus and imaging apparatus
JP2011232371A (en) Imaging apparatus
JP5228777B2 (en) Focus detection apparatus and imaging apparatus
JP5338112B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5804105B2 (en) Imaging device
JP5338113B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5685892B2 (en) Focus detection device, focus adjustment device, and imaging device
JP2009162845A (en) Imaging device, focus detecting device and imaging apparatus
JP5332384B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5338119B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5476702B2 (en) Imaging device and imaging apparatus
JP5338118B2 (en) Correlation calculation device, focus detection device, and imaging device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20111207

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120619

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20121024

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20121030

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20121228

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130514

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130712

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20131008

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20131021

R150 Certificate of patent or registration of utility model

Ref document number: 5407314

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees