JP4992481B2 - Focus detection apparatus and imaging apparatus - Google Patents

Focus detection apparatus and imaging apparatus Download PDF

Info

Publication number
JP4992481B2
JP4992481B2 JP2007059641A JP2007059641A JP4992481B2 JP 4992481 B2 JP4992481 B2 JP 4992481B2 JP 2007059641 A JP2007059641 A JP 2007059641A JP 2007059641 A JP2007059641 A JP 2007059641A JP 4992481 B2 JP4992481 B2 JP 4992481B2
Authority
JP
Japan
Prior art keywords
focus detection
distance
region
photoelectric conversion
pupil
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2007059641A
Other languages
Japanese (ja)
Other versions
JP2008224801A (en
Inventor
洋介 日下
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2007059641A priority Critical patent/JP4992481B2/en
Publication of JP2008224801A publication Critical patent/JP2008224801A/en
Application granted granted Critical
Publication of JP4992481B2 publication Critical patent/JP4992481B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

本発明は焦点検出装置および撮像装置に関する。   The present invention relates to a focus detection apparatus and an imaging apparatus.

マイクロレンズとその背後に配置された一対の光電変換部からなる瞳分割型位相差検出方式の焦点検出用画素を結像光学系の撮像面に配列し、結像光学系から到来する一対の光束により形成される一対の像の像ズレ量を焦点検出用画素の出力に基づいて算出し、結像光学系の焦点調節状態を検出する焦点検出装置が知られている(例えば、特許文献1参照)。
この種の焦点検出装置では、焦点検出用画素が配列された面から結像光学系の射出瞳の方向に所定距離離れた面(測距瞳面)の一対の領域を通過する一対の光束が焦点検出用画素の配列面に形成する一対の像の像ズレ量を焦点検出用画素配列によって検出している。さらに、焦点検出用の一対の光束にアンバランスが発生した場合においても、焦点検出用画素の出力にフィルター処理を施して焦点検出精度の低下を防止している。
A pair of luminous fluxes arriving from the imaging optical system by arranging the focus detection pixels of the pupil division type phase difference detection method, which is composed of a microlens and a pair of photoelectric conversion units arranged behind the microlens, on the imaging surface of the imaging optical system There is known a focus detection device that calculates an image shift amount of a pair of images formed by the above based on an output of a focus detection pixel and detects a focus adjustment state of an imaging optical system (for example, see Patent Document 1). ).
In this type of focus detection device, a pair of light fluxes passing through a pair of areas on a plane (a distance measurement pupil plane) that is a predetermined distance away from the plane on which focus detection pixels are arranged in the direction of the exit pupil of the imaging optical system. The image shift amount of the pair of images formed on the focus detection pixel array surface is detected by the focus detection pixel array. Further, even when an unbalance occurs in the pair of light beams for focus detection, the focus detection pixel output is filtered to prevent a decrease in focus detection accuracy.

この出願の発明に関連する先行技術文献としては次のものがある。
特開2004―138968号公報
Prior art documents related to the invention of this application include the following.
JP 2004-138968 A

しかしながら、上述した従来の瞳分割型位相差検出方式の焦点検出装置は種々の結像光学系と組み合わせて用いられるが、結像光学系の射出瞳距離が上述した測距瞳距離とかけ離れている場合には、結像光学系の撮影画面周辺において一対の光束のアンバランスの程度が大きくなり、一対の像のアンバランスも大きくなるので、焦点検出用画素の出力にフィルター処理を施して焦点検出精度の低下を防止するのには限度がある。さらに、焦点検出用の一対の光束のアンバランスの程度が大きくなると、結像光学系の射出瞳の中に一対の光束のうちの一方しか入らなくなってしまい、一対の像のうちの一方が消失して焦点検出が不能に陥る。   However, the conventional pupil division type phase difference detection type focus detection apparatus described above is used in combination with various imaging optical systems, but the exit pupil distance of the imaging optical system is far from the above-mentioned distance measurement pupil distance. In this case, the degree of unbalance between the pair of light beams increases around the imaging screen of the imaging optical system, and the unbalance between the pair of images also increases, so that the focus detection is performed by filtering the output of the focus detection pixels. There is a limit to preventing the loss of accuracy. Furthermore, when the degree of unbalance between the pair of light beams for focus detection increases, only one of the pair of light beams enters the exit pupil of the imaging optical system, and one of the pair of images disappears. As a result, focus detection becomes impossible.

請求項1の発明による焦点検出装置は、射出瞳距離が異なる複数種類の結像光学系を交換して装着可能な撮像装置において前記結像光学系の予定焦点面に配置されており、前記結像光学系を通過する一対の光束を受光し、前記結像光学系の焦点検出調節状態を検出するための位相差検出に用いられる一対の像信号を出力する第1焦点検出画素と第2焦点検出画素が複数配列された焦点検出素子と、前記複数の第1焦点検出画素と前記複数の第2焦点検出画素の出力信号に基づき、前記結像光学系の焦点調節状態を検出する焦点検出演算手段と、を備え、前記第1焦点検出用画素は第1マイクロレンズと該第1マイクロレンズを介した光を受光する第1光電変換部から構成され、前記第1光電変換部は、前記予定焦点面から所定距離(測距瞳距離と称する)に設定された瞳面(測距瞳面と称する)において全ての前記第1焦点検出用画素に共通に設定される第1領域を通過する光束を受光し、前記第2焦点検出用画素は前記第1マイクロレンズと異なる第2マイクロレンズと該第2マイクロレンズを介した光を受光する第2光電変換部から構成され、前記第2光電変換部は、前記測距瞳面において全ての前記第2焦点検出用画素に共通に設定される前記第1領域と異なる第2領域を通過する光束を受光し、前記測距瞳距離は前記複数種類の結像光学系の射出瞳距離の平均的な距離であり、前記第1領域と前記第2領域は前記結像光学系の光軸を含む第3領域を共通領域として含み、前記第1領域の重心と前記第2領域の重心とを結ぶ方向の前記第3領域の幅は前記複数種類の結像光学系の開放F値の中で最も大きなF値の光束が前記測距瞳面を通過する領域の直径に比較して短く設定され、前記第1焦点検出用画素と前記第2焦点検出用画素は前記第1領域の重心と前記第2領域の重心を結ぶ方向の直線に沿って交互に配列され、前記複数の第1の焦点画素は、第1の信号データ列を出力し、前記複数の第2の焦点画素は、第2の信号データ列を出力し、前記焦点検出演算手段は、前記第1の信号データ列中の第1のデータと前記第2の信号データ列中の、前記第1のデータに対応する第2のデータの近傍のデータとを乗算して第1演算データを算出すると共に、前記第2の信号データ列中の第2のデータと前記第1の信号データ列中の、前記第2のデータに対応する第1のデータの近傍のデータとを乗算して第2演算データを算出し、前記第1演算データと前記第2演算データとの相関度を演算し、前記相関量に基づき前記デフォーカス量を算出することを特徴とする。
請求項の発明による撮像装置は、請求項に記載の焦点検出装置と、前記焦点検出演算手段により算出されたデフォーカス量に基づいて前記結像光学系の焦点調節を行う焦点調節手段と、前記結像光学系により形成される像を撮像する撮像素子とを備えることを特徴とする。
According to a first aspect of the present invention, there is provided a focus detection apparatus which is disposed on a predetermined focal plane of the imaging optical system in an imaging apparatus which can be mounted by replacing a plurality of types of imaging optical systems having different exit pupil distances. A first focus detection pixel and a second focus that receive a pair of light beams passing through the image optical system and output a pair of image signals used for phase difference detection for detecting a focus detection adjustment state of the imaging optical system. A focus detection calculation that detects a focus adjustment state of the imaging optical system based on a focus detection element in which a plurality of detection pixels are arranged, and output signals of the plurality of first focus detection pixels and the plurality of second focus detection pixels. And the first focus detection pixel includes a first microlens and a first photoelectric conversion unit that receives light via the first microlens, and the first photoelectric conversion unit A predetermined distance from the focal plane (ranging pupil distance In a pupil plane (referred to as a distance measurement pupil plane) set to the first focus detection pixel, a light beam passing through a first region set in common to all the first focus detection pixels is received, and the second focus detection The pixel includes a second microlens different from the first microlens and a second photoelectric conversion unit that receives light via the second microlens, and the second photoelectric conversion unit is all on the distance measuring pupil plane. Receiving a light beam passing through a second region different from the first region set in common to the second focus detection pixels, and the distance measuring pupil distance is an exit pupil distance of the plurality of types of imaging optical systems An average distance, and the first region and the second region include a third region including the optical axis of the imaging optical system as a common region, and a centroid of the first region and a centroid of the second region The width of the third region in the direction connecting the two is the plurality of types of imaging optics The light flux having the largest F value among the open F values is set shorter than the diameter of the region passing through the distance measuring pupil plane, and the first focus detection pixel and the second focus detection pixel are Arranged alternately along a straight line connecting the centroid of the first area and the centroid of the second area , the plurality of first focus pixels output a first signal data string, and the plurality of second areas The focus pixel outputs a second signal data string, and the focus detection calculation means outputs the first data in the first signal data string and the first signal data string in the second signal data string. The first calculation data is calculated by multiplying the data in the vicinity of the second data corresponding to the data, and the second data in the second signal data string and the first signal data string, The second operation data is obtained by multiplying the data in the vicinity of the first data corresponding to the second data. The degree of correlation between the first calculation data and the second calculation data is calculated, and the defocus amount is calculated based on the correlation amount .
Imaging device according to the invention of claim 4 includes a focus detecting apparatus according to claim 1, a focusing means for performing focus adjustment of the imaging optical system based on the defocus amount calculated by the focus detection computing unit And an image pickup device for picking up an image formed by the imaging optical system.

本発明によれば、結像光学系から到来する一対の光束により形成される一対の像の像ズレ量を相関演算により求め、結像光学系の焦点検出を行う際に、結像光学系の撮影画面周辺において一対の光束のアンバランスの程度が大きくても、相関演算で焦点検出可能になる確率を上げることができる。   According to the present invention, when an image shift amount of a pair of images formed by a pair of light beams coming from the image forming optical system is obtained by correlation calculation, and the focus detection of the image forming optical system is performed, Even if the degree of unbalance between a pair of light beams around the photographing screen is large, the probability that the focus can be detected by the correlation calculation can be increased.

一実施の形態の焦点検出装置を備えた撮像装置として、レンズ交換式デジタルスチルカメラを例に上げて説明する。図1は一実施の形態のカメラの構成を示す横断面図である。一実施の形態のデジタルスチルカメラ201は交換レンズ202とカメラボディ203から構成され、交換レンズ202がマウント部204を介してカメラボディ203に装着される。   As an imaging apparatus including the focus detection apparatus according to an embodiment, a lens interchangeable digital still camera will be described as an example. FIG. 1 is a cross-sectional view showing a configuration of a camera according to an embodiment. A digital still camera 201 according to an embodiment includes an interchangeable lens 202 and a camera body 203, and the interchangeable lens 202 is attached to the camera body 203 via a mount unit 204.

交換レンズ202はレンズ209、ズーミング用レンズ208、フォーカシング用レンズ210、絞り211、レンズ駆動制御装置206などを備えている。レンズ駆動制御装置206は不図示のマイクロコンピューター、メモリ、駆動制御回路などから構成され、フォーカシング用レンズ210および絞り211の駆動制御や、ズーミング用レンズ208、フォーカシング用レンズ210および絞り211の状態検出などを行う他、後述するボディ駆動制御装置214との通信によりレンズ情報の送信とカメラ情報の受信を行う。   The interchangeable lens 202 includes a lens 209, a zooming lens 208, a focusing lens 210, an aperture 211, a lens drive control device 206, and the like. The lens drive control device 206 includes a microcomputer (not shown), a memory, a drive control circuit, and the like. The lens drive control device 206 controls the driving of the focusing lens 210 and the aperture 211, detects the state of the zooming lens 208, the focusing lens 210, and the aperture 211, and the like. In addition, the lens information is transmitted and the camera information is received by communication with a body drive control device 214 described later.

カメラボディ203は撮像素子212、ボディ駆動制御装置214、液晶表示素子駆動回路215、液晶表示素子216、接眼レンズ217、メモリカード219などを備えている。撮像素子212には、撮像用画素が二次元状に配置されるとともに、焦点検出位置に対応した部分に焦点検出用画素が組み込まれている。   The camera body 203 includes an imaging element 212, a body drive control device 214, a liquid crystal display element drive circuit 215, a liquid crystal display element 216, an eyepiece lens 217, a memory card 219, and the like. In the imaging element 212, imaging pixels are two-dimensionally arranged, and focus detection pixels are incorporated in portions corresponding to focus detection positions.

ボディ駆動制御装置214はマイクロコンピューター、メモリ、駆動制御回路などから構成され、撮像素子212の駆動制御と画像信号および焦点検出信号の読み出し、画像信号の処理と記録、焦点検出信号に基づく焦点検出演算と交換レンズ202の焦点調節、カメラの動作制御を行う。また、ボディ駆動制御装置214は電気接点213を介してレンズ駆動制御装置206と通信を行い、レンズ情報の受信とカメラ情報(デフォーカス量や絞り値等)の送信を行う。   The body drive control device 214 includes a microcomputer, a memory, a drive control circuit, and the like, and performs drive control of the image sensor 212 and reading of image signals and focus detection signals, image signal processing and recording, and focus detection calculation based on the focus detection signals. The focus adjustment of the interchangeable lens 202 and the operation control of the camera are performed. The body drive control device 214 communicates with the lens drive control device 206 via the electrical contact 213 to receive lens information and send camera information (defocus amount, aperture value, etc.).

液晶表示素子216は液晶ビューファインダー(EVF:電気的ビューファインダー)として機能する。液晶表示素子駆動回路215は撮像素子212によるスルー画像を液晶表示素子216に表示し、撮影者は接眼レンズ217を介してスルー画像を観察することができる。メモリカード219は、撮像素子212により撮像された画像を記憶する画像ストレージである。   The liquid crystal display element 216 functions as a liquid crystal viewfinder (EVF: electrical viewfinder). The liquid crystal display element driving circuit 215 displays a through image by the imaging element 212 on the liquid crystal display element 216, and the photographer can observe the through image through the eyepiece lens 217. The memory card 219 is an image storage that stores an image captured by the image sensor 212.

交換レンズ202を通過した光束により撮像素子212の受光面上に被写体像が形成される。この被写体像は撮像素子212により光電変換され、画像信号と焦点検出信号がボディ駆動制御装置214へ送られる。   A subject image is formed on the light receiving surface of the image sensor 212 by the light flux that has passed through the interchangeable lens 202. This subject image is photoelectrically converted by the image sensor 212, and an image signal and a focus detection signal are sent to the body drive control device 214.

ボディ駆動制御装置214は、撮像素子212の焦点検出用画素からの焦点検出信号に基づいてデフォーカス量を算出し、このデフォーカス量をレンズ駆動制御装置206へ送る。また、ボディ駆動制御装置214は、撮像素子212からの画像信号を処理してメモリカード219に格納するとともに、撮像素子212からのスルー画像信号を液晶表示素子駆動回路215へ送り、スルー画像を液晶表示素子216に表示させる。さらに、ボディ駆動制御装置214は、レンズ駆動制御装置206へ絞り制御情報を送って絞り211の開口制御を行う。   The body drive control device 214 calculates the defocus amount based on the focus detection signal from the focus detection pixel of the image sensor 212 and sends the defocus amount to the lens drive control device 206. In addition, the body drive control device 214 processes the image signal from the image sensor 212 and stores it in the memory card 219, and sends the through image signal from the image sensor 212 to the liquid crystal display element drive circuit 215, and transmits the through image to the liquid crystal. The image is displayed on the display element 216. Further, the body drive control device 214 sends aperture control information to the lens drive control device 206 to control the aperture of the aperture 211.

レンズ駆動制御装置206は、レンズ情報をフォーカシング状態、ズーミング状態、絞り設定状態、絞り開放F値などに応じて変更する。具体的には、ズーミング用レンズ208とフォーカシング用レンズ210の位置と絞り211の絞り値を検出し、これらのレンズ位置と絞り値に応じてレンズ情報を演算したり、あるいは予め用意されたルックアップテーブルからレンズ位置と絞り値に応じたレンズ情報を選択する。   The lens drive control device 206 changes the lens information according to the focusing state, zooming state, aperture setting state, aperture opening F value, and the like. Specifically, the positions of the zooming lens 208 and the focusing lens 210 and the aperture value of the aperture 211 are detected, and lens information is calculated according to these lens positions and aperture values, or a lookup prepared in advance. Lens information corresponding to the lens position and aperture value is selected from the table.

レンズ駆動制御装置206は、受信したデフォーカス量に基づいてレンズ駆動量を算出し、レンズ駆動量に応じてフォーカシング用レンズ210を合焦点へ駆動する。また、レンズ駆動制御装置206は受信した絞り値に応じて絞り211を駆動する。   The lens drive control device 206 calculates a lens drive amount based on the received defocus amount, and drives the focusing lens 210 to a focal point according to the lens drive amount. Further, the lens drive control device 206 drives the diaphragm 211 in accordance with the received diaphragm value.

カメラボディ203にはマウント部204を介して種々の結像光学系を有する交換レンズ202が装着可能であり、カメラボディ203は撮像素子212に組み込まれた焦点検出用画素の出力に基づいて交換レンズ202の焦点調節状態を検出する。   An interchangeable lens 202 having various imaging optical systems can be attached to the camera body 203 via a mount unit 204, and the camera body 203 is based on the output of focus detection pixels incorporated in the image sensor 212. The focus adjustment state 202 is detected.

図2は撮影画面上における焦点検出画素の配置を示す図であり、後述する焦点検出用画素列の出力に基づいて焦点検出を行うときに、撮影画面上で像をサンプリングする領域(焦点検出エリア、焦点検出位置)を示す。この一実施の形態では、撮影画面100上の中央102を含み水平方向に焦点検出エリア101が配置された例を示す。長方形で示した焦点検出エリア101の長手方向に焦点検出用画素が直線的に配列される。焦点検出エリア101の端点103は画面中央102から水平方向に距離Sの位置にあり、この点まで焦点検出用画素が配列される。   FIG. 2 is a diagram showing the arrangement of focus detection pixels on the shooting screen. When focus detection is performed on the basis of an output of a focus detection pixel array, which will be described later, an area for sampling an image on the shooting screen (focus detection area). , Focus detection position). In this embodiment, an example in which the focus detection area 101 is arranged in the horizontal direction including the center 102 on the photographing screen 100 is shown. Focus detection pixels are linearly arranged in the longitudinal direction of the focus detection area 101 indicated by a rectangle. The end point 103 of the focus detection area 101 is located at a distance S in the horizontal direction from the center 102 of the screen, and focus detection pixels are arranged up to this point.

図3は撮像素子212の詳細な構成を示す正面図であり、図2に示す撮像素子212上の焦点検出エリア101付近を拡大した図である。図3において、縦横(画素の行と列)は図2に示す撮影画面の縦横に対応している。撮像素子212は、緑画素、青画素および赤画素からなる撮像用画素310と焦点検出用画素312、313から構成され、焦点検出エリア101(図2参照)には焦点検出用画素312、313が交互に水平方向に配列されている。焦点検出用画素312,313は、撮像用画素310の緑画素と青画素が配置されるべき行に直線的に配置されている。   FIG. 3 is a front view showing a detailed configuration of the image sensor 212, and is an enlarged view of the vicinity of the focus detection area 101 on the image sensor 212 shown in FIG. In FIG. 3, the vertical and horizontal directions (pixel rows and columns) correspond to the vertical and horizontal directions of the shooting screen shown in FIG. The imaging element 212 includes an imaging pixel 310 including green pixels, blue pixels, and red pixels, and focus detection pixels 312 and 313. The focus detection area 312 (see FIG. 2) includes focus detection pixels 312 and 313. They are alternately arranged in the horizontal direction. The focus detection pixels 312 and 313 are linearly arranged in a row where the green pixel and the blue pixel of the imaging pixel 310 are to be arranged.

なお、この一実施の形態では、撮像用画素の二次元状配列内の一部に焦点検出用画素を配列した素子を例に上げて説明するので、この素子を便宜的に撮像素子と呼ぶが、この素子は焦点検出装置を構成する要素であるから“焦点検出素子”と呼ぶこともできる。   In this embodiment, an element in which focus detection pixels are arranged in a part of a two-dimensional array of imaging pixels will be described as an example, and this element will be referred to as an imaging element for convenience. Since this element is an element constituting the focus detection device, it can also be called a “focus detection element”.

図4に示すように、撮像用画素310はマイクロレンズ10、光電変換部11、不図示の色フィルターから構成される。色フィルターには赤(R)、緑(G)、青(B)の3種類があり、それぞれの分光感度は図6に示す特性になっている。各色フィルターを備えた撮像用画素がベイヤー配列されている。   As shown in FIG. 4, the imaging pixel 310 includes the microlens 10, the photoelectric conversion unit 11, and a color filter (not shown). There are three types of color filters, red (R), green (G), and blue (B), and the respective spectral sensitivities have the characteristics shown in FIG. An imaging pixel having each color filter is arranged in a Bayer array.

図5(a)に示すように、焦点検出用画素312はマイクロレンズ10と光電変換部12とから構成され、光電変換部12は長方形である。また、図5(b)に示すように、焦点検出画素313はマイクロレンズ10と光電変換部13とから構成され、光電変換部13は長方形である。光電変換部12と13は、マイクロレンズ10を重ね合わせて表示した場合に左右水平方向に並んでおり、光電変換部12の左の一部と光電変換部13の右の一部は互いに重なっている。撮像素子212上の焦点検出用画素列では、焦点検出用画素312と焦点検出用画素313が水平方向(光電変換部12と13の並び方向)に交互に配置されている。   As shown in FIG. 5A, the focus detection pixel 312 includes the microlens 10 and the photoelectric conversion unit 12, and the photoelectric conversion unit 12 is rectangular. As shown in FIG. 5B, the focus detection pixel 313 includes the microlens 10 and the photoelectric conversion unit 13, and the photoelectric conversion unit 13 is rectangular. The photoelectric conversion units 12 and 13 are arranged in the horizontal direction when the microlenses 10 are displayed in a superimposed manner, and the left part of the photoelectric conversion unit 12 and the right part of the photoelectric conversion unit 13 overlap each other. Yes. In the focus detection pixel row on the image sensor 212, the focus detection pixels 312 and the focus detection pixels 313 are alternately arranged in the horizontal direction (alignment direction of the photoelectric conversion units 12 and 13).

焦点検出用画素312、313には光量をかせぐために色フィルターは配置されておらず、その分光特性は光電変換を行うフォトダイオードの分光感度特性と、赤外カットフィルター(不図示)の分光感度特性とを総合した分光感度特性(図7参照)となり、図6に示す緑画素、赤画素および青画素の分光感度特性を加算したような分光感度特性であり、その感度の光波長領域は緑画素、赤画素および青画素の感度の光波長領域を包括している。   The focus detection pixels 312 and 313 are not provided with a color filter in order to increase the amount of light. The spectral characteristics of the focus detection pixels 312 and 313 are the spectral sensitivity characteristics of a photodiode that performs photoelectric conversion and the spectral sensitivity characteristics of an infrared cut filter (not shown). Is a spectral sensitivity characteristic obtained by adding the spectral sensitivity characteristics of the green pixel, the red pixel, and the blue pixel shown in FIG. 6, and the light wavelength region of the sensitivity is a green pixel. It covers the light wavelength range of red and blue pixel sensitivity.

撮像用画素310の光電変換部11は、マイクロレンズ10により最も明るい交換レンズの射出瞳(たとえばF1.0)を通過する光束をすべて受光するような形状に設計されている。焦点検出画素312、313の光電変換部12、13は、マイクロレンズ10により交換レンズの射出瞳の所定の領域(たとえばF2.8)を通過する光束をすべて受光するような形状に設計されている。   The photoelectric conversion unit 11 of the imaging pixel 310 is designed so as to receive all the light beams passing through the exit pupil (for example, F1.0) of the brightest interchangeable lens by the microlens 10. The photoelectric conversion units 12 and 13 of the focus detection pixels 312 and 313 are designed so as to receive all the light beams that pass through a predetermined region (for example, F2.8) of the exit pupil of the interchangeable lens by the microlens 10. .

図8は撮像用画素310の断面図である。撮像用画素310では、撮像用の光電変換部11の前方にマイクロレンズ10(8〜12μm径程度)が配置され、マイクロレンズ10により光電変換部11の形状が前方に投影される。光電変換部11は半導体回路基板29上に形成され、不図示の色フィルターはマイクロレンズ10と光電変換部11の中間に配置される。   FIG. 8 is a cross-sectional view of the imaging pixel 310. In the imaging pixel 310, the microlens 10 (with a diameter of about 8 to 12 μm) is disposed in front of the imaging photoelectric conversion unit 11, and the shape of the photoelectric conversion unit 11 is projected forward by the microlens 10. The photoelectric conversion unit 11 is formed on the semiconductor circuit substrate 29, and a color filter (not shown) is disposed between the microlens 10 and the photoelectric conversion unit 11.

図9(a)は画面中央部に配置された焦点検出用画素312の断面図である。画面中央部に配置された焦点検出用画素312では、光電変換部12の前方にマイクロレンズ10(8〜12μm径程度)が配置され、マイクロレンズ10により光電変換部12の形状が前方に投影される。光電変換部12は半導体回路基板29上に形成されるとともに、その上にマイクロレンズ10が半導体イメージセンサーの製造工程により一体的かつ固定的に形成される。   FIG. 9A is a cross-sectional view of the focus detection pixel 312 arranged at the center of the screen. In the focus detection pixel 312 disposed in the center of the screen, the microlens 10 (about 8 to 12 μm in diameter) is disposed in front of the photoelectric conversion unit 12, and the shape of the photoelectric conversion unit 12 is projected forward by the microlens 10. The The photoelectric conversion unit 12 is formed on the semiconductor circuit substrate 29, and the microlens 10 is integrally and fixedly formed thereon by a semiconductor image sensor manufacturing process.

図9(b)は画面中央部に配置された焦点検出用画素313の断面図である。画面中央部に配置された焦点検出用画素313では、光電変換部13の前方にマイクロレンズ10(8〜12μm径程度)が配置され、マイクロレンズ10により光電変換部13の形状が前方に投影される。光電変換部13は半導体回路基板29上に形成されるとともに、その上にマイクロレンズ10が半導体イメージセンサーの製造工程により一体的かつ固定的に形成される。   FIG. 9B is a cross-sectional view of the focus detection pixel 313 arranged at the center of the screen. In the focus detection pixel 313 disposed in the center of the screen, the micro lens 10 (about 8 to 12 μm in diameter) is disposed in front of the photoelectric conversion unit 13, and the shape of the photoelectric conversion unit 13 is projected forward by the micro lens 10. The The photoelectric conversion unit 13 is formed on the semiconductor circuit substrate 29, and the microlens 10 is integrally and fixedly formed thereon by a semiconductor image sensor manufacturing process.

焦点検出用画素312の光電変換部12と焦点検出用画素313の光電変換部13は、マイクロレンズ10の光軸300に対して対称位置に配置されるとともに、焦点検出用画素312、313の断面図をマイクロレンズ10の光軸300を基準に重ね合わせると、光電変換部12、13の一部は重なりあっている。したがって、光電変換部12を投影する光束322と光電変換部13を投影する光束323は互いに一部重なり合うとともに、全体としてはマイクロレンズの光軸300の方向に向かっている。換言すれば、光電変換部12は光束322を受光するとともに、光電変換部13は光束323を受光することになる。   The photoelectric conversion unit 12 of the focus detection pixel 312 and the photoelectric conversion unit 13 of the focus detection pixel 313 are arranged at symmetrical positions with respect to the optical axis 300 of the microlens 10 and are cross sections of the focus detection pixels 312 and 313. When the figures are superimposed on the basis of the optical axis 300 of the microlens 10, the photoelectric conversion units 12 and 13 are partially overlapped. Therefore, the light beam 322 that projects the photoelectric conversion unit 12 and the light beam 323 that projects the photoelectric conversion unit 13 partially overlap each other and are directed toward the optical axis 300 of the microlens as a whole. In other words, the photoelectric conversion unit 12 receives the light beam 322 and the photoelectric conversion unit 13 receives the light beam 323.

図10(a)は焦点検出エリア端部に配置された焦点検出用画素312の断面図である。図2に示すように、焦点検出エリア101の画面中央から距離Sの端点103に配置された焦点検出用画素312において、光電変換部12の前方にマイクロレンズ10(8〜12μm径程度)が配置され、マイクロレンズ10により光電変換部12の形状が前方に投影される。光電変換部12は半導体回路基板29上に形成されるとともに、その上にマイクロレンズ10が半導体イメージセンサーの製造工程により一体的かつ固定的に形成される。   FIG. 10A is a cross-sectional view of the focus detection pixel 312 disposed at the end of the focus detection area. As shown in FIG. 2, in the focus detection pixel 312 disposed at the end point 103 at a distance S from the screen center of the focus detection area 101, the microlens 10 (about 8 to 12 μm diameter) is disposed in front of the photoelectric conversion unit 12. Then, the shape of the photoelectric conversion unit 12 is projected forward by the microlens 10. The photoelectric conversion unit 12 is formed on the semiconductor circuit substrate 29, and the microlens 10 is integrally and fixedly formed thereon by a semiconductor image sensor manufacturing process.

図10(b)は焦点検出エリア端部に配置された焦点検出用画素313の断面図である。図2に示すように、焦点検出エリア101の画面中央から距離Sの端点103に配置された焦点検出用画素313において、光電変換部13の前方にマイクロレンズ10(8〜12μm径程度)が配置され、マイクロレンズ10により光電変換部13の形状が前方に投影される。光電変換部13は半導体回路基板29上に形成されるとともに、その上にマイクロレンズ10が半導体イメージセンサーの製造工程により一体的かつ固定的に形成される。   FIG. 10B is a cross-sectional view of the focus detection pixel 313 disposed at the end of the focus detection area. As shown in FIG. 2, in the focus detection pixel 313 disposed at the end point 103 at a distance S from the screen center of the focus detection area 101, the microlens 10 (about 8 to 12 μm in diameter) is disposed in front of the photoelectric conversion unit 13. Then, the shape of the photoelectric conversion unit 13 is projected forward by the microlens 10. The photoelectric conversion unit 13 is formed on the semiconductor circuit substrate 29, and the microlens 10 is integrally and fixedly formed thereon by a semiconductor image sensor manufacturing process.

焦点検出用画素312の光電変換部12と焦点検出用画素313の光電変換部13は、マイクロレンズ10の光軸300に対して非対称位置に配置されるとともに、焦点検出用画素312、313の断面図をマイクロレンズ10の光軸300を基準に重ね合わせると、光電変換部12、13の一部は重なりあっている。したがって、光電変換部12を投影する光束332と光電変換部13を投影する光束333は互いに一部重なり合うとともに、全体としてはマイクロレンズの光軸300と異なる方向(図10においては光軸300より下方向)に向かっている。換言すれば、光電変換部12は光束332を受光するとともに、光電変換部13は光束333を受光することになる。   The photoelectric conversion unit 12 of the focus detection pixel 312 and the photoelectric conversion unit 13 of the focus detection pixel 313 are arranged at asymmetric positions with respect to the optical axis 300 of the microlens 10 and are cross-sections of the focus detection pixels 312 and 313. When the figures are superimposed on the basis of the optical axis 300 of the microlens 10, the photoelectric conversion units 12 and 13 are partially overlapped. Accordingly, the light beam 332 that projects the photoelectric conversion unit 12 and the light beam 333 that projects the photoelectric conversion unit 13 partially overlap each other, and as a whole, are in a direction different from the optical axis 300 of the microlens (in FIG. 10, below the optical axis 300). Direction). In other words, the photoelectric conversion unit 12 receives the light beam 332 and the photoelectric conversion unit 13 receives the light beam 333.

図11は、マイクロレンズを用いた瞳分割型位相差検出方式の焦点検出光学系の構成を示す。図において、90は、交換レンズ202(図1参照)の予定結像面に配置されたマイクロレンズの前方dの距離に設定された瞳面であり、この明細書では“測距瞳面”という。距離dは、マイクロレンズの曲率、屈折率、マイクロレンズと光電変換部との間の距離などに応じて決まる距離であり、この明細書では“測距瞳距離”という。91は交換レンズ202の光軸、10a〜10dは画面中央近傍に配置されたマイクロレンズ、12a、12b、13a、13bはマイクロレンズ10a〜10dの光電変換部、312a、312b、313a、313bは焦点検出用画素、72,73、82,83は焦点検出用の光束である。   FIG. 11 shows a configuration of a pupil division type phase difference detection type focus detection optical system using a microlens. In the figure, reference numeral 90 denotes a pupil plane set at a distance d in front of the microlens disposed on the planned imaging plane of the interchangeable lens 202 (see FIG. 1). In this specification, the pupil plane is referred to as a “ranging pupil plane”. . The distance d is a distance determined according to the curvature and refractive index of the microlens, the distance between the microlens and the photoelectric conversion unit, and is referred to as “distance pupil distance” in this specification. 91 is an optical axis of the interchangeable lens 202, 10a to 10d are microlenses arranged near the center of the screen, 12a, 12b, 13a and 13b are photoelectric conversion units of the microlenses 10a to 10d, and 312a, 312b, 313a and 313b are focal points. The detection pixels 72, 73, 82, and 83 are light beams for focus detection.

また、92はマイクロレンズ10a、10cにより測距瞳面90に投影された光電変換部12a、12bの領域であり、この明細書では“測距瞳”という。93はマイクロレンズ10b、10dにより測距瞳面90に投影された光電変換部13a、13bの領域、すなわち測距瞳である。なお、図11ではわかりやすくするために測距瞳を楕円形の領域で示すが、実際は光電変換部の形状が拡大投影された形状となる。測距瞳92と93は光軸91近傍で重畳している。   Reference numeral 92 denotes an area of the photoelectric conversion units 12a and 12b projected onto the distance measurement pupil plane 90 by the microlenses 10a and 10c, and is referred to as “distance measurement pupil” in this specification. Reference numeral 93 denotes an area of the photoelectric conversion units 13a and 13b projected on the distance measuring pupil plane 90 by the microlenses 10b and 10d, that is, a distance measuring pupil. In FIG. 11, for the sake of clarity, the distance measuring pupil is shown as an elliptical area, but in reality, the shape of the photoelectric conversion unit is an enlarged projection. The distance measuring pupils 92 and 93 overlap in the vicinity of the optical axis 91.

図11では、画面中央近傍において隣接する4個の焦点検出用画素312a、312b、313a、313bを模式的に例示しているが、その他の焦点検出用画素においても、光電変換部はそれぞれ対応した測距瞳92,93から各マイクロレンズに到来する光束を受光する。このため、画面中央から離れた位置にある焦点検出用画素では、マイクロレンズの光軸に対し光電変換部の位置が非対称に配置されることになり、マイクロレンズの光軸より交換レンズの光軸91側に寄った光束を受光している。ここで、焦点検出用画素の配列方向は、一対の測距瞳の並び方向、すなわち一対の光電変換部の並び方向と一致させる。   In FIG. 11, four focus detection pixels 312a, 312b, 313a, and 313b adjacent in the vicinity of the center of the screen are schematically illustrated, but the photoelectric conversion units correspond to the other focus detection pixels, respectively. Light beams coming from the distance measurement pupils 92 and 93 to each microlens are received. For this reason, in the focus detection pixel located away from the center of the screen, the position of the photoelectric conversion unit is asymmetrically arranged with respect to the optical axis of the microlens, and the optical axis of the interchangeable lens from the optical axis of the microlens. The light beam approaching the 91 side is received. Here, the arrangement direction of the focus detection pixels is made to coincide with the arrangement direction of the pair of distance measuring pupils, that is, the arrangement direction of the pair of photoelectric conversion units.

マイクロレンズ10a〜10dは、交換レンズ202(図1参照)の予定結像面近傍に配置されており、マイクロレンズ10a〜10dによりその背後に配置された光電変換部12a、12b、13a、13bの形状がマイクロレンズ10a〜10dから測距瞳距離dだけ離間した射出瞳90上に投影され、その投影形状は測距瞳92,93を形成する。すなわち、投影距離dにある射出瞳90上で各画素の光電変換部の投影形状(測距瞳92,93)が一致するように、各画素における光電変換部の投影方向が決定されている。   The microlenses 10a to 10d are disposed in the vicinity of the planned imaging surface of the interchangeable lens 202 (see FIG. 1), and the photoelectric conversion units 12a, 12b, 13a, and 13b disposed behind the microlenses 10a to 10d. The shape is projected onto the exit pupil 90 separated from the microlenses 10a to 10d by the distance measuring pupil distance d, and the projected shape forms the distance measuring pupils 92 and 93. That is, the projection direction of the photoelectric conversion unit in each pixel is determined so that the projection shape (ranging pupils 92 and 93) of the photoelectric conversion unit of each pixel matches on the exit pupil 90 at the projection distance d.

光電変換部12aは、測距瞳92を通過しマイクロレンズ10aに向う光束72を受光し、光束72によりマイクロレンズ10a上に形成される像の強度に対応した信号を出力する。また、光電変換部12bは、測距瞳92を通過しマイクロレンズ10cに向う光束82を受光し、光束82によりマイクロレンズ10c上に形成される像の強度に対応した信号を出力する。同様に、光電変換部13aは、測距瞳93を通過しマイクロレンズ10bに向う光束73を受光し、光束73によりマイクロレンズ10b上に形成される像の強度に対応した信号を出力する。また、光電変換部13bは、測距瞳92を通過しマイクロレンズ10dに向う光束83を受光し、光束83によりマイクロレンズ10d上に形成される像の強度に対応した信号を出力する。   The photoelectric conversion unit 12a receives a light beam 72 that passes through the distance measuring pupil 92 and travels toward the micro lens 10a, and outputs a signal corresponding to the intensity of an image formed on the micro lens 10a by the light beam 72. The photoelectric conversion unit 12b receives a light beam 82 that passes through the distance measuring pupil 92 and travels toward the microlens 10c, and outputs a signal corresponding to the intensity of an image formed on the microlens 10c by the light beam 82. Similarly, the photoelectric conversion unit 13a receives a light beam 73 that passes through the distance measuring pupil 93 and travels toward the microlens 10b, and outputs a signal corresponding to the intensity of an image formed on the microlens 10b by the light beam 73. The photoelectric conversion unit 13b receives the light beam 83 that passes through the distance measuring pupil 92 and travels toward the micro lens 10d, and outputs a signal corresponding to the intensity of the image formed on the micro lens 10d by the light beam 83.

上述したような2種類の焦点検出用画素を直線状に多数配置し、各画素の光電変換部の出力を測距瞳92と測距瞳93に対応した出力グループにまとめることによって、測距瞳92と測距瞳93をそれぞれ通過する焦点検出用光束が画素列上に形成する一対の像の強度分布に関する情報が得られる。この情報に対して後述する像ズレ検出演算処理(相関演算処理、位相差検出処理)を施すことによって、いわゆる瞳分割型位相差検出方式により一対の像の像ズレ量が検出される。像ズレ量に一対の測距瞳の重心間隔の開き角に応じた変換演算を行うことによって、予定結像面に対する現在の結像面(予定結像面上のマイクロレンズアレイの位置に対応した焦点検出位置における結像面)の偏差(デフォーカス量)が算出される。   A large number of the two types of focus detection pixels as described above are arranged in a straight line, and the output of the photoelectric conversion unit of each pixel is collected into an output group corresponding to the distance measuring pupil 92 and the distance measuring pupil 93, thereby the distance measuring pupil. Information on the intensity distribution of a pair of images formed on the pixel array by the focus detection light fluxes that respectively pass through 92 and the distance measuring pupil 93 is obtained. By applying an image shift detection calculation process (correlation calculation process, phase difference detection process), which will be described later, to this information, an image shift amount of a pair of images is detected by a so-called pupil division type phase difference detection method. By converting the image shift amount according to the opening angle of the center of gravity of the pair of distance measuring pupils, the current image plane relative to the planned image plane (corresponding to the position of the microlens array on the planned image plane) The deviation (defocus amount) of the imaging plane at the focus detection position is calculated.

図12は、瞳分割型位相差検出方式におけるデフォーカスと像ずれとの関係を説明するための図である。図12(a)において、像を形成する光束は、測距瞳92を通過する光束72と測距瞳93を通過する光束73とに分割される。このような構成により、例えば光軸91上にあり図12の紙面に垂直な方向の線パターン(黒地に白線)を結像光学系により結像させた場合、合焦面P0において測距瞳92を通過する光束72と測距瞳93を通過する光束73が、図12(c)に示すように光軸91上の同じ位置に高コントラストな線像パターンを形成する。   FIG. 12 is a diagram for explaining the relationship between defocus and image shift in the pupil division type phase difference detection method. In FIG. 12A, the light beam forming the image is divided into a light beam 72 passing through the distance measuring pupil 92 and a light beam 73 passing through the distance measuring pupil 93. With such a configuration, for example, when a line pattern (white line on a black background) on the optical axis 91 and perpendicular to the paper surface of FIG. 12 is imaged by the imaging optical system, the distance measuring pupil 92 on the focusing plane P0. The light beam 72 passing through and the light beam 73 passing through the distance measuring pupil 93 form a high-contrast line image pattern at the same position on the optical axis 91 as shown in FIG.

一方、合焦面P0より前方の面P1では、測距瞳92を通過する光束72と測距瞳93を通過する光束73は、図12(b)に示すように異なる位置にぼけた線像パターンを形成する。また、合焦面P0より後方の面P2では、測距瞳92を通過する光束72と測距瞳93を通過する光束73は、図12(d)に示すように図12(b)とは反対方向の異なる位置にぼけた線像パターンを形成する。したがって、測距瞳92を通過する光束72と測距瞳93を通過する光束73とにより形成される2つの像を分離して検出し、2つの像の相対的な位置関係(像ズレ量)を算出することによって、2つの像を検出した面における光学系の焦点調節状態(デフォーカス量)を検出することができる。   On the other hand, on the surface P1 ahead of the focusing surface P0, the light beam 72 passing through the distance measuring pupil 92 and the light beam 73 passing through the distance measuring pupil 93 are line images blurred at different positions as shown in FIG. Form a pattern. On the surface P2 behind the focusing surface P0, the light beam 72 passing through the distance measuring pupil 92 and the light beam 73 passing through the distance measuring pupil 93 are different from FIG. 12 (b) as shown in FIG. 12 (d). Blurred line image patterns are formed at different positions in the opposite direction. Therefore, two images formed by the light beam 72 passing through the distance measuring pupil 92 and the light beam 73 passing through the distance measuring pupil 93 are detected separately, and the relative positional relationship (image shift amount) between the two images is detected. By calculating, it is possible to detect the focus adjustment state (defocus amount) of the optical system on the surface where the two images are detected.

図13は撮像用画素と測距瞳面の関係を説明するための図である。なお、図11に示す要素と同様な要素に対しては同一の符号を付して説明を省略する。図13において、70は図1に示す交換レンズ202の光軸91上に配置されるマイクロレンズであり、71はマイクロレンズ70により構成される撮像用画素、81は撮像用光束、94はマイクロレンズ70により投影された光電変換部71の領域である。なお、図13では光軸91上にある撮像用画素(マイクロレンズ70と光電変換部71からなる)を模式的に例示しているが、その他の撮像用画素においても光電変換部はそれぞれ領域94から各マイクロレンズに到来する光束を受光する。   FIG. 13 is a diagram for explaining the relationship between the imaging pixels and the distance measuring pupil plane. In addition, the same code | symbol is attached | subjected to the element similar to the element shown in FIG. 11, and description is abbreviate | omitted. In FIG. 13, reference numeral 70 denotes a microlens disposed on the optical axis 91 of the interchangeable lens 202 shown in FIG. 1, reference numeral 71 denotes an imaging pixel constituted by the microlens 70, reference numeral 81 denotes an imaging light beam, and reference numeral 94 denotes a microlens. This is a region of the photoelectric conversion unit 71 projected by 70. Note that FIG. 13 schematically illustrates an imaging pixel (consisting of a microlens 70 and a photoelectric conversion unit 71) on the optical axis 91, but in other imaging pixels, the photoelectric conversion unit is an area 94, respectively. To receive the light flux coming to each microlens.

マイクロレンズ70は結像光学系の予定結像面近傍に配置されており、光軸91上に配置されたマイクロレンズ70によりその背後に配置された光電変換部71の形状がマイクロレンズ70から投影距離dだけ離間した測距瞳面90上に投影され、その投影形状は領域94を形成する。光電変換部71は、領域94を通過しマイクロレンズ70に向う光束81を受光し、光束81によりマイクロレンズ70上に形成される像の強度に対応した信号を出力する。上述したような撮像用画素を2次元状に多数配置により、各画素の光電変換部の出力に基づいて画像情報が得られる。   The microlens 70 is disposed in the vicinity of the planned imaging plane of the imaging optical system, and the shape of the photoelectric conversion unit 71 disposed behind the microlens 70 disposed on the optical axis 91 is projected from the microlens 70. Projected onto a distance measuring pupil plane 90 separated by a distance d, and the projection shape forms an area 94. The photoelectric conversion unit 71 receives a light beam 81 that passes through the region 94 and travels toward the microlens 70, and outputs a signal corresponding to the intensity of an image formed on the microlens 70 by the light beam 81. By arranging a large number of imaging pixels as described above in a two-dimensional manner, image information can be obtained based on the output of the photoelectric conversion unit of each pixel.

図14は測距瞳面における投影関係を示す正面図である。焦点検出用画素から光電変換部をマイクロレンズにより測距瞳面90に投影した測距瞳92,93は、撮像用画素から光電変換部をマイクロレンズにより測距瞳面90に投影した領域94の内部に包含される。測距瞳92,93は、互いの一部分が重畳した領域95を有する。   FIG. 14 is a front view showing the projection relationship on the distance measuring pupil plane. The distance measuring pupils 92 and 93 obtained by projecting the photoelectric conversion unit from the focus detection pixel to the distance measuring pupil plane 90 by the microlens are the regions 94 and 93 in which the photoelectric conversion unit is projected from the imaging pixel to the distance measuring pupil plane 90 by the microlens. Included inside. The distance measuring pupils 92 and 93 have a region 95 where a part of each other overlaps.

図15および図16は、測距瞳の重畳の効果と適切な重畳量を説明するための図である。図において、面PFは撮像素子が配置される結像光学系の予定焦点面である。また、面PAは測距瞳面であり、予定焦点面PFから距離dだけ離れている。さらに、面PLは結像光学系の射出瞳面であり、予定焦点面PFから距離fだけ離れている。91は結像光学系の光軸であり、位置ACは予定焦点面PFと光軸91とが交わる点、すなわち画面中心である。また、位置AEは予定焦点面PF上で画面中心ACから距離Sだけ離れた位置であって、焦点検出用画素が配置される端の位置である(図2参照)。線70、72は、結像光学系の開放F値を示す境界線、すなわち画面中央から開放F値を見込む線である。   FIG. 15 and FIG. 16 are diagrams for explaining the effect of superimposing the distance measuring pupil and the appropriate amount of superimposition. In the figure, a surface PF is a planned focal plane of the imaging optical system on which the image sensor is arranged. The plane PA is a distance measuring pupil plane, and is separated from the planned focal plane PF by a distance d. Further, the plane PL is an exit pupil plane of the imaging optical system, and is separated from the planned focal plane PF by a distance f. Reference numeral 91 denotes an optical axis of the imaging optical system, and a position AC is a point where the planned focal plane PF and the optical axis 91 intersect, that is, the center of the screen. Further, the position AE is a position that is separated from the screen center AC by a distance S on the planned focal plane PF, and is an end position where the focus detection pixels are arranged (see FIG. 2). Lines 70 and 72 are boundary lines indicating the open F value of the imaging optical system, that is, lines for expecting the open F value from the center of the screen.

図15(a)は測距瞳を重畳させていない場合の図であって、画面中央の位置ACに配置された焦点検出用画素が受光する一対の光束が測距瞳面PAに形成する一対の分布、すなわち測距瞳150,160と、画面中央から離れた位置AEに配置された焦点検出用画素が受光する一対の光束が測距瞳面PAに形成する一対の分布、すなわち測距瞳150,160とは図15(b)に示すように一致しており、一対の測距瞳150と160に重なる部分はない。   FIG. 15A is a diagram in the case where the distance measurement pupils are not superimposed, and a pair of light beams received by the focus detection pixels arranged at the position AC in the center of the screen are formed on the distance measurement pupil plane PA. Distribution, that is, distance measurement pupils 150 and 160, and a pair of distributions formed on the distance measurement pupil plane PA by a pair of light beams received by focus detection pixels arranged at a position AE away from the center of the screen, that is, distance measurement pupils 150 and 160 coincide with each other as shown in FIG. 15B, and there is no portion overlapping the pair of distance measuring pupils 150 and 160.

画面中央の位置ACにおいて、測距瞳150を通る光束を受光する焦点検出用画素は、図15(a)において外側を線52、内側を光軸91で示した範囲内の光束を受光することになる。また、位置ACにおいて測距瞳160を通る光束を受光する焦点検出用画素は、図15(a)において外側を線50、内側を光軸91で示した範囲内の光束を受光することになる。一方、画面中央から距離Sだけ離れた位置AEにおいて測距瞳150を通る光束を受光する焦点検出用画素は、図15(a)において外側を線62、内側を線90で示した範囲内の光束を受光することになる。また、位置AEにおいて測距瞳160を通る光束を受光する焦点検出用画素は、図15(a)において外側を線60、内側を線90で示した範囲内の光束を受光することになる。   At the position AC in the center of the screen, the focus detection pixel that receives the light beam passing through the distance measuring pupil 150 receives the light beam within the range indicated by the line 52 on the outside and the optical axis 91 on the inside in FIG. become. Further, the focus detection pixel that receives the light beam passing through the distance measuring pupil 160 at the position AC receives the light beam within the range indicated by the line 50 on the outer side and the optical axis 91 on the inner side in FIG. . On the other hand, the focus detection pixel that receives the light beam passing through the distance measuring pupil 150 at the position AE that is separated from the center of the screen by the distance S is within the range indicated by the line 62 on the outside and the line 90 on the inside in FIG. The light beam is received. Further, the focus detection pixel that receives the light beam passing through the distance measuring pupil 160 at the position AE receives the light beam within the range indicated by the line 60 on the outer side and the line 90 on the inner side in FIG.

結像光学系の射出瞳距離fが測距瞳距離dと一致する場合には、図15(b)に示すように、測距瞳面PAにおける一対の測距瞳150,160の内の結像光学系の開放F値を示す円170の内部を通過する光束が、位置ACから位置AEまで配列された焦点検出用画素で受光されることになり、光量が等しい一対の光束により強度レベルが揃った一対の像が形成されるので、焦点検出用画素の出力に基づいて一対の像の像ズレ量を高精度に検出することができる。   When the exit pupil distance f of the imaging optical system matches the distance measurement pupil distance d, as shown in FIG. 15B, the result of the pair of distance measurement pupils 150 and 160 on the distance measurement pupil plane PA is obtained. The light beam passing through the circle 170 indicating the open F value of the image optical system is received by the focus detection pixels arranged from the position AC to the position AE, and the intensity level is set by the pair of light beams having the same light amount. Since a pair of aligned images is formed, the image shift amount of the pair of images can be detected with high accuracy based on the output of the focus detection pixels.

結像光学系の射出瞳距離fが測距瞳距離dと異なる場合(図15(a)ではd>f)には、画面中央の位置ACにおいて測距瞳150、160を通る光束を受光する焦点検出用画素は、図15(c)に示すように、一対の測距瞳150,160を通過する光束が射出瞳面PLにおいて形成する分布151、161を通過する光束の内の、射出瞳面PLにおいて結像光学系の開放F値を示す円171の内部を通過する光束を受光することになる。この結果、画面中央の位置ACでは光量の等しい一対の光束により強度レベルが揃った一対の像が形成され、焦点検出用画素の出力に基づいて一対の像の像ズレ量を高精度に検出することができる。   When the exit pupil distance f of the imaging optical system is different from the distance measurement pupil distance d (d> f in FIG. 15A), the light flux passing through the distance measurement pupils 150 and 160 is received at the position AC in the center of the screen. As shown in FIG. 15C, the focus detection pixel includes an exit pupil among the light beams that pass through the distributions 151 and 161 formed on the exit pupil plane PL by the light beams that pass through the pair of distance measurement pupils 150 and 160. The light beam passing through the inside of the circle 171 indicating the open F value of the imaging optical system on the surface PL is received. As a result, at a position AC in the center of the screen, a pair of images having the same intensity level is formed by a pair of light beams having the same light amount, and the image shift amount of the pair of images is detected with high accuracy based on the output of the focus detection pixels. be able to.

しかしながら、焦点検出エリア端の位置AEにおいて測距瞳150、160を通る光束を受光する焦点検出用画素は、図15(d)に示すように、一対の測距瞳150,160を通過する光束が射出瞳面PLにおいて形成する分布152、162を通過する光束の内の、射出瞳面PLにおいて結像光学系の開放F値を示す円171の内部を通過する光束を受光することになる。この場合には、図15(d)から明らかなように分布152と円171の共通領域が消失してしまうので、焦点検出エリア端の位置AEにおいては、像ズレ量を検出するための一対の像が形成されず、焦点検出不能になってしまう。また、位置ACから位置AEの間では一対の像が形成されるが、位置AEに近い部分では形成される一対の像の光量バランスが大幅に崩れているので像ズレ量の検出精度が低下する。   However, the focus detection pixel that receives the light beam passing through the distance measurement pupils 150 and 160 at the position AE at the end of the focus detection area, as shown in FIG. 15D, the light beam that passes through the pair of distance measurement pupils 150 and 160. Among the light beams passing through the distributions 152 and 162 formed on the exit pupil plane PL, the light beam passing through the inside of the circle 171 indicating the open F value of the imaging optical system on the exit pupil plane PL is received. In this case, as apparent from FIG. 15 (d), the common region of the distribution 152 and the circle 171 disappears, so that at the position AE at the end of the focus detection area, a pair of images for detecting the image shift amount. An image is not formed, and focus detection becomes impossible. In addition, a pair of images is formed between the position AC and the position AE, but the light amount balance between the pair of images formed in the portion close to the position AE is greatly broken, so that the detection accuracy of the image shift amount is lowered. .

次に、図16(a)は測距瞳を重畳させた場合の図であって、画面中央の位置ACに配置された焦点検出用画素が受光する一対の光束が測距瞳面PAに形成する一対の分布、すなわち測距瞳250,260と、画面中央から距離Sだけ離れた位置AEに配置された焦点検出用画素が受光する一対の光束が測距瞳面PAに形成する一対の分布、すなわち測距瞳250,260とは図16(b)に示すように一致しており、一対の測距瞳250と260に重なり部分(図中のハッチング部分)240を有する。   Next, FIG. 16A is a diagram in the case where the distance measuring pupil is superimposed, and a pair of light beams received by the focus detection pixel arranged at the position AC in the center of the screen is formed on the distance measuring pupil plane PA. A pair of distributions formed by the focus detection pixels PA and the distance detection pupils 250 and 260 and a pair of light beams received by the focus detection pixels arranged at the position AE away from the center of the screen on the distance measurement pupil plane PA. That is, the distance measurement pupils 250 and 260 coincide with each other as shown in FIG. 16B, and the pair of distance measurement pupils 250 and 260 has an overlapping portion (hatched portion in the drawing) 240.

画面中央の位置ACにおいて、測距瞳250を通る光束を受光する焦点検出用画素は、図16(a)において外側を線52、内側を線53で示した範囲内の光束を受光することになる。また、位置ACにおいて測距瞳260を通る光束を受光する焦点検出用画素は、図16(a)において外側を線50、内側を線51で示した範囲内の光束を受光することになる。一方、画面中央から距離Sだけ離れた位置AEにおいて、測距瞳250を通る光束を受光する焦点検出用画素は、図16(a)において外側を線62、内側を線63で示した範囲内の光束を受光することになる。また、位置AEにおいて測距瞳260を通る光束を受光する焦点検出用画素は、図16(a)において外側を線60、内側を線61で示した範囲内の光束を受光することになる。   At the position AC in the center of the screen, the focus detection pixel that receives the light beam passing through the distance measuring pupil 250 receives the light beam within the range indicated by the line 52 on the outer side and the line 53 on the inner side in FIG. Become. Further, the focus detection pixel that receives the light beam passing through the distance measuring pupil 260 at the position AC receives the light beam within the range indicated by the line 50 on the outer side and the line 51 on the inner side in FIG. On the other hand, the focus detection pixels that receive the light beam passing through the distance measuring pupil 250 at the position AE that is separated from the center of the screen by the distance S are within the range indicated by the line 62 on the outside and the line 63 on the inside in FIG. The light beam is received. Further, the focus detection pixel that receives the light beam passing through the distance measuring pupil 260 at the position AE receives the light beam within the range indicated by the line 60 on the outer side and the line 61 on the inner side in FIG.

結像光学系の射出瞳距離fが測距瞳距離dと一致する場合には、図16(b)に示すように測距瞳面PAにおける一対の測距瞳250,260を通過する光束の内の結像光学系の開放F値を示す円170の内部を通過する光束が、位置ACから位置AEまで配列された焦点検出用画素に受光されることになり、光量が等しい一対の光束により強度レベルが揃った一対の像が形成されるので、焦点検出用画素の出力に基づいて一対の像の像ズレ量を高精度に検出することができる。   When the exit pupil distance f of the imaging optical system matches the distance measuring pupil distance d, as shown in FIG. 16 (b), the light flux passing through the pair of distance measuring pupils 250 and 260 on the distance measuring pupil plane PA is shown. The light flux passing through the inside of the circle 170 indicating the open F value of the imaging optical system is received by the focus detection pixels arranged from the position AC to the position AE. Since a pair of images with uniform intensity levels are formed, the image shift amount of the pair of images can be detected with high accuracy based on the output of the focus detection pixels.

結像光学系の射出瞳距離fが測距瞳距離dと異なる場合(図16(a)ではd>f)には、画面中央の位置ACにおいて測距瞳150、160を通る光束を受光する焦点検出用画素は、図16(c)に示すように、一対の測距瞳250,260を通過する光束が射出瞳面PLにおいて形成する分布251、261を通過する光束の内の、射出瞳面PLにおいて結像光学系の開放F値を示す円171の内部を通過する光束を受光することになり、画面中央の位置ACにおいては光量が等しい一対の光束により強度レベルが揃った一対の像が形成されるので、焦点検出用画素の出力に基づいて一対の像の像ズレ量を高精度に検出することができる。   When the exit pupil distance f of the imaging optical system is different from the distance measurement pupil distance d (d> f in FIG. 16A), the light flux passing through the distance measurement pupils 150 and 160 is received at the position AC in the center of the screen. As shown in FIG. 16C, the focus detection pixel has an exit pupil out of the light fluxes passing through the distributions 251 and 261 formed by the light fluxes passing through the pair of distance measuring pupils 250 and 260 on the exit pupil plane PL. A light beam passing through the inside of a circle 171 indicating the open F value of the imaging optical system on the surface PL is received, and a pair of images having the same intensity level by a pair of light beams having the same light amount at the position AC in the center of the screen. Therefore, it is possible to detect the image shift amount of the pair of images with high accuracy based on the output of the focus detection pixels.

焦点検出エリア端の位置AEにおいて測距瞳250、260を通る光束を受光する焦点検出用画素は、図16(d)に示すように、一対の測距瞳250,260を通過する光束が射出瞳面PLにおいて形成する分布252、262を通過する光束の内の、射出瞳面PLにおいて結像光学系の開放F値を示す円171の内部を通過する光束を受光することになる。この場合には、図16(d)から明らかなように分布252は光軸91側に拡大しているので、円171との共通領域を確保でき、焦点検出エリア端の位置AEにおいても、光量バランスが大幅に崩れていない一対の像が形成され、焦点検出精度を維持した焦点検出ができる。同様に、位置ACから位置AEまで配列された焦点検出用画素においても焦点検出精度を維持した焦点検出ができる。測距瞳を重ね合わせることにより、画面中央から離れた位置AEにおける一対の像の光量バランス(光量比)を、1:4以上望ましくは1:2以上になるように改善することが望ましい。   As shown in FIG. 16D, the focus detection pixels that receive the light beams passing through the distance measurement pupils 250 and 260 at the position AE at the focus detection area end emit light beams that pass through the pair of distance measurement pupils 250 and 260, respectively. Of the luminous fluxes passing through the distributions 252 and 262 formed on the pupil plane PL, the luminous flux passing through the inside of the circle 171 indicating the open F value of the imaging optical system is received on the exit pupil plane PL. In this case, as apparent from FIG. 16 (d), the distribution 252 is enlarged toward the optical axis 91, so that a common area with the circle 171 can be secured, and the light amount is also obtained at the position AE at the end of the focus detection area. A pair of images that are not significantly out of balance are formed, and focus detection can be performed while maintaining focus detection accuracy. Similarly, focus detection can be performed while maintaining focus detection accuracy in the focus detection pixels arranged from the position AC to the position AE. It is desirable to improve the light quantity balance (light quantity ratio) of a pair of images at a position AE away from the center of the screen by overlapping the distance measuring pupil so that it is 1: 4 or more, preferably 1: 2 or more.

図16では射出瞳距離fが測距瞳距離dより短い結像光学系の場合について説明したが、射出瞳距離fが測距瞳距離dより長い結像光学系の場合についても測距瞳を一部重ね合わせることによって、画面周辺における焦点検出精度を確保することができる。このように、測距瞳(測距瞳面における一対の光束の光量分布)を一部重ね合わせることによって、種々の射出瞳と開放F値を有する結像光学系に対して画面周辺においても焦点検出が可能になる。   Although FIG. 16 illustrates the case of the imaging optical system in which the exit pupil distance f is shorter than the distance measuring pupil distance d, the distance measuring pupil is also used in the case of the imaging optical system in which the exit pupil distance f is longer than the distance measuring pupil distance d. By partially overlapping, it is possible to ensure focus detection accuracy around the screen. In this way, by partially overlapping the distance measuring pupil (the light quantity distribution of the pair of light beams on the distance measuring pupil plane), it is possible to focus on the periphery of the screen with respect to the imaging optical system having various exit pupils and open F values. Detection is possible.

なお、測距瞳の重ね合わせの量は以下のように制限される。射出瞳面PLにおける結像光学系の開放F値を示す円の内部に測距瞳の重なり部分だけが入るようになると、一対の光束の重心が一致してしまうため、一対の光束により形成される一対の像の像ズレが生じなくなり、焦点検出が不能になってしまう。これを防止するために、開放F値の内部に一対の測距瞳の重なり部分以外の測距瞳部分が入るように測距瞳の重なり量を制限し、開放F値を示す円の内部において一対の測距瞳を通過する一対の光束の光量分布の重心位置が異なるようにする。   Note that the amount of distance measurement pupil superimposition is limited as follows. When only the overlapping portion of the distance measuring pupil enters the circle showing the open F value of the imaging optical system on the exit pupil plane PL, the centroids of the pair of light beams coincide with each other. The pair of images is not shifted from each other, and focus detection becomes impossible. In order to prevent this, the overlapping amount of the distance measuring pupil is limited so that the distance measuring pupil part other than the overlapping part of the pair of distance measuring pupils enters the inside of the open F value. The barycentric positions of the light quantity distributions of the pair of light beams passing through the pair of distance measuring pupils are made different.

例えば図16(b)では、重なり部分240の幅を円170の直径より短くすることによって、円170と測距瞳250の共通部分の重心位置253と、円170と測距瞳260の共通部分の重心位置263とが測距瞳面PA上で距離G1だけ離間する。また、図16(c)では、重なり部分241の幅が円171の直径より短くなり、円171と光量分布251の共通部分の重心位置254と、円171と光量分布261の共通部分の重心位置264とが射出瞳面PL上で距離G2だけ離間している。さらに、図16(d)では、円171と光量分布252の共通部分の重心位置255と、円171と光量分布262の共通部分の重心位置264とが射出瞳面PL上で距離G3だけ離間している。   For example, in FIG. 16B, by making the width of the overlapping portion 240 shorter than the diameter of the circle 170, the center of gravity position 253 of the common portion of the circle 170 and the distance measuring pupil 250 and the common portion of the circle 170 and the distance measuring pupil 260 are obtained. Is separated from the center of gravity position 263 by a distance G1 on the distance measuring pupil plane PA. In FIG. 16C, the width of the overlapping portion 241 is shorter than the diameter of the circle 171, the barycentric position 254 of the common part of the circle 171 and the light quantity distribution 251, and the barycentric position of the common part of the circle 171 and the light quantity distribution 261. H.264 is separated from the exit pupil plane PL by a distance G2. Further, in FIG. 16D, the center of gravity position 255 of the common part of the circle 171 and the light quantity distribution 252 and the center of gravity position 264 of the common part of the circle 171 and the light quantity distribution 262 are separated by a distance G3 on the exit pupil plane PL. ing.

一般的には、最も開口径の小さな開放F値を仮定し、測距瞳面PAにおけるこの開放F値の円内に入る一対の測距瞳の光量分布の重心位置の差(図16(b)のG1に相当)で測距瞳距離dを除した値(=測距F値)が、所定の焦点検出精度を維持するのに必要なF値より小さくなるように設定する。測距F値は検出された像ズレ量を結像光学系のデフォーカス量に変換する変換係数であって、この値が大きいと焦点検出精度(=算出されるデフォーカス量の精度)が低下する。   In general, assuming an open F value with the smallest aperture diameter, the difference in the center of gravity of the light distribution of the pair of distance measuring pupils that falls within the circle of the open F value on the distance measuring pupil plane PA (FIG. 16B). ) (Corresponding to G1)), the value obtained by dividing the distance measurement pupil distance d (= distance measurement F value) is set to be smaller than the F value necessary to maintain the predetermined focus detection accuracy. The distance measurement F value is a conversion coefficient for converting the detected image shift amount into the defocus amount of the imaging optical system. If this value is large, the focus detection accuracy (= accuracy of the calculated defocus amount) decreases. To do.

例えば規格として焦点検出システムに適用可能な交換レンズの開放F値をF5.6以上の明るい結像光学系を有する交換レンズとした場合、開放F値5.6における測距F値がF20以上になるように測距瞳の重なり量、すなわち重心位置の差を規定することができる。   For example, when an interchangeable lens having a bright imaging optical system with an open F value of F5.6 or higher is used as a standard for an interchangeable lens that can be applied to a focus detection system, the distance measurement F value at the open F value of 5.6 is F20 or higher. Thus, the overlapping amount of the distance measuring pupils, that is, the difference in the center of gravity position can be defined.

また、測距瞳面PAの測距瞳距離dは、焦点検出システムに適用可能な複数種類の交換レンズが内蔵する結像光学系の射出瞳の位置の平均的な位置に合致するように設定することによって、上述した開放F値に起因する制約を緩和することができ、より広範囲に射出瞳距離が散逸している交換レンズ群や、より開放F値が暗いレンズ群に本発明による焦点検出システムを適用することを可能にするとともに、画面上での焦点検出可能領域をより周辺まで拡大することが可能になる。   Further, the distance measurement pupil distance d of the distance measurement pupil plane PA is set so as to match the average position of the exit pupil of the imaging optical system built in a plurality of types of interchangeable lenses applicable to the focus detection system. By doing so, it is possible to relax the restrictions caused by the above-mentioned open F value, and focus detection according to the present invention for an interchangeable lens group in which the exit pupil distance is dissipated in a wider range or a lens group with a darker open F value. The system can be applied, and the focus detectable area on the screen can be expanded to the periphery.

ここで、測距瞳の重ね合わせ量の一例を説明する。測距瞳距離dを100mm、一対の光束の光量分布を一様分布、開放絞りの開口形状を正方形(円形にすると重心位置の演算が煩雑になるため、ここでは正方形とする)、測距瞳面上に換算した開放絞りの開口幅(正方形の一辺の長さ)を18mm(F5.6相当)とすると、一対の光束の重心位置の差を5mm(F20相当)とするためには、測距瞳上において一対の光束の光量分布を8mm重ね合わせればよい。焦点検出用画素を構成するマイクロレンズの投影倍率(光電変換部を測距瞳面に投影する倍率)を10000倍とすれば、光電変換部における重ね合わせ量は0.8ミクロンになる。   Here, an example of the overlapping amount of the distance measuring pupil will be described. Distance pupil distance d is 100 mm, light quantity distribution of a pair of light fluxes is uniform, aperture shape of the aperture stop is square (when it is circular, the calculation of the position of the center of gravity becomes complicated, so here it is square), distance measurement pupil Assuming that the aperture width (length of one side of the square) of the open aperture converted on the surface is 18 mm (equivalent to F5.6), in order to set the difference in the center of gravity of the pair of light beams to 5 mm (equivalent to F20), measurement is required. What is necessary is just to superimpose 8 mm of light quantity distribution of a pair of light beam on a distance pupil. If the projection magnification of the microlens constituting the focus detection pixel (the magnification at which the photoelectric conversion unit is projected onto the distance measuring pupil plane) is 10,000, the amount of superposition in the photoelectric conversion unit is 0.8 microns.

図17は、図1に示すデジタルスチルカメラ(撮像装置)の動作を示すフローチャートである。ボディ駆動制御装置214は、ステップ100でカメラの電源スイッチ(不図示)がオンされるとこの動作を開始する。ステップ110において、撮像素子212の出力信号レベルに基づいて被写体輝度を検出し、被写体輝度に応じて撮像素子212の露光時間を制御する。続くステップ120で撮像素子212の撮像用画素と焦点検出用画素のデータを読み出す。また、ステップ130では撮像用画素のデータを電子ビューファインダーに表示させる。   FIG. 17 is a flowchart showing the operation of the digital still camera (imaging device) shown in FIG. The body drive control device 214 starts this operation when a power switch (not shown) of the camera is turned on in step 100. In step 110, the subject brightness is detected based on the output signal level of the image sensor 212, and the exposure time of the image sensor 212 is controlled according to the subject brightness. In subsequent step 120, data of the imaging pixels and focus detection pixels of the imaging device 212 are read. In step 130, the image pickup pixel data is displayed on the electronic viewfinder.

ステップ140において焦点検出用画素の一対の像データに基づいて後述する像ズレ検出演算処理(相関演算処理)を行い、デフォーカス量を算出する。続くステップ150で合焦近傍か否か、つまりデフォーカス量の絶対値が所定値以内であるかを判定する。合焦でないと判定された場合はステップ160へ進み、デフォーカス量をレンズ駆動制御装置206へ送信し、交換レンズ202のフォーカシング用レンズ210を合焦位置に駆動させ、ステップ110へ戻って上述した動作を繰り返す。なお、焦点検出不能な場合もこのステップに分岐し、レンズ駆動制御装置206へスキャン駆動命令を送信し、交換レンズ202のフォーカシング用レンズ210を無限から至近までの間でスキャン駆動させ、ステップ110へ戻って上述した動作を繰り返す。   In step 140, an image shift detection calculation process (correlation calculation process) to be described later is performed based on a pair of image data of the focus detection pixels to calculate a defocus amount. In the following step 150, it is determined whether or not the focus is close, that is, whether the absolute value of the defocus amount is within a predetermined value. If it is determined that the in-focus state is not achieved, the process proceeds to step 160, the defocus amount is transmitted to the lens drive control unit 206, the focusing lens 210 of the interchangeable lens 202 is driven to the in-focus position, and the process returns to step 110 to return to the above-described state. Repeat the operation. Even when focus detection is impossible, the process branches to this step, a scan drive command is transmitted to the lens drive control device 206, and the focusing lens 210 of the interchangeable lens 202 is driven to scan from infinity to the nearest position. Return and repeat the above operation.

一方、合焦近傍であると判定された場合はステップ170へ進み、シャッターボタン(不図示)の操作によりシャッターレリーズがなされたか否かを判定し、なされていない場合はステップ110へ戻って上述した動作を繰り返す。シャッターレリーズ操作がなされた場合はステップ180へ進み、被写体輝度に応じて撮影パラメータを決定し、レンズ駆動制御装置206へ絞り制御情報を送信し、交換レンズ202の絞り値を撮影絞り値にする。絞り制御が終了した時点で、決定された電荷蓄積時間に応じて撮像素子212の露光を行う。   On the other hand, if it is determined that it is close to the in-focus state, the process proceeds to step 170 to determine whether or not a shutter release has been performed by operating a shutter button (not shown), and if not, the process returns to step 110 and described above. Repeat the operation. If the shutter release operation has been performed, the process proceeds to step 180, the shooting parameters are determined according to the subject brightness, the aperture control information is transmitted to the lens drive control unit 206, and the aperture value of the interchangeable lens 202 is set to the shooting aperture value. When the aperture control is completed, the image sensor 212 is exposed according to the determined charge accumulation time.

ステップ190で撮像素子212から撮像用画素のデータを読み出し、続くステップ200で焦点検出用画素の周囲のある撮像用画素のデータに基づいて焦点検出用画素位置の画像データを補間して求め、画面全体の画像データを生成する。ステップ210で画像データをメモリーカード219に保存し、ステップ110へ戻って上述した動作を繰り返す。   In step 190, data of the imaging pixel is read from the imaging device 212, and in step 200, image data at the focus detection pixel position is obtained by interpolation based on the data of the imaging pixel around the focus detection pixel. The entire image data is generated. In step 210, the image data is stored in the memory card 219, and the process returns to step 110 to repeat the above-described operation.

図18を参照して、図17のステップ140で実行される像ずれ検出演算処理(相関演算処理、位相差検出処理)の詳細を説明する。焦点検出用画素が検出する一対の像は、光量バランスが崩れている可能性があるので、後述するような光量バランスに対して像ズレ検出精度を維持できる型式の相関演算処理を施す。焦点検出用画素列から出力される一対のデータ列(α1〜αM、β1〜βM:Mはデータ数)に対し、(1)式に示すような高周波カットフィルター処理を施し、第1データ列と第2データ列(A1〜AN、B1〜BN)を生成することによって、データ列から相関処理に悪影響を及ぼすノイズ成分や高周波成分を除去する。なお、演算時間の短縮を図る場合や、すでに大きくデフォーカスしていて高周波成分が少ないことがわかっている場合などは、この処理を省略することもできる。
An=αn+2×αn+1+αn+2,
Bn=βn+2×βn+1+βn+2 ・・・(1)
(1)式において、n=1〜Nである。
The details of the image shift detection calculation process (correlation calculation process, phase difference detection process) executed in step 140 of FIG. 17 will be described with reference to FIG. Since the pair of images detected by the focus detection pixels may be out of balance in the light amount, a correlation calculation process that can maintain the image shift detection accuracy with respect to the light amount balance as described later is performed. A pair of data strings (α1 to αM, β1 to βM, where M is the number of data) output from the focus detection pixel column is subjected to a high frequency cut filter process as shown in Formula (1), and the first data string and By generating the second data string (A1 to AN, B1 to BN), noise components and high frequency components that adversely affect the correlation process are removed from the data string. It should be noted that this processing can be omitted when the calculation time is shortened or when it is already defocused and it is known that there are few high frequency components.
An = αn + 2 × αn + 1 + αn + 2,
Bn = βn + 2 × βn + 1 + βn + 2 (1)
In the formula (1), n = 1 to N.

データ列An、Bnに対し(2)式に示す相関演算処理を施し、相関量C(k)を演算する。
C(k)=Σ|An×Bn+1+k−Bn+k×An+1| ・・・(2)
(2)式において、Σ演算はnについて累積され、nのとる範囲はずらし量kに応じてAn、An+1、Bn+k、Bn+1+kのデータが存在する範囲に限定される。ずらし量kは整数であり、データ列のデータ間隔を単位とした相対的シフト量である。(2)式の演算結果は、図18(a)に示すように、一対のデータの相関が高いシフト量(図18(a)ではk=kj=2)において相関量C(k)が極小(小さいほど相関度が高い)になる。
Correlation calculation processing shown in equation (2) is performed on the data strings An and Bn to calculate the correlation amount C (k).
C (k) = Σ | An × Bn + 1 + k−Bn + k × An + 1 | (2)
In equation (2), the Σ operation is accumulated for n, and the range taken by n is limited to a range in which data of An, An + 1, Bn + k, and Bn + 1 + k exist according to the shift amount k. . The shift amount k is an integer and is a relative shift amount with the data interval of the data string as a unit. As shown in FIG. 18A, the calculation result of the expression (2) shows that the correlation amount C (k) is minimal in the shift amount with high correlation between the pair of data (k = kj = 2 in FIG. 18A). (The smaller the value, the higher the degree of correlation).

次に、(3)〜(6)式による3点内挿の手法を用いて連続的な相関量に対する極小値C(x)を与えるシフト量xを求める。
x=kj+D/SLOP ・・・(3),
C(x)= C(kj)−|D| ・・・(4),
D={C(kj-1)−C(kj-1)}/2 ・・・(5),
SLOP=MAX{C(kj+1)−C(kj),C(kj-1)−C(kj)} ・・・(6)
Next, the shift amount x that gives the minimum value C (x) with respect to the continuous correlation amount is obtained by using the three-point interpolation method according to the equations (3) to (6).
x = kj + D / SLOP (3),
C (x) = C (kj) − | D | (4),
D = {C (kj-1) -C (kj-1)} / 2 (5),
SLOP = MAX {C (kj + 1) -C (kj), C (kj-1) -C (kj)} (6)

(3)式で算出されたずらし量xの信頼性があるかどうかは、以下のようにして判定される。図18(b)に示すように、一対のデータの相関度が低い場合は、内挿された相関量の極小値C(x)の値が大きくなる。したがって、C(x)が所定の閾値以上の場合は算出されたずらし量の信頼性が低いと判定し、算出されたずらし量xをキャンセルする。あるいは、C(x)をデータのコントラストで規格化するために、コントラストに比例した値となるSLOPでC(x)を除した値が所定値以上の場合は、算出されたずらし量の信頼性が低いと判定し、算出されたずらし量xをキャンセルする。あるいはまた、コントラストに比例した値となるSLOPが所定値以下の場合は、被写体が低コントラストであり、算出されたずらし量の信頼性が低いと判定し、算出されたずらし量xをキャンセルする。   Whether or not the shift amount x calculated by the equation (3) is reliable is determined as follows. As shown in FIG. 18B, when the degree of correlation between a pair of data is low, the value of the minimal value C (x) of the interpolated correlation amount is large. Accordingly, when C (x) is equal to or greater than a predetermined threshold value, it is determined that the calculated shift amount has low reliability, and the calculated shift amount x is canceled. Alternatively, in order to normalize C (x) with the contrast of data, when the value obtained by dividing C (x) by SLOP that is proportional to the contrast is equal to or greater than a predetermined value, the reliability of the calculated shift amount Is determined to be low, and the calculated shift amount x is canceled. Alternatively, when SLOP that is a value proportional to the contrast is equal to or less than a predetermined value, it is determined that the subject has low contrast and the reliability of the calculated shift amount is low, and the calculated shift amount x is canceled.

図18(c)に示すように、一対のデータの相関度が低く、シフト範囲kmin〜kmaxの間で相関量C(k)の落ち込みがない場合は、極小値C(x)を求めることができず、このような場合は焦点検出不能と判定する。   As shown in FIG. 18C, when the correlation between the pair of data is low and there is no drop in the correlation amount C (k) between the shift ranges kmin to kmax, the minimum value C (x) is obtained. In such a case, it is determined that the focus cannot be detected.

算出されたずらし量xの信頼性があると判定された場合は、被写体像面の予定結像面に対するデフォーカス量DEFを(7)式により求めることができる。
DEF=FA・PY・x ・・・(7)
(7)式において、PYは検出ピッチ(焦点検出用画素のピッチ)であり、FAは測距F値(変換係数)である。
When it is determined that the calculated shift amount x is reliable, the defocus amount DEF of the subject image plane with respect to the planned image formation plane can be obtained by Expression (7).
DEF = FA · PY · x (7)
In Expression (7), PY is a detection pitch (the pitch of focus detection pixels), and FA is a distance measurement F value (conversion coefficient).

《発明の一実施の形態の変形例》
図19は変形例の撮像素子212Aの詳細な構成を示す正面図であり、焦点検出エリア近傍を拡大して示す。図において、撮像素子212Aの縦横は図2に示す画面の縦横に対応している。図3に示す撮像素子212は、図5(a)および(b)に示す一対の焦点検出用画素312、313によって構成されていた。これに対し図19に示す変形例の撮像素子212Aでは、焦点検出用画素が1つのマイクロレンズのもとに一対の光電変換部を備えた画素構造を有する。
<< Modification of Embodiment of Invention >>
FIG. 19 is a front view showing a detailed configuration of an image sensor 212A according to a modification, and shows an enlarged vicinity of the focus detection area. In the figure, the vertical and horizontal directions of the image sensor 212A correspond to the vertical and horizontal directions of the screen shown in FIG. The image sensor 212 shown in FIG. 3 is composed of a pair of focus detection pixels 312 and 313 shown in FIGS. 5 (a) and 5 (b). On the other hand, in the imaging device 212A of the modification shown in FIG. 19, the focus detection pixel has a pixel structure including a pair of photoelectric conversion units under one microlens.

撮像素子212Aは、撮像用画素310と焦点検出用画素311から構成される。図20に示すように、焦点検出用画素311は、マイクロレンズ10と一対の光電変換部22,23からなる。光電変換部22、23は、マイクロレンズ10により交換レンズ202の射出瞳の所定の領域(たとえばF2.8)を通過する光束をすべて受光するような形状に設計される。光電変換部22と23は左右水平方向に並んでおり、光電変換部22の左の一部と光電変換部23の右の一部は感度分布が重なった領域24を有する。焦点検出用画素311が水平方向(光電変換部22と23の並び方向)に配置される。   The image sensor 212 </ b> A includes an imaging pixel 310 and a focus detection pixel 311. As shown in FIG. 20, the focus detection pixel 311 includes a microlens 10 and a pair of photoelectric conversion units 22 and 23. The photoelectric conversion units 22 and 23 are designed in such a shape that the microlens 10 receives all light beams that pass through a predetermined region (for example, F2.8) of the exit pupil of the interchangeable lens 202. The photoelectric conversion units 22 and 23 are arranged in the horizontal direction. The left part of the photoelectric conversion unit 22 and the right part of the photoelectric conversion unit 23 have a region 24 where sensitivity distributions overlap. The focus detection pixels 311 are arranged in the horizontal direction (alignment direction of the photoelectric conversion units 22 and 23).

図21は焦点検出用画素311の断面を示す。図21(a)は、図2に示す位置AC近傍に配置された焦点検出用画素311の断面図であり、焦点検出用の光電変換部22、23の前方にマイクロレンズ10が配置され、マイクロレンズ10により光電変換部22、23が前方に投影される。光電変換部22、23は半導体回路基板29上に形成される。感度重畳部分24はマイクロレンズ10の光軸上にあり、光電変換部22,23はマイクロレンズ10の光軸を中心とした方向から到来する焦点検出用光束を全体として受光する。   FIG. 21 shows a cross section of the focus detection pixel 311. FIG. 21A is a cross-sectional view of the focus detection pixel 311 arranged in the vicinity of the position AC shown in FIG. 2, and the microlens 10 is arranged in front of the focus detection photoelectric conversion units 22 and 23. The lens 10 projects the photoelectric conversion units 22 and 23 forward. The photoelectric conversion units 22 and 23 are formed on the semiconductor circuit substrate 29. The sensitivity superimposing portion 24 is on the optical axis of the microlens 10, and the photoelectric conversion units 22 and 23 receive the focus detection light beam coming from the direction around the optical axis of the microlens 10 as a whole.

また、図21(b)は、図2に示す位置AE近傍に配置された焦点検出用画素311の断面図であり、焦点検出用の光電変換部22、23の前方にマイクロレンズ10が配置され、マイクロレンズ10により光電変換部22、23が前方に投影される。光電変換部22、23は半導体回路基板29上に形成される。感度重畳部分24はマイクロレンズ10の光軸上になく、光電変換部22,23はマイクロレンズ10の光軸より光学系の光軸に寄った方向から到来する焦点検出光束を全体として受光する。   FIG. 21B is a cross-sectional view of the focus detection pixel 311 arranged in the vicinity of the position AE shown in FIG. 2. The microlens 10 is arranged in front of the focus detection photoelectric conversion units 22 and 23. The photoelectric conversion units 22 and 23 are projected forward by the microlens 10. The photoelectric conversion units 22 and 23 are formed on the semiconductor circuit substrate 29. The sensitivity superimposing portion 24 is not on the optical axis of the microlens 10, and the photoelectric conversion units 22 and 23 receive the focus detection light beam coming from the direction closer to the optical axis of the optical system than the optical axis of the microlens 10 as a whole.

図22により、マイクロレンズを用いた瞳分割方式による焦点検出方法を説明する。なお、基本原理は図11で説明した焦点検出方法と同じである。図では光軸91上にある焦点検出用画素(マイクロレンズ50と一対の光電変換部52、53からなる)と、隣接する焦点検出用画素を模式的に例示しているが、その他の焦点検出用画素においても、一対の光電変換部はそれぞれ一対の測距瞳から各マイクロレンズに到来する光束を受光する。焦点検出用画素の配列方向は、一対の測距瞳92,93の並び方向すなわち一対の光電変換部の並び方向と一致させる。測距瞳92と93は一部重なりあっている。   A focus detection method by a pupil division method using a microlens will be described with reference to FIG. The basic principle is the same as the focus detection method described in FIG. In the figure, focus detection pixels (consisting of a microlens 50 and a pair of photoelectric conversion units 52 and 53) on the optical axis 91 and adjacent focus detection pixels are schematically illustrated, but other focus detections are illustrated. Also in the pixel for use, each of the pair of photoelectric conversion units receives the light flux that arrives at each microlens from the pair of distance measuring pupils. The arrangement direction of the focus detection pixels is made to coincide with the arrangement direction of the pair of distance measuring pupils 92 and 93, that is, the arrangement direction of the pair of photoelectric conversion units. The distance measuring pupils 92 and 93 partially overlap each other.

マイクロレンズ50は結像光学系の予定結像面近傍に配置されており、光軸91上に配置されたマイクロレンズ50によりその背後に配置された一対の光電変換部22、23の形状がマイクロレンズ50から投影距離dだけ離間した射出瞳面90上に投影され、その投影形状は測距瞳92,93を形成する。光電変換部22は、測距瞳92を通過してマイクロレンズ50に向う焦点検出用光束42によりマイクロレンズ50上に形成される像の強度に対応した信号を出力する。また、光電変換部23は、測距瞳93を通過てしマイクロレンズ50に向う焦点検出光束43によりマイクロレンズ50上に形成される像の強度に対応した信号を出力する。   The microlens 50 is disposed in the vicinity of the planned imaging surface of the imaging optical system, and the shape of the pair of photoelectric conversion units 22 and 23 disposed behind the microlens 50 disposed on the optical axis 91 is micro. The projection is performed on the exit pupil plane 90 separated from the lens 50 by the projection distance d, and the projection shape forms distance measuring pupils 92 and 93. The photoelectric conversion unit 22 outputs a signal corresponding to the intensity of the image formed on the microlens 50 by the focus detection light beam 42 that passes through the distance measuring pupil 92 and faces the microlens 50. Further, the photoelectric conversion unit 23 outputs a signal corresponding to the intensity of the image formed on the microlens 50 by the focus detection light beam 43 that passes through the distance measuring pupil 93 and faces the microlens 50.

このような焦点検出用画素を直線状に多数配置し、各画素の一対の光電変換部の出力を測距瞳92と測距瞳93に対応した出力グループにまとめることによって、測距瞳92と測距瞳93をそれぞれ通過する焦点検出用光束が焦点検出用画素列上に形成する一対の像の強度分布に関する情報が得られる。   A large number of such focus detection pixels are arranged in a straight line, and the output of the pair of photoelectric conversion units of each pixel is grouped into an output group corresponding to the distance measurement pupil 92 and the distance measurement pupil 93, whereby the distance measurement pupil 92 and Information on the intensity distribution of the pair of images formed on the focus detection pixel array by the focus detection light fluxes that respectively pass through the distance measuring pupil 93 is obtained.

図23は焦点検出用画素の詳細な構造を示す断面図である。図23は、1つのマイクロレンズの背後に一対の光電変換部を有する焦点検出用画素の詳細構造を示す断面図である。マイクロレンズ10により焦点検出用光束は半導体基板29上に集光される。半導体基板29上には開口部を有する酸化膜が形成され、受光領域以外を遮光している。半導体基板29はp型半導体となっており、受光領域には一対のn型領域32,33が形成されている。一対のn型領域32、33の間の領域34には分離領域を設けずにそのままp型領域となっている。   FIG. 23 is a sectional view showing the detailed structure of the focus detection pixel. FIG. 23 is a cross-sectional view showing a detailed structure of a focus detection pixel having a pair of photoelectric conversion units behind one microlens. The focus detection light beam is condensed on the semiconductor substrate 29 by the microlens 10. An oxide film having an opening is formed on the semiconductor substrate 29 to shield light other than the light receiving region. The semiconductor substrate 29 is a p-type semiconductor, and a pair of n-type regions 32 and 33 are formed in the light receiving region. The region 34 between the pair of n-type regions 32 and 33 is a p-type region as it is without providing an isolation region.

このような構造において、n型領域32,33とp型基板によりPN接合が形成され一対の光電変換部(フォトダイオード)を構成している。領域34に入射する光束によって発生する電荷は、拡散によってn型領域32またはn型領域33に流入する。その結果、n型領域32およびn型領域33の光感度分布34、35は領域34で重なり合うようになる。   In such a structure, a PN junction is formed by the n-type regions 32 and 33 and the p-type substrate to constitute a pair of photoelectric conversion units (photodiodes). The charge generated by the light beam incident on the region 34 flows into the n-type region 32 or the n-type region 33 by diffusion. As a result, the photosensitivity distributions 34 and 35 of the n-type region 32 and the n-type region 33 overlap in the region 34.

図24は変形例の撮像装置201Aの構成を示す図である。図1に示す撮像装置201では撮像素子211を焦点検出用と撮像用に兼用する例を示したが、この変形例の撮像装置201Aでは図24に示すように撮像専用の撮像素子212Bと焦点検出専用の焦点検出素子211とを設ける。カメラボディ203には撮影光束を分離するハーフミラー221が配置され、透過側に撮像専用の撮像素子212が配置され、反射側に焦点検出専用の焦点検出素子211が配置される。   FIG. 24 is a diagram illustrating a configuration of an imaging apparatus 201A according to a modification. In the imaging apparatus 201 shown in FIG. 1, the example in which the imaging element 211 is used for both focus detection and imaging is shown. However, in the imaging apparatus 201A of this modification, as shown in FIG. A dedicated focus detection element 211 is provided. The camera body 203 is provided with a half mirror 221 for separating a photographing light beam, an imaging element 212 dedicated for imaging on the transmission side, and a focus detection element 211 dedicated for focus detection on the reflection side.

撮影前は焦点検出素子211の出力に応じて焦点検出が行われる。レリーズ時は撮像専用の撮像素子212Bの出力に応じた画像データが生成される。なお、ハーフミラー221を全反射ミラーとし、撮影時は撮影光路から退避するようにしてもよい。焦点検出素子211と撮像素子212Bの配置を逆にして、反射側に撮像専用の撮像素子212Bを配置し、透過側に焦点検出兼電子ビューファインダー表示用の撮像素子211を配置してもよい。   Before photographing, focus detection is performed according to the output of the focus detection element 211. At the time of release, image data corresponding to the output of the imaging element 212B dedicated to imaging is generated. The half mirror 221 may be a total reflection mirror, and may be retracted from the photographing optical path during photographing. It is also possible to reverse the arrangement of the focus detection element 211 and the imaging element 212B, arrange the imaging element 212B dedicated to imaging on the reflection side, and arrange the imaging element 211 for focus detection and electronic viewfinder display on the transmission side.

上述した一実施の形態では、マイクロレンズにより光電変換部の形状を測距瞳面上に投影した領域を測距瞳としており、図14に示すように測距瞳92,93は明確な周囲を有する形状で示しているが、実際にはマイクロレンズの収差や回折の影響により測距瞳の周囲形状は多少ぼけており、測距瞳は測距瞳上を通過する光束の割合を示す分布から規定される。   In the embodiment described above, an area obtained by projecting the shape of the photoelectric conversion unit on the distance measuring pupil plane by the microlens is used as the distance measuring pupil, and the distance measuring pupils 92 and 93 have a clear periphery as shown in FIG. Actually, the surrounding shape of the distance measuring pupil is slightly blurred due to the influence of aberration and diffraction of the microlens, and the distance measuring pupil is based on the distribution indicating the proportion of the light beam passing through the distance measuring pupil. It is prescribed.

なお、焦点検出エリアの数とそれらの位置については、図2に示す配置に限定されない。画面内の複数の任意の位置に焦点検出用画素を水平または垂直方向に配列することによって、複数の焦点検出エリアを配置することができる。   Note that the number of focus detection areas and their positions are not limited to the arrangement shown in FIG. A plurality of focus detection areas can be arranged by arranging focus detection pixels in a horizontal or vertical direction at a plurality of arbitrary positions in the screen.

図3および図19に示す撮像素子では、焦点検出エリアに焦点検出用画素を隙間なく配列した例を示したが、数画素おきに焦点検出用画素を配置してもよい。焦点検出用画素のピッチが大きくなることにより、焦点検出精度は多少低下するが、焦点検出用画素の密度が低くなるので、画像補間後の画像品質が向上する   In the image sensor shown in FIGS. 3 and 19, the example in which the focus detection pixels are arranged without gaps in the focus detection area is shown. However, the focus detection pixels may be arranged every several pixels. The focus detection accuracy is slightly reduced by increasing the pitch of the focus detection pixels, but the density of the focus detection pixels is reduced, so that the image quality after image interpolation is improved.

図3および図19に示す撮像素子では、撮像用画素および焦点検出用画素が稠密正方格子配列に配置した例を示したが、稠密六方格子配列に配置してもよい。   In the image pickup device shown in FIGS. 3 and 19, the example in which the imaging pixels and the focus detection pixels are arranged in a dense square lattice arrangement is shown, but they may be arranged in a dense hexagonal lattice arrangement.

図3および図19に示す撮像素子では、撮像用画素がベイヤー配列の色フィルターを備えた例を示したが、色フィルターの構成や配列はこれに限定されることはなく、補色フィルター(緑:G、イエロー:Ye、マゼンタ:Mg,シアン:Cy)を採用してもよい。焦点検出用画素はシアンとマゼンタ(出力誤差が比較的目立たない青成分を含む)が配置されるべき画素位置に配置される。   In the imaging device shown in FIGS. 3 and 19, an example in which the imaging pixels include a Bayer color filter is shown, but the configuration and arrangement of the color filter are not limited to this, and the complementary color filter (green: G, yellow: Ye, magenta: Mg, cyan: Cy) may be employed. The focus detection pixels are arranged at pixel positions where cyan and magenta (including a blue component whose output error is relatively inconspicuous) should be arranged.

上述した一実施の形態では、焦点検出用画素のデータは画像データとしては用いられていないが、輝度信号として撮像素子の画像データと合成することにより、ダイナミックレンジを拡大するようにしてもよい。   In the above-described embodiment, the focus detection pixel data is not used as image data, but the dynamic range may be expanded by combining the focus detection pixel data with the image data of the image sensor as a luminance signal.

上述した一実施の形態では、焦点検出用画素のデータは焦点検出に用いられているが、撮像素子の露光制御のための測光用データとして用いることもできる。このようにすれば測光センサーを省略することができる。   In the above-described embodiment, the focus detection pixel data is used for focus detection, but can also be used as photometric data for exposure control of the image sensor. In this way, the photometric sensor can be omitted.

上述した一実施の形態では、焦点検出用画素のデータは焦点検出に用いられているが、撮像素子の画像ブレ検出のためのブレ検出用データとして用いることもできる。このようにすれば画像ブレ検出専用の画像センサーを省略することができる。   In the above-described embodiment, the focus detection pixel data is used for focus detection, but it can also be used as blur detection data for image blur detection of the image sensor. In this way, an image sensor dedicated to image blur detection can be omitted.

上述した一実施の形態では、焦点検出用画素のデータは像ズレ焦点検出に用いられているが、コントラスト検出用データとして用いることもできる。   In the above-described embodiment, the focus detection pixel data is used for image shift focus detection, but can also be used as contrast detection data.

上述した一実施の形態では、焦点検出用画素の分光感度は図7に示すように白色に近い特性となっているが、焦点検出用画素の感度を図6に示す分光感度の中の1つとしてもよい。また、異なる分光感度を有する焦点検出用画素を1つの焦点検出素子の中に共存させるようにしてもよい。   In the above-described embodiment, the spectral sensitivity of the focus detection pixel has a characteristic close to white as shown in FIG. 7, but the sensitivity of the focus detection pixel is one of the spectral sensitivities shown in FIG. It is good. Further, focus detection pixels having different spectral sensitivities may coexist in one focus detection element.

上述した一実施の形態では、焦点検出用画素によって設定される測距瞳面は単一であるが、異なる測距瞳距離を有する複数の測距瞳面に対応して複数種類の焦点検出画素を設けるようにしてもよい。   In the above-described embodiment, the distance detection pupil plane set by the focus detection pixels is single, but a plurality of types of focus detection pixels corresponding to a plurality of distance measurement pupil planes having different distance detection pupil distances. May be provided.

上述した一実施の形態では、撮像用画素の画像データは焦点検出には用いられていないが、コントラスト検出方式の焦点検出データとして用いるとともに、焦点検出用画素による瞳分割方式の焦点検出結果とコントラスト方式による焦点検出結果を併用するようにしてもよい。   In the above-described embodiment, the image data of the imaging pixels is not used for focus detection, but is used as focus detection data for the contrast detection method, and the focus detection result and contrast of the pupil division method using the focus detection pixels. You may make it use together the focus detection result by a system.

図24に示す変形例の撮像装置201Aでは、ハーフミラー221により波長依存がないように光束を分割しているが、分光ミラーにより光束を分割することもできる。例えば赤外光成分と可視光成分に分光するミラーとし、可視光成分を撮像素子で受光し、赤外光成分を焦点検出素子で受光するとともに、焦点検出素子による焦点検出結果を赤外光の収差量に応じて補正することによって、可視光の焦点検出結果に修正して用いることができる。このようにすれば撮像素子が受光する可視光の光量がハーフミラーによって減少することを防止できる。   In the image pickup apparatus 201A of the modified example shown in FIG. 24, the light beam is split by the half mirror 221 so as not to depend on the wavelength, but the light beam can also be split by a spectroscopic mirror. For example, a mirror that divides into an infrared light component and a visible light component is received, the visible light component is received by the image sensor, the infrared light component is received by the focus detection element, and the focus detection result of the focus detection element is converted into the infrared light. By correcting according to the amount of aberration, the focus detection result of visible light can be corrected and used. In this way, it is possible to prevent the amount of visible light received by the image sensor from being reduced by the half mirror.

図24に示す変形例の撮像装置201Aでは、ハーフミラー221は多層膜によって形成してもいいし、偏光ミラー(例えば図1の紙面に平行な偏光成分を反射し、紙面に垂直な偏光成分を透過する)として形成してもよい。   In the image pickup apparatus 201A of the modified example shown in FIG. 24, the half mirror 221 may be formed of a multilayer film, or a polarizing mirror (for example, a polarization component that reflects a polarization component parallel to the paper surface of FIG. It may be formed as a transparent).

上述した一実施の形態では、焦点調節のために交換レンズ側に内蔵されたレンズを光軸方向に移動しているが、カメラボディ内の撮像素子を光軸方向に移動してもよいし、レンズの光軸方向の移動により焦点の粗調整を行い、撮像素子の移動により焦点の微調整を行うようにしてもよい。   In the embodiment described above, the lens built in the interchangeable lens side is moved in the optical axis direction for focus adjustment, but the image sensor in the camera body may be moved in the optical axis direction, The focus may be adjusted roughly by moving the lens in the optical axis direction, and the focus may be finely adjusted by moving the image sensor.

本発明はマイクロレンズを用いた瞳分割型位相差検出方式の焦点検出用画素を有する撮像素子および焦点検出素子に限定されず、他の方式の瞳分割型位相差検出方式の焦点検出用画素を有する撮像素子および焦点検出素子に適用が可能である。例えば、偏光を利用した瞳分割型位相差検出方式の焦点検出画素を備えた撮像素子および焦点検出素子や、再結像レンズを用いた瞳分割型位相差検出方式に適用することが可能である。   The present invention is not limited to an image pickup element and a focus detection element having focus detection pixels of a pupil division type phase difference detection method using a microlens, and focus detection pixels of another type of pupil division type phase difference detection method are used. The present invention can be applied to an imaging element and a focus detection element that have the same. For example, the present invention can be applied to an image pickup element and a focus detection element having focus detection pixels of a pupil division type phase difference detection method using polarized light, and a pupil division type phase difference detection method using a re-imaging lens. .

上述した一実施の形態において、撮像素子および焦点検出素子はCCDイメージセンサーであってもよいし、CMOSイメージセンサーであってもよい。   In the above-described embodiment, the image sensor and the focus detection element may be a CCD image sensor or a CMOS image sensor.

撮像装置は、着脱可能な交換レンズを備えたカメラボディから構成されるデジタルスチルカメラに限定されない。レンズ一体型のデジタルスチルカメラやビデオカメラにも適用できる。あるいは、携帯電話などに内蔵される小型カメラモジュールや監視カメラなどにも適用できる。カメラ以外の焦点検出装置や測距装置やステレオ測距装置にも適用できる。   The imaging device is not limited to a digital still camera including a camera body provided with a removable interchangeable lens. It can also be applied to lens-integrated digital still cameras and video cameras. Alternatively, the present invention can be applied to a small camera module or a surveillance camera built in a mobile phone or the like. The present invention can also be applied to focus detection devices other than cameras, distance measuring devices, and stereo distance measuring devices.

一実施の形態のカメラの構成を示す横断面図Cross-sectional view showing a configuration of a camera according to an embodiment 撮影画面上における焦点検出画素の配置を示す図The figure which shows arrangement | positioning of the focus detection pixel on an imaging | photography screen. 撮像素子の詳細な構成を示す正面図Front view showing detailed configuration of image sensor 撮像用画素の構成を示す図The figure which shows the structure of the pixel for imaging 焦点検出用画素の構成を示す図The figure which shows the structure of the pixel for focus detection 撮像用緑画素、赤画素および青画素の分光感度特性を示す図Diagram showing spectral sensitivity characteristics of green, red and blue pixels for imaging 焦点検出用画素の分光感度特性を示す図The figure which shows the spectral sensitivity characteristic of the pixel for focus detection 撮像用画素の断面図Cross-sectional view of imaging pixels 焦点検出用画素の断面図Cross section of focus detection pixel 焦点検出用画素の断面図Cross section of focus detection pixel マイクロレンズを用いた瞳分割型位相差検出方式の焦点検出光学系の構成を示す図The figure which shows the structure of the focus detection optical system of the pupil division type phase difference detection method using a micro lens 瞳分割型位相差検出方式におけるデフォーカスと像ずれとの関係を説明するための図Diagram for explaining the relationship between defocus and image shift in the pupil division type phase difference detection method 撮像用画素と測距瞳面の関係を説明するための図The figure for demonstrating the relationship between the pixel for imaging, and a ranging pupil plane 測距瞳面における投影関係を示す正面図Front view showing the projection relationship on the distance measurement pupil plane 測距瞳の重畳の効果と適切な重畳量を説明するための図Diagram for explaining the effect of focusing on the distance measurement pupil and the appropriate amount of superposition 測距瞳の重畳の効果と適切な重畳量を説明するための図Diagram for explaining the effect of focusing on the distance measurement pupil and the appropriate amount of superposition デジタルスチルカメラ(撮像装置)の動作を示すフローチャートFlow chart showing operation of digital still camera (imaging device) 像ずれ検出演算処理(相関演算処理、位相差検出処理)の詳細を説明する図The figure explaining the detail of image shift detection calculation processing (correlation calculation processing, phase difference detection processing) 変形例の撮像素子の詳細な構成を示す正面図Front view showing a detailed configuration of an image sensor of a modification 変形例の撮像素子の詳細な構成を示す正面図Front view showing a detailed configuration of an image sensor of a modification 焦点検出用画素の断面を示す図The figure which shows the cross section of the pixel for focus detection マイクロレンズを用いた瞳分割方式による焦点検出方法を説明する図The figure explaining the focus detection method by the pupil division method using a micro lens 焦点検出用画素の詳細な構造を示す断面図Sectional drawing which shows the detailed structure of the pixel for focus detection 変形例の撮像装置の構成を示す図The figure which shows the structure of the imaging device of a modification

符号の説明Explanation of symbols

10 マイクロレンズ
11、12、13、22、23 光電変換部
24 重なった領域
201、201A デジタルスチルカメラ
202 交換レンズ
206 レンズ駆動制御装置
212、212A、212B 撮像素子
310 撮像用画素
311、312、313 焦点検出用画素
214 ボディ駆動制御装置
10 Microlens 11, 12, 13, 22, 23 Photoelectric conversion unit 24 Overlapping area 201, 201A Digital still camera 202 Interchangeable lens 206 Lens drive control device 212, 212A, 212B Image sensor 310 Imaging pixels 311, 312, 313 Focus Pixel 214 for detection Body drive control device

Claims (4)

射出瞳距離が異なる複数種類の結像光学系を交換して装着可能な撮像装置において前記結像光学系の予定焦点面に配置されており、前記結像光学系を通過する一対の光束を受光し、前記結像光学系の焦点検出調節状態を検出するための位相差検出に用いられる一対の像信号を出力する第1焦点検出画素と第2焦点検出画素が複数配列された焦点検出素子と、
前記複数の第1焦点検出画素と前記複数の第2焦点検出画素の出力信号に基づき、前記結像光学系の焦点調節状態を検出する焦点検出演算手段と、を備え、
前記第1焦点検出用画素は第1マイクロレンズと該第1マイクロレンズを介した光を受光する第1光電変換部から構成され、前記第1光電変換部は、前記予定焦点面から所定距離(測距瞳距離と称する)に設定された瞳面(測距瞳面と称する)において全ての前記第1焦点検出用画素に共通に設定される第1領域を通過する光束を受光し、
前記第2焦点検出用画素は前記第1マイクロレンズと異なる第2マイクロレンズと該第2マイクロレンズを介した光を受光する第2光電変換部から構成され、前記第2光電変換部は、前記測距瞳面において全ての前記第2焦点検出用画素に共通に設定される前記第1領域と異なる第2領域を通過する光束を受光し、
前記測距瞳距離は前記複数種類の結像光学系の射出瞳距離の平均的な距離であり、
前記第1領域と前記第2領域は前記結像光学系の光軸を含む第3領域を共通領域として含み、前記第1領域の重心と前記第2領域の重心とを結ぶ方向の前記第3領域の幅は前記複数種類の結像光学系の開放F値の中で最も大きなF値の光束が前記測距瞳面を通過する領域の直径に比較して短く設定され、
前記第1焦点検出用画素と前記第2焦点検出用画素は前記第1領域の重心と前記第2領域の重心を結ぶ方向の直線に沿って交互に配列され
前記複数の第1の焦点画素は、第1の信号データ列を出力し、前記複数の第2の焦点画素は、第2の信号データ列を出力し、
前記焦点検出演算手段は、前記第1の信号データ列中の第1のデータと前記第2の信号データ列中の、前記第1のデータに対応する第2のデータの近傍のデータとを乗算して第1演算データを算出すると共に、前記第2の信号データ列中の第2のデータと前記第1の信号データ列中の、前記第2のデータに対応する第1のデータの近傍のデータとを乗算して第2演算データを算出し、前記第1演算データと前記第2演算データとの相関度を演算し、前記相関量に基づき前記デフォーカス量を算出することを特徴とする焦点検出装置。
In an imaging device that can be mounted by exchanging a plurality of types of imaging optical systems with different exit pupil distances, the imaging optical system is disposed on a predetermined focal plane and receives a pair of light beams passing through the imaging optical system. A focus detection element in which a plurality of first focus detection pixels and a plurality of second focus detection pixels are arranged to output a pair of image signals used for phase difference detection for detecting the focus detection adjustment state of the imaging optical system ; ,
Focus detection calculation means for detecting a focus adjustment state of the imaging optical system based on output signals of the plurality of first focus detection pixels and the plurality of second focus detection pixels,
The first focus detection pixel includes a first microlens and a first photoelectric conversion unit that receives light via the first microlens, and the first photoelectric conversion unit has a predetermined distance from the planned focal plane ( Receiving a light beam passing through a first region set in common to all the first focus detection pixels on a pupil plane (referred to as a distance measurement pupil plane) set to a distance measurement pupil distance);
The second focus detection pixel includes a second microlens that is different from the first microlens and a second photoelectric conversion unit that receives light via the second microlens, and the second photoelectric conversion unit includes the second photoelectric conversion unit, Receiving a light beam passing through a second region different from the first region set in common to all the second focus detection pixels on the distance measuring pupil plane;
The distance measuring pupil distance is an average distance of exit pupil distances of the plurality of types of imaging optical systems,
The first region and the second region include a third region including the optical axis of the imaging optical system as a common region, and the third region in a direction connecting the centroid of the first region and the centroid of the second region. The width of the region is set shorter than the diameter of the region where the light beam having the largest F value among the open F values of the plurality of types of imaging optical systems passes through the distance measuring pupil plane,
The first focus detection pixels and the second focus detection pixels are alternately arranged along a straight line connecting the centroid of the first region and the centroid of the second region ,
The plurality of first focus pixels output a first signal data string, and the plurality of second focus pixels output a second signal data string,
The focus detection calculation unit multiplies the first data in the first signal data string and the data in the vicinity of the second data corresponding to the first data in the second signal data string. The first calculation data is calculated, and the second data in the second signal data string and the vicinity of the first data corresponding to the second data in the first signal data string are calculated. The second calculation data is calculated by multiplying the data, the degree of correlation between the first calculation data and the second calculation data is calculated, and the defocus amount is calculated based on the correlation amount. Focus detection device.
請求項1に記載の焦点検出装置において、
前記複数種類の結像光学系の開放F値の中で最も大きなF値をF5.6とするとともに、開放F値がF5.6である前記結像光学系の射出瞳により制限された前記第1領域と前記第2領域との重心間の距離で前記測距瞳距離を除した値(以降測距F値と称する)がF20以上となるように、前記第3領域の幅を定めることを特徴とする焦点検出装置
The focus detection apparatus according to claim 1,
The largest F value among the open F values of the plurality of types of imaging optical systems is F5.6, and the first F limited by the exit pupil of the imaging optical system whose open F value is F5.6. The width of the third region is determined so that a value obtained by dividing the distance measurement pupil distance by the distance between the centers of gravity of the first region and the second region (hereinafter referred to as a distance measurement F value) is F20 or more. Feature focus detection device .
請求項1または請求項2に記載の焦点検出装置において、
前記結像光学系を通過する一対の光束を受光し、前記結像光学系の焦点検出調節状態を検出するための位相差検出に用いられる一対の像信号を出力する第3焦点検出画素と第4焦点検出画素とが更に複数配列され、
前記第3焦点検出画素及び第4焦点検出画素に関する測距瞳距離が、前記第1焦点検出画素及び第2焦点検出画素に関する前記測距瞳距離と異なることを特徴とする焦点検出装置。
The focus detection apparatus according to claim 1 or 2,
A third focus detection pixel that receives a pair of light beams passing through the imaging optical system and outputs a pair of image signals used for phase difference detection for detecting a focus detection adjustment state of the imaging optical system; A plurality of four-focus detection pixels are further arranged,
A focus detection apparatus, wherein a distance measurement pupil distance related to the third focus detection pixel and the fourth focus detection pixel is different from the distance measurement pupil distance related to the first focus detection pixel and the second focus detection pixel .
請求項1に記載の焦点検出装置と、
前記焦点検出演算手段により算出されたデフォーカス量に基づいて前記結像光学系の焦点調節を行う焦点調節手段と、
前記結像光学系により形成される像を撮像する撮像素子とを備えることを特徴とする撮像装置。
A focus detection apparatus according to claim 1 ;
Focus adjusting means for adjusting the focus of the imaging optical system based on the defocus amount calculated by the focus detection calculating means;
An image pickup apparatus comprising: an image pickup device that picks up an image formed by the imaging optical system.
JP2007059641A 2007-03-09 2007-03-09 Focus detection apparatus and imaging apparatus Active JP4992481B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007059641A JP4992481B2 (en) 2007-03-09 2007-03-09 Focus detection apparatus and imaging apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007059641A JP4992481B2 (en) 2007-03-09 2007-03-09 Focus detection apparatus and imaging apparatus

Publications (2)

Publication Number Publication Date
JP2008224801A JP2008224801A (en) 2008-09-25
JP4992481B2 true JP4992481B2 (en) 2012-08-08

Family

ID=39843539

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007059641A Active JP4992481B2 (en) 2007-03-09 2007-03-09 Focus detection apparatus and imaging apparatus

Country Status (1)

Country Link
JP (1) JP4992481B2 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5228777B2 (en) * 2008-10-09 2013-07-03 株式会社ニコン Focus detection apparatus and imaging apparatus
CN102227665B (en) * 2008-11-27 2015-02-25 佳能株式会社 Solid-state image sensing element and image sensing apparatus
JP2010128205A (en) * 2008-11-27 2010-06-10 Nikon Corp Imaging apparatus
JP5407314B2 (en) * 2008-12-10 2014-02-05 株式会社ニコン Focus detection apparatus and imaging apparatus
JP5159700B2 (en) 2009-05-19 2013-03-06 キヤノン株式会社 Optical apparatus and focus detection method
JP2011059337A (en) * 2009-09-09 2011-03-24 Fujifilm Corp Image pickup apparatus
JP2011077770A (en) * 2009-09-30 2011-04-14 Fujifilm Corp Controller of solid-state imaging device and operation control method thereof
JP5434761B2 (en) * 2010-04-08 2014-03-05 株式会社ニコン Imaging device and imaging apparatus
JP2011221290A (en) * 2010-04-09 2011-11-04 Olympus Corp Imaging device and camera
JP2012182332A (en) * 2011-03-02 2012-09-20 Sony Corp Imaging element and imaging device
JP2014164258A (en) * 2013-02-27 2014-09-08 Nikon Corp Focus detection device and amount of image deviation detection device
EP2848974B1 (en) 2012-05-07 2018-11-21 Nikon Corporation Focus detection device
US9036080B2 (en) * 2012-09-04 2015-05-19 Canon Kabushiki Kaisha Apparatus and method for acquiring information about light-field data

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61240213A (en) * 1985-04-17 1986-10-25 Nippon Kogaku Kk <Nikon> Focus detecting device
JP2959142B2 (en) * 1991-02-22 1999-10-06 ソニー株式会社 Solid-state imaging device
JPH04274404A (en) * 1991-03-01 1992-09-30 Nikon Corp Focus detection device
JPH04345279A (en) * 1991-05-22 1992-12-01 Konica Corp Automatic focusing device for still video camera
JP3617065B2 (en) * 1994-02-23 2005-02-02 株式会社ニコン Focus detection device
JP2985126B2 (en) * 1997-07-08 1999-11-29 株式会社ニコン Focus detection camera
JP4500434B2 (en) * 2000-11-28 2010-07-14 キヤノン株式会社 Imaging apparatus, imaging system, and imaging method
JP4532968B2 (en) * 2004-04-13 2010-08-25 キヤノン株式会社 Focus detection device
JP4721769B2 (en) * 2005-05-17 2011-07-13 Hoya株式会社 Digital camera

Also Published As

Publication number Publication date
JP2008224801A (en) 2008-09-25

Similar Documents

Publication Publication Date Title
JP4992481B2 (en) Focus detection apparatus and imaging apparatus
JP5012495B2 (en) IMAGING ELEMENT, FOCUS DETECTION DEVICE, FOCUS ADJUSTMENT DEVICE, AND IMAGING DEVICE
JP5552214B2 (en) Focus detection device
JP5163068B2 (en) Imaging device
JP5374862B2 (en) Focus detection apparatus and imaging apparatus
JP4826507B2 (en) Focus detection apparatus and imaging apparatus
JP5157400B2 (en) Imaging device
JP4858008B2 (en) FOCUS DETECTION DEVICE, FOCUS DETECTION METHOD, AND IMAGING DEVICE
JP5045007B2 (en) Imaging device
JP2007317951A (en) Optical sensing element and photographing device
JP2008268403A (en) Focus detection device, focus detection method, and imaging apparatus
JP5067148B2 (en) Imaging device, focus detection device, and imaging device
JP4983271B2 (en) Imaging device
JP4858529B2 (en) Imaging device and imaging apparatus
JP4858179B2 (en) Focus detection apparatus and imaging apparatus
JP2013054120A (en) Focus detection device and focus adjustment device
JP2009063952A (en) Imaging device, focus detecting device and imaging apparatus
JP2010210903A (en) Image capturing apparatus
JP5866760B2 (en) Imaging device
JP5278123B2 (en) Imaging device
JP5338112B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5332384B2 (en) Correlation calculation device, focus detection device, and imaging device
JP5399627B2 (en) Focus detection device, focus adjustment device, and imaging device
JP5685892B2 (en) Focus detection device, focus adjustment device, and imaging device
JP5949893B2 (en) Imaging device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100121

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110121

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110520

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110524

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110713

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20110713

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20111018

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20111219

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20120410

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20120423

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150518

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Ref document number: 4992481

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150518

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250