JP2015043026A - Image capturing device and control method therefor - Google Patents

Image capturing device and control method therefor Download PDF

Info

Publication number
JP2015043026A
JP2015043026A JP2013174507A JP2013174507A JP2015043026A JP 2015043026 A JP2015043026 A JP 2015043026A JP 2013174507 A JP2013174507 A JP 2013174507A JP 2013174507 A JP2013174507 A JP 2013174507A JP 2015043026 A JP2015043026 A JP 2015043026A
Authority
JP
Japan
Prior art keywords
focus detection
area
focus
signal
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2013174507A
Other languages
Japanese (ja)
Other versions
JP6271911B2 (en
Inventor
西尾 彰宏
Teruhiro Nishio
彰宏 西尾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to JP2013174507A priority Critical patent/JP6271911B2/en
Publication of JP2015043026A publication Critical patent/JP2015043026A/en
Application granted granted Critical
Publication of JP6271911B2 publication Critical patent/JP6271911B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Focusing (AREA)
  • Automatic Focus Adjustment (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)

Abstract

PROBLEM TO BE SOLVED: To provide an image capturing device which uses an image sensor having focus detection pixels and which realizes both high-speed crop shooting and a large focus detection area.SOLUTION: An image capturing device with a crop shooting feature has an image sensor 158 having focus detection pixels of an image plane phase difference detection type. The image capturing device provides control for reading signals from the focus detection pixels located outside a crop shooting area. A focus detection area setting unit 163 determines whether or not to acquire signals from the focus detection pixels located outside the crop shooting area using area information on focus detection pixels, information on an effective image area of an image capturing lens mounted on a camera main body 150, and information on the crop shooting area to set a focus detection area. Focus detection signal extraction means 164 obtains positional information of the focus detection pixels from the focus detection area setting unit 163, extracts signals from the focus detection pixels in the focus detection area that has been set from output signals of the image sensor 158, and outputs the same to a camera-side CPU 166 via an AF method selection unit 165.

Description

本発明は、撮像素子の焦点検出画素を用いて位相差検出方式の焦点検出を行う撮像装置とその制御方法に関するものである。   The present invention relates to an image pickup apparatus that performs focus detection by a phase difference detection method using focus detection pixels of an image pickup element, and a control method thereof.

静止画や動画の撮影時に被写体像をリアルタイムに観察しながら撮影を行う、いわゆるライブビュー撮影では、正確で速い焦点検出が望まれている。特許文献1では、ライブビュー撮影時に位相差検出方式の焦点検出を行う機構が開示されている。しかし、レンズ交換式のデジタルカメラにおいては、撮像素子の大きさに則してない撮像光学系を用いた際にケラレを含んだ領域の画像データも記録されてしまうという問題がある。このため、特許文献2には、記録画像領域をトリミングして使用領域を限定する方法(以下、クロップ撮影という。)により、有効画面領域の画像を記録する装置が開示されている。   In so-called live view shooting, in which shooting is performed while observing a subject image in real time when shooting a still image or a moving image, accurate and fast focus detection is desired. Patent Document 1 discloses a mechanism for performing focus detection by a phase difference detection method during live view shooting. However, in the interchangeable lens digital camera, there is a problem that image data of an area including vignetting is also recorded when an imaging optical system that does not conform to the size of the imaging element is used. For this reason, Patent Document 2 discloses an apparatus for recording an image of an effective screen area by a method of trimming a recording image area to limit a use area (hereinafter referred to as crop photography).

特開2004−191629号公報JP 2004-191629 A 特開2005−175683号公報JP 2005-175683 A 特開2009−244862号公報JP 2009-244862 A

従来のクロップ撮影では画像信号の読み込み領域を限定すると、位相差検出方式の焦点検出では同時に焦点検出画素の信号に係る読み込み領域も狭くなってしまう。そのため、画面周辺に位置する被写体像に対応する焦点検出点にて相関演算を行うための焦点検出信号波形の長さ(以下、検出長という。)が制限されるので、焦点検出精度が低下する可能性がある。また、十分な検出長を確保するためには、焦点検出点を撮像画面の中心寄りに配置しなければならなくなり、結果として焦点検出可能な領域が狭くなってしまう。   In conventional crop photography, if the reading area of an image signal is limited, the reading area related to the signal of the focus detection pixel is also narrowed at the same time in the focus detection by the phase difference detection method. For this reason, the length of a focus detection signal waveform (hereinafter referred to as a detection length) for performing correlation calculation at a focus detection point corresponding to a subject image located around the screen is limited, so that focus detection accuracy is reduced. there is a possibility. In addition, in order to ensure a sufficient detection length, the focus detection point must be arranged closer to the center of the imaging screen, resulting in a narrow focus detection area.

本発明の目的は、焦点検出画素を有する撮像素子を用いた撮像装置において、クロップ撮影時の高速化と広い焦点検出領域の確保を両立させることである。   An object of the present invention is to achieve both speeding up during crop photography and securing a wide focus detection area in an imaging apparatus using an imaging device having focus detection pixels.

上記課題を解決するために、本発明に係る装置は、位相差検出による焦点検出信号を出力する焦点検出画素を有する撮像素子を備え、クロップ撮影時に第1領域の撮像画素から信号を取得して撮影画像を出力する撮像装置であって、前記第1領域の焦点検出画素、及び前記第1領域を含む第2領域の焦点検出画素から信号を取得して焦点検出演算を行う制御手段と、前記制御手段により算出される焦点検出演算の結果に従って撮像光学系の焦点調節を行う焦点調節手段を備える。   In order to solve the above-described problem, an apparatus according to the present invention includes an imaging element having a focus detection pixel that outputs a focus detection signal based on phase difference detection, and acquires a signal from the imaging pixel in the first region during crop photography. An imaging apparatus that outputs a captured image, the control means for acquiring a signal from the focus detection pixel in the first area and the focus detection pixel in the second area including the first area, and performing a focus detection calculation; Focus adjustment means for adjusting the focus of the imaging optical system according to the result of the focus detection calculation calculated by the control means is provided.

本発明によれば、焦点検出画素を有する撮像素子を用いた撮像装置において、クロップ撮影時の高速化と広い焦点検出領域の確保を両立させることができる。   ADVANTAGE OF THE INVENTION According to this invention, in the imaging device using the image pick-up element which has a focus detection pixel, high-speed at the time of crop photography and ensuring of a wide focus detection area can be made compatible.

本発明の実施形態に係る撮像装置のブロック図である。1 is a block diagram of an imaging apparatus according to an embodiment of the present invention. 撮像素子のマスクを利用した方式の焦点検出画素の配列を例示する図である。It is a figure which illustrates the arrangement | sequence of the focus detection pixel of the system using the mask of an image pick-up element. 撮像素子の光電変換部を2分割した方式の焦点検出画素の配列を例示する図である。It is a figure which illustrates the arrangement | sequence of the focus detection pixel of the system which divided the photoelectric conversion part of the image pick-up element into two. 撮像素子の光電変換部を4分割して2信号を加算する方式の焦点検出画素の配列を例示する図である。It is a figure which illustrates the arrangement | sequence of the focus detection pixel of the system which divides the photoelectric conversion part of an image sensor into 4 and adds 2 signals. 焦点検出画素の瞳投影状態を説明する図である。It is a figure explaining the pupil projection state of a focus detection pixel. 相関演算を行う1対の焦点検出画素群の信号波形と重心間隔(像ズレ)状態を示した図である。It is the figure which showed the signal waveform of a pair of focus detection pixel group which performs correlation calculation, and a gravity center space | interval (image shift | offset | difference) state. 撮像光学系の合焦状態における焦点検出用波形と検出長の関係を示す概略図である。It is the schematic which shows the relationship between the waveform for focus detection in the in-focus state of an imaging optical system, and detection length. 撮像光学系による被写体像が撮像光学系での最大の暈けを生じている状態における焦点検出用波形と検出長の関係を示す概略図である。It is the schematic which shows the relationship between the focus detection waveform and detection length in the state which the to-be-photographed image by the imaging optical system has produced the largest blur in the imaging optical system. 有効撮像領域、有効画像領域、クロップ撮影領域、最大暈け時に焦点検出が可能な領域を示した図である。FIG. 5 is a diagram illustrating an effective imaging area, an effective image area, a crop imaging area, and an area where focus detection is possible at the time of maximum gain. 本実施形態にて焦点検出が可能な領域が拡張されることを説明する図である。It is a figure explaining extending the field where focus detection is possible in this embodiment. 本実施形態の撮影時における焦点検出処理及び合焦制御動作を説明するフローチャートである。It is a flowchart explaining the focus detection process and focusing control operation | movement at the time of imaging | photography of this embodiment.

以下、本発明の実施形態を、添付図面に基づいて説明する。本発明は、位相差検出方式の焦点検出画素を有する撮像素子を用いた、クロップ撮影機能を有する一眼レフカメラやコンパクトデジタルカメラ、ビデオカメラ等の撮像装置に適用可能である。
図1は本発明に係る撮像装置の構成図である。撮像装置は、交換可能なレンズユニット100と、焦点検出画素を有する撮像素子を用いたカメラ本体部150とで構成され、静止画像及び動画像を記録可能である。尚、画像記録回路、表示駆動回路、及び操作部材等、本発明の技術的内容と直接関わりがない箇所については説明を省略する。またカメラ本体部150は撮像素子による画像を用いたコントラスト方式の焦点検出手段を備えていてもよい。
Embodiments of the present invention will be described below with reference to the accompanying drawings. The present invention can be applied to an imaging apparatus such as a single-lens reflex camera, a compact digital camera, or a video camera having a crop photography function using an imaging device having focus detection pixels of a phase difference detection method.
FIG. 1 is a configuration diagram of an imaging apparatus according to the present invention. The imaging device includes a replaceable lens unit 100 and a camera main body 150 using an imaging device having focus detection pixels, and can record still images and moving images. It should be noted that description of portions that are not directly related to the technical contents of the present invention, such as an image recording circuit, a display drive circuit, and an operation member, will be omitted. The camera body 150 may include a contrast-type focus detection unit that uses an image from the image sensor.

レンズ群101は撮像光学系を構成する光学部材である。光彩絞り102は、絞り径駆動部106により絞り径が制御され、撮影時の光量調節を行う。この撮像光学系では、レンズ群101がフォーカス駆動部105により光軸方向に移動することで焦点位置調節を行うものとする。フォーカス駆動部105及び絞り径駆動部106は、レンズ側CPU(中央演算処理装置)104からの制御命令を受信して駆動制御を行う。後述するが、カメラ側伝達部168とレンズ側伝達部103との間で通信処理が実行され、カメラ本体部150での測光値と焦点検出評価値に基づいて露光調整及び焦点調節が行われる。   The lens group 101 is an optical member constituting the imaging optical system. The iris diaphragm 102 is controlled by a diaphragm diameter driving unit 106 to adjust the light amount during photographing. In this imaging optical system, the focus position is adjusted by moving the lens group 101 in the optical axis direction by the focus driving unit 105. The focus driving unit 105 and the aperture diameter driving unit 106 receive a control command from the lens side CPU (central processing unit) 104 and perform drive control. As will be described later, communication processing is executed between the camera side transmission unit 168 and the lens side transmission unit 103, and exposure adjustment and focus adjustment are performed based on the photometric value and focus detection evaluation value in the camera main body unit 150.

レンズユニット100中のメモリには、有効画像領域情報107及び光学系諸情報108が記憶されている。有効画像領域情報107は、撮像光学系の周辺光量特性やイメージサークル径に応じて、焦点検出用画像と撮影記録用画像が有効となる結像画像領域特性に則した領域(有効画像領域)を示す。光学系諸情報108は、以下の情報を含む。
・レンズユニット100の射出瞳位置情報及び設定可能な絞り径に係るF値領域の情報。
・撮像光学系の焦点距離(交換式ズームレンズの場合には焦点距離領域)等の光学特性情報
・撮像光学系の識別情報。
有効画像領域情報107と光学系諸情報108は通信により、レンズ側伝達部103からカメラ側伝達部168を介してカメラ本体部150に伝達される。
The memory in the lens unit 100 stores effective image area information 107 and various optical system information 108. The effective image area information 107 is an area (effective image area) that conforms to the imaging image area characteristics in which the focus detection image and the shooting and recording image are effective in accordance with the peripheral light amount characteristics and the image circle diameter of the imaging optical system. Show. The optical system information 108 includes the following information.
Information on the exit pupil position of the lens unit 100 and information on the F-number area relating to the settable aperture diameter.
-Optical characteristic information such as the focal length of the imaging optical system (focal length region in the case of an interchangeable zoom lens)-Identification information of the imaging optical system.
The effective image area information 107 and the various optical system information 108 are transmitted from the lens side transmission unit 103 to the camera body unit 150 via the camera side transmission unit 168 by communication.

次に、交換式一眼レフレックスカメラの形態をもつカメラ本体部150の構成を説明する。
レンズユニット100を介した被写体からの光線は、跳ね上げ式の反射部材155の駆動により偏向の有無が切り替えられる。反射部材155が撮影光路上に位置する基準状態では、被写体からの光線が偏向され、表示領域調整機能付の焦点板154に被写体像が結像される。この被写体像はペンタゴナル形状の反射プリズム151を介して、接眼レンズ152によりユーザが目視で観察する。また、焦点板154に結像された被写体像の一部は、測光ユニット153の光学系を介して内部のセンサに結像され、測光用の電気信号が生成される。この電気信号に基づいて露光量決定部160は適正な露光量を決定し、撮像素子駆動部161に対するゲイン量の調整や、レンズユニット100の絞り径の制御のための信号を発生させる。
Next, the configuration of the camera body 150 having the form of an interchangeable single-lens reflex camera will be described.
The presence or absence of deflection of the light beam from the subject via the lens unit 100 is switched by driving the flip-up reflection member 155. In the reference state where the reflecting member 155 is positioned on the photographing optical path, the light beam from the subject is deflected, and a subject image is formed on the focusing screen 154 with a display area adjustment function. The subject image is visually observed by the user through the eyepiece 152 via the pentagonal reflecting prism 151. A part of the subject image formed on the focusing screen 154 is imaged on an internal sensor via the optical system of the photometric unit 153, and an electric signal for photometry is generated. Based on the electrical signal, the exposure amount determination unit 160 determines an appropriate exposure amount, and generates a signal for adjusting the gain amount for the image sensor driving unit 161 and controlling the aperture diameter of the lens unit 100.

反射部材155は反射部の一部が半透過特性を有し、透過光線は副反射部材156によって偏向され、焦点検出ユニット157に導光される。尚、副反射部材156は、反射部材155が撮影光路から上方に退避した状態(図中の破線参照)において、反射部材155に連動して折り畳まれた状態になる。これにより、撮像素子158に入射する被写体からの光線を妨げることがない。   A part of the reflection part of the reflection member 155 has a semi-transmission characteristic, and the transmitted light is deflected by the sub-reflection member 156 and guided to the focus detection unit 157. The sub-reflection member 156 is folded in conjunction with the reflection member 155 in a state where the reflection member 155 is retracted upward from the photographing optical path (see a broken line in the drawing). Thereby, the light from the subject incident on the image sensor 158 is not obstructed.

焦点検出ユニット157に導光された光線は、当該ユニット内の焦点検出用センサが受光する。焦点検出用センサは、位相差検出用の一対の像信号(焦点検出信号)を発生させて、AF(オートフォーカス)方式選択部165に出力する。
AF方式選択部165は、焦点検出用センサを使用した位相差検出方式、または撮像素子158上に焦点検出画素を配置した、いわゆる瞳分割による撮像面位相差検出方式を切り替える。選択された焦点検出方式に従って焦点検出及び焦点調節が実行される。
The light beam guided to the focus detection unit 157 is received by a focus detection sensor in the unit. The focus detection sensor generates a pair of image signals (focus detection signals) for phase difference detection and outputs them to an AF (autofocus) method selection unit 165.
The AF method selection unit 165 switches between a phase difference detection method using a focus detection sensor, or an imaging surface phase difference detection method based on so-called pupil division in which focus detection pixels are arranged on the image sensor 158. Focus detection and focus adjustment are performed according to the selected focus detection scheme.

撮像素子158は、撮像画素信号を部分的に読み込み可能なCMOS(相補型金属酸化膜半導体)センサ、及びその周辺回路で構成される。撮像素子158には、横方向にM画素、縦方向にN画素の受光ピクセルが正方配置されている。ベイヤー配列の原色カラーモザイクフィルタがオンチップで形成された、2次元単板カラーセンサが用いられる。尚、詳細な構成については後述するが、撮像素子中に焦点検出画素が配置されている。   The image sensor 158 includes a CMOS (complementary metal oxide semiconductor) sensor capable of partially reading an image pixel signal and its peripheral circuit. The image sensor 158 is squarely arranged with M pixels in the horizontal direction and N pixels in the vertical direction. A two-dimensional single-plate color sensor in which a Bayer array primary color mosaic filter is formed on-chip is used. Although a detailed configuration will be described later, focus detection pixels are arranged in the image sensor.

次に、反射部材155が上方に移動して撮影光路から退避した状態(図1の破線参照)にて、撮像素子158が被写体像を受光した状態で説明を行う。先ず、レンズユニット100からの有効画像領域情報107及び光学系諸情報108、レンズ側CPU104から出力される絞り径(F値)やフォーカス位置等の動的な光学系の特性値がカメラ本体部150に伝達される。これらの情報は、レンズ側伝達部103とカメラ側伝達部168を介して撮像領域設定部162と焦点検出領域設定部163に伝達される。絞り径のように、動的に変化する情報以外の情報については、例えばレンズユニット100がカメラ本体部150に装着されたタイミングで転送される。   Next, description will be made with the imaging element 158 receiving a subject image in a state where the reflecting member 155 moves upward and retracts from the photographing optical path (see the broken line in FIG. 1). First, the effective image area information 107 and optical system information 108 from the lens unit 100, and the dynamic optical system characteristic values such as the aperture diameter (F value) and the focus position output from the lens side CPU 104 are the camera body 150. Is transmitted to. These pieces of information are transmitted to the imaging region setting unit 162 and the focus detection region setting unit 163 via the lens side transmission unit 103 and the camera side transmission unit 168. Information other than information that changes dynamically, such as the aperture diameter, is transferred at the timing when the lens unit 100 is mounted on the camera body 150, for example.

撮像素子駆動部161は、撮像素子158上の画素部から画像信号を取得する領域を設定して駆動させる処理を行う。撮像領域設定部162は記録用に設定される撮像領域を決定し、表示領域調整機能付の焦点板154にて被写体像の表示を行う領域情報を出力して焦点板154での表示調整を行わせる。
焦点検出領域設定部(以下、領域設定部という)163は、撮像光学系に固有な有効画像領域情報と、撮像光学系の諸特性情報を用いて、焦点検出演算に必要な検出長を算出し、焦点検出に必要な焦点検出画素の位置情報を定義する。焦点検出用信号抽出部(以下、信号抽出部という)164は、撮像素子158によって得られた画素信号から、領域設定部163が定義した焦点検出画素位置における位相差検出用の一対の電気信号を抽出する。この焦点検出信号はAF方式選択部165に出力される。
The image sensor driving unit 161 performs a process of setting and driving an area in which an image signal is acquired from the pixel unit on the image sensor 158. The imaging area setting unit 162 determines an imaging area set for recording, outputs area information for displaying a subject image on a focusing screen 154 with a display area adjustment function, and performs display adjustment on the focusing screen 154. Make it.
A focus detection area setting unit (hereinafter referred to as an area setting unit) 163 calculates a detection length necessary for focus detection calculation using effective image area information unique to the imaging optical system and various characteristic information of the imaging optical system. The position information of focus detection pixels necessary for focus detection is defined. A focus detection signal extraction unit (hereinafter referred to as a signal extraction unit) 164 generates a pair of electrical signals for phase difference detection at the focus detection pixel position defined by the region setting unit 163 from the pixel signal obtained by the image sensor 158. Extract. The focus detection signal is output to the AF method selection unit 165.

AF方式選択部165は、焦点検出ユニット157または信号抽出部164からの焦点検出信号を選択する。あるいはさらにコントラスト方式を採用する場合には、コントラスト評価値に基づく焦点検出信号が選択肢に加わる。使用するAF方式の選択処理では、例えば、反射部材155の位置状態が検出され、被写体光線の導光状態が参照される。また、ライブビュー時の被写体像の状態を評価した結果に従ってコントラスト方式と撮像面位相差検出方式との自動選択が行われる。勿論、ユーザ操作の手動設定による選択も可能である。AF方式選択部165により選択された焦点検出信号はカメラ側CPU166に出力される。   The AF method selection unit 165 selects a focus detection signal from the focus detection unit 157 or the signal extraction unit 164. Alternatively, when a contrast method is further employed, a focus detection signal based on the contrast evaluation value is added to the options. In the AF method selection process to be used, for example, the position state of the reflecting member 155 is detected, and the light guide state of the subject light beam is referred to. In addition, the contrast method and the imaging surface phase difference detection method are automatically selected according to the result of evaluating the state of the subject image during live view. Of course, selection by manual setting of user operation is also possible. The focus detection signal selected by the AF method selection unit 165 is output to the camera side CPU 166.

使用するAF方式の決定後、カメラ側CPU166はAF方式選択部165から焦点検出信号を取得し、焦点検出演算を行い、デフォーカス量を算出する。フォーカス駆動量決定部167は、カメラ側CPU166からのデフォーカス量に基づいて合焦制御を行うフォーカスレンズの移動方向及び駆動量を算出する。駆動制御情報は、カメラ側伝達部168からレンズ側伝達部103を介してレンズ側CPU104に伝達される。フォーカス駆動部105への駆動制御命令によりフォーカスレンズが駆動されて焦点調節が行われる。本実施形態では、クロップ撮影時の高速連写性能を低下させずに焦点検出可能な領域を広く確保するための構成を備える。   After determining the AF method to be used, the camera side CPU 166 acquires a focus detection signal from the AF method selection unit 165, performs a focus detection calculation, and calculates a defocus amount. The focus drive amount determination unit 167 calculates the movement direction and drive amount of the focus lens that performs focusing control based on the defocus amount from the camera side CPU 166. The drive control information is transmitted from the camera side transmission unit 168 to the lens side CPU 104 via the lens side transmission unit 103. The focus lens is driven by a drive control command to the focus drive unit 105 to perform focus adjustment. In the present embodiment, there is provided a configuration for ensuring a wide focus-detectable region without degrading high-speed continuous shooting performance during crop photography.

次に、撮像素子158に設けられる焦点検出画素の構成と受光特性を説明する。図2は、撮像素子158の画素配列の構造例を示す。図2の上下方向をY方向とし、左右方向をX方向と定義する。
図2中の画素群200は、撮像画像を形成するための光電変換部から構成される画素群である。画素群201〜204は、画素内に遮光構造が配された光電変換部から構成される、焦点検出用の画素群である(特許文献3参照)。図中のY方向に沿って一列に配置された画素群201および202の出力する光電変換信号を、位相差検出用の一対の相関演算用信号として用いて横縞パターン形状の被写体に対する焦点検出が行われる。同様に、縦縞パターン形状の被写体については、図2中のX方向に沿って一列に配置された画素群203及び204を使用する。画素群203及び204の出力する信号を取得して相関演算を行うことで縦縞パターン形状の被写体に対する焦点検出が行われる。
Next, the configuration and light receiving characteristics of the focus detection pixels provided in the image sensor 158 will be described. FIG. 2 shows a structural example of a pixel array of the image sensor 158. The vertical direction in FIG. 2 is defined as the Y direction, and the horizontal direction is defined as the X direction.
A pixel group 200 in FIG. 2 is a pixel group including a photoelectric conversion unit for forming a captured image. The pixel groups 201 to 204 are focus detection pixel groups each including a photoelectric conversion unit in which light shielding structures are arranged in the pixels (see Patent Document 3). Using the photoelectric conversion signals output from the pixel groups 201 and 202 arranged in a line along the Y direction in the figure as a pair of correlation calculation signals for phase difference detection, focus detection is performed on an object having a horizontal stripe pattern shape. Is called. Similarly, for a subject having a vertical stripe pattern, pixel groups 203 and 204 arranged in a line along the X direction in FIG. 2 are used. By obtaining signals output from the pixel groups 203 and 204 and performing a correlation calculation, focus detection is performed on a subject having a vertical stripe pattern shape.

図3は、1つのマイクロレンズに対して2つの光電変換部を配置した撮像素子の画素配列構造を例示する。図3の上下方向をY方向とし、左右方向をX方向と定義する。
図3中の画素群300は、X方向の縞パターンの被写体に対する焦点検出に使用する。Y方向に並んだ画素302及び303がそれぞれ出力する信号は一対の相関演算用信号として用いる。また画素群301は、Y方向の縞パターンの被写体に対する焦点検出に使用する。X方向に並んだ画素304及び305がそれぞれ出力する信号は一対の相関演算用信号として用いる。
撮影画像信号には、画素302と303が出力する各信号の加算結果、及び画素304と305が出力する各信号の加算結果を使用する。
FIG. 3 illustrates a pixel array structure of an image sensor in which two photoelectric conversion units are arranged for one microlens. The vertical direction in FIG. 3 is defined as the Y direction, and the horizontal direction is defined as the X direction.
A pixel group 300 in FIG. 3 is used for focus detection on a subject having a stripe pattern in the X direction. The signals output from the pixels 302 and 303 arranged in the Y direction are used as a pair of correlation calculation signals. The pixel group 301 is used for focus detection on a subject having a stripe pattern in the Y direction. The signals output from the pixels 304 and 305 arranged in the X direction are used as a pair of correlation calculation signals.
For the captured image signal, the addition result of each signal output from the pixels 302 and 303 and the addition result of each signal output from the pixels 304 and 305 are used.

図4は、1つのマイクロレンズに対して4つの光電変換部を配置した撮像素子の画素配列構造を例示する。4つの光電変換部の電気信号に係る加算方式を変えることにより、図3で説明を行った画素特性を得ることができる。図4の上下方向をY方向とし、左右方向をX方向と定義する。   FIG. 4 illustrates a pixel array structure of an image sensor in which four photoelectric conversion units are arranged for one microlens. The pixel characteristics described with reference to FIG. 3 can be obtained by changing the addition method related to the electrical signals of the four photoelectric conversion units. The vertical direction in FIG. 4 is defined as the Y direction, and the horizontal direction is defined as the X direction.

図4中、画素群400により、Y方向の縞パターンの被写体に対する焦点検出の際、X方向に並んだ画素401と402の各出力の加算及び画素403と404の各出力の加算が行われる。得られた2行分の画素信号は一対の相関演算用の電気信号として用いる。
また、X方向の縞パターンの被写体に対する焦点検出の際には、Y方向に並んだ画素401と403の各出力の加算及び画素402と404の各出力の加算が行われる。得られた2列分の画素信号は一対の相関演算用の電気信号として用いる。
In FIG. 4, when focus detection is performed on a subject having a stripe pattern in the Y direction by the pixel group 400, the outputs of the pixels 401 and 402 aligned in the X direction and the outputs of the pixels 403 and 404 are added. The obtained pixel signals for two rows are used as a pair of electrical signals for correlation calculation.
In addition, when focus detection is performed on a subject having a stripe pattern in the X direction, the outputs of the pixels 401 and 403 arranged in the Y direction are added and the outputs of the pixels 402 and 404 are added. The obtained pixel signals for two columns are used as a pair of electrical signals for correlation calculation.

上記した焦点検出のための2通りの加算処理については、撮像素子上で複数のブロックに分割して加算方法を変更してもよい。千鳥格子配列で互い違いに加算方法を変えることで、図3の場合と等価な画素配列構造を実現できる。この際、縦縞パターンと横縞パターンの被写体について同時に評価が行えるので、焦点検出に際して被写体パターンの方向による依存性をなくすことができる。   As for the above-described two types of addition processing for focus detection, the addition method may be changed by dividing into a plurality of blocks on the image sensor. A pixel arrangement structure equivalent to the case of FIG. 3 can be realized by alternately changing the addition method in the staggered arrangement. At this time, since the vertical stripe pattern and horizontal stripe pattern subjects can be simultaneously evaluated, it is possible to eliminate the dependency of the subject pattern on the focus detection.

撮影状態に応じて、また時系列において全画素に対する加算方法を切り替えてもよい。同じパターン方向の被写体に対する焦点検出に係る焦点検出画素が密な状態になるために、焦点検出画素が疎な範囲に生ずる細い線分を有する被写体が合焦近傍において検知できなくなるという問題を回避できる。
撮影画像用の信号については、画素401ないし404の各画素信号の加算結果から得ることができる。
Depending on the shooting state, the addition method for all pixels may be switched in time series. Since focus detection pixels related to focus detection with respect to a subject in the same pattern direction are in a dense state, it is possible to avoid a problem that a subject having a thin line segment that occurs in a sparse range of focus detection pixels cannot be detected in the vicinity of in-focus. .
The signal for the photographed image can be obtained from the addition result of the pixel signals of the pixels 401 to 404.

このような撮像素子構造を用いることにより、従来の位相差焦点検出方式のように撮像光学系を介した被写体像の一部を、焦点検出専用の光学系によって分離する必要がなくなる。そのため、リアルタイムで撮像素子が受光して画像記録を行う際に、被写体像を監視しつつライブビュー撮影が行える。従来の動画撮影において被写体光線の分割機構無しでは行えなかった、位相差検出方式の焦点検出が可能となる。   By using such an image sensor structure, it is not necessary to separate a part of the subject image via the imaging optical system by an optical system dedicated to focus detection as in the conventional phase difference focus detection method. Therefore, when the image sensor receives light in real time and performs image recording, live view shooting can be performed while monitoring the subject image. This makes it possible to perform focus detection using a phase difference detection method that cannot be performed without subject beam splitting in conventional moving image shooting.

また本実施形態において撮像素子にCMOSセンサを使用する場合、撮像画素や焦点検出画素から部分的に信号を読み込み可能である。これにより、焦点検出画素の一部から焦点検出信号を取得することができる。   In the present embodiment, when a CMOS sensor is used as the image sensor, a signal can be partially read from the image pickup pixel or the focus detection pixel. Thereby, a focus detection signal can be acquired from a part of the focus detection pixels.

次に、上記構成の撮像素子にて位相差検出方式の焦点検出信号を発生させるための構造について、図5及び図6を参照して説明する。焦点検出画素は、例えば、図2の構造を有するものとする。
図5中の断面500は、図2で示した構造の焦点検出画素を示しており、マイクロレンズ501と、遮光部材502を示す。光電変換部503及び504は、相関演算を行うための一対の電気信号(以下、A像信号及びB像信号という)をそれぞれ出力する焦点検出画素である。焦点検出画素は、図2で説明したように一列の方向に配置されている。各焦点検出画素の光電変換信号については、必要であれば隣接する画素の出力信号を用いて直線的な補間処理等が行われる。焦点検出画素の出力により、相関演算を行うためのA像信号及びB像信号が生成される。
Next, a structure for generating a focus detection signal of the phase difference detection method by the imaging device having the above configuration will be described with reference to FIGS. The focus detection pixel has, for example, the structure shown in FIG.
A cross section 500 in FIG. 5 shows the focus detection pixel having the structure shown in FIG. 2, and shows the microlens 501 and the light shielding member 502. The photoelectric conversion units 503 and 504 are focus detection pixels that output a pair of electric signals (hereinafter referred to as an A image signal and a B image signal) for performing a correlation calculation. The focus detection pixels are arranged in a line direction as described with reference to FIG. For the photoelectric conversion signal of each focus detection pixel, if necessary, linear interpolation processing or the like is performed using an output signal of an adjacent pixel. The output of the focus detection pixel generates an A image signal and a B image signal for performing correlation calculation.

撮像光学系の射出瞳形状EP0において、領域EPa、EPbは光電変換部503及び504、つまりA像信号用とB像信号用の焦点検出画素にそれぞれ光束が入射する、分離された瞳領域を示す。このように撮像光学系の入射瞳を複数の瞳領域に分離して光線を取り込むことで、合焦状態の変化におけるA像信号及びB像信号の変化に基づいて焦点検出を行うことができる。   In the exit pupil shape EP0 of the imaging optical system, regions EPa and EPb indicate separated pupil regions in which light beams enter the photoelectric conversion units 503 and 504, that is, the focus detection pixels for the A image signal and the B image signal, respectively. . In this way, by separating the entrance pupil of the imaging optical system into a plurality of pupil regions and taking in light rays, focus detection can be performed based on changes in the A image signal and the B image signal in the change of the in-focus state.

図6は、図5で示した焦点検出画素が一列方向に配置された場合に、A像信号用、B像信号用の焦点検出画素群の出力波形を例示する。
図6に示す波形AI0、BI0は、A像用及びB像用の各焦点検出画素群の出力信号をそれぞれ補間合成した波形である。位相差検出方式の焦点検出では、例えば、A像波形とB像波形との相対位置をずらして、互いの波形を重ね合わせた場合に、差異部分の面積量がもっとも小さくなる状態が相関度の高い状態と判断される。相関演算の際、図6中のL0で示すにように、A像波形及びB像波形をずらすための相対的なずらし量(像ズレ量)が算出され、これからデフォーカス量への換算処理が行われる。
FIG. 6 exemplifies output waveforms of the focus detection pixel groups for the A image signal and the B image signal when the focus detection pixels shown in FIG. 5 are arranged in one line direction.
Waveforms AI0 and BI0 shown in FIG. 6 are waveforms obtained by interpolating the output signals of the focus detection pixel groups for the A and B images. In the focus detection using the phase difference detection method, for example, when the relative positions of the A image waveform and the B image waveform are shifted and the waveforms are superimposed, the state where the area amount of the difference portion is the smallest is the correlation degree. It is judged to be high. At the time of correlation calculation, as indicated by L0 in FIG. 6, a relative shift amount (image shift amount) for shifting the A image waveform and the B image waveform is calculated, and conversion processing to a defocus amount from this is performed. Done.

よって焦点検出の際、一対の焦点検出信号波形を用いて正確な相関演算を行うためには、取得する波形が十分な長さを有していることが望まれる。例えば、仮に図6の波形においてA像及びB像の一方の波形の端が途切れた状態である場合を想定する。この場合、相関演算を行うと、実際に最大の相関度を示す像ズレ量とは異なった像ズレ量が算出されてしまう。すなわち、結像状態に則した検出長が確保できない場合、焦点検出精度が低下することになる。   Therefore, it is desirable that the acquired waveform has a sufficient length in order to perform accurate correlation calculation using a pair of focus detection signal waveforms during focus detection. For example, a case is assumed in which the end of one waveform of the A image and the B image is interrupted in the waveform of FIG. In this case, when the correlation calculation is performed, an image shift amount different from the image shift amount that actually shows the maximum degree of correlation is calculated. That is, when the detection length according to the imaging state cannot be ensured, the focus detection accuracy decreases.

図7及び図8は、焦点検出画素を含む撮像面に結像する被写体像について、合焦時の状態、及びピントが外れて画像が大きく暈けている状態を示す。以下では、各状態において必要となる検出長の違いに関して概念的な説明を行う。   FIG. 7 and FIG. 8 show a state when the subject is focused on the imaging surface including the focus detection pixels, and a state where the image is out of focus and the image is greatly blurred. Hereinafter, a conceptual description will be given regarding the difference in detection length required in each state.

図7及び図8は、十字形の被写体701に対して、撮像面上に被写体像が形成されている状態を示している。被写体701からの光は、光線703a〜cに示すように、撮像光学系700a〜cをそれぞれ介して、焦点検出画素を有する撮像素子の撮像面702に結像される。図7(A)と、図8(A)及び(B)には概略的な光路図を示す。また、図7(B)及び図8(C)には焦点検出画素と被写体像との関係を簡略化して示す。   7 and 8 show a state in which a subject image is formed on the imaging surface for a cross-shaped subject 701. The light from the subject 701 forms an image on the imaging surface 702 of the imaging element having focus detection pixels via the imaging optical systems 700a to 700c as indicated by light rays 703a to 703c. 7A and 8A and 8B show schematic optical path diagrams. FIGS. 7B and 8C show a simplified relationship between the focus detection pixels and the subject image.

図7は、ほぼピントが合った被写体像710aの状態を示している。
焦点検出画素群711は水平方向に配列されており、垂直方向成分の合焦状態を検出する。また、焦点検出画素群712は垂直方向に配列されており、水平方向成分の合焦状態を検出する。信号波形720a及び721aは、焦点検出画素群711により得られるA像及びB像の出力信号をそれぞれ示す。信号波形730a及び731bは焦点検出画素群712により得られるA像及びB像の出力信号をそれぞれ示す。これらの信号波形を用いて相関演算が行われる。ほぼピントが合った状態においては、A像信号とB像信号との間の像ズレ量が小さい。よって、焦点検出画素群711及び712に係る検出長に亘る全ての焦点検出信号を用いなくても、精度の良い焦点検出が行える。
上記したような被写体の水平方向及び垂直方向の各成分の像に対して同時に焦点検出を行う方式を、クロス焦点検出方式と称する。
FIG. 7 shows a state of the subject image 710a that is substantially in focus.
The focus detection pixel group 711 is arranged in the horizontal direction and detects the in-focus state of the vertical direction component. The focus detection pixel group 712 is arranged in the vertical direction, and detects the in-focus state of the horizontal direction component. Signal waveforms 720a and 721a indicate output signals of the A image and the B image obtained by the focus detection pixel group 711, respectively. Signal waveforms 730a and 731b indicate output signals of the A image and the B image obtained by the focus detection pixel group 712, respectively. Correlation calculation is performed using these signal waveforms. In an almost in-focus state, the image shift amount between the A image signal and the B image signal is small. Accordingly, accurate focus detection can be performed without using all focus detection signals over the detection lengths related to the focus detection pixel groups 711 and 712.
A method of performing focus detection simultaneously on the horizontal and vertical component images of the subject as described above is referred to as a cross focus detection method.

図8はピントが大きく外れた状態であって、暈けた被写体像710bが撮像面702に形成されている状態を示す。
図8(A)は、撮像面よりも後方(右側)に焦点が位置した、いわゆる後ピン状態を示している。この場合、被写体701が、撮像光学系700bのピント調節可能な領域での至近距離に位置しており、撮像光学系700bが物体距離無限状態のフォーカス位置に設定されている。また図8(B)は、撮像面よりも前方(左側)に焦点が位置した、いわゆる前ピン状態を示している。この場合、被写体701が無限遠方位置に位置しており、撮像光学系700cが物体距離至近状態のフォーカス位置に設定されている。
図8(A)及び(B)に示す2通りの状態において、暈けの度合が最大となるときの暈け量を、撮像光学系の最大暈け量と呼ぶ。尚、通常、最大暈け量は撮像光学系の絞りが開放状態での暈け量である。図8(A)及び(B)の状態での各暈け量は説明の便宜上、同等なものとしている。
FIG. 8 shows a state in which the subject is greatly out of focus and a blurred subject image 710 b is formed on the imaging surface 702.
FIG. 8A shows a so-called rear pin state in which the focal point is located behind (on the right side) the imaging surface. In this case, the subject 701 is located at a close distance in a focus adjustable region of the imaging optical system 700b, and the imaging optical system 700b is set to a focus position in an infinite object distance state. FIG. 8B shows a so-called front pin state in which the focal point is located in front (left side) of the imaging surface. In this case, the subject 701 is located at an infinitely far position, and the imaging optical system 700c is set to a focus position in a state where the object distance is close.
In the two states shown in FIGS. 8A and 8B, the amount of blur when the degree of blur becomes the maximum is referred to as the maximum blur amount of the imaging optical system. Normally, the maximum amount of blur is the amount of blur when the aperture of the imaging optical system is open. Each amount of blurring in the states of FIGS. 8A and 8B is equivalent for convenience of explanation.

絞り値の変化に際して焦点検出領域を動的に変化させたい場合、焦点検出領域として各絞り値に対応する最大暈け量を導き出せるように設定する。その際、最大暈け量を示す情報に関する数値は参照テーブルの形態でメモリに記憶されている。あるいは、絞り値を変数とした係数値をメモリに記憶しておき、演算式から最大暈け量を算出してもよい。また撮像光学系がズームレンズである場合には、焦点距離域を分割して領域ごとに最大暈け量を持たせるのが良い。   When it is desired to dynamically change the focus detection area when the aperture value changes, the focus detection area is set so that the maximum amount of blur corresponding to each aperture value can be derived. At this time, the numerical value related to the information indicating the maximum amount of profit is stored in the memory in the form of a reference table. Alternatively, a coefficient value with the aperture value as a variable may be stored in a memory, and the maximum amount of profit may be calculated from an arithmetic expression. Further, when the imaging optical system is a zoom lens, it is preferable to divide the focal length region so as to have the maximum amount of blur for each region.

図8(C)の信号波形720b及び721bは、暈け量が大きい状態での焦点検出画素群711によるA像信号及びB像信号をそれぞれ示す。信号波形730b及び731bは、暈け量が大きい状態での焦点検出画素群712によるA像信号及びB像信号をそれぞれ示す。検出長740は、被写体からの光束の垂直成分に対して暈け像の焦点検出を行うために必要な検出長を示す。また、検出長741は、被写体から光束の水平成分に対して暈け像の焦点検出を行うために必要な検出長を示す。
各方向の検出長を設定することで、暈けの度合が大きい場合の、垂直方向の成分(信号波形720b及び721b参照)及び水平方向の成分(信号波形730b及び731b)に関し、拡がった波形の信号全体を取り込むことができる。
Signal waveforms 720b and 721b in FIG. 8C respectively show an A image signal and a B image signal by the focus detection pixel group 711 in a state where the blur amount is large. Signal waveforms 730b and 731b respectively indicate an A image signal and a B image signal by the focus detection pixel group 712 when the blur amount is large. The detection length 740 indicates a detection length necessary to detect the focus of the blurred image with respect to the vertical component of the light beam from the subject. The detection length 741 indicates a detection length necessary for detecting the focus of the blurred image with respect to the horizontal component of the luminous flux from the subject.
By setting the detection length in each direction, the expanded waveform of the vertical component (see signal waveforms 720b and 721b) and the horizontal component (signal waveforms 730b and 731b) when the degree of blurring is large. The entire signal can be captured.

次に、図9及び図10を参照して、焦点検出画素を有する撮像素子上のクロップ撮影領域と有効画像領域、及び焦点検出画素の位置から導出される焦点検出可能領域の関係を説明する。
図9は、指定されたクロップ撮影領域内のみの焦点検出画素を用いて焦点検出を行う場合の焦点検出可能領域を示す。図10は本実施形態にてクロップ撮影領域外で焦点検出に必要な一部分の焦点検出画素信号を読み出す設定を行う場合の焦点検出可能領域を示す。
図9及び図10中の領域900は撮像素子の有効撮像領域を示す。有効撮像領域900の情報は、カメラ側CPU166により、撮像光学系の変倍位置、またはフォーカス位置の変化により更新される。また情報の更新が撮影システムの負荷となる場合には、カメラ本体部に装着されている撮影レンズが動的に取り得る有効撮像領域のうちで、最も領域が狭くなる領域の情報を代表値としてもよい。
第1領域901は、画像信号の取得領域を限定するクロップ撮影領域を示す。有効画像領域902は、有効撮像領域900よりもイメージサークル径が小さい。つまり、撮影可能領域が狭い撮像光学系(撮影レンズ)をカメラ本体部に装着した例を示す。
Next, with reference to FIG. 9 and FIG. 10, the relationship between the crop imaging area on the image sensor having the focus detection pixel, the effective image area, and the focus detectable area derived from the position of the focus detection pixel will be described.
FIG. 9 shows a focus-detectable region when focus detection is performed using focus detection pixels only in the designated crop photography region. FIG. 10 shows a focus-detectable area when setting to read out a part of focus detection pixel signals necessary for focus detection outside the crop photography area in the present embodiment.
An area 900 in FIGS. 9 and 10 indicates an effective imaging area of the imaging device. The information of the effective imaging area 900 is updated by the camera side CPU 166 by the change of the magnification position or the focus position of the imaging optical system. In addition, when the update of information becomes a load on the imaging system, information on the area where the area becomes the narrowest among the effective imaging areas that can be dynamically taken by the imaging lens attached to the camera body is used as a representative value. Also good.
A first area 901 indicates a crop photography area that limits an image signal acquisition area. The effective image area 902 has a smaller image circle diameter than the effective imaging area 900. That is, an example in which an imaging optical system (photographing lens) with a narrow shootable area is mounted on the camera body is shown.

十字形を構成する横長の長方形は検出長域904を示し、縦長の長方形は検出長域905を示す。検出長域904、905は、図8で説明したように、カメラ本体部に装着される撮影レンズの最大暈け量から求められる、焦点検出に必要な検出長に対応する。被写体に対する焦点検出中心点903及びその範囲(矩形枠906参照)をそれぞれ示す。   A horizontally long rectangle constituting the cross shape indicates a detection long region 904, and a vertically long rectangle indicates a detection long region 905. As described with reference to FIG. 8, the detection length regions 904 and 905 correspond to detection lengths necessary for focus detection, which are obtained from the maximum amount of blur of the photographing lens attached to the camera body. A focus detection center point 903 for the subject and its range (see rectangular frame 906) are shown.

前提として、検出長域904と905にてクロス焦点検出が可能であり、焦点検出画素群がクロップ撮影領域901内に亘って配置されているものとする。最大の暈け状態においても焦点検出が可能な領域は、クロップ撮影領域901内にて垂直方向及び水平方向の検出長域904と905が収まるように制限される。その結果、図9における矩形枠906で示す領域(第1焦点検出領域)が、焦点検出中心点の配置される領域として限定されることになる。   As a premise, it is assumed that cross focus detection is possible in the detection long regions 904 and 905, and the focus detection pixel group is arranged over the crop photographing region 901. The area where focus detection is possible even in the maximum blur state is limited so that the detection areas 904 and 905 in the vertical direction and the horizontal direction can be accommodated in the crop photography area 901. As a result, the region (first focus detection region) indicated by the rectangular frame 906 in FIG. 9 is limited as the region where the focus detection center point is arranged.

図10は、クロップ撮影領域901の外側において必要な検出長分だけ焦点検出信号を取り込むことが可能な第2領域1009を示す。焦点検出画素の一部は、有効画像領域902と有効撮像領域900とが重複する第2領域内にて、焦点検出に必要な検出長で焦点検出信号を出力可能である。
例えば、検出長域1001と1002との交点に、焦点検出中心点1000を示す。この点に対応する被写体に対して焦点検出を行う場合に、クロップ撮影領域901外の焦点検出画素の信号を読み込む際、上記の最大暈け情報を用いる。該情報は撮像光学系の焦点距離、絞り径とフォーカス可能領域の情報から得られる。
FIG. 10 shows a second area 1009 in which the focus detection signal can be captured for the necessary detection length outside the crop photography area 901. A part of the focus detection pixels can output a focus detection signal with a detection length necessary for focus detection in the second area where the effective image area 902 and the effective imaging area 900 overlap.
For example, the focus detection center point 1000 is shown at the intersection of the detection long regions 1001 and 1002. When focus detection is performed on a subject corresponding to this point, the above-described maximum gain information is used when reading a signal of a focus detection pixel outside the crop shooting area 901. This information is obtained from information on the focal length, aperture diameter, and focusable area of the imaging optical system.

最大の暈け状態にて焦点検出が可能な検出長に関しては、垂直方向成分の焦点検出では検出長域1001が必要となり、水平方向成分の焦点検出では検出長域1002が必要となる。また焦点検出中心点1003に対応する被写体については、前記と同様に最大の暈け状態の場合、検出長域1004と1005で表した必要検出長に応じて焦点検出画素の信号が取得される。   With respect to the detection length that enables focus detection in the maximum blur state, the detection length region 1001 is necessary for focus detection of the vertical component, and the detection length region 1002 is required for focus detection of the horizontal component. As for the subject corresponding to the focus detection center point 1003, in the case of the maximum blur state as described above, the signal of the focus detection pixel is acquired according to the necessary detection length represented by the detection length areas 1004 and 1005.

このようにクロップ撮影領域901外でも、有効撮像領域900内であって且つ有効画像領域902内に存在する焦点検出画素については、焦点検出に必要となるごく限られた焦点検出画素の信号が取得される。この信号は、クロップ撮影領域901内の撮像画素による画像信号に加えて取り込む処理が実行される。記録用の画像信号に追加される、焦点検出画素の読み取り時間分の増加は微少なものであるため、処理時間への影響は少ない。
図10に示す焦点検出可能領域は、矩形枠906の領域から領域1006に拡張される。すなわち、クロップ撮影領域901内に限定された焦点検出信号のみを取り込む際には、矩形枠906で示した領域(第1焦点検出領域)が使用される。これに対して本実施形態では、焦点検出中心点1000や1003を含むように、領域1006(第2焦点検出領域)へ拡張可能である。領域1006は、第1領域内(クロップ撮影領域901内)であるが、焦点検出に使用する検出長域は第2領域1009に及ぶ。
As described above, for the focus detection pixels that are within the effective image pickup area 900 and within the effective image area 902 even outside the crop image pickup area 901, only a limited focus detection pixel signal necessary for focus detection is acquired. Is done. This signal is captured in addition to the image signal from the imaging pixels in the crop photography area 901. Since the increase in the reading time of the focus detection pixels added to the image signal for recording is slight, the influence on the processing time is small.
The focus detectable area shown in FIG. 10 is expanded from the area of the rectangular frame 906 to the area 1006. That is, when only the focus detection signal limited in the crop photography area 901 is captured, the area (first focus detection area) indicated by the rectangular frame 906 is used. On the other hand, in the present embodiment, the area can be expanded to the area 1006 (second focus detection area) so as to include the focus detection center points 1000 and 1003. An area 1006 is in the first area (in the crop photography area 901), but the detection long area used for focus detection extends to the second area 1009.

微少な信号読み取り時間の増加で済むような高速な連続撮影では、先ず部分的なクロップ撮影領域901外の焦点検出画素の信号の読み込み処理が行われる。カメラ側CPU166は焦点検出可能領域を設定する処理を行う。その際に使用する情報は、焦点検出画素の領域情報と、撮像光学系の有効画像領域の情報と、クロップ撮影領域の情報であり、クロップ撮影領域外の焦点検出画素の信号を用いるか否かが判断される。クロップ撮影領域外の焦点検出画素の信号を用いるという判断結果の場合、図9と図10で比較して説明したように、焦点検出可能領域が拡大される。一方、焦点検出中心点が、図9の矩形枠906に示す領域内であれば、クロップ撮影領域901外の焦点検出画素の信号を取り込む必要は無い。この場合、矩形枠906の領域を示す情報がメモリに記憶されている。カメラ側CPU166はこの情報を参照し、選択される焦点検出点(焦点検出位置情報)に応じて、クロップ撮影領域901外の焦点検出画素の信号を読み込む必要があるか否かを判定できるので効率が良い。   In high-speed continuous shooting that requires only a slight increase in signal reading time, first, a signal of a focus detection pixel outside the cropped imaging region 901 is read. The camera side CPU 166 performs processing for setting a focus detectable area. The information used at this time is the area information of the focus detection pixel, the information of the effective image area of the imaging optical system, and the information of the crop imaging area, and whether to use the signal of the focus detection pixel outside the crop imaging area. Is judged. In the case of the determination result that the signal of the focus detection pixel outside the crop photographing area is used, as described in comparison with FIG. 9 and FIG. 10, the focus detectable area is enlarged. On the other hand, if the focus detection center point is within the area indicated by the rectangular frame 906 in FIG. 9, it is not necessary to capture the signal of the focus detection pixel outside the crop photography area 901. In this case, information indicating the area of the rectangular frame 906 is stored in the memory. Since the camera-side CPU 166 can refer to this information and determine whether or not it is necessary to read the signal of the focus detection pixel outside the crop shooting area 901 in accordance with the selected focus detection point (focus detection position information). Is good.

図11は、本実施形態の撮影時における焦点検出処理及び合焦制御動作を説明するフローチャートである。ライブビュー時に適正露光になるように撮像光学系の絞り動作が行われ、絞り値の変化を検出することによりAF可能領域が変更されるものとする。以下の処理はカメラ側CPU166がメモリから制御プログラムを読み出して実行することにより実現される。
撮像装置の電源が投入されると、S100にて焦点検出画素の領域情報が設定され、図1の焦点検出領域設定部163に領域情報が伝達される。次のS101で、撮影レンズのカメラ本体部への装着状態が検出される。カメラ側CPU166は、S102で撮影レンズから有効画像領域情報を取得する。図1の撮像領域設定部162に撮影レンズの有効画像領域情報が伝達される。
FIG. 11 is a flowchart for explaining a focus detection process and a focus control operation during photographing according to the present embodiment. It is assumed that the aperture operation of the imaging optical system is performed so as to achieve proper exposure during live view, and the AF-enabled area is changed by detecting a change in the aperture value. The following processing is realized by the camera side CPU 166 reading and executing the control program from the memory.
When the image pickup apparatus is turned on, the area information of the focus detection pixel is set in S100, and the area information is transmitted to the focus detection area setting unit 163 in FIG. In the next S101, the mounting state of the taking lens on the camera body is detected. In step S102, the camera side CPU 166 acquires effective image area information from the photographing lens. The effective image area information of the photographic lens is transmitted to the imaging area setting unit 162 in FIG.

次のS103でカメラ側CPU166は、撮影レンズ(撮像光学系)の諸特性の情報を取得する。諸特性の情報とは、カメラ本体部に装着されている撮影レンズ、つまり撮像光学系の最大暈け量や絞り値に対する暈け量等、焦点検出の際の必要検出長に関わる情報群である。尚、最大暈け量とは、フォーカスレンズ位置が無限遠位置である場合に、至近位置の被写体を撮影する際の暈け量、あるいは逆にフォーカスレンズ位置が至近状態にあって無限距離の被写体を撮影する際の暈け量を示す。S104では撮像装置に採用されている複数の焦点検出方式(AF方式)から、特定のAF方式が決定される。カメラ側CPU166は、AF方式を決定する際、例えば、被写体からの光が撮像素子に入射するライブビューモードになっているか否かを判断する。このために、図1に示す可動の反射部材155が跳ね上がって撮影光路上から退避した状態であるか否かが検出される。   In next step S103, the camera-side CPU 166 acquires information on various characteristics of the photographing lens (imaging optical system). The information on various characteristics is a group of information related to a necessary detection length at the time of focus detection, such as a photographing lens mounted on the camera body, that is, a maximum blur amount of the imaging optical system or a blur amount with respect to an aperture value. . The maximum amount of blur is the amount of blur when the subject at the closest position is photographed when the focus lens position is at infinity, or conversely, the subject at an infinite distance when the focus lens position is in the closest state. Shows the amount of profit when shooting. In S104, a specific AF method is determined from a plurality of focus detection methods (AF methods) employed in the imaging apparatus. When determining the AF method, the camera side CPU 166 determines, for example, whether or not the camera is in a live view mode in which light from a subject is incident on the image sensor. For this reason, it is detected whether or not the movable reflecting member 155 shown in FIG. 1 is in a state of jumping up and retracting from the photographing optical path.

S105は、S104で選択されたAF方式により、その後の処理の分岐選択を行う判定処理である。撮像面位相差検出方式であるか、または、その他の方式であるかが判定される。その他のAF方式には、焦点検出ユニット157の出力する焦点検出信号を用いるTTL方式の位相差検出がある。また、撮像光学系を介さない焦点検出ユニットを用いた方式や、被写体のコントラスト方式等がある。S105で、撮像面位相差検出方式以外のAF方式が選択された場合、S106に進んで、他のAF方式の焦点検出処理が実行される。   S105 is a determination process for performing branch selection of the subsequent process by the AF method selected in S104. It is determined whether the imaging surface phase difference detection method or another method is used. Other AF methods include TTL phase difference detection using a focus detection signal output from the focus detection unit 157. In addition, there are a method using a focus detection unit without using an imaging optical system, a subject contrast method, and the like. If an AF method other than the imaging surface phase difference detection method is selected in S105, the process proceeds to S106, and focus detection processing of another AF method is executed.

S106の後、S114でカメラ側CPU166はデフォーカス量を算出して合焦判断を行う。設定条件内で合焦と判断された場合、一連の処理を終了する。実際には焦点検出方法に応じて、S114の合焦判断処理は異なる場合がある。S114では撮像装置が備える焦点検出手段の合焦判断を、カメラ側CPU166が総合して行うものとする。
S114にて、設定条件内で合焦していないと判断された場合、S115に移行する。フォーカス駆動量決定部167は、算出されたデフォーカス量に基づいてフォーカスレンズの駆動方向と駆動量を決定する。その後、S116に進み、合焦動作を行うために必要なフォーカスレンズの駆動処理が実行される。そしてS105に戻り、被写体の変化に対応した焦点検出処理が繰り返される。
他方、S105でAF方式が撮像面位相差検出方式と判断された場合、S107に移行し、被写体に対して、合焦を行う箇所を選択するために焦点検出点の選択処理が実行される。
After S106, in S114, the camera side CPU 166 calculates the defocus amount and performs in-focus determination. When it is determined to be in focus within the setting conditions, a series of processing ends. Actually, the focus determination process in S114 may differ depending on the focus detection method. In S114, it is assumed that the camera-side CPU 166 comprehensively performs the focus determination of the focus detection unit provided in the imaging apparatus.
If it is determined in S114 that the focus is not within the set conditions, the process proceeds to S115. The focus drive amount determination unit 167 determines the drive direction and drive amount of the focus lens based on the calculated defocus amount. Thereafter, the process proceeds to S116, and the focus lens driving process necessary for performing the focusing operation is executed. Then, the process returns to S105, and the focus detection process corresponding to the change of the subject is repeated.
On the other hand, if it is determined in S105 that the AF method is the imaging plane phase difference detection method, the process proceeds to S107, and a focus detection point selection process is performed to select a location to be focused on the subject.

次にS108にて撮像光学系の絞りデータの取得処理が実行される。S109では、S108で得られた絞り値の情報(Fナンバー情報)を用いて、領域設定部163が、図8にて説明を行ったように焦点検出処理に必要となる検出長を算出する。S110は、S109で算出された検出長をもつ領域(図9、図10参照)がクロップ撮影領域を超えているか否かの判断処理である。この処理は領域設定部163が行い、クロップ撮影領域を超える範囲の焦点検出画素の信号を読み込む必要があると判断された場合、S111に処理を進める。また、クロップ撮影領域内の焦点検出画素の信号を読み込むことが判断された場合、S112に移行する。
S111では、焦点検出画素信号の読み込みの判定結果に従い、クロップ撮影領域外で必要となる焦点検出信号の読み込み設定が行われる。
Next, in step S108, aperture data acquisition processing for the imaging optical system is executed. In S109, using the aperture value information (F number information) obtained in S108, the area setting unit 163 calculates a detection length necessary for the focus detection process as described with reference to FIG. S110 is a process for determining whether or not the area having the detection length calculated in S109 (see FIGS. 9 and 10) exceeds the crop photography area. This process is performed by the area setting unit 163, and if it is determined that it is necessary to read the signal of the focus detection pixel in the range exceeding the crop photography area, the process proceeds to S111. If it is determined to read the signal of the focus detection pixel in the crop photography area, the process proceeds to S112.
In S111, in accordance with the determination result of reading the focus detection pixel signal, the focus detection signal reading setting necessary outside the crop photography region is set.

S112では、S109ないしS111のステップで設定された焦点検出画素に関し、信号抽出部164が焦点検出信号の取得処理を行う。S113にて、S112で取得した焦点検出信号を用いて焦点検出演算処理が行われ、デフォーカス量が算出される。そして、S114に移行し、上述した通り、デフォーカス量に基づいて合焦判断が行われる。
尚、図11の説明では、現在設定されている絞り値データに応じて必要な検出長が算出され、領域設定部163は可能な限り焦点検出領域を拡大する。但し、演算を簡略的に行う場合には、特定の検出長情報として、例えば絞り開放時の検出長に固定した情報を用いてもよい。
In S112, the signal extraction unit 164 performs a focus detection signal acquisition process for the focus detection pixels set in steps S109 to S111. In S113, a focus detection calculation process is performed using the focus detection signal acquired in S112, and a defocus amount is calculated. Then, the process proceeds to S114, and as described above, an in-focus determination is made based on the defocus amount.
In the description of FIG. 11, a necessary detection length is calculated according to the currently set aperture value data, and the region setting unit 163 enlarges the focus detection region as much as possible. However, when the calculation is performed simply, for example, information fixed to the detection length when the aperture is opened may be used as the specific detection length information.

本実施形態では、撮像面位相差検出方式の焦点検出手段を有しており、部分的な画素信号の読み込みが行える撮像素子を用いた撮像装置において、クロップ撮影時に連写性能を犠牲にすることなく、焦点検出可能領域を拡大することができる。よって、本実施形態によれば、クロップ撮影時の高速化と、広い焦点検出領域の確保とを両立させることができる。   In the present embodiment, an imaging device using an imaging device having an imaging surface phase difference detection type focus detection unit and capable of reading partial pixel signals sacrifices continuous shooting performance during crop photography. In addition, the focus detectable region can be enlarged. Therefore, according to the present embodiment, it is possible to achieve both speeding up during crop photography and securing a wide focus detection area.

100:レンズユニット
101:撮像光学系レンズ群
150:カメラ本体部
158:撮像素子
163:焦点検出領域設定部
164:焦点検出用信号抽出部
166:カメラ側CPU
DESCRIPTION OF SYMBOLS 100: Lens unit 101: Imaging optical system lens group 150: Camera main-body part 158: Image pick-up element 163: Focus detection area setting part 164: Signal extraction part for focus detection 166: Camera side CPU

Claims (8)

位相差検出による焦点検出信号を出力する焦点検出画素を有する撮像素子を備え、クロップ撮影時に第1領域の撮像画素から信号を取得して撮影画像を出力する撮像装置であって、
前記第1領域の焦点検出画素、及び前記第1領域を含む第2領域の焦点検出画素から信号を取得して焦点検出演算を行う制御手段と、
前記制御手段により算出される焦点検出演算の結果に従って撮像光学系の焦点調節を行う焦点調節手段を備えることを特徴とする撮像装置。
An image pickup apparatus including an image pickup element having a focus detection pixel that outputs a focus detection signal by phase difference detection, outputs a captured image by acquiring a signal from an image pickup pixel in a first region during crop shooting,
Control means for acquiring a signal from the focus detection pixel in the first area and the focus detection pixel in the second area including the first area and performing a focus detection calculation;
An imaging apparatus comprising: a focus adjustment unit that performs focus adjustment of an imaging optical system according to a result of a focus detection calculation calculated by the control unit.
前記制御手段は、
前記撮像光学系の有効画像領域の情報と、前記焦点検出画素の領域情報と、前記第1領域の情報を取得し、焦点検出演算に必要な検出長を算出して、焦点検出に使用する前記焦点検出画素の位置情報を定義することで焦点検出領域を設定する領域設定手段と、
前記領域設定手段から前記焦点検出画素の位置情報を取得し、前記撮像素子の出力信号から、前記領域設定手段により設定された前記焦点検出領域の焦点検出画素の信号を抽出する信号抽出手段を備えることを特徴とする請求項1に記載の撮像装置。
The control means includes
The information of the effective image area of the imaging optical system, the area information of the focus detection pixel, and the information of the first area are obtained, the detection length necessary for the focus detection calculation is calculated, and used for focus detection. Area setting means for setting a focus detection area by defining position information of focus detection pixels;
Signal extraction means is provided for acquiring position information of the focus detection pixel from the area setting means, and extracting a signal of the focus detection pixel of the focus detection area set by the area setting means from an output signal of the image sensor. The imaging apparatus according to claim 1.
前記領域設定手段は、前記第1領域の外側に位置する前記焦点検出画素の信号を用いるか否かを判断して前記焦点検出領域を設定することを特徴とする請求項2に記載の撮像装置。   The imaging apparatus according to claim 2, wherein the area setting unit sets the focus detection area by determining whether or not a signal of the focus detection pixel located outside the first area is used. . 前記制御手段は、前記撮像光学系の有効画像領域の情報を、前記撮像光学系の変倍位置またはフォーカス位置が変化した場合に更新することを特徴とする請求項2または3に記載の撮像装置。   4. The imaging apparatus according to claim 2, wherein the control unit updates information on an effective image area of the imaging optical system when a magnification position or a focus position of the imaging optical system is changed. 5. . 前記領域設定手段は、前記第1領域の外側に位置する前記焦点検出画素の信号を読み込む場合、前記撮像光学系の焦点距離と絞り径とフォーカス可能領域の情報を用いて前記焦点検出領域を設定することを特徴とする請求項2または3に記載の撮像装置。   The area setting unit sets the focus detection area using information on a focal length, a diaphragm diameter, and a focusable area of the imaging optical system when reading a signal of the focus detection pixel located outside the first area. The imaging apparatus according to claim 2 or 3, wherein 前記制御手段は、前記第1領域の焦点検出画素の信号を取得して焦点検出を行う場合の第1焦点検出領域から、前記第2領域の焦点検出画素の信号をさらに取得して焦点検出を行う場合の第2焦点検出領域に変更して焦点検出演算を行うことを特徴とする請求項1ないし5のいずれか1項に記載の撮像装置。   The control means further acquires a focus detection pixel signal of the second region from the first focus detection region when focus detection is performed by acquiring a focus detection pixel signal of the first region, and performs focus detection. The imaging apparatus according to claim 1, wherein focus detection calculation is performed by changing to a second focus detection region in the case of performing. 前記第1焦点検出領域及び該第1焦点検出領域よりも広い前記第2焦点検出領域は、クロス焦点検出における焦点検出中心点が位置する領域であって前記第1領域内であることを特徴とする請求項6に記載の撮像装置。   The first focus detection region and the second focus detection region wider than the first focus detection region are regions where a focus detection center point in cross focus detection is located and are within the first region. The imaging device according to claim 6. 位相差検出による焦点検出信号を出力する焦点検出画素を有する撮像素子を備え、クロップ撮影時に第1領域の撮像画素から信号を取得して撮影画像を出力する撮像装置にて実行される制御方法であって、
前記第1領域の焦点検出画素、及び前記第1領域を含む第2領域の焦点検出画素から信号を取得して制御手段により焦点検出演算を行うステップと、
前記制御手段により算出される焦点検出演算の結果に従って、焦点調節手段により撮像光学系の焦点調節を行うステップを有することを特徴とする撮像装置の制御方法。
A control method executed by an imaging apparatus that includes an imaging element having a focus detection pixel that outputs a focus detection signal by phase difference detection, and that acquires a signal from the imaging pixel in the first region and outputs a captured image during crop photography. There,
Obtaining a signal from a focus detection pixel in the first region and a focus detection pixel in a second region including the first region, and performing a focus detection calculation by a control unit;
A method for controlling an imaging apparatus, comprising: a step of adjusting a focus of an imaging optical system by a focus adjustment unit according to a result of a focus detection calculation calculated by the control unit.
JP2013174507A 2013-08-26 2013-08-26 Imaging apparatus, control method therefor, and defocus amount calculation method Active JP6271911B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013174507A JP6271911B2 (en) 2013-08-26 2013-08-26 Imaging apparatus, control method therefor, and defocus amount calculation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013174507A JP6271911B2 (en) 2013-08-26 2013-08-26 Imaging apparatus, control method therefor, and defocus amount calculation method

Publications (2)

Publication Number Publication Date
JP2015043026A true JP2015043026A (en) 2015-03-05
JP6271911B2 JP6271911B2 (en) 2018-01-31

Family

ID=52696580

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013174507A Active JP6271911B2 (en) 2013-08-26 2013-08-26 Imaging apparatus, control method therefor, and defocus amount calculation method

Country Status (1)

Country Link
JP (1) JP6271911B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018004663A (en) * 2016-06-27 2018-01-11 株式会社シグマ Imaging apparatus
WO2019031000A1 (en) * 2017-08-09 2019-02-14 ソニー株式会社 Signal processing device, image capturing device, signal processing method, and program
CN110024307A (en) * 2016-11-29 2019-07-16 昕诺飞控股有限公司 Visible light communication detection and/or decoding
CN111885308A (en) * 2015-09-24 2020-11-03 高通股份有限公司 Phase detection autofocus noise reduction

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11194261A (en) * 1997-12-26 1999-07-21 Canon Inc Camera system and exchange lens device
JPH11281884A (en) * 1998-03-30 1999-10-15 Minolta Co Ltd Focus position detecting device
JP2004212892A (en) * 2003-01-08 2004-07-29 Canon Inc Photographing device
JP2006133476A (en) * 2004-11-05 2006-05-25 Canon Inc Camera system
JP2009092824A (en) * 2007-10-05 2009-04-30 Sony Corp Imaging device
JP2010028397A (en) * 2008-07-17 2010-02-04 Canon Inc Imaging device, and its controlling method and program
JP2011166378A (en) * 2010-02-08 2011-08-25 Canon Inc Imaging device and control method of the same
JP2012133151A (en) * 2010-12-22 2012-07-12 Nikon Corp Imaging device
JP2012211945A (en) * 2011-03-30 2012-11-01 Canon Inc Imaging apparatus

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11194261A (en) * 1997-12-26 1999-07-21 Canon Inc Camera system and exchange lens device
JPH11281884A (en) * 1998-03-30 1999-10-15 Minolta Co Ltd Focus position detecting device
JP2004212892A (en) * 2003-01-08 2004-07-29 Canon Inc Photographing device
JP2006133476A (en) * 2004-11-05 2006-05-25 Canon Inc Camera system
JP2009092824A (en) * 2007-10-05 2009-04-30 Sony Corp Imaging device
JP2010028397A (en) * 2008-07-17 2010-02-04 Canon Inc Imaging device, and its controlling method and program
JP2011166378A (en) * 2010-02-08 2011-08-25 Canon Inc Imaging device and control method of the same
JP2012133151A (en) * 2010-12-22 2012-07-12 Nikon Corp Imaging device
JP2012211945A (en) * 2011-03-30 2012-11-01 Canon Inc Imaging apparatus

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111885308A (en) * 2015-09-24 2020-11-03 高通股份有限公司 Phase detection autofocus noise reduction
JP2018004663A (en) * 2016-06-27 2018-01-11 株式会社シグマ Imaging apparatus
CN110024307A (en) * 2016-11-29 2019-07-16 昕诺飞控股有限公司 Visible light communication detection and/or decoding
CN110024307B (en) * 2016-11-29 2023-06-27 昕诺飞控股有限公司 Visible light communication detection and/or decoding
WO2019031000A1 (en) * 2017-08-09 2019-02-14 ソニー株式会社 Signal processing device, image capturing device, signal processing method, and program
CN110945399A (en) * 2017-08-09 2020-03-31 索尼公司 Signal processing device, imaging device, signal processing method, and program
JPWO2019031000A1 (en) * 2017-08-09 2020-07-09 ソニー株式会社 Signal processing device, imaging device, signal processing method, and program
US11394866B2 (en) 2017-08-09 2022-07-19 Sony Group Corporation Signal processing device, imaging device, signal processing meihod and program
CN110945399B (en) * 2017-08-09 2022-09-20 索尼公司 Signal processing apparatus, imaging apparatus, signal processing method, and memory
JP7230807B2 (en) 2017-08-09 2023-03-01 ソニーグループ株式会社 SIGNAL PROCESSING DEVICE, IMAGING DEVICE, SIGNAL PROCESSING METHOD AND PROGRAM

Also Published As

Publication number Publication date
JP6271911B2 (en) 2018-01-31

Similar Documents

Publication Publication Date Title
JP5388544B2 (en) Imaging apparatus and focus control method thereof
JP6584149B2 (en) Imaging device
US10264173B2 (en) Image capturing apparatus and control method thereof, and storage medium
JP2009175528A (en) Focus-adjusting apparatus and imaging apparatus
JP2014153509A (en) Imaging device and imaging method
JP2012234152A (en) Imaging apparatus and control method thereof
JP4995002B2 (en) Imaging device, focusing device, imaging method, and focusing method
JP5963552B2 (en) Imaging device
JP6271911B2 (en) Imaging apparatus, control method therefor, and defocus amount calculation method
JP5402298B2 (en) Focus detection device and camera
JP6952222B2 (en) Imaging device
JP2019041178A (en) Image sensor and imaging apparatus using the same
JP2010128205A (en) Imaging apparatus
JP2013113857A (en) Imaging device, and control method therefor
JP2017118212A (en) Imaging apparatus
JP5796388B2 (en) Focus detection apparatus and imaging apparatus
JP5240591B2 (en) Imaging device, focusing device, imaging method, and focusing method
JP2014142497A (en) Imaging apparatus and method for controlling the same
JP5998820B2 (en) Imaging device, focus state display method, focus state display program
JP5973784B2 (en) Imaging apparatus, control method therefor, program, and storage medium
JP6530610B2 (en) Focusing device, imaging device, control method of focusing device, and program
JP6341668B2 (en) Imaging device
JP2014211589A (en) Focus adjustment device and imaging device
JP7005209B2 (en) Image pickup device and its control method
JP2009031562A (en) Light receiving element, light receiver, focus detecting device, camera

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160810

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20170426

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170509

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170619

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20171205

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20171228

R151 Written notification of patent or utility model registration

Ref document number: 6271911

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151