JP2007121896A - Focus detector and optical system - Google Patents

Focus detector and optical system Download PDF

Info

Publication number
JP2007121896A
JP2007121896A JP2005316892A JP2005316892A JP2007121896A JP 2007121896 A JP2007121896 A JP 2007121896A JP 2005316892 A JP2005316892 A JP 2005316892A JP 2005316892 A JP2005316892 A JP 2005316892A JP 2007121896 A JP2007121896 A JP 2007121896A
Authority
JP
Japan
Prior art keywords
focus detection
optical system
information
imaging optical
pair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2005316892A
Other languages
Japanese (ja)
Other versions
JP4984491B2 (en
Inventor
Yosuke Kusaka
洋介 日下
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2005316892A priority Critical patent/JP4984491B2/en
Publication of JP2007121896A publication Critical patent/JP2007121896A/en
Application granted granted Critical
Publication of JP4984491B2 publication Critical patent/JP4984491B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide a focus detector capable of improving the focus detecting accuracy of an image-formation optical system, and to provide an optical system. <P>SOLUTION: The focus detector detects a focusing state on the scheduled image-formation surface of the image-formation optical system based on the deviation of a pair of images by a pair of luminous fluxes passing through different positions of the pupil of the image-formation optical system. It calculates a conversion coefficient for converting the deviation of a pair of images into the defocus amount of the image-formation optical system based on information showing the distribution of the luminous flux and the aperture information of the image-formation optical system. <P>COPYRIGHT: (C)2007,JPO&INPIT

Description

本発明は、結像光学系の焦点調節状態を検出する焦点検出装置と、その焦点検出装置を備えた光学システムに関する。   The present invention relates to a focus detection device that detects a focus adjustment state of an imaging optical system, and an optical system including the focus detection device.

結像光学系を通過した光束を用いて結像光学系の焦点調節状態を検出するTTL方式の焦点検出方式として、いわゆる瞳分割型の焦点検出方式が知られている。この瞳分割型の焦点検出方式では、結像光学系の射出瞳の異なる領域を通過した光束が予定結像面上に形成する一対の像の相対的位置ズレ量を検出し、その像ズレ量に所定の変換係数を乗じて光軸方向のデフォーカス量に変換している。そのため位相差検出方式または像ズレ検出方式とも呼ばれている。
像ズレ量をデフォーカス量に変換する際に、結像光学系の明るさ(F値)に応じた変換係数を用いるようにした焦点検出装置が知られている(例えば、特許文献1参照)。この焦点検出装置によれば、結像光学系の明るさの影響を受けて焦点検出用光束にケラレが生じるような場合であっても、適切な変換係数を用いて高精度な焦点検出が可能になるとしている。
A so-called pupil division type focus detection method is known as a TTL focus detection method for detecting a focus adjustment state of an imaging optical system using a light beam that has passed through the imaging optical system. In this pupil division type focus detection method, a relative positional deviation amount of a pair of images formed on a predetermined imaging plane by a light beam that has passed through different regions of the exit pupil of the imaging optical system is detected, and the image deviation amount. Is multiplied by a predetermined conversion coefficient to convert to a defocus amount in the optical axis direction. Therefore, it is also called a phase difference detection method or an image shift detection method.
A focus detection device is known that uses a conversion coefficient corresponding to the brightness (F value) of an imaging optical system when converting an image shift amount into a defocus amount (see, for example, Patent Document 1). . According to this focus detection apparatus, even when vignetting occurs in the focus detection light beam due to the influence of the brightness of the imaging optical system, high-precision focus detection is possible using an appropriate conversion coefficient. It is going to be.

この出願の発明に関連する先行技術文献としては次のものがある。
特公平07−062732号公報
Prior art documents related to the invention of this application include the following.
Japanese Patent Publication No. 07-062732

しかしながら、焦点検出位置が結像光学系の光軸を外れた予定結像面の周辺にある場合には、絞り開口以外のレンズ開口によっても焦点検出用光束のケラレが発生するため、単に明るさのみの情報では正確な変換係数が得られないという問題がある。   However, when the focus detection position is in the vicinity of the planned imaging plane that is off the optical axis of the imaging optical system, the vignetting of the focus detection light beam is also generated by the lens opening other than the aperture opening, so that the brightness is simply increased. There is a problem that an accurate conversion coefficient cannot be obtained with only the information.

結像光学系の瞳の異なる位置を通過した一対の光束による一対の像のズレ量に基づいて、結像光学系の予定結像面における焦点調節状態を検出する焦点検出装置であって、光束の分布を示す情報と結像光学系の口径情報とに基づいて、一対の像のズレ量を結像光学系のデフォーカス量に変換するための変換係数を算出する。   A focus detection device that detects a focus adjustment state on a planned imaging plane of an imaging optical system based on a shift amount of a pair of images by a pair of light beams that have passed through different positions of a pupil of the imaging optical system, The conversion coefficient for converting the shift amount of the pair of images into the defocus amount of the imaging optical system is calculated based on the information indicating the distribution of the image and the aperture information of the imaging optical system.

本発明によれば、瞳分割型の焦点検出方式において、一対の像ズレ量を結像光学系のデフォーカス量に変換するための正確な変換係数を求めることができ、結像光学系の焦点検出精度を向上させることができる。   According to the present invention, in the pupil division type focus detection method, an accurate conversion coefficient for converting a pair of image shift amounts into a defocus amount of the imaging optical system can be obtained. Detection accuracy can be improved.

結像光学系を通過した光束を用いて結像光学系の焦点調節状態を検出するTTL方式の焦点検出方式として、いわゆる瞳分割型の焦点検出方式が知られている。この瞳分割型の焦点検出方式では、結像光学系の射出瞳の異なる領域を通過した光束が予定結像面上に形成する一対の像の相対的位置ズレ量を検出し、その位置ズレ量に所定の変換係数を乗じて光軸方向のデフォーカス量に変換している。そのため位相差検出方式または像ズレ検出方式とも呼ばれている。   A so-called pupil division type focus detection method is known as a TTL focus detection method for detecting a focus adjustment state of an imaging optical system using a light beam that has passed through the imaging optical system. In this pupil division type focus detection method, a relative positional deviation amount of a pair of images formed on a predetermined imaging plane by a light beam that has passed through different regions of the exit pupil of the imaging optical system is detected, and the positional deviation amount is detected. Is multiplied by a predetermined conversion coefficient to convert to a defocus amount in the optical axis direction. Therefore, it is also called a phase difference detection method or an image shift detection method.

瞳分割方式においては焦点検出状態の検出結果としてデフォーカス量(予定結像面に対する現在の結像面の光軸方向のズレ方向とズレ量)が正確に算出されるため、このデフォーカス量に応じて結像光学系を変位させることによって、コントラスト検出方式などの他の焦点検出方式に比較して迅速に結像光学系を合焦させることが可能である。   In the pupil division method, the defocus amount (the shift direction and the shift amount of the current imaging plane relative to the planned imaging plane) is accurately calculated as the detection result of the focus detection state. By displacing the imaging optical system accordingly, it is possible to focus the imaging optical system more quickly than other focus detection methods such as a contrast detection method.

瞳分割型の焦点検出方式には再結像方式とマイクロレンズ方式がある。まず、再結像方式について説明する。再結像方式は結像光学系の予定結像面近傍に配置されたコンデンサーレンズ、イメージセンサー、一対の再結像レンズ、一対の再結像レンズの近傍に配置された一対の開口を有する絞りマスクから構成され、再結像レンズにより予定焦点面に結像した像をイメージセンサー上に一対の像として再結像させ、この一対の像の相対的位置ズレ量を検出する方式である。この方式においては、一対の絞りマスク開口の形状がコンデンサーレンズによって結像光学系の射出瞳面近傍に投影されており、この投影された一対の絞り開口形状の領域を通過する一対の光束により上記一対の像が再結像される。   The pupil division type focus detection method includes a re-imaging method and a microlens method. First, the re-imaging method will be described. The re-imaging method is a diaphragm having a condenser lens, an image sensor, a pair of re-imaging lenses, and a pair of apertures arranged in the vicinity of the pair of re-imaging lenses. In this method, an image formed on a predetermined focal plane by a re-imaging lens is re-imaged as a pair of images on an image sensor, and a relative positional shift amount of the pair of images is detected. In this method, the shape of the pair of aperture mask apertures is projected by the condenser lens in the vicinity of the exit pupil plane of the imaging optical system, and the pair of light beams passing through the projected pair of aperture aperture shape regions A pair of images is re-imaged.

図1は再結像方式の焦点検出装置の構成を示す図である。測距ユニット18は結像光学系の予定結像面近傍に配置されたコンデンサレンズ10、その背後に配置されたイメージサンサ16、コンデンサレンズ10とイメージサンサ16の間に配置され、予定結像面近傍に結像された1次像をイメージセンサ16上に再結像する一対の再結像レンズ14,15、一対の再結像レンズの近傍(図では前面)に配置された一対の絞り開口12,13を有する絞りマスク11から構成される。   FIG. 1 is a diagram showing the configuration of a re-imaging type focus detection apparatus. The distance measuring unit 18 is disposed between the condenser lens 10 disposed in the vicinity of the planned imaging plane of the imaging optical system, the image sensor 16 disposed behind the condenser lens 10, and between the condenser lens 10 and the image sensor 16, and the planned imaging plane. A pair of re-imaging lenses 14 and 15 for re-imaging the primary image formed in the vicinity on the image sensor 16, and a pair of aperture openings disposed in the vicinity (front surface in the figure) of the pair of re-imaging lenses. The aperture mask 11 has 12 and 13.

イメージセンサ16上に再結像された一対の像の強度分布に対応した情報がイメージセンサ16から出力され、この情報に対して周知の像ズレ検出演算処理(相関処理、位相差検出処理)が施されて一対の像の像ズレ量が検出される。そして、像ズレ量に所定の変換係数を乗ずることによって、予定結像面に対する現在の結像面の偏差(デフォーカス量)が算出される。   Information corresponding to the intensity distribution of the pair of images re-imaged on the image sensor 16 is output from the image sensor 16, and known image shift detection calculation processing (correlation processing, phase difference detection processing) is performed on this information. The amount of image misalignment between the pair of images is detected. Then, the deviation (defocus amount) of the current imaging plane with respect to the scheduled imaging plane is calculated by multiplying the image shift amount by a predetermined conversion coefficient.

コンデンサレンズ10は光軸4上に配置され、光軸4の方向に絞りマスク11の絞り開口12,13を射出瞳1上に領域2,3として投影している。この領域2,3を測距瞳と呼ぶ。すなわち、イメージセンサ16上に再結像される一対の像は射出瞳1上の一対の測距瞳2,3を通過する光束によって形成される。射出瞳1上の一対の測距瞳2,3を通過する光束32、33を焦点検出用光束と呼ぶ。   The condenser lens 10 is disposed on the optical axis 4, and the aperture openings 12 and 13 of the aperture mask 11 are projected on the exit pupil 1 as regions 2 and 3 in the direction of the optical axis 4. These areas 2 and 3 are called distance measuring pupils. That is, a pair of images re-imaged on the image sensor 16 is formed by a light beam passing through the pair of distance measuring pupils 2 and 3 on the exit pupil 1. The light beams 32 and 33 passing through the pair of distance measuring pupils 2 and 3 on the exit pupil 1 are called focus detection light beams.

測距ユニット28は結像光学系の予定結像面近傍に配置されたコンデンサレンズ20、その背後に配置されたイメージサンサ26、コンデンサレンズ20とイメージサンサ26の間に配置され、予定結像面近傍に結像された1次像をイメージセンサ26上に再結像する一対の再結像レンズ24,25、一対の再結像レンズの近傍(図では前面)に配置された一対の絞り開口22,23を有する絞りマスク21から構成される。   The distance measuring unit 28 is disposed between the condenser lens 20 disposed in the vicinity of the planned imaging plane of the imaging optical system, the image sensor 26 disposed behind the condenser lens 20, and between the condenser lens 20 and the image sensor 26. A pair of re-imaging lenses 24 and 25 for re-imaging the primary image formed in the vicinity on the image sensor 26, and a pair of aperture openings disposed in the vicinity (front surface in the figure) of the pair of re-imaging lenses. The aperture mask 21 includes 22 and 23.

イメージセンサ26上に再結像された一対の像の強度分布に対応した情報がイメージセンサ26から出力され、この情報に対して周知の像ズレ検出演算処理(相関処理、位相差検出処理)が施されて一対の像の像ズレ量が検出される。この像ズレ量に所定の変換係数を乗ずることによって、予定結像面に対する現在の結像面の偏差(デフォーカス量)が算出される。   Information corresponding to the intensity distribution of the pair of images re-imaged on the image sensor 26 is output from the image sensor 26, and known image shift detection calculation processing (correlation processing, phase difference detection processing) is performed on this information. The amount of image misalignment between the pair of images is detected. By multiplying this image shift amount by a predetermined conversion coefficient, the deviation (defocus amount) of the current image plane with respect to the planned image plane is calculated.

コンデンサレンズ20は光軸4から離間した位置に配置され、射出瞳1に対して投影軸27の方向に絞りマスク21の絞り開口22,23を射出瞳1上に測距瞳2,3として投影している。すなわち、イメージセンサ26上に再結像される一対の像は、射出瞳1上の一対の測距瞳2,3を通過する光束によって形成される。射出瞳1上の一対の測距瞳2,3を通過する光束42,43を焦点検出用光束と呼ぶ。   The condenser lens 20 is disposed at a position separated from the optical axis 4, and the aperture openings 22 and 23 of the aperture mask 21 are projected onto the exit pupil 1 as distance measurement pupils 2 and 3 in the direction of the projection axis 27 with respect to the exit pupil 1. is doing. That is, a pair of images re-imaged on the image sensor 26 is formed by a light beam passing through the pair of distance measuring pupils 2 and 3 on the exit pupil 1. The light beams 42 and 43 passing through the pair of distance measuring pupils 2 and 3 on the exit pupil 1 are called focus detection light beams.

次に、マイクロレンズ方式の瞳分割型焦点検出について説明する。マイクロレンズ方式は、予定結像面近傍に配置されたマイクロレンズアレイの、各マイクロレンズの背後に置かれたイメージセンサーからなる一対の受光素子でマイクロレンズを通過した光束を受光し、一対の受光素子の一方の系列の出力信号と他方の系列の出力信号のズレ量を検出する方式である。   Next, the pupil division type focus detection of the microlens method will be described. In the microlens method, a pair of light-receiving elements that receive a light beam that has passed through the microlens are received by a pair of light-receiving elements consisting of an image sensor placed behind each microlens in a microlens array arranged near the planned imaging plane. This is a method for detecting the amount of deviation between the output signal of one series of elements and the output signal of the other series.

この方式においては、各マイクロレンズによりその背後に位置する一対の受光素子の開口形状が結像光学系の射出瞳面近傍に投影されており、該投影された一対の開口形状の領域を通過する一対の光束によりマイクロレンズアレイ上に形成された一対の像の光量が、一対の受光素子によりそれぞれ検出される。   In this method, the aperture shape of a pair of light receiving elements positioned behind each microlens is projected in the vicinity of the exit pupil plane of the imaging optical system, and passes through the projected pair of aperture shape regions. The light amounts of the pair of images formed on the microlens array by the pair of light beams are respectively detected by the pair of light receiving elements.

図2はマイクロレンズ方式の焦点検出装置の構成を示す図である。なお、図2において、図1に示す要素と同様な要素については同一の符号を付して相違点を中心に説明する。マイクロレンズ50、60は結像光学系の予定結像面近傍に配置されている。光軸4上に配置されたマイクロレンズ50によって、その背後に配置された一対の受光素子52,53の受光部の形状が光軸4の方向の射出瞳1の測距瞳2,3へ投影される。また、光軸4から離間して配置されたマイクロレンズ60によって、その背後に配置された一対の受光素子62,63の受光部の形状が投影軸67の方向の射出瞳1の測距瞳2,3へ投影される。   FIG. 2 is a diagram illustrating a configuration of a microlens focus detection apparatus. In FIG. 2, the same elements as those shown in FIG. The microlenses 50 and 60 are disposed in the vicinity of the planned image forming surface of the image forming optical system. By the microlens 50 disposed on the optical axis 4, the shape of the light receiving portion of the pair of light receiving elements 52 and 53 disposed behind the microlens 50 is projected onto the distance measuring pupils 2 and 3 of the exit pupil 1 in the direction of the optical axis 4. Is done. Further, the distance measurement pupil 2 of the exit pupil 1 in the direction of the projection axis 67 with the shape of the light receiving portion of the pair of light receiving elements 62 and 63 disposed behind the micro lens 60 disposed away from the optical axis 4. , 3.

受光素子52は、測距瞳2を通過した焦点検出用光束72によりマイクロレンズ50上に形成される像の強度に対応した情報を出力する。また、受光素子53は、測距瞳3を通過した焦点検出用光束73によりマイクロレンズ50上に形成される像の強度に対応した情報を出力する。一方、受光素子63は、測距瞳2を通過した焦点検出用光束82によりマイクロレンズ60上に形成される像の強度に対応した情報を出力する。また、受光素子63は、測距瞳3を通過した焦点検出用光束83によりマイクロレンズ60上に形成される像の強度に対応した情報を出力する。   The light receiving element 52 outputs information corresponding to the intensity of the image formed on the microlens 50 by the focus detection light beam 72 that has passed through the distance measuring pupil 2. The light receiving element 53 outputs information corresponding to the intensity of the image formed on the microlens 50 by the focus detection light beam 73 that has passed through the distance measuring pupil 3. On the other hand, the light receiving element 63 outputs information corresponding to the intensity of the image formed on the microlens 60 by the focus detection light beam 82 that has passed through the distance measuring pupil 2. The light receiving element 63 outputs information corresponding to the intensity of the image formed on the microlens 60 by the focus detection light beam 83 that has passed through the distance measuring pupil 3.

上記のようなマイクロレンズをアレイ状に多数配置し、その背後に配置した一対の受光素子の出力をまとめることによって、測距瞳2と測距瞳3をそれぞれ通過する焦点検出用光束がマイクロレンズアレイ上に形成する一対の像の強度分布に関する情報が得られる。この情報に対して周知の像ズレ検出演算処理(相関処理、位相差検出処理)を施すことによって、一対の像の像ズレ量が検出される。さらに、この像ズレ量に所定の変換係数を乗ずることによって、結像光学系の予定結像面に対する現在の結像面の偏差(デフォーカス量)が算出される。マイクロレンズ方式では、マイクロレンズのサイズが小さくになるにつれてマイクロレンズの開口径が小さくなり、回折により受光素子の射出瞳面への投影像がぼやける。   A large number of microlenses as described above are arranged in an array, and the outputs of a pair of light receiving elements arranged behind the microlenses are collected, so that the focus detection light beams that pass through the distance measuring pupil 2 and the distance measuring pupil 3 can be converted into microlenses. Information about the intensity distribution of a pair of images formed on the array is obtained. By performing known image shift detection calculation processing (correlation processing, phase difference detection processing) on this information, the image shift amount of the pair of images is detected. Further, the deviation (defocus amount) of the current imaging plane with respect to the scheduled imaging plane of the imaging optical system is calculated by multiplying this image shift amount by a predetermined conversion coefficient. In the microlens system, the aperture diameter of the microlens decreases as the size of the microlens decreases, and the projection image on the exit pupil plane of the light receiving element blurs due to diffraction.

図3はマイクロレンズ方式における回折現象を説明する図である。なお、図3は回折の影響がない場合の理想的な射出瞳上での受光素子投影像(片側)の例を示す。930はマイクロレンズから100mm前方に設定された射出瞳上で絞り開口径F2.8に対応する円、940はF2.8の対応円930にほぼ内接して投影されるべき矩形領域940(X=2〜12mm、Y=−10〜10mm)である。   FIG. 3 is a diagram for explaining a diffraction phenomenon in the microlens method. FIG. 3 shows an example of a light receiving element projection image (one side) on an ideal exit pupil when there is no influence of diffraction. Reference numeral 930 denotes a circle corresponding to the aperture diameter F2.8 on the exit pupil set in front of the microlens by 100 mm, and reference numeral 940 denotes a rectangular region 940 to be projected substantially inscribed in the corresponding circle 930 of F2.8 (X = 2 to 12 mm, Y = -10 to 10 mm).

図4は、図3に対応する回折像強度分布パターンを示す図である。なお、図4はマイクロレンズの開口を半径2umの円形とした場合の、投影されるべき矩形領域940の射出瞳上の回折像強度分布を波長500nmとしてシミュレーションで求めた図である。   FIG. 4 is a diagram showing a diffraction image intensity distribution pattern corresponding to FIG. FIG. 4 is a diagram obtained by simulation by setting the diffraction image intensity distribution on the exit pupil of the rectangular region 940 to be projected as a wavelength of 500 nm when the aperture of the microlens is a circle having a radius of 2 μm.

図5は、図4の回折像パターンのY=0の断面図である。図4および図5から明らかなように、回折像においては投影像のエッジがだれて投影像のかなりの部分がF2.8の円からはみ出す(F2.8の円の内部にあるものが約81%)。実際にはマイクロレンズの開口円の半径が数10um以下になると、射出瞳上の焦点検出用の領域は回折の影響で形が崩れ始める。回折の影響で測距瞳の形が崩れ、設定した領域の外部にはみ出してしまうことにより、はみ出した部分の光束が結像光学系の絞りや絞り以外のレンズ端部によってけられてしまうことがある。また、図5からも明らかなように、測距瞳における焦点検出用光束の分布は一様ではない。   FIG. 5 is a cross-sectional view of the diffraction pattern of FIG. 4 at Y = 0. As is apparent from FIGS. 4 and 5, in the diffraction image, the edge of the projection image deviates and a considerable part of the projection image protrudes from the circle of F2.8 (the one inside the circle of F2.8 is about 81). %). Actually, when the radius of the opening circle of the microlens becomes several tens of um or less, the focus detection region on the exit pupil starts to be deformed due to the influence of diffraction. Due to the influence of diffraction, the shape of the distance measuring pupil collapses and protrudes outside the set area, so that the light beam at the protruding portion is scattered by the aperture of the imaging optical system or the lens end other than the aperture. is there. Further, as is clear from FIG. 5, the distribution of the focus detection light flux in the distance measuring pupil is not uniform.

図6はマイクロレンズ方式における球面収差を説明する図である。マイクロレンズ50の近軸光線により測距瞳を射出瞳1上に形成する場合、例えば一対の受光素子の中心点95から出射する近軸光線90は射出瞳1と光軸4との交点92上に収束するが、マイクロレンズ50の周辺を通る光線は射出瞳1上で点92から離れた93,94に到達する。この現象はいわゆる球面収差と呼ばれているもので、微小なマイクロレンズ50を製造する上で避けられないものである。球面収差により投影像のエッジがだれて投影像のかなりの部分が理想の測距瞳からはみ出し、回折と同じような結果をもたらす。球面収差以外の収差(色収差、コマ収差、非点収差、歪曲収差、像面湾曲)によっても同様に投影像のエッジが乱れるとともに、焦点検出用光束の分布は一様ではなくなる。   FIG. 6 is a diagram for explaining spherical aberration in the microlens system. When the distance measuring pupil is formed on the exit pupil 1 by the paraxial ray of the micro lens 50, for example, the paraxial ray 90 emitted from the center point 95 of the pair of light receiving elements is on the intersection 92 of the exit pupil 1 and the optical axis 4. However, the light rays passing through the periphery of the microlens 50 reach 93 and 94 apart from the point 92 on the exit pupil 1. This phenomenon is called so-called spherical aberration, and is unavoidable in manufacturing the microlens 50. Due to the spherical aberration, the edge of the projected image is deviated, and a considerable part of the projected image protrudes from the ideal distance measuring pupil, resulting in the same result as diffraction. Similarly, aberrations other than spherical aberration (chromatic aberration, coma aberration, astigmatism, distortion aberration, curvature of field) disturb the edge of the projected image and the distribution of the focus detection light beam is not uniform.

図7は焦点検出用光束のケラレの説明図である。5は予定結像面、6,7はケラレた場合の焦点検出用光束、56,57はケラレない場合の焦点検出用光束、8,9はケラレない場合の焦点検出用光束56,57の重心位置、78,79は焦点検出用光束6,7の重心位置、45は予定結像面5と光軸4の交点、36は重心位置8,9を点45から見た場合の開き角、35は重心位置78,79を点45から見た場合の開き角、46は予定結像面5から射出瞳面1までの距離、40は射出瞳面1に一致している結像光学系の絞りの射出瞳、41,44は射出瞳面1以外の位置にある結像光学系の絞りの射出瞳である。   FIG. 7 is an explanatory diagram of vignetting of the focus detection light beam. Reference numeral 5 denotes a planned imaging plane, reference numerals 6 and 7 denote focus detection light beams when vignetting occurs, reference numerals 56 and 57 denote focus detection light beams when there is no vignetting, and reference numerals 8 and 9 denote centroids of focus detection light beams 56 and 57 when there is no vignetting. Positions 78 and 79 are the gravity center positions of the focus detection light beams 6 and 7, 45 is the intersection of the planned imaging plane 5 and the optical axis 4, 36 is the opening angle when the gravity center positions 8 and 9 are viewed from the point 45, and 35 Is the opening angle when the barycentric positions 78 and 79 are viewed from the point 45, 46 is the distance from the planned imaging plane 5 to the exit pupil plane 1, and 40 is the stop of the imaging optical system that coincides with the exit pupil plane 1 The exit pupils 41 and 44 are exit pupils of the stop of the imaging optical system located at a position other than the exit pupil plane 1.

図7において、焦点検出位置が光軸上あるいはその近傍にある場合の焦点検出用光束のケラレを示し、予定結像面5から距離46に設定された射出瞳面1上に測距瞳が形成されており、結像光学系の明るさ絞りが十分おおきな場合には焦点検出用光束56,57にケラレが発生しない。焦点検出用光束56,57の重心位置8,9を点45から臨む開き角36をΘa、像ズレ量をH、変換係数をka、デフォーカス量をDとすれば、以下の式でデフォーカス量Dが求まる。
D=H×ka=H/(2×Tan(Θa/2)) ・・・(1)
FIG. 7 shows the vignetting of the focus detection light beam when the focus detection position is on or near the optical axis, and a distance measurement pupil is formed on the exit pupil plane 1 set at a distance 46 from the planned imaging plane 5. Thus, when the aperture stop of the imaging optical system is sufficiently large, no vignetting occurs in the focus detection light beams 56 and 57. If the opening angle 36 at which the gravity centers 8 and 9 of the focus detection light beams 56 and 57 face the point 45 is Θa, the image shift amount is H, the conversion coefficient is ka, and the defocus amount is D, the defocus is expressed by the following equation. The quantity D is determined.
D = H × ka = H / (2 × Tan (Θa / 2)) (1)

射出瞳面1に一致した位置に結像光学系の射出瞳40があり、その明るさが暗くなってくると、焦点検出用光束56,57の一部を遮光していわゆるケラレと呼ばれる現象が発生し、実際に焦点検出に利用できる光束は焦点検出用光束6,7となる。ケラレが生じた焦点検出用光束6,7の重心位置78,79を点45から臨む開き角35をΘb、像ズレ量をH、変換係数をkb、デフォーカス量をDとすれば、以下の式でデフォーカス量Dが求まる。
D=H×kb=H/(2×Tan(Θb/2)) ・・・(2)
焦点検出位置が光軸上あるいはその近傍にある場合には、結像光学系の絞りの射出瞳41、44が射出瞳面1以外の位置にあっても、明るさが同じであれば焦点検出用光束のケラレは大きく変わらない。
When the exit pupil 40 of the imaging optical system is located at a position coincident with the exit pupil plane 1 and the brightness thereof becomes dark, a phenomenon called vignetting occurs by blocking a part of the focus detection light beams 56 and 57. The generated light flux that can be actually used for focus detection is the focus detection light fluxes 6 and 7. Assuming that the opening angle 35 facing the gravity center positions 78 and 79 of the focus detection light beams 6 and 7 where the vignetting occurs from the point 45 is Θb, the image shift amount is H, the conversion coefficient is kb, and the defocus amount is D, the following The defocus amount D is obtained by the equation.
D = H × kb = H / (2 × Tan (Θb / 2)) (2)
When the focus detection position is on or near the optical axis, the focus detection is performed if the brightness is the same even if the exit pupils 41 and 44 of the stop of the imaging optical system are located at positions other than the exit pupil plane 1. The vignetting of the luminous flux does not change significantly.

図8は焦点検出用光束の重心の求め方を説明する図である。図8において、結像光学系の射出瞳40は焦点検出用光束57の一部を遮光している。実際に焦点検出に利用できる光束は射出瞳40の内部の焦点検出用光束7となる。焦点検出用光束57の重心位置9は焦点検出用光束57(射出瞳面1上での測距瞳に相当)の光束の分布重心となる。ここで、重心は測距瞳の並び方向(図ではx方向)の重心を示す。結像光学系の射出瞳40でケラレた焦点検出用光束7の重心位置79は焦点検出用光束7(射出瞳面1上でのけられた測距瞳に相当)の光束の分布重心となる。   FIG. 8 is a diagram for explaining how to obtain the center of gravity of the focus detection light beam. In FIG. 8, the exit pupil 40 of the imaging optical system blocks a part of the focus detection light beam 57. The light beam that can actually be used for focus detection is the focus detection light beam 7 inside the exit pupil 40. The center-of-gravity position 9 of the focus detection light beam 57 is the distribution center of gravity of the light beam of the focus detection light beam 57 (corresponding to the distance measuring pupil on the exit pupil plane 1). Here, the center of gravity indicates the center of gravity in the direction in which the distance measuring pupils are arranged (x direction in the figure). The centroid position 79 of the focus detection light beam 7 vignetted by the exit pupil 40 of the imaging optical system is the distribution centroid of the light beam of the focus detection light beam 7 (corresponding to the distance measurement pupil on the exit pupil plane 1). .

ケラレが生じていない場合の焦点検出用光束57の光束の分布情報を測距瞳情報として、予め測定または計算により求めて2次元情報などとして記憶しておく。結像光学系の射出瞳の情報(明るさ情報=射出瞳の径、位置)を口径情報として、予め測定または計算により求めて2次元情報などとして記憶しておく。2次元的な測距瞳情報と口径情報を重ね合わせ、結像光学系の射出瞳の内部に存在する焦点検出用光束の重心位置を演算により求める。一対の焦点検出用光束6,7の重心位置78,79が求まれば、予定結像面5と射出瞳面1との間の距離46を用いて点45における開き角35を求めることができる。   The distribution information of the light beam 57 of the focus detection light beam 57 when vignetting has not occurred is previously obtained by measurement or calculation and stored as two-dimensional information or the like as distance measurement pupil information. Information on the exit pupil of the imaging optical system (brightness information = diameter and position of the exit pupil) is obtained as aperture information in advance by measurement or calculation and stored as two-dimensional information. The two-dimensional distance measurement pupil information and aperture information are overlapped, and the position of the center of gravity of the focus detection light beam existing inside the exit pupil of the imaging optical system is obtained by calculation. If the barycentric positions 78 and 79 of the pair of focus detection light beams 6 and 7 are obtained, the opening angle 35 at the point 45 can be obtained using the distance 46 between the planned imaging plane 5 and the exit pupil plane 1. .

例えば重心位置間の距離をS、距離46をM、開き角をΘbとすれば、以下の式で開き角Θbが求まる。
Θb=2×ArcTan(S/(2×M)) ・・・(3)
焦点検出位置が光軸上あるいはその近傍にある場合には、結像光学系の絞りの射出瞳41、44が射出瞳面1以外の位置にあっても、明るさが同じであれば焦点検出用光束のケラレは大きく変わらないので、焦点検出用光束56,57のケラレは結像光学系の光学絞りに対応する射出瞳の明るさのみを考慮すればよい。2次元的な測距瞳情報と口径情報を重ね合わせ、結像光学系の射出瞳の内部に存在する焦点検出用光束の総量(光量に相当する)を演算により求める。
For example, if the distance between the gravity center positions is S, the distance 46 is M, and the opening angle is Θb, the opening angle Θb can be obtained by the following equation.
Θb = 2 × ArcTan (S / (2 × M)) (3)
When the focus detection position is on or near the optical axis, the focus detection is performed if the brightness is the same even if the exit pupils 41 and 44 of the stop of the imaging optical system are located at positions other than the exit pupil plane 1. Since the vignetting of the luminous flux does not change greatly, the vignetting of the focus detecting beams 56 and 57 need only consider the brightness of the exit pupil corresponding to the optical aperture of the imaging optical system. Two-dimensional ranging pupil information and aperture information are superimposed, and the total amount (corresponding to the amount of light) of the focus detection light beam existing inside the exit pupil of the imaging optical system is obtained by calculation.

図9は焦点検出用光束のケラレを説明する図である。47は予定結像面5と光軸4の交点47から離間した点、36は重心位置8,9を点47から見た場合の開き角、35は重心位置78,79を点47から見た場合の開き角、46は予定結像面5から射出瞳面1までの距離、48は射出瞳面1より遠方にある結像光学系の絞り以外の開口の射出瞳、49は射出瞳面1より近くにある結像光学系の絞り以外の開口の射出瞳、58は測距瞳の投影軸である。   FIG. 9 is a diagram for explaining the vignetting of the focus detection light beam. 47 is a point separated from the intersection 47 of the planned imaging plane 5 and the optical axis 4, 36 is an opening angle when the gravity center positions 8 and 9 are viewed from the point 47, and 35 is a gravity center position 78 and 79 viewed from the point 47. , 46 is the distance from the planned imaging plane 5 to the exit pupil plane 1, 48 is the exit pupil of the aperture other than the stop of the imaging optical system far from the exit pupil plane 1, and 49 is the exit pupil plane 1. The exit pupil 58 of the aperture other than the aperture of the imaging optical system closer to the aperture 58 is the projection axis of the distance measuring pupil.

図9において、焦点検出位置が光軸から離れている場合の焦点検出用光束のケラレを示し、予定結像面5から距離46に設定された射出瞳面1上に測距瞳が形成されており、結像光学系の明るさ絞りが十分大きな場合には焦点検出用光束56,57にケラレが発生しない。焦点検出用光束56,57の重心位置8,9を点47から臨む開き角36をΘc、像ズレ量をH、変換係数をkc、デフォーカス量をDとすれば、以下の式でデフォーカス量Dが求まる。
D=H×kc=H/(2×Tan(Θc/2)) ・・・(4)
射出瞳面1に一致した位置に結像光学系の射出瞳40があり、その明るさが暗くなってくると、焦点検出用光束56,57の一部を遮光していわゆるケラレと呼ばれる現象が発生する。
FIG. 9 shows the vignetting of the focus detection light beam when the focus detection position is away from the optical axis, and the distance measurement pupil is formed on the exit pupil plane 1 set at a distance 46 from the planned imaging plane 5. If the aperture stop of the imaging optical system is sufficiently large, no vignetting occurs in the focus detection light beams 56 and 57. If the opening angle 36 facing the gravity center positions 8 and 9 of the focus detection light beams 56 and 57 from the point 47 is Θc, the image shift amount is H, the conversion coefficient is kc, and the defocus amount is D, the defocus is expressed by the following equation. The quantity D is determined.
D = H × kc = H / (2 × Tan (Θc / 2)) (4)
When the exit pupil 40 of the imaging optical system is located at a position coincident with the exit pupil plane 1 and the brightness thereof becomes dark, a phenomenon called vignetting occurs by blocking a part of the focus detection light beams 56 and 57. appear.

また、焦点検出位置が光軸近傍にない場合には、結像光学系の絞り以外のレンズ端部による射出瞳48,49(絞りの射出瞳よりF値が明るい場合においても)によっても焦点検出用光束のケラレが発生し、実際に焦点検出に利用できる光束は焦点検出用光束6,7となる。ケラレが生じた焦点検出用光束6,7の重心位置78,79を点47から臨む開き角35をΘd、像ズレ量をH、変換係数をkd、デフォーカス量をDとすれば、以下の式でデフォーカス量Dが求まる。
D=H×kd=H/(2×Tan(Θd/2)) ・・・(5)
焦点検出位置が光軸近傍にない場合には、射出瞳面1以外の位置にある絞りの射出瞳や、絞りの射出瞳よりF値が明るい場合であっても結像光学系の絞り以外のレンズ端部に対応した射出瞳48,49によっても焦点検出用光束のケラレが発生する。
Further, when the focus detection position is not in the vicinity of the optical axis, focus detection is also performed by the exit pupils 48 and 49 (even when the F value is brighter than the exit pupil of the stop) by the lens end other than the stop of the imaging optical system. The vignetting of the light flux for use occurs, and the light flux that can actually be used for focus detection becomes the focus detection light fluxes 6 and 7. Assuming that the opening angle 35 facing the gravity center positions 78 and 79 of the focus detection light beams 6 and 7 where the vignetting occurs from the point 47 is Θd, the image shift amount is H, the conversion coefficient is kd, and the defocus amount is D, the following The defocus amount D is obtained by the equation.
D = H × kd = H / (2 × Tan (Θd / 2)) (5)
When the focus detection position is not in the vicinity of the optical axis, the exit pupil of the stop located at a position other than the exit pupil plane 1 or other than the stop of the imaging optical system even if the F value is brighter than the exit pupil of the stop Vignetting of the focus detection light beam is also generated by the exit pupils 48 and 49 corresponding to the lens end portions.

図10は焦点検出用光束の重心の求め方を説明する図である。射出瞳面1において、焦点検出用光束57は図9に示すように射出瞳48,49により一部が遮光される。すなわち、焦点検出位置である点47に対しては、結像光学系の射出瞳は図10に示すように左側が射出瞳48により規定され、右側が射出瞳49により規定された形状となる。上記形状の射出瞳を周辺射出瞳と呼ぶと、実際に焦点検出に利用できる光束は周辺射出瞳の内部の焦点検出用光束7となる。焦点検出用光束57の重心位置9は焦点検出用光束57(射出瞳面1上での測距瞳に相当)の光束の分布重心となる。ここで、重心は測距瞳の並び方向(図ではx方向)の重心を示す。結像光学系の周辺射出瞳でケラレた焦点検出用光束7の重心位置79は、焦点検出用光束7(射出瞳面1上でのケラレた測距瞳に相当)の光束の分布重心となる。   FIG. 10 is a diagram for explaining how to obtain the center of gravity of the focus detection light beam. On the exit pupil plane 1, a part of the focus detection light beam 57 is shielded by the exit pupils 48 and 49 as shown in FIG. That is, for the point 47 which is the focus detection position, the exit pupil of the imaging optical system has a shape defined by the exit pupil 48 on the left side and the exit pupil 49 on the right side as shown in FIG. When the exit pupil having the above shape is called a peripheral exit pupil, a light beam that can actually be used for focus detection is a focus detection light beam 7 inside the peripheral exit pupil. The center-of-gravity position 9 of the focus detection light beam 57 is the distribution center of gravity of the light beam of the focus detection light beam 57 (corresponding to the distance measuring pupil on the exit pupil plane 1). Here, the center of gravity indicates the center of gravity in the direction in which the distance measuring pupils are arranged (x direction in the figure). The center-of-gravity position 79 of the focus detection light beam 7 vignetted by the peripheral exit pupil of the imaging optical system is the distribution centroid of the light beam of the focus detection light beam 7 (corresponding to the vignetting distance measuring pupil on the exit pupil plane 1). .

ケラレが生じていない場合の焦点検出用光束57の光束の分布情報を測距瞳情報として予め測定または計算により求め、2次元情報などとして記憶しておく。結像光学系の絞りおよび絞り以外のレンズ端部に対応する射出瞳の情報(明るさ情報=射出瞳の径、位置)を口径情報として、予め測定または計算により求めて2次元情報などとして記憶しておく。焦点検出位置から射出瞳面を見た場合の周辺射出瞳の形状を、結像光学系の口径情報から求める。2次元的な測距瞳情報と周辺射出瞳形状を重ね合わせ、周辺射出瞳の内部に存在する焦点検出用光束の重心位置を演算により求める。   The distribution information of the light beam 57 of the focus detection light beam 57 when vignetting has not occurred is previously obtained by measurement or calculation as distance measurement pupil information and stored as two-dimensional information or the like. Information on the exit pupil corresponding to the stop of the imaging optical system and the lens end other than the stop (brightness information = diameter and position of the exit pupil) is obtained in advance by measurement or calculation and stored as two-dimensional information. Keep it. The shape of the peripheral exit pupil when the exit pupil plane is viewed from the focus detection position is obtained from the aperture information of the imaging optical system. The two-dimensional distance measurement pupil information and the peripheral exit pupil shape are overlapped, and the center of gravity position of the focus detection light beam existing inside the peripheral exit pupil is obtained by calculation.

一対の焦点検出用光束6,7の重心位置78,79が求まれば、予定結像面5と射出瞳面1との間の距離46を用いて点45における開き角35を求めることができる。例えば重心位置間の距離をS、距離46をM、開き角をΘdとすれば、以下の式で開き角Θdが求まる。
Θd=2×ArcTan(S/(2×M)) ・・・(6)
2次元的な測距瞳情報と周辺射出瞳形状を重ね合わせ、周辺射出瞳の内部に存在する焦点検出用光束の総量(光量に相当する)を演算により求める。
If the barycentric positions 78 and 79 of the pair of focus detection light beams 6 and 7 are obtained, the opening angle 35 at the point 45 can be obtained using the distance 46 between the planned imaging plane 5 and the exit pupil plane 1. . For example, if the distance between the gravity center positions is S, the distance 46 is M, and the opening angle is Θd, the opening angle Θd can be obtained by the following equation.
Θd = 2 × ArcTan (S / (2 × M)) (6)
The two-dimensional distance measurement pupil information and the peripheral exit pupil shape are superimposed, and the total amount (corresponding to the amount of light) of the focus detection light beam existing inside the peripheral exit pupil is obtained by calculation.

図11は光学システムの構成図を示す。デジタルスチルカメラ201はカメラボディ203と交換レンズ202から構成され、マウント部204により結合される。交換レンズ202は被写体像を形成するためのレンズ209、フォーカシング用レンズ210および絞り211とフォーカシング用レンズ210の駆動制御および絞り211の駆動制御を行うレンズCPU206とからなる。   FIG. 11 shows a block diagram of the optical system. The digital still camera 201 includes a camera body 203 and an interchangeable lens 202 and is coupled by a mount unit 204. The interchangeable lens 202 includes a lens 209 for forming a subject image, a focusing lens 210, an aperture 211, and a lens CPU 206 that controls driving of the focusing lens 210 and driving of the aperture 211.

カメラボディ203は交換レンズ202の予定結像面に配置されている撮像素子212と、撮像素子212からの画像信号の読み出しおよびデジタルスチルカメラ全体の動作制御を行うボディCPU214と、ボディCPU214から画像信号の一部を受信して、交換レンズ202の焦点調節状態を検出する焦点検出部213と、液晶ビューファインダ(EVF:電気的ビューファインダー)の液晶表示素子216と、液晶表示素子216を観察するための接眼レンズ217と、ボディCPU214の制御にしたがって液晶ビューファインダの液晶表示素子216を駆動する液晶表示素子駆動回路215とを備えている。焦点検出部213とレンズCPU206はマウント部204に設けられた電気接点部218により各種情報(変換係数を算出するために必要な口径情報、レンズ駆動のためのデフォーカス量等)を伝達する。   The camera body 203 has an image sensor 212 disposed on the planned imaging plane of the interchangeable lens 202, a body CPU 214 that reads out image signals from the image sensor 212 and controls the operation of the entire digital still camera, and an image signal from the body CPU 214. For detecting a focus adjustment state of the interchangeable lens 202, a liquid crystal display finder (EVF) 216, and a liquid crystal display 216. Eyepiece lens 217 and a liquid crystal display element driving circuit 215 for driving the liquid crystal display element 216 of the liquid crystal viewfinder in accordance with the control of the body CPU 214. The focus detection unit 213 and the lens CPU 206 transmit various types of information (aperture information necessary for calculating the conversion coefficient, a defocus amount for driving the lens, and the like) through an electrical contact unit 218 provided in the mount unit 204.

撮像素子212には、複数の焦点検出位置に対応した複数の部分にマイクロレンズ方式の焦点検出用のイメージセンサが組込まれている。交換レンズ202を通過して撮像素子212上に形成された被写体像は撮像素子212により光電変換され、その出力はボディCPU214に送られ、マイクロレンズ方式の焦点検出用のイメージセンサの出力は焦点検出部に送られる。焦点検出部213は、レンズCPU206と通信して、装着されているレンズの口径情報を読み出し、この口径情報と焦点検出部213が保持している測距瞳情報と複数の焦点検出位置の情報に基づき、複数の焦点検出位置ごとに変換係数と測距光量情報を算出する。   The image sensor 212 incorporates microlens focus detection image sensors in a plurality of portions corresponding to a plurality of focus detection positions. The subject image formed on the image sensor 212 through the interchangeable lens 202 is photoelectrically converted by the image sensor 212, and the output is sent to the body CPU 214. The output of the microlens focus detection image sensor is the focus detector. Sent to the department. The focus detection unit 213 communicates with the lens CPU 206 to read the aperture information of the mounted lens, and converts the aperture information, the distance measurement pupil information held by the focus detection unit 213, and information on a plurality of focus detection positions. Based on this, a conversion coefficient and ranging light quantity information are calculated for each of the plurality of focus detection positions.

なお、レンズCPU206は口径情報を、フォーカシング状態、ズーミング状態、絞り設定状態に応じて変更する。具体的には、レンズCPU206はレンズ209,210の位置と絞り211の絞り位置をモニターし、モニター情報に応じて口径情報を演算したりあるいは予め用意されたルックアップテーブルからモニター情報に応じた口径情報を選択する。焦点検出部213は、測距光量情報に応じて焦点検出位置ごとに一対の像信号を補正した後、周知の焦点検出演算処理を施して焦点検出位置毎に一対の像の像ズレ量を算出する。   The lens CPU 206 changes the aperture information according to the focusing state, zooming state, and aperture setting state. Specifically, the lens CPU 206 monitors the positions of the lenses 209 and 210 and the aperture position of the aperture 211, calculates aperture information according to the monitor information, or apertures according to the monitor information from a lookup table prepared in advance. Select information. The focus detection unit 213 corrects the pair of image signals for each focus detection position according to the distance measurement light amount information, and then performs a known focus detection calculation process to calculate the image shift amount of the pair of images for each focus detection position. To do.

焦点検出部213は、焦点検出位置ごとに算出された像ズレ量に焦点検出位置ごとに求められた変換係数を乗じ、各焦点検出位置でのデフォーカス量を算出する。焦点検出部213は、複数のデフォーカス量に基づき最終的なデフォーカス量を決定する。例えば、複数のデフォーカス量のうち最至近を示すデフォーカス量を最終的なデフォーカス量とする。あるいは、複数のデフォーカス量の平均値を最終的なデフォーカス量とする。焦点検出部213は、最終的なデフォーカス量に基づきレンズ駆動が必要だと判断した場合(非合焦と判断した場合)は、最終的なデフォーカス量をレンズCPUに送信する。レンズCPU206は受信したデフォーカス量に基づき、レンズ駆動量を算出し、該レンズ駆動量に基づきフォーカシングレンズ210を合焦点へと駆動する。ボディCPU214は撮像素子212からの出力信号に基づき表示用の画像信号を生成し、この画像信号を液晶表示素子駆動回路215を介して液晶表示素子216に表示させる。   The focus detection unit 213 multiplies the image shift amount calculated for each focus detection position by the conversion coefficient obtained for each focus detection position, and calculates the defocus amount at each focus detection position. The focus detection unit 213 determines a final defocus amount based on a plurality of defocus amounts. For example, the defocus amount indicating the closest distance among the plurality of defocus amounts is set as the final defocus amount. Alternatively, an average value of a plurality of defocus amounts is set as a final defocus amount. When the focus detection unit 213 determines that lens driving is necessary based on the final defocus amount (when it is determined that the image is out of focus), the focus detection unit 213 transmits the final defocus amount to the lens CPU. The lens CPU 206 calculates a lens driving amount based on the received defocus amount, and drives the focusing lens 210 to a focal point based on the lens driving amount. The body CPU 214 generates an image signal for display based on the output signal from the image sensor 212 and causes the liquid crystal display element 216 to display the image signal via the liquid crystal display element driving circuit 215.

図12は、図11に示すデジタルスチルカメラにおける焦点検出位置の具体例を示す。図11で説明した複数の焦点検出位置は、例えば図12に示すように、撮影画面300上で焦点検出位置301が画面中央、焦点検出位置302、303が画面上下、焦点検出位置304、305が画面左右に配置されている。   FIG. 12 shows a specific example of the focus detection position in the digital still camera shown in FIG. As shown in FIG. 12, for example, the focus detection position 301 is the center of the screen, the focus detection positions 302 and 303 are the top and bottom of the screen, and the focus detection positions 304 and 305 are as shown in FIG. It is arranged on the left and right of the screen.

図13は、図11、図12に対応した撮像素子212の詳細構成図である。撮像素子212は撮像用画素310が2次元的に配列されており、図12の5箇所の焦点検出位置に対応する部分には焦点検出用画素311(マイクロレンズ方式)が図のように配列されている。   FIG. 13 is a detailed configuration diagram of the image sensor 212 corresponding to FIGS. 11 and 12. In the image sensor 212, imaging pixels 310 are two-dimensionally arranged, and focus detection pixels 311 (microlens systems) are arranged as shown in portions corresponding to the five focus detection positions in FIG. ing.

図14は、デジタルスチルカメラ(光学システム、図11参照)の動作(ボディCPU214と焦点検出部213の動作)を示すフローチャートである。ステップ100において電源がONされると処理を開始し、ステップ110へ進む。ステップ110でレンズCPUから口径情報を受信する。続くステップ120では、口径情報と測距瞳情報と焦点検出位置情報に基づいて各焦点検出位置毎の変換係数と測距光量情報を演算する。ステップ130において、焦点検出位置毎に撮像素子の焦点検出用画素から一対の像信号を読み出し、測距光量情報で補正する。この補正の詳細については後述する。   FIG. 14 is a flowchart showing the operation of the digital still camera (optical system, see FIG. 11) (operation of the body CPU 214 and the focus detection unit 213). When the power is turned on in step 100, the process is started, and the process proceeds to step 110. In step 110, aperture information is received from the lens CPU. In the following step 120, a conversion coefficient and distance measurement light quantity information for each focus detection position are calculated based on the aperture information, distance measurement pupil information, and focus detection position information. In step 130, a pair of image signals are read from the focus detection pixels of the image sensor for each focus detection position, and corrected with distance measurement light quantity information. Details of this correction will be described later.

ステップ140で焦点検出位置毎に補正された一対の像信号の像ズレ量を演算する。ステップ150において、焦点検出位置毎に像ズレ量に変換係数を乗じてデフォーカス量を算出し、該複数のデフォーカス量に基づき最終的なデフォーカス量を決定する。ステップ160で最終的なデフォーカス量に基づき、結像光学系が合焦状態か否かを判定する。ステップ160で合焦状態でないと判定された場合はステップ170へ進み、デフォーカス量をレンズCPUに送信して結像光学系を合焦位置に駆動させ、ステップ110へ戻って上記動作を繰り返す。   In step 140, an image shift amount of the pair of image signals corrected for each focus detection position is calculated. In step 150, a defocus amount is calculated by multiplying the image shift amount by a conversion coefficient for each focus detection position, and a final defocus amount is determined based on the plurality of defocus amounts. In step 160, based on the final defocus amount, it is determined whether or not the imaging optical system is in a focused state. If it is determined in step 160 that the in-focus state is not achieved, the process proceeds to step 170, the defocus amount is transmitted to the lens CPU, the imaging optical system is driven to the in-focus position, and the process returns to step 110 to repeat the above operation.

ステップ160で合焦状態であると判定された場合はステップ180へ進み、シャッターレリーズがなされたか否かを判定し、シャッターレリーズがなされていないと判定された場合はステップ110へ戻って上記動作を繰り返す。一方、シャッターレリーズがなされたと判定された場合はステップ190へ進み、撮影動作を実行した後、ステップ110へ戻って上記動作を繰り返す。   If it is determined in step 160 that the in-focus state has been reached, the process proceeds to step 180, where it is determined whether or not a shutter release has been performed. If it is determined that no shutter release has been performed, the process returns to step 110 and the above operation is performed. repeat. On the other hand, if it is determined that the shutter release has been performed, the process proceeds to step 190 to execute the photographing operation, and then returns to step 110 to repeat the above operation.

図15は測距光量情報による像信号の補正を説明する図である。図15では、1つの焦点検出位置における像信号の強度分布(光量)を縦軸、焦点検出位置内の位置偏差を横軸にとって示した。ここで、焦点検出位置内の位置偏差は、例えば図13に示す撮像素子上の1つの焦点検出位置に属する複数の焦点検出用画素の位置に相当する。焦点検出用光束にケラレが生じていない場合の一対の像信号400,401は、図15(a)に示すように、同一の像信号関数が単に横にシフトしたものとなっている。このような一対の像信号に対して周知の像ズレ検出演算を施すことにより、一対の像信号400,401の正確な像ズレ量を算出することができる。   FIG. 15 is a diagram for explaining correction of an image signal based on distance measurement light quantity information. In FIG. 15, the vertical axis represents the intensity distribution (light quantity) of the image signal at one focus detection position, and the horizontal axis represents the position deviation within the focus detection position. Here, the position deviation in the focus detection position corresponds to the positions of a plurality of focus detection pixels belonging to one focus detection position on the image sensor shown in FIG. 13, for example. As shown in FIG. 15A, the pair of image signals 400 and 401 in the case where no vignetting is generated in the focus detection light beam is obtained by simply shifting the same image signal function horizontally. By performing a known image shift detection operation on such a pair of image signals, an accurate image shift amount of the pair of image signals 400 and 401 can be calculated.

焦点検出用光束にケラレが生ずると、測距瞳を通る焦点検出用光束の量が焦点検出位置および焦点検出位置内での位置偏差によって変化し、例えば輝度が一様な物体を結像させた場合でも図15(b)のように一対の像信号402,403のように一様にならない。図15(b)のような輝度一様物体に対する一対の像信号402,403は、図7〜図10で説明したように、口径情報と測距瞳情報と焦点検出位置および焦点検出位置内の位置偏差に基づき、実際に焦点検出に利用できる焦点検出用光束の量すなわち測距光量情報として演算することができる。   When vignetting occurs in the focus detection light beam, the amount of the focus detection light beam passing through the distance measuring pupil changes depending on the focus detection position and the position deviation within the focus detection position, for example, an object having uniform brightness is imaged. Even in this case, it is not uniform like the pair of image signals 402 and 403 as shown in FIG. As described with reference to FIGS. 7 to 10, the pair of image signals 402 and 403 for the uniform luminance object as shown in FIG. 15B are the aperture information, the distance measurement pupil information, the focus detection position, and the focus detection position. Based on the positional deviation, it can be calculated as the amount of focus detection light beam that can actually be used for focus detection, that is, the distance measurement light quantity information.

焦点検出用光束にケラレが生じている場合の一対の像信号404,405は図15(c)のようになり、同一の信号を相対的にシフトしたものにはならない。したがって、このまま像ズレ検出演算を施しても像ズレ量に誤差を生ずる。一対の像信号404,405は、図15(a)に示す焦点検出用光束にケラレが生じていない場合の一対の像信号400,401に、図15(b)に示す一対の像信号(測距光量情報)402、403を乗じた像信号となっている。反対に、焦点検出用光束にケラレが生じている場合の一対の像信号404,405が得られた場合には、測距光量情報402、403を演算で求め、この測距光量情報で一対の像信号404,405を除してやれば、焦点検出用光束にケラレが生じていない場合の一対の像信号400,401を得ることができる。   A pair of image signals 404 and 405 in the case where vignetting is generated in the focus detection light beam is as shown in FIG. 15C, and the same signal is not relatively shifted. Therefore, even if the image shift detection calculation is performed as it is, an error occurs in the image shift amount. A pair of image signals 404 and 405 is a pair of image signals 400 and 401 in the case where no vignetting occurs in the focus detection light beam shown in FIG. This is an image signal multiplied by (distance light quantity information) 402 and 403. On the other hand, when a pair of image signals 404 and 405 in the case where vignetting is generated in the focus detection light beam, distance measurement light quantity information 402 and 403 are obtained by calculation, and the pair of distance measurement light quantity information is used as a pair. By removing the image signals 404 and 405, it is possible to obtain a pair of image signals 400 and 401 when no vignetting occurs in the focus detection light beam.

このように、ケラレが生じている信号を測距光量情報で補正することにより、一対の像の一致度が高まり、像ズレ量を正確に算出することが可能になる。   In this way, by correcting the signal in which vignetting has occurred with the distance measurement light quantity information, the degree of coincidence between the pair of images is increased, and the image shift amount can be accurately calculated.

図16は光学システムの他の実施例を示す。図11に示す光学システムでは撮像素子中に焦点検出用の画素を埋め込んだ構成となっているが、図16に示す光学システムでは焦点検出を撮像素子と別の焦点検出ユニット(焦点検出装置)で行う構成になっている。図16において、図11に示す機器と同様な機器については説明を省略する。320は焦点検出ユニット(再結像方式)、321はメインミラー(ハーフミラー)、322はサブミラー、323は折り返しミラーである。また、撮像素子212は撮像専用のイメージセンサーである。   FIG. 16 shows another embodiment of the optical system. The optical system shown in FIG. 11 has a configuration in which pixels for focus detection are embedded in the image sensor. In the optical system shown in FIG. 16, focus detection is performed by a focus detection unit (focus detection device) separate from the image sensor. It is configured to do. In FIG. 16, description of devices similar to those shown in FIG. 11 is omitted. 320 is a focus detection unit (re-imaging method), 321 is a main mirror (half mirror), 322 is a sub mirror, and 323 is a folding mirror. The image sensor 212 is an image sensor dedicated to imaging.

交換レンズ202からボディ203に入射する光束は、メインミラー321で分割され一方は折り返しミラー323方向に反射され、折り返しミラー323で接眼レンズ217方向に反射され、ファインダー像をした観察される。メインミラー321を通過した光束はサブミラー322で反射され、再結像方式の焦点検出ユニット320で受光され、焦点検出ユニット320は一対の像信号を出力される。焦点検出ユニットは図1に示すような構成となっており、画面上の複数の焦点検出位置に対応して、複数の再結像方式の光学系とイメージセンサが配置されている。   The light beam incident on the body 203 from the interchangeable lens 202 is divided by the main mirror 321 and one is reflected in the direction of the folding mirror 323, and is reflected in the direction of the eyepiece lens 217 by the folding mirror 323, and is observed as a viewfinder image. The light beam that has passed through the main mirror 321 is reflected by the sub mirror 322, received by the refocusing focus detection unit 320, and the focus detection unit 320 outputs a pair of image signals. The focus detection unit is configured as shown in FIG. 1, and a plurality of re-imaging optical systems and image sensors are arranged corresponding to a plurality of focus detection positions on the screen.

撮影時はメインミラー321とサブミラー322は撮影光路中から待避し、撮像素子212で撮像が行われる。焦点検出ユニット320から出力される一対の像信号は焦点検出部213で処理される。焦点検出部213とレンズCPU206は、マウント部204に設けられた電気接点部218により各種情報(変換係数を算出するために必要な口径情報、レンズ駆動のためのデフォーカス量等)を伝達する。焦点検出部213は、レンズCPU206と通信して装着されているレンズの口径情報を読み出し、この口径情報と焦点検出部213が保持している測距瞳情報と複数の焦点検出位置の情報に基づいて、複数の焦点検出位置毎に変換係数と測距光量情報を算出する。   At the time of shooting, the main mirror 321 and the sub mirror 322 are retracted from the shooting optical path, and the image pickup device 212 picks up an image. A pair of image signals output from the focus detection unit 320 is processed by the focus detection unit 213. The focus detection unit 213 and the lens CPU 206 transmit various types of information (aperture information necessary for calculating a conversion coefficient, a defocus amount for driving the lens, and the like) through an electrical contact unit 218 provided in the mount unit 204. The focus detection unit 213 communicates with the lens CPU 206 to read the aperture information of the mounted lens, and based on the aperture information, the distance measurement pupil information held by the focus detection unit 213, and information on a plurality of focus detection positions. Thus, the conversion coefficient and the distance measurement light quantity information are calculated for each of the plurality of focus detection positions.

焦点検出部213は、測距光量情報に応じて焦点検出位置毎に一対の像信号を補正した後、周知の焦点検出演算処理を施して焦点検出位置毎に一対の像の像ズレ量を算出する。焦点検出部213は、焦点検出位置毎に算出された像ズレ量に焦点検出位置毎に求められた変換係数を乗じて各焦点検出位置でのデフォーカス量を算出し、複数のデフォーカス量に基づき最終的なデフォーカス量を決定する。また、焦点検出部213は、最終的なデフォーカス量に基づきレンズ駆動が必要だと判断した場合(非合焦と判断した場合)は、最終的なデフォーカス量をレンズCPUに送信する。   The focus detection unit 213 corrects the pair of image signals for each focus detection position according to the distance measurement light amount information, and then performs a known focus detection calculation process to calculate the image shift amount of the pair of images for each focus detection position. To do. The focus detection unit 213 calculates the defocus amount at each focus detection position by multiplying the image shift amount calculated for each focus detection position by the conversion coefficient obtained for each focus detection position, and obtains a plurality of defocus amounts. Based on this, the final defocus amount is determined. Further, when the focus detection unit 213 determines that lens driving is necessary based on the final defocus amount (when it is determined that the lens is out of focus), the focus detection unit 213 transmits the final defocus amount to the lens CPU.

レンズCPU206は、受信したデフォーカス量に基づいてレンズ駆動量を算出し、このレンズ駆動量に基づきフォーカシングレンズ210を合焦点へと駆動する。   The lens CPU 206 calculates a lens driving amount based on the received defocus amount, and drives the focusing lens 210 to a focal point based on the lens driving amount.

なお、光学システムは上述したデジタルスチルカメラに限定されない。携帯電話などに内蔵される小型カメラモジュール等にも適用できる。また、以上の説明では瞳分割型の焦点検出方式の例として再結像方式とマイクロレンズ方式を説明したが、これらの方式に限定されず、射出瞳を何らかの方法で分割し、これらの分割された瞳を通る光束を用いて形成された一対の像の像ズレ量を検出する方式であればよい。   The optical system is not limited to the digital still camera described above. The present invention can also be applied to a small camera module built in a mobile phone or the like. In the above description, the re-imaging method and the microlens method have been described as examples of the pupil division type focus detection method. However, the present invention is not limited to these methods, and the exit pupil is divided by some method and divided. Any method may be used as long as it detects the amount of image misalignment between a pair of images formed using a light beam passing through the pupil.

このように、一実施の形態によれば、結像光学系を通過した一対の焦点検出用光束により結像される一対の像のズレ量に基づいて結像光学系の予定結像面における焦点調節状態を検出する場合に、焦点検出に用いる焦点検出用光束の分布を示す測距瞳情報と、結像光学系から焦点検出装置に射出する光束を規定する口径情報とに基づいて、一対の像のズレ量を結像光学系のデフォーカス量に変換するための変換係数を算出するようにしたので、瞳分割型の焦点検出方式において、一対の像ズレ量を結像光学系のデフォーカス量に変換するための正確な変換係数を求めることができ、結像光学系の焦点検出精度を向上させることができる。   Thus, according to one embodiment, the focal point on the planned imaging plane of the imaging optical system based on the amount of deviation of the pair of images formed by the pair of focus detection light beams that have passed through the imaging optical system. When detecting the adjustment state, a pair of focus detection pupil information indicating the distribution of the focus detection light beam used for focus detection and a pair of aperture information defining the light beam emitted from the imaging optical system to the focus detection device. Since the conversion coefficient for converting the image shift amount to the defocus amount of the imaging optical system is calculated, the pair of image shift amounts are defocused by the imaging optical system in the pupil division type focus detection method. An accurate conversion coefficient for conversion into a quantity can be obtained, and the focus detection accuracy of the imaging optical system can be improved.

また、一実施の形態によれば、上述した測距瞳情報と口径情報とに基づいて、一対の焦点検出用光束の中からそれぞれ焦点検出に利用可能な光束を分別するとともに、分別した二つの光束分布の重心の開き角を算出し、この重心の開き角に基づいて変換係数を算出するようにしたので、焦点検出用光束にケラレが発生したり、さらにそのケラレ方が結像光学系の交換やフォーカシングやズーミングによって変化する場合でも正確な変換係数を迅速にも求めることができ、焦点検出における精度と応答性を向上させることができる。   According to one embodiment, based on the above-mentioned distance measurement pupil information and aperture information, the light beams that can be used for focus detection are separated from the pair of focus detection light beams, and Since the opening angle of the center of gravity of the light flux distribution is calculated and the conversion coefficient is calculated based on the opening angle of the center of gravity, vignetting occurs in the focus detection light beam, and the vignetting method is further applied to the imaging optical system. Even when changing due to exchange, focusing, or zooming, an accurate conversion coefficient can be quickly obtained, and the accuracy and responsiveness in focus detection can be improved.

一実施の形態によれば、結像光学系の射出瞳面上において一対の焦点検出用光束の中からそれぞれ焦点検出に利用可能な二つの光束を分別するとともに、分別した二つの光束分布の重心間隔を算出し、結像光学系の予定結像面と射出瞳面との間の距離と重心間隔とに基づいて重心の開き角を算出するようにしたので、重心の開き角を正確にかつ迅速に求めることができ、焦点検出における精度と応答性をさらに向上させることができる。   According to one embodiment, the two light beams that can be used for focus detection are separated from the pair of focus detection light beams on the exit pupil plane of the imaging optical system, and the center of gravity of the two separated light beam distributions is separated. Since the interval was calculated, and the opening angle of the center of gravity was calculated based on the distance between the planned imaging surface and exit pupil plane of the imaging optical system and the center of gravity interval, the opening angle of the center of gravity was accurately and It can be obtained quickly, and the accuracy and responsiveness in focus detection can be further improved.

さらに、一実施の形態によれば、測距瞳情報には焦点検出位置に応じた焦点検出用光束の方向に関する情報を含むようにしたので、焦点検出位置に応じて焦点検出用光束のケラレ方が異なる場合でも正確な変換係数を迅速に求めることができ、焦点検出における精度と応答性を向上させることができる。   Furthermore, according to the embodiment, since the distance measurement pupil information includes information on the direction of the focus detection light beam according to the focus detection position, the vignetting method of the focus detection light beam according to the focus detection position. Even if they are different, an accurate conversion coefficient can be quickly obtained, and the accuracy and responsiveness in focus detection can be improved.

一実施の形態によれば、口径情報を焦点検出位置に応じた情報としたので、焦点検出位置に応じて焦点検出用光束のケラレ方が異なる場合でも正確な変換係数を迅速に求めることができ、焦点検出における精度と応答性を向上させることができる。   According to one embodiment, since the aperture information is information according to the focus detection position, an accurate conversion coefficient can be quickly obtained even when the vignetting method of the focus detection light beam differs depending on the focus detection position. The accuracy and responsiveness in focus detection can be improved.

一実施の形態によれば、焦点検出位置の情報、測距瞳情報および口径情報に基づいて変換係数を算出するようにしたので、焦点検出位置に応じて焦点検出用光束のケラレ方が異なる場合でも正確な変換係数を迅速に求めることができ、焦点検出における精度と応答性を向上させることができる。   According to the embodiment, since the conversion coefficient is calculated based on the focus detection position information, the distance measurement pupil information, and the aperture information, the vignetting method of the focus detection light beam varies depending on the focus detection position. However, an accurate conversion coefficient can be quickly obtained, and the accuracy and responsiveness in focus detection can be improved.

さらにまた、焦点検出位置情報、測距瞳情報および口径情報とに基づいて、焦点検出に利用可能な一対の焦点検出用光束の光量に関する測距光量情報を算出するとともに、一対の焦点検出用光束により結像される一対の像を測距光量情報により補正するようにしたので、高精度な像ズレ量の検出ができ、それにより高精度な焦点検出が可能になる。   Furthermore, the distance measurement light amount information regarding the light amount of the pair of focus detection light beams that can be used for focus detection is calculated based on the focus detection position information, the distance measurement pupil information, and the aperture information, and the pair of focus detection light beams Since the pair of images formed by the correction is corrected based on the distance measurement light quantity information, it is possible to detect the amount of image shift with high accuracy, thereby enabling highly accurate focus detection.

一実施の形態によれば、上述した一実施の形態の焦点検出装置を含むカメラボディ203と、カメラボディ203に対して着脱交換可能な結像光学系を含む交換レンズ202とから光学システムを構成し、カメラボディ203側に変換係数を演算するCPU214を設置するとともにそのメモリに測距瞳情報を記憶しておき、一方、交換レンズ202のレンズCPU206のメモリに口径情報を記憶し、カメラボディ203に設置されたCPU214によって、交換レンズ202から口径情報を読み出し、口径情報と測距瞳情報とに基づいて変換係数を算出するようにしたので、上述した焦点検出装置を含む本体と、本体に対して着脱交換可能なレンズ溝体とから成る光学システムにおいても、焦点検出用光束にケラレが発生したり、さらにそのケラレ方が結像光学系の交換やフォーカシングやズーミングによって変化する場合でも正確な変換係数を迅速にも求めることができ、焦点検出における精度と応答性を向上させることができる。   According to an embodiment, an optical system is configured by the camera body 203 including the focus detection device of the above-described embodiment and the interchangeable lens 202 including an imaging optical system that can be attached to and detached from the camera body 203. Then, a CPU 214 for calculating a conversion coefficient is installed on the camera body 203 side, and distance measuring pupil information is stored in the memory thereof, while aperture information is stored in a memory of the lens CPU 206 of the interchangeable lens 202, and the camera body 203 is stored. Since the aperture information is read from the interchangeable lens 202 and the conversion coefficient is calculated based on the aperture information and the distance measurement pupil information, the main body including the above-described focus detection device, Even in an optical system consisting of a lens groove body that can be attached and detached, vignetting may occur in the light beam for focus detection. Who vignetting can also be determined quickly and accurately transform coefficients even vary by the exchange and focusing and zooming of the imaging optical system, it is possible to improve the accuracy and responsiveness in the focus detection.

《焦点検出用光束の重心位置算出の変形例》
上述した図8および図9では、予定結像面上の中心(予定結像面と光軸との交点)と周辺において焦点検出用光束の重心を求める際、同一の焦点検出用光束の分布情報を用いているが、予定結像面上の位置に応じて焦点検出用光束の分布を示す測距瞳情報が異なる場合には、それぞれの位置に対応した測距瞳情報を記憶しておき、焦点検出用光束の重心を求める際には予定結像面上の位置(焦点検出位置)に対応した測距瞳情報を用いるようにしてもよい。
<< Variation of center of gravity calculation of focus detection beam >>
In FIG. 8 and FIG. 9 described above, the distribution information of the same focus detection light beam is obtained when obtaining the center of gravity of the focus detection light beam at the center (intersection of the planned image formation surface and the optical axis) and the periphery on the planned image formation surface. However, if the distance measurement pupil information indicating the distribution of the light beam for focus detection differs according to the position on the planned imaging plane, the distance measurement pupil information corresponding to each position is stored, When obtaining the center of gravity of the focus detection light beam, distance measurement pupil information corresponding to a position on the planned imaging plane (focus detection position) may be used.

焦点検出位置が多く、焦点検出位置に対応した測距瞳情報を記憶するのに膨大な記憶容量を必要とする場合には、図17に示すように少数の特定位置に対応した測距瞳情報を記憶しておき、任意の焦点検出位置における測距瞳情報を焦点検出位置近傍の特定位置の測距瞳情報より補間して求める。このようにすれば、測距瞳情報を記憶するための記憶容量を低減することができる。   If there are many focus detection positions and a large amount of storage capacity is required to store the distance measurement pupil information corresponding to the focus detection position, the distance measurement pupil information corresponding to a small number of specific positions as shown in FIG. Is stored, and distance measurement pupil information at an arbitrary focus detection position is obtained by interpolation from distance measurement pupil information at a specific position near the focus detection position. In this way, the storage capacity for storing the distance measuring pupil information can be reduced.

図17(a)において、予定結像面400上の特定位置401、402に対しては、焦点検出用光束の分布を示す測距瞳情報((b)図および(d)図参照)が記憶されているものとする。ここで、任意の焦点検出位置403における測距瞳情報を求める場合には、位置403の近傍に存在する特定位置であって、予め測距瞳情報が記憶されている特定位置、ここでは位置401と402の測距瞳情報を用いて焦点検出用位置403の測距瞳情報((c)図参照)を補間により求める。この補間演算に際しては、焦点検出位置403と特定位置401の間の距離と、焦点検出位置403と特定位置402の間の距離とに応じて、特定位置401の測距瞳情報と特定位置402の測距瞳情報を按分する。   In FIG. 17 (a), distance measurement pupil information (see FIGS. (B) and (d)) indicating the distribution of the light beam for focus detection is stored for specific positions 401 and 402 on the scheduled imaging plane 400. It is assumed that Here, in the case of obtaining distance measurement pupil information at an arbitrary focus detection position 403, a specific position existing in the vicinity of the position 403, which is a specific position in which distance measurement pupil information is stored in advance, in this case, the position 401 The distance measurement pupil information of the focus detection position 403 (see FIG. 4C) is obtained by interpolation using the distance measurement pupil information 402 and 402. In this interpolation calculation, the distance measurement pupil information of the specific position 401 and the specific position 402 are determined according to the distance between the focus detection position 403 and the specific position 401 and the distance between the focus detection position 403 and the specific position 402. Apportions the distance measurement pupil information.

《焦点検出用光束の重心位置算出の他の変形例》
上述した図8および図9では、焦点検出用光束の重心を求める際に射出瞳面1上における焦点検出用光束の分布情報を用いたが、予定結像面からの光軸方向の距離に応じて焦点検出用光束の分布を示す測距瞳情報が異なる場合には、予定結像面からの光軸方向の距離に対応した測距瞳情報を記憶しておき、焦点検出用光束の重心を求める際には光学系の射出瞳位置(予定結像面からの光軸方向の距離)に対応した測距瞳情報を用いるようにしてもよい。
<< Other Modifications of Center of Gravity Position of Focus Detection Beam >>
In FIG. 8 and FIG. 9 described above, the distribution information of the focus detection light beam on the exit pupil plane 1 is used when obtaining the center of gravity of the focus detection light beam, but depending on the distance in the optical axis direction from the planned imaging plane. If the distance measurement pupil information indicating the distribution of the focus detection light beam is different, the distance measurement pupil information corresponding to the distance in the optical axis direction from the planned imaging plane is stored, and the center of gravity of the focus detection light beam is calculated. When obtaining, distance measurement pupil information corresponding to the exit pupil position of the optical system (distance in the optical axis direction from the planned imaging plane) may be used.

予定結像面からの光軸方向の距離に対応した測距瞳情報を記憶するのに膨大な記憶容量が必要となる場合には、図18に示すように少数の特定距離に対応した測距瞳情報を記憶しておき、任意の距離における測距瞳情報を近傍の特定距離の測距瞳情報より補間して求める。このようにすれば、測距瞳情報を記憶するための記憶容量を低減することができる。   When enormous storage capacity is required to store distance measurement pupil information corresponding to the distance in the optical axis direction from the planned imaging plane, distance measurement corresponding to a small number of specific distances as shown in FIG. The pupil information is stored, and the distance measurement pupil information at an arbitrary distance is obtained by interpolation from the distance measurement pupil information at a specific distance in the vicinity. In this way, the storage capacity for storing the distance measuring pupil information can be reduced.

図18において、予定結像面400から特定距離だけ離れた位置411、412に対しては、焦点検出用光束の分布を示す測距瞳情報((b)図および(d)図参照)が記憶されているものとする。ここで、結像光学系の射出瞳位置413における測距瞳情報を求める場合には、位置413の近傍に存在する特定位置であって、予め測距瞳情報が記憶されている特定位置、ここでは位置411と412の測距瞳情報を用いて射出瞳位置413の測距瞳情報((c)図参照)を補間演算により求める。この補間演算に際しては、位置413と特定位置411の間の距離と、位置413と特定位置412の間の距離とに応じて、特定位置411の測距瞳情報と特定位置412の測距瞳情報を按分する。   In FIG. 18, distance measurement pupil information (see FIGS. (B) and (d)) indicating the distribution of the light beam for focus detection is stored at positions 411 and 412 that are apart from the planned imaging plane 400 by a specific distance. It is assumed that Here, when obtaining the distance measuring pupil information at the exit pupil position 413 of the imaging optical system, a specific position existing in the vicinity of the position 413, where the distance measuring pupil information is stored in advance, Then, using the distance measurement pupil information at the positions 411 and 412, distance measurement pupil information (see FIG. 7C) at the exit pupil position 413 is obtained by interpolation calculation. In this interpolation calculation, the distance measurement pupil information at the specific position 411 and the distance measurement pupil information at the specific position 412 according to the distance between the position 413 and the specific position 411 and the distance between the position 413 and the specific position 412. Apportion.

以上の説明では、焦点検出用光束の重心を求める際に用いる焦点検出用光束の分布情報(測距瞳情報)を予め記憶しておく例を示したが、焦点検出装置が高速演算処理可能なマイクロコンピューターを備えている場合には、焦点検出系の構成パラメーター(光電変換部サイズ、マイクロレンズ曲率、マイクロレンズ屈折率、マイクロレンズと光電変換部の距離、マイクロレンズ開口径など)から直接、光線追跡演算や回折演算を行うことによって、予定結像面上の任意の位置、あるいは予定結像面から光軸方向の任意の距離に対応した測距瞳情報を求めるようにしてもよい。   In the above description, the example in which the distribution information (ranging pupil information) of the focus detection light beam used for obtaining the center of gravity of the focus detection light beam is stored in advance, but the focus detection device can perform high-speed calculation processing. When equipped with a microcomputer, the light beam directly from the configuration parameters of the focus detection system (photoelectric conversion unit size, microlens curvature, microlens refractive index, distance between microlens and photoelectric conversion unit, microlens aperture diameter, etc.) By performing tracking calculation and diffraction calculation, distance measurement pupil information corresponding to an arbitrary position on the planned imaging plane or an arbitrary distance in the optical axis direction from the planned imaging plane may be obtained.

光線追跡演算や回折演算に時間がかかり過ぎる場合には、予定結像面上の任意の位置、あるいは予定結像面から光軸方向の任意の距離に対応した点像分布関数を予め演算して記憶するとともに、回折や収差のない場合の設計上の測距瞳情報(設計測距瞳情報)を記憶しておき、焦点検出光束の重心を求める際には焦点検出位置および結像光学系の射出瞳位置に対応した点像分布関数と設計測距瞳情報を読み出し、この点像分布関数と設計測距瞳情報とのコンボルーション演算を行うことによって、焦点検出位置および結像光学系の射出瞳位置に対応した測距瞳情報を演算により求めて用いるようにする。   If the ray tracing calculation or diffraction calculation takes too much time, a point spread function corresponding to an arbitrary position on the planned imaging plane or an arbitrary distance in the optical axis direction from the planned imaging plane is calculated in advance. The distance measurement pupil information (design distance measurement pupil information) when there is no diffraction or aberration is stored, and when the center of gravity of the focus detection light beam is obtained, the focus detection position and the imaging optical system By reading out the point spread function corresponding to the exit pupil position and the design ranging pupil information, and performing the convolution calculation of this point spread function and the design ranging pupil information, the focus detection position and the exit of the imaging optical system are obtained. The distance measuring pupil information corresponding to the pupil position is obtained by calculation and used.

予定結像面上の任意の位置、あるいは予定結像面から光軸方向の任意の距離に対応した点像分布関数を記憶するのに膨大な記憶容量が必要となる場合は、予定結像面上の特定の位置あるいは予定結像面から光軸方向の特定の距離に対応した点像分布関数を記憶しておき、焦点検出位置あるいは結像光学系の射出瞳位置に対応した点像分布関数を焦点検出位置近傍の特定位置あるいは結像光学系の射出瞳距離の近傍の特定距離に対応した点像分布関数を用いて補間演算を行う。このようにすれば、点像分布関数を記憶するための記憶容量を低減することができる。   If a large storage capacity is required to store a point spread function corresponding to an arbitrary position on the planned imaging plane or an arbitrary distance in the optical axis direction from the planned imaging plane, Stores a point spread function corresponding to a specific position above or a specific distance in the optical axis direction from the planned imaging plane, and a point spread function corresponding to the focus detection position or the exit pupil position of the imaging optical system Is interpolated using a point spread function corresponding to a specific position near the focus detection position or a specific distance near the exit pupil distance of the imaging optical system. In this way, the storage capacity for storing the point spread function can be reduced.

再結像方式の焦点検出装置の構成を示す図である。It is a figure which shows the structure of the focus detection apparatus of a re-imaging system. マイクロレンズ方式の焦点検出装置の構成を示す図である。It is a figure which shows the structure of the focus detection apparatus of a micro lens system. マイクロレンズ方式における回折現象を説明する図である。It is a figure explaining the diffraction phenomenon in a micro lens system. 図3に対応する回折像強度分布パターンを示す図である。It is a figure which shows the diffraction image intensity distribution pattern corresponding to FIG. 図4の回折像強度分布パターンのY=0の断面図である。5 is a cross-sectional view of Y = 0 of the diffraction image intensity distribution pattern of FIG. マイクロレンズ方式における球面収差を説明する図である。It is a figure explaining the spherical aberration in a micro lens system. 焦点検出用光束のケラレの説明図である。It is explanatory drawing of the vignetting of the light beam for focus detection. 焦点検出用光束の重心の求め方を説明する図である。It is a figure explaining how to obtain | require the gravity center of the light beam for focus detection. 焦点検出用光束のケラレを説明する図である。It is a figure explaining the vignetting of the light beam for focus detection. 焦点検出用光束の重心の求め方を説明する図である。It is a figure explaining how to obtain | require the gravity center of the light beam for focus detection. 光学システムの構成図である。It is a block diagram of an optical system. 図11に示すデジタルスチルカメラにおける焦点検出位置の具体例を示す図である。It is a figure which shows the specific example of the focus detection position in the digital still camera shown in FIG. 図11、図12に対応した撮像素子の詳細構成図である。It is a detailed block diagram of the image pick-up element corresponding to FIG. 11, FIG. デジタルスチルカメラの動作を示すフローチャートである。It is a flowchart which shows operation | movement of a digital still camera. 測距光量情報による像信号の補正を説明する図である。It is a figure explaining correction | amendment of the image signal by ranging light quantity information. 光学システムの他の実施例を示す図である。It is a figure which shows the other Example of an optical system. 焦点検出用光束の重心位置算出の変形例を説明する図である。It is a figure explaining the modification of the gravity center position calculation of the focus detection light beam. 焦点検出用光束の重心位置算出の他の変形例を説明する図である。It is a figure explaining the other modification of calculation of the gravity center position of the light beam for focus detection.

符号の説明Explanation of symbols

1 射出瞳
2、3 測距瞳
4 光軸
6、7 焦点検出用光束
8、9 重心位置
12、13、22、23 絞り開口
16、26 イメージセンサ
18、28 測距ユニット
32、33、42、43、56、57、72、73、82、83 焦点検出用光束
41、44 結像光学系の絞りの射出瞳
50、60 マイクロレンズ
52、53、62、63 受光素子
67 投影軸
78、79 重心位置
201 デジタルスチルカメラ
203 カメラボディ
206 レンズCPU
211 絞り
212 撮像素子
210 デジタルスチルカメラ
212 撮像素子
213 焦点検出部
214 ボディCPU
310 撮像用画素
311 焦点検出用画素(マイクロレンズ方式)
1 Exit pupil 2, 3 Distance pupil 4 Optical axis 6, 7 Focus detection light beam 8, 9 Center of gravity position 12, 13, 22, 23 Aperture aperture 16, 26 Image sensor 18, 28 Distance unit 32, 33, 42, 43, 56, 57, 72, 73, 82, 83 Focus detection light beam 41, 44 Exit pupil 50, 60 of imaging optical system Micro lens 52, 53, 62, 63 Light receiving element 67 Projection axis 78, 79 Center of gravity Position 201 Digital still camera 203 Camera body 206 Lens CPU
211 Diaphragm 212 Image Sensor 210 Digital Still Camera 212 Image Sensor 213 Focus Detection Unit 214 Body CPU
310 Imaging Pixel 311 Focus Detection Pixel (Microlens System)

Claims (11)

結像光学系の瞳の異なる位置を通過した一対の光束による一対の像のズレ量に基づいて、前記結像光学系の予定結像面における焦点調節状態を検出する焦点検出装置であって、
前記光束の分布を示す情報と前記結像光学系の口径情報とに基づいて、前記一対の像のズレ量を前記結像光学系のデフォーカス量に変換するための変換係数を算出する変換係数演算手段を備えることを特徴とする焦点検出装置。
A focus detection device that detects a focus adjustment state on a planned imaging plane of the imaging optical system based on a shift amount of a pair of images by a pair of light beams that have passed through different positions of a pupil of the imaging optical system,
A conversion coefficient for calculating a conversion coefficient for converting a shift amount of the pair of images into a defocus amount of the imaging optical system based on information indicating the distribution of the light flux and aperture information of the imaging optical system A focus detection apparatus comprising a calculation means.
請求項1に記載の焦点検出装置において、
前記変換係数演算手段は、前記光束の分布を示す情報と前記口径情報とに基づいて、前記一対の光束の中でそれぞれ焦点検出に利用可能な二つの光束分布の重心の開き角を算出し、この重心の開き角に基づいて変換係数を算出することを特徴とする焦点検出装置。
The focus detection apparatus according to claim 1,
The conversion coefficient calculation means calculates an opening angle of the center of gravity of two light flux distributions that can be used for focus detection among the pair of light fluxes based on the information indicating the distribution of the light fluxes and the aperture information, A focus detection apparatus that calculates a conversion coefficient based on an opening angle of the center of gravity.
請求項2に記載の焦点検出装置において、
前記変換係数演算手段は、前記一対の光束の中でそれぞれ焦点検出に利用可能な二つの光束分布の前記結像光学系の射出瞳面上における重心間隔を算出し、前記結像光学系の予定結像面と前記射出瞳面との間の距離と前記重心間隔とに基づいて前記重心の開き角を算出することを特徴とする焦点検出装置。
The focus detection apparatus according to claim 2,
The conversion coefficient calculation means calculates a center-of-gravity interval on the exit pupil plane of the imaging optical system of two light beam distributions that can be used for focus detection in the pair of light beams, and schedules the imaging optical system A focus detection apparatus that calculates an opening angle of the center of gravity based on a distance between an imaging plane and the exit pupil plane and the center-of-gravity interval.
請求項1〜3のいずれか1項に記載の焦点検出装置において、
前記結像光学系の予定結像面上の所定の焦点検出位置における前記結像光学系の焦点調節状態を検出するものであり、前記光束の分布を示す情報には前記焦点検出位置に応じた焦点検出用光束の方向に関する情報が含まれることを特徴とする焦点検出装置。
The focus detection apparatus according to any one of claims 1 to 3,
It detects a focus adjustment state of the imaging optical system at a predetermined focus detection position on a predetermined imaging surface of the imaging optical system, and information indicating the distribution of the luminous flux corresponds to the focus detection position. A focus detection apparatus characterized in that information on the direction of a focus detection light beam is included.
請求項1〜4のいずれか1項に記載の焦点検出装置において、
前記結像光学系の予定結像面上の所定の焦点検出位置における前記結像光学系の焦点調節状態を検出するものであり、前記口径情報は前記焦点検出位置に応じた情報であることを特徴とする焦点検出装置。
In the focus detection apparatus according to any one of claims 1 to 4,
A focus adjustment state of the imaging optical system at a predetermined focus detection position on a predetermined imaging surface of the imaging optical system; and the aperture information is information corresponding to the focus detection position. Feature focus detection device.
請求項1〜3のいずれか1項に記載の焦点検出装置において、
前記結像光学系の予定結像面上の所定の焦点検出位置における前記結像光学系の焦点調節状態を検出するものであり、前記変換係数演算手段は、前記焦点検出位置の情報、前記光束の分布を示す情報および前記口径情報に基づいて前記変換係数を算出することを特徴とする焦点検出装置。
The focus detection apparatus according to any one of claims 1 to 3,
Detecting a focus adjustment state of the imaging optical system at a predetermined focus detection position on a predetermined imaging surface of the imaging optical system, and the conversion coefficient calculating means includes information on the focus detection position, the light flux The conversion coefficient is calculated based on the information indicating the distribution of the aperture and the aperture information.
請求項1〜3のいずれか1項に記載の焦点検出装置において、
前記結像光学系の予定結像面上の所定の焦点検出位置における前記結像光学系の焦点調節状態を検出するものであり、前記焦点検出位置情報、前記光束の分布を示す情報および前記口径情報とに基づいて、焦点検出に利用可能な前記一対の焦点検出用光束の光量に関する情報を算出する測距光量情報算出手段と、前記一対の焦点検出用光束による一対の像を前記光量に関する情報により補正する像補正手段とをさらに備えることを特徴とする焦点検出装置。
The focus detection apparatus according to any one of claims 1 to 3,
Detecting a focus adjustment state of the imaging optical system at a predetermined focus detection position on a predetermined imaging surface of the imaging optical system, the focus detection position information, information indicating the distribution of the luminous flux, and the aperture Distance measurement light quantity information calculating means for calculating information on the light quantity of the pair of focus detection light fluxes available for focus detection based on the information, and a pair of images of the pair of focus detection light fluxes on the light quantity information A focus detection apparatus, further comprising: an image correction unit that corrects the image by the correction.
請求項1〜3のいずれか1項に記載の焦点検出装置において、
予定結像面上の複数の位置に対応した複数の前記光束の分布を示す情報を記憶する瞳情報記憶手段と、
前記瞳情報記憶手段に記憶されている複数の前記光束の分布を示す情報に基づいて、予定結像面上の焦点検出位置に対応した前記光束の分布を示す情報を補間演算により求める補間演算手段とを備えることを特徴とする焦点検出装置。
The focus detection apparatus according to any one of claims 1 to 3,
Pupil information storage means for storing information indicating a plurality of light flux distributions corresponding to a plurality of positions on the planned imaging plane;
Interpolation calculation means for obtaining information indicating the distribution of the luminous flux corresponding to the focus detection position on the planned imaging plane by interpolation based on information indicating the distribution of the plurality of luminous fluxes stored in the pupil information storage means A focus detection device comprising:
請求項1〜3のいずれか1項に記載の焦点検出装置において、
予定結像面から光軸方向の複数の距離に対応した複数の前記光束の分布を示す情報を記憶する瞳情報記憶手段と、
前記記憶手段に記憶されている複数の前記光束の分布を示す情報に基づいて、結像光学系の射出瞳の予定結像面からの距離に対応した前記光束の分布を示す情報を補間演算により求める補間演算手段とを備えることを特徴とする焦点検出装置。
The focus detection apparatus according to any one of claims 1 to 3,
Pupil information storage means for storing information indicating a plurality of distributions of the luminous fluxes corresponding to a plurality of distances in the optical axis direction from the planned imaging plane;
Based on the information indicating the distribution of the plurality of light beams stored in the storage means, the information indicating the distribution of the light beams corresponding to the distance from the planned imaging surface of the exit pupil of the imaging optical system is obtained by interpolation calculation. A focus detection apparatus comprising: an interpolation calculation means to be obtained.
請求項1〜9のいずれか1項に記載の焦点検出装置と結像光学系とを備えることを特徴とする光学システム。   An optical system comprising the focus detection apparatus according to claim 1 and an imaging optical system. 請求項10に記載の光学システムにおいて、
光学システムは、前記焦点検出装置を含む本体と、本体に対して着脱交換可能な前記結像光学系を含むレンズ構体とから成り、
前記本体側に前記変換係数演算手段を設置するとともに前記光束の分布を示す情報を保持し、前記レンズ構体側に前記口径情報を保持し、前記本体側に設置された前記変換係数演算手段は、前記レンズ構体から前記口径情報を読み出し、前記口径情報と前記光束の分布を示す情報とに基づいて前記変換係数を算出することを特徴とする光学システム。
The optical system according to claim 10.
The optical system includes a main body including the focus detection device, and a lens structure including the imaging optical system that can be attached to and detached from the main body.
The conversion coefficient calculating means installed on the main body side and holding information indicating the distribution of the luminous flux, holding the aperture information on the lens structure side, and the conversion coefficient calculating means installed on the main body side, An optical system, wherein the aperture information is read from the lens structure, and the conversion coefficient is calculated based on the aperture information and information indicating the distribution of the luminous flux.
JP2005316892A 2005-10-31 2005-10-31 Focus detection apparatus and optical system Expired - Fee Related JP4984491B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005316892A JP4984491B2 (en) 2005-10-31 2005-10-31 Focus detection apparatus and optical system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005316892A JP4984491B2 (en) 2005-10-31 2005-10-31 Focus detection apparatus and optical system

Publications (2)

Publication Number Publication Date
JP2007121896A true JP2007121896A (en) 2007-05-17
JP4984491B2 JP4984491B2 (en) 2012-07-25

Family

ID=38145781

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005316892A Expired - Fee Related JP4984491B2 (en) 2005-10-31 2005-10-31 Focus detection apparatus and optical system

Country Status (1)

Country Link
JP (1) JP4984491B2 (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008292894A (en) * 2007-05-28 2008-12-04 Nikon Corp Image tracking device, image tracking method and imaging apparatus
JP2009042370A (en) * 2007-08-07 2009-02-26 Canon Inc Focus detecting device and its control method
JP2009075407A (en) * 2007-09-21 2009-04-09 Nikon Corp Imaging apparatus
JP2009105681A (en) * 2007-10-23 2009-05-14 Canon Inc Image processing apparatus and image processing program
JP2009288042A (en) * 2008-05-29 2009-12-10 Nikon Corp Distance measuring device
WO2010007772A1 (en) 2008-07-15 2010-01-21 Canon Kabushiki Kaisha Focal point adjusting apparatus, image-taking apparatus, interchangeable lens, conversion coefficient calibrating method, and conversion coefficient calibrating program
WO2010021195A1 (en) * 2008-08-20 2010-02-25 株式会社ニコン Focal point detecting device
JP2010049209A (en) * 2008-08-25 2010-03-04 Canon Inc Imaging sensing apparatus, image sensing system, and focus detection method
WO2010050403A1 (en) * 2008-10-30 2010-05-06 Canon Kabushiki Kaisha Image capturing apparatus
WO2011045850A1 (en) * 2009-10-13 2011-04-21 キヤノン株式会社 Focusing device and focusing method
JP2011114553A (en) * 2009-11-26 2011-06-09 Nikon Corp Imaging device
JP2012058379A (en) * 2010-09-07 2012-03-22 Toshiba Corp Solid-state image pickup device
WO2012105222A1 (en) * 2011-01-31 2012-08-09 パナソニック株式会社 Image restoration device, imaging device, and image restoration method
JP2012226213A (en) * 2011-04-21 2012-11-15 Canon Inc Imaging apparatus and control method therefor
JP2013037296A (en) * 2011-08-10 2013-02-21 Olympus Imaging Corp Image pickup apparatus and image pickup device
JP2013037295A (en) * 2011-08-10 2013-02-21 Olympus Imaging Corp Image pickup apparatus and image pickup device
WO2013133115A1 (en) * 2012-03-06 2013-09-12 株式会社ニコン Defocus amount detection device and camera
US8633992B2 (en) 2009-09-09 2014-01-21 Nikon Corporation Focus detection device, photographic lens unit, image-capturing apparatus and camera system
WO2014021147A1 (en) * 2012-07-31 2014-02-06 Canon Kabushiki Kaisha Distance detecting apparatus
JP2014032414A (en) * 2013-10-02 2014-02-20 Nikon Corp Imaging device
US8675121B2 (en) 2008-10-30 2014-03-18 Canon Kabushiki Kaisha Camera and camera system
JP2014174357A (en) * 2013-03-11 2014-09-22 Canon Inc Imaging apparatus, imaging system, signal processor, program, and storage medium
JP2015011283A (en) * 2013-07-01 2015-01-19 キヤノン株式会社 Image capturing device, method of controlling the same, and program
WO2015046246A1 (en) * 2013-09-30 2015-04-02 オリンパス株式会社 Camera system and focal point detection pixel correction method
JP2015078855A (en) * 2013-10-15 2015-04-23 キヤノン株式会社 Distance detector, image capturing device, and distance detection method
US9215364B2 (en) 2012-10-26 2015-12-15 Canon Kabushiki Kaisha Focus detection apparatus, image pickup apparatus, image pickup system and focus detection method
WO2016067648A1 (en) * 2014-10-30 2016-05-06 オリンパス株式会社 Focal point adjustment device, camera system, and focal point adjustment method
JP2017138621A (en) * 2012-03-06 2017-08-10 株式会社ニコン Defocus amount detection device and camera
JP2018013624A (en) * 2016-07-21 2018-01-25 リコーイメージング株式会社 Focus detection device, focus detection method and imaging device
US10021289B2 (en) 2014-12-26 2018-07-10 Canon Kabushiki Kaisha Image pickup apparatus and image pickup system with point image intensity distribution calculation
JP2018165826A (en) * 2018-06-15 2018-10-25 キヤノン株式会社 Imaging device and lens device
JP2019219577A (en) * 2018-06-21 2019-12-26 キヤノン株式会社 Detection device and detection method
JP2019219576A (en) * 2018-06-21 2019-12-26 キヤノン株式会社 Focus detection method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03172811A (en) * 1989-12-01 1991-07-26 Nikon Corp Focus detector
JPH03214133A (en) * 1990-01-18 1991-09-19 Nikon Corp Focus detector
JPH05127073A (en) * 1991-11-01 1993-05-25 Canon Inc Focus detection device
JPH0762732B2 (en) * 1984-07-04 1995-07-05 株式会社ニコン Focus detection device
JP2004012493A (en) * 2002-06-03 2004-01-15 Canon Inc Focus detector

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0762732B2 (en) * 1984-07-04 1995-07-05 株式会社ニコン Focus detection device
JPH03172811A (en) * 1989-12-01 1991-07-26 Nikon Corp Focus detector
JPH03214133A (en) * 1990-01-18 1991-09-19 Nikon Corp Focus detector
JPH05127073A (en) * 1991-11-01 1993-05-25 Canon Inc Focus detection device
JP2004012493A (en) * 2002-06-03 2004-01-15 Canon Inc Focus detector

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8254773B2 (en) 2007-05-28 2012-08-28 Nikon Corporation Image tracking apparatus and tracking evaluation method
JP2008292894A (en) * 2007-05-28 2008-12-04 Nikon Corp Image tracking device, image tracking method and imaging apparatus
JP2009042370A (en) * 2007-08-07 2009-02-26 Canon Inc Focus detecting device and its control method
JP2009075407A (en) * 2007-09-21 2009-04-09 Nikon Corp Imaging apparatus
JP2009105681A (en) * 2007-10-23 2009-05-14 Canon Inc Image processing apparatus and image processing program
JP2009288042A (en) * 2008-05-29 2009-12-10 Nikon Corp Distance measuring device
JP2010025997A (en) * 2008-07-15 2010-02-04 Canon Inc Focusing device, imaging apparatus, interchangeable lens, conversion factor calibration method, and conversion factor calibration program
CN102089696B (en) * 2008-07-15 2013-07-24 佳能株式会社 Focal point adjusting apparatus, image-taking apparatus, interchangeable lens, conversion coefficient calibrating method
US8619180B2 (en) 2008-07-15 2013-12-31 Canon Kabushiki Kaisha Focal point adjusting apparatus, image-taking apparatus, interchangeable lens, conversion coefficient calibrating method, and conversion coefficient calibrating program
CN102089696A (en) * 2008-07-15 2011-06-08 佳能株式会社 Focal point adjusting apparatus, image-taking apparatus, interchangeable lens, conversion coefficient calibrating method, and conversion coefficient calibrating program
WO2010007772A1 (en) 2008-07-15 2010-01-21 Canon Kabushiki Kaisha Focal point adjusting apparatus, image-taking apparatus, interchangeable lens, conversion coefficient calibrating method, and conversion coefficient calibrating program
WO2010021195A1 (en) * 2008-08-20 2010-02-25 株式会社ニコン Focal point detecting device
JP2010048933A (en) * 2008-08-20 2010-03-04 Nikon Corp Focus detector and imaging apparatus
US8526807B2 (en) 2008-08-20 2013-09-03 Nikon Corporation Focus detecting apparatus
JP2010049209A (en) * 2008-08-25 2010-03-04 Canon Inc Imaging sensing apparatus, image sensing system, and focus detection method
WO2010050403A1 (en) * 2008-10-30 2010-05-06 Canon Kabushiki Kaisha Image capturing apparatus
US8675121B2 (en) 2008-10-30 2014-03-18 Canon Kabushiki Kaisha Camera and camera system
JP2010107770A (en) * 2008-10-30 2010-05-13 Canon Inc Imaging apparatus
US8477233B2 (en) 2008-10-30 2013-07-02 Canon Kabushiki Kaisha Image capturing apparatus
US8633992B2 (en) 2009-09-09 2014-01-21 Nikon Corporation Focus detection device, photographic lens unit, image-capturing apparatus and camera system
WO2011045850A1 (en) * 2009-10-13 2011-04-21 キヤノン株式会社 Focusing device and focusing method
US8488956B2 (en) 2009-10-13 2013-07-16 Canon Kabushiki Kaisha Focus adjusting apparatus and focus adjusting method
JP2011114553A (en) * 2009-11-26 2011-06-09 Nikon Corp Imaging device
JP2012058379A (en) * 2010-09-07 2012-03-22 Toshiba Corp Solid-state image pickup device
WO2012105222A1 (en) * 2011-01-31 2012-08-09 パナソニック株式会社 Image restoration device, imaging device, and image restoration method
CN102804751B (en) * 2011-01-31 2016-08-03 松下电器产业株式会社 Image recovery device, camera head and image recovery method
CN102804751A (en) * 2011-01-31 2012-11-28 松下电器产业株式会社 Image restoration device, imaging device, and image restoration method
US8767092B2 (en) 2011-01-31 2014-07-01 Panasonic Corporation Image restoration device, imaging apparatus, and image restoration method
JP2012226213A (en) * 2011-04-21 2012-11-15 Canon Inc Imaging apparatus and control method therefor
JP2013037296A (en) * 2011-08-10 2013-02-21 Olympus Imaging Corp Image pickup apparatus and image pickup device
JP2013037295A (en) * 2011-08-10 2013-02-21 Olympus Imaging Corp Image pickup apparatus and image pickup device
WO2013133115A1 (en) * 2012-03-06 2013-09-12 株式会社ニコン Defocus amount detection device and camera
JP2017138621A (en) * 2012-03-06 2017-08-10 株式会社ニコン Defocus amount detection device and camera
JP2013214048A (en) * 2012-03-06 2013-10-17 Nikon Corp Defocus amount detection device and camera
US10514248B2 (en) 2012-07-31 2019-12-24 Canon Kabushiki Kaisha Distance detecting apparatus
JP2014029393A (en) * 2012-07-31 2014-02-13 Canon Inc Distance detection device
WO2014021147A1 (en) * 2012-07-31 2014-02-06 Canon Kabushiki Kaisha Distance detecting apparatus
US9215364B2 (en) 2012-10-26 2015-12-15 Canon Kabushiki Kaisha Focus detection apparatus, image pickup apparatus, image pickup system and focus detection method
JP2014174357A (en) * 2013-03-11 2014-09-22 Canon Inc Imaging apparatus, imaging system, signal processor, program, and storage medium
JP2015011283A (en) * 2013-07-01 2015-01-19 キヤノン株式会社 Image capturing device, method of controlling the same, and program
WO2015046246A1 (en) * 2013-09-30 2015-04-02 オリンパス株式会社 Camera system and focal point detection pixel correction method
JP2015069180A (en) * 2013-09-30 2015-04-13 オリンパス株式会社 Camera system and correction method of focus detection pixels
US9509899B2 (en) 2013-09-30 2016-11-29 Olympus Corporation Camera system and method for correcting focus detection pixel
CN105593739A (en) * 2013-09-30 2016-05-18 奥林巴斯株式会社 Camera system and focal point detection pixel correction method
JP2014032414A (en) * 2013-10-02 2014-02-20 Nikon Corp Imaging device
JP2015078855A (en) * 2013-10-15 2015-04-23 キヤノン株式会社 Distance detector, image capturing device, and distance detection method
JP2016090649A (en) * 2014-10-30 2016-05-23 オリンパス株式会社 Focus adjustment device, camera system and focus adjustment method
WO2016067648A1 (en) * 2014-10-30 2016-05-06 オリンパス株式会社 Focal point adjustment device, camera system, and focal point adjustment method
US10021289B2 (en) 2014-12-26 2018-07-10 Canon Kabushiki Kaisha Image pickup apparatus and image pickup system with point image intensity distribution calculation
US10321044B2 (en) 2014-12-26 2019-06-11 Canon Kabushiki Kaisha Image pickup apparatus and image pickup system with point image intensity distribution calculation
JP2018013624A (en) * 2016-07-21 2018-01-25 リコーイメージング株式会社 Focus detection device, focus detection method and imaging device
JP2018165826A (en) * 2018-06-15 2018-10-25 キヤノン株式会社 Imaging device and lens device
JP2019219577A (en) * 2018-06-21 2019-12-26 キヤノン株式会社 Detection device and detection method
JP2019219576A (en) * 2018-06-21 2019-12-26 キヤノン株式会社 Focus detection method
JP7146477B2 (en) 2018-06-21 2022-10-04 キヤノン株式会社 Detection device and detection method
JP7237476B2 (en) 2018-06-21 2023-03-13 キヤノン株式会社 Focus detection method

Also Published As

Publication number Publication date
JP4984491B2 (en) 2012-07-25

Similar Documents

Publication Publication Date Title
JP4984491B2 (en) Focus detection apparatus and optical system
JP4946059B2 (en) Imaging device
JP5169499B2 (en) Imaging device and imaging apparatus
US10491799B2 (en) Focus detection apparatus, focus control apparatus, image capturing apparatus, focus detection method, and storage medium
JP5168798B2 (en) Focus adjustment device and imaging device
US8098984B2 (en) Focus detecting apparatus and an imaging apparatus
JP4972960B2 (en) Focus adjustment device and imaging device
JP5168797B2 (en) Imaging device
JP2008268403A (en) Focus detection device, focus detection method, and imaging apparatus
CN103837959A (en) Focus detection apparatus, focus detection method, and image capturing apparatus
JP2012234152A (en) Imaging apparatus and control method thereof
JP6854619B2 (en) Focus detection device and method, imaging device, lens unit and imaging system
JP5157073B2 (en) Focus adjustment device and imaging device
JP2006071950A (en) Optical equipment
JP2006215398A (en) Image pickup device
JP2006065080A (en) Imaging device
JP2014194502A (en) Imaging apparatus and imaging system
JP2017223879A (en) Focus detector, focus control device, imaging apparatus, focus detection method, and focus detection program
JP2018180135A (en) Imaging device
JP2017219791A (en) Control device, imaging device, control method, program, and storage medium
JP4938922B2 (en) Camera system
JP2009128843A (en) Focus detecting device and image pickup apparatus having the same
JP2020113948A (en) Imaging element, imaging apparatus, control method, and program
JP2014010284A (en) Focus detector, imaging apparatus and camera system
JP2013140380A (en) Imaging apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080804

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110125

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110322

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110523

RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20110523

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20120403

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20120416

R150 Certificate of patent or registration of utility model

Ref document number: 4984491

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150511

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150511

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees