JP2008299184A - Imaging apparatus and focus detecting device - Google Patents

Imaging apparatus and focus detecting device Download PDF

Info

Publication number
JP2008299184A
JP2008299184A JP2007146819A JP2007146819A JP2008299184A JP 2008299184 A JP2008299184 A JP 2008299184A JP 2007146819 A JP2007146819 A JP 2007146819A JP 2007146819 A JP2007146819 A JP 2007146819A JP 2008299184 A JP2008299184 A JP 2008299184A
Authority
JP
Japan
Prior art keywords
pixel
light receiving
receiving unit
output
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2007146819A
Other languages
Japanese (ja)
Inventor
Takeshi Utagawa
健 歌川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to JP2007146819A priority Critical patent/JP2008299184A/en
Publication of JP2008299184A publication Critical patent/JP2008299184A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To improve a blurred image in a photographic optical system upon a non-focusing. <P>SOLUTION: The imaging apparatus is equipped with: an imaging device 4 where a first pixel 41 receiving one luminous flux out of a pair of luminous fluxes passing through different pupil areas of the photographic optical system by a first light receiving part 41a through a microlens 41b, and a second pixel 42 receiving the other by a second light receiving part 42a through a microlens 42b are arranged, and which receives an image formed by the photographic optical system; and an arithmetic means which adds output from the first pixel 41 to output from the second pixel 42. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

本発明は撮像装置と焦点検出装置に関する。   The present invention relates to an imaging device and a focus detection device.

マイクロレンズと受光部からなる画素が二次元状に配列された撮像素子により撮影光学系を透過した被写体からの光を受光し、撮像素子の画素信号を用いて画像を生成するとともに、撮影光学系の焦点調節状態を検出するようにした撮像装置が知られている(例えば、特許文献1参照)。
上述した装置では、マイクロレンズの光軸と受光部の中心とを交互にずらすことによって、瞳分割型位相差検出方式による焦点検出を可能にしている。
The imaging element in which the pixels comprising the microlens and the light receiving unit are two-dimensionally arranged receives light from the subject that has passed through the imaging optical system, generates an image using the pixel signal of the imaging element, and the imaging optical system. There is known an imaging apparatus that detects the focus adjustment state of the camera (see, for example, Patent Document 1).
In the apparatus described above, focus detection by the pupil division type phase difference detection method is enabled by alternately shifting the optical axis of the microlens and the center of the light receiving unit.

この出願の発明に関連する先行技術文献としては次のものがある。
特許第2959142号
Prior art documents related to the invention of this application include the following.
Patent No. 2959142

しかしながら、上述した従来の撮像装置では、マイクロレンズの光軸と受光部の中心とが交互にずれているので、撮像素子の各画素の出力信号により画像を生成すると、撮影光学系の非合焦状態におけるボケ像が“まだらな”で“ざらついた”状態になり、不自然な画像になるという問題がある。   However, in the conventional imaging device described above, the optical axis of the microlens and the center of the light receiving unit are alternately shifted. Therefore, when an image is generated by the output signal of each pixel of the imaging device, the imaging optical system is not focused. There is a problem that the blurred image in the state becomes “mottled” and “grainy”, resulting in an unnatural image.

(1) 請求項1の発明は、撮影光学系の異なる瞳領域を通過した一対の光束の内、マイクロレンズを介して第1受光部により一方の光束を受光する第1画素と、マイクロレンズを介して第2受光部により他方の光束を受光する第2画素とが配列され、撮影光学系により結像された像を受光する撮像素子と、第1画素の出力と第2画素の出力とを加算する演算手段とを備える。
(2) 請求項2の撮像装置は、演算手段によって、マイクロレンズの光軸に対する第1受光部と第2受光部のそれぞれのずれが対象となる第1画素と第2画素の出力を加算するようにしたものである。
(3) 請求項3の撮像装置は、演算手段によって、第1画素と第2画素の配列の内、各マイクロレンズの光軸に対して第1受光部と第2受光部とが互いに内側となる組の出力を加算した加算結果と、各マイクロレンズの光軸に対して第1受光部と第2受光部とが互いに外側となる組の出力を加算した加算結果とに基づいて画像を合成するようにしたものである。
(4) 請求項4の発明は、撮影光学系の異なる瞳領域を通過した一対の光束の内の一方を受光する第1受光部を有する第1画素と、一対の光束の内の他方の光束を受光する第2受光部を有する第2画素とが配列され、撮影光学系により結像された像を受光する撮像素子と、第1画素に入射する他方の光束に相当する出力を、第2画素における第2受光部で得られる出力に基づいて求め、他方の光束に相当する出力と第1受光部で得られる出力とに基づいて第1画素の出力を求める演算手段とを備える。
(5) 請求項5の撮像装置は、演算手段によって、第1画素の近傍に位置する第2画素における第2受光部で得られる出力に基づいて、他方の光束に相当する出力を求めるようにしたものである。
(6) 請求項6の撮像装置の撮像素子は、第1画素と第2画素とが交互に配列される。
(7) 請求項7の撮像装置は、第1受光部と第2受光部の形状が、第1画素と第2画素とを重ね合わさせた場合に、第1受光部と第2受光部の外周が略円形になるように決定される。
(8) 請求項8の撮像装置の撮像素子は、第1画素と第2画素とが二次元状に配列されている。
(9) 請求項9の撮像装置は、演算手段によって、撮影光学系の瞳領域の分割方向に配列される第1画素と第2画素の出力を加算するようにしたものである。
(10) 請求項10の撮像装置の撮像素子は、第1画素と第2画素が瞳領域の分割方向と直交する方向に並進対称性を持たないように配列される。
(11) 請求項11の焦点検出装置は、請求項1〜10のいずれか1項に記載の撮像装置と、第1画素と第2画素の出力に基づいて瞳分割型位相差検出により撮影光学系の焦点調節状態を検出する焦点検出手段とを備える。
(1) According to the first aspect of the present invention, a first pixel that receives one light beam by a first light receiving unit via a micro lens among a pair of light beams that have passed through different pupil regions of the photographing optical system, and a micro lens are provided. A second pixel that receives the other light beam by the second light receiving unit, and an image pickup device that receives an image formed by the photographing optical system, an output of the first pixel, and an output of the second pixel Arithmetic means for adding.
(2) In the imaging apparatus according to claim 2, the outputs of the first pixel and the second pixel that are targeted for the respective deviations of the first light receiving unit and the second light receiving unit with respect to the optical axis of the microlens are added by the calculation unit. It is what I did.
(3) In the imaging device according to claim 3, the first light receiving unit and the second light receiving unit are located inside each other with respect to the optical axis of each microlens in the arrangement of the first pixel and the second pixel by the calculating unit. An image is synthesized based on the addition result obtained by adding the output of the set and the addition result obtained by adding the output of the set in which the first light receiving unit and the second light receiving unit are outside each other with respect to the optical axis of each microlens. It is what you do.
(4) The invention of claim 4 is a first pixel having a first light receiving portion for receiving one of a pair of light beams that have passed through different pupil regions of the photographing optical system, and the other light beam of the pair of light beams. And a second pixel having a second light-receiving unit that receives the light, and an image sensor that receives an image formed by the photographing optical system, and an output corresponding to the other light beam incident on the first pixel. Calculation means for obtaining the output of the first pixel based on the output obtained by the second light receiving unit in the pixel and the output corresponding to the other light beam and the output obtained by the first light receiving unit.
(5) The imaging apparatus according to claim 5 is configured to obtain an output corresponding to the other light flux based on an output obtained by the second light receiving unit in the second pixel located in the vicinity of the first pixel by the arithmetic unit. It is a thing.
(6) In the imaging device of the imaging device according to the sixth aspect, the first pixels and the second pixels are alternately arranged.
(7) In the imaging device according to claim 7, when the shapes of the first light receiving unit and the second light receiving unit overlap the first pixel and the second pixel, the outer periphery of the first light receiving unit and the second light receiving unit. Is determined to be substantially circular.
(8) In the image pickup device of the image pickup apparatus according to the eighth aspect, the first pixel and the second pixel are two-dimensionally arranged.
(9) The image pickup apparatus according to claim 9 adds the outputs of the first pixel and the second pixel arranged in the dividing direction of the pupil region of the photographing optical system by the calculating means.
(10) The imaging device of the imaging device according to claim 10 is arranged so that the first pixel and the second pixel do not have translational symmetry in a direction orthogonal to the dividing direction of the pupil region.
(11) A focus detection apparatus according to an eleventh aspect is the imaging apparatus according to any one of the first to tenth aspects, and imaging optical by pupil division type phase difference detection based on outputs of the first pixel and the second pixel. Focus detection means for detecting the focus adjustment state of the system.

本発明によれば、撮影光学系の非合焦状態におけるボケ像がまだらなでざらついた状態になるのを抑制し、滑らかなボケ像を得ることができる。   According to the present invention, it is possible to suppress a blurred image in the out-of-focus state of the photographic optical system from becoming a mottled and rough state and obtain a smooth blurred image.

図1は一実施の形態の撮像装置1の主要部構成を示す。撮影レンズ2を透過した被写体からの光はコンデンサーレンズ3を透過し、撮像素子4へ入射する。撮像素子4は、複数のマイクロレンズが二次元状に配列されたマイクロレンズアレイ4mと、複数の光電変換素子(以下、受光部という)が二次元状に配列された受光部アレイ4jとが一体に形成され、一つのマイクロレンズ下に一つの受光部が配置されて一つの画素を構成している。なお、コンデンサーレンズ3は各マイクロレンズへの入射光を撮影レンズ2の光軸と平行にする。   FIG. 1 shows a main part configuration of an imaging apparatus 1 according to an embodiment. Light from the subject that has passed through the photographic lens 2 passes through the condenser lens 3 and enters the image sensor 4. The image pickup device 4 is integrally formed with a microlens array 4m in which a plurality of microlenses are arranged in a two-dimensional manner and a light receiving portion array 4j in which a plurality of photoelectric conversion elements (hereinafter referred to as light receiving portions) are arranged in a two-dimensional manner. And one light receiving portion is arranged under one microlens to constitute one pixel. The condenser lens 3 makes incident light to each microlens parallel to the optical axis of the photographing lens 2.

また、撮像装置1は制御装置5、駆動装置6、メモリ7などを備えている。制御装置5は、図示しないマイクロコンピューター、メモリ、A/Dコンバーター、撮像素子駆動回路などから構成され、マイクロコンピューターのソフトウエア形態により構成される焦点検出部5aと画像合成部5bを有している。   The imaging device 1 includes a control device 5, a drive device 6, a memory 7, and the like. The control device 5 includes a microcomputer (not shown), a memory, an A / D converter, an image sensor driving circuit, and the like, and includes a focus detection unit 5a and an image synthesis unit 5b configured by a microcomputer software form. .

焦点検出部5aは、撮像素子4の出力信号に基づいて撮影レンズ2の焦点調節状態を示すデフォーカス量を検出し、レンズ駆動量に変換して駆動装置6へ出力する。駆動装置6はレンズ駆動用モーター(不図示)を備え、レンズ駆動量にしたがって撮影レンズ2のフォーカシングレンズ(不図示)を駆動し、焦点調節を行う。画像合成部5bは、撮像素子4の出力信号を合成するとともに合成後の画像に各種の処理を施し、被写体像を生成してメモリ7に記録する。   The focus detection unit 5 a detects the defocus amount indicating the focus adjustment state of the photographing lens 2 based on the output signal of the image sensor 4, converts it into a lens drive amount, and outputs it to the drive device 6. The driving device 6 includes a lens driving motor (not shown), drives a focusing lens (not shown) of the photographing lens 2 according to the lens driving amount, and performs focus adjustment. The image synthesizing unit 5 b synthesizes the output signals of the image sensor 4, performs various processes on the synthesized image, generates a subject image, and records it in the memory 7.

図2は撮像素子4をコンデンサーレンズ3の側から見た正面図である。なお、図2では撮像素子4の左上隅の一部を拡大して示す。撮像素子4は、撮影レンズ2の異なる瞳領域をそれぞれ通過した一対の光束を受光する2種類の画素41と42が、瞳領域の分割方向(図2の列方向)に交互に二次元状に配列されている。第1の画素41は、撮影レンズ2の一対の瞳領域の内の一方の瞳領域を通過した光束を受光する受光部41aと、マイクロレンズ41bとからなる。また、第2の画素42は、撮影レンズ2の上記一対の瞳領域の内の他方の瞳領域を通過した光束を受光する受光部42aと、マイクロレンズ42bとからなる。マイクロレンズ41bと42bは同じものであるが、受光部41aと42aは互いに異なる。一対の受光部41a,42aの内の一方の受光部41aを有する画素41と、他方の受光部42aを有する画素42とで一対の画素41,42を構成する。   FIG. 2 is a front view of the image sensor 4 as viewed from the condenser lens 3 side. In FIG. 2, a part of the upper left corner of the image sensor 4 is enlarged. In the imaging device 4, two types of pixels 41 and 42 that receive a pair of light beams respectively passing through different pupil regions of the photographing lens 2 are alternately two-dimensionally in the division direction of the pupil region (column direction in FIG. 2). It is arranged. The first pixel 41 includes a light receiving unit 41a that receives a light beam that has passed through one of the pair of pupil regions of the photographing lens 2, and a micro lens 41b. The second pixel 42 includes a light receiving unit 42a that receives a light beam that has passed through the other pupil region of the pair of pupil regions of the photographing lens 2, and a micro lens 42b. The microlenses 41b and 42b are the same, but the light receiving portions 41a and 42a are different from each other. A pair of pixels 41 and 42 is composed of a pixel 41 having one light receiving portion 41a of the pair of light receiving portions 41a and 42a and a pixel 42 having the other light receiving portion 42a.

撮像素子4では、撮影レンズ2の瞳領域の分割方向(図2の列方向)において、マイクロレンズの光軸と受光部の中心とが交互にずれた一対の画素41と42が交互に配置されている。この一実施の形態では、さらに一対の画素41と42を重ね合わせた場合に、それぞれの受光部41aと42aの外周が円形となるように、一対の受光部41aと42aの形状が決定される。換言すれば、一対の受光部41aと42aを重ね合わせた場合の形状は、一対の受光部41aと42aの合成された受光光束の断面が略円形になるように決定される。   In the imaging device 4, a pair of pixels 41 and 42 in which the optical axis of the microlens and the center of the light receiving unit are alternately shifted are alternately arranged in the dividing direction (column direction in FIG. 2) of the pupil region of the photographing lens 2. ing. In this embodiment, when the pair of pixels 41 and 42 are further overlapped, the shape of the pair of light receiving parts 41a and 42a is determined so that the outer periphery of each of the light receiving parts 41a and 42a is circular. . In other words, the shape when the pair of light receiving portions 41a and 42a are overlapped is determined so that the cross-section of the combined received light beam of the pair of light receiving portions 41a and 42a is substantially circular.

このように、一対の画素41,42の受光部41a,42aを重ね合わせたときの外周が略円形になるように受光部形状を決定したことによって、撮影レンズ2が合焦状態にないときのボケ像が滑らかな円形になり、観察者に違和感を与えない。これに対し、一対の受光部の形状を図3に示すように四角形にすると、ボケ像も四角形になって観察者に違和感を与えてしまう。図3において、一対の画素51,52の受光部51aと52aを重ね合わせたときの外周の形状は四角形になり、撮影レンズが合焦状態にないときのボケ像も四角形になる。   As described above, when the light receiving part shape is determined so that the outer periphery when the light receiving parts 41a and 42a of the pair of pixels 41 and 42 are overlapped is substantially circular, the photographing lens 2 is not in focus. The blurred image becomes a smooth circle and does not give the viewer a sense of incongruity. On the other hand, if the shape of the pair of light receiving parts is a quadrangle as shown in FIG. 3, the blurred image also becomes a quadrangle, giving the viewer a sense of discomfort. In FIG. 3, the outer peripheral shape when the light receiving portions 51a and 52a of the pair of pixels 51 and 52 are superposed is a quadrangle, and the blurred image when the photographing lens is not in focus is also a quadrangle.

次に、一実施の形態の各受光部出力の合成方法を説明する。図4は、撮影レンズ2に対し共役関係にある物体側の面O上の点0〜9と、それらの点0〜9の像面Q上の対応点(0)〜(9)との結像関係と、デフォーカス位置にある点Pが撮影レンズ2の一対の瞳領域A、Bのいずれを通過して像面Qに至るかを示した図である。また、図5〜図9は点Pの像面Q上における像のボケの状態を説明するための図であり、図5と図7は像面Q上の対応点(0)〜(9)における画素を表し、図6、図8および図9は像面Q上の対応点(0)〜(9)における画素の信号出力状態を表す。なお、図5,図7の黒点は、点Pから発した光を受光する位置を表す。また、図6、図8および図9では、光を受光して信号を出力する画素をハッチングして示す。   Next, a method for synthesizing the outputs of the respective light receiving units according to the embodiment will be described. FIG. 4 shows the connection between points 0 to 9 on the object-side surface O that are conjugate to the photographing lens 2 and corresponding points (0) to (9) on the image plane Q of these points 0 to 9. FIG. 4 is a diagram showing an image relationship and a point P at a defocus position through which of a pair of pupil regions A and B of the photographing lens 2 reaches an image plane Q. 5 to 9 are diagrams for explaining the state of blurring of the image of the point P on the image plane Q. FIGS. 5 and 7 correspond to the corresponding points (0) to (9) on the image plane Q. 6, FIG. 8, and FIG. 9 show signal output states of the pixels at corresponding points (0) to (9) on the image plane Q. FIG. 5 and 7 represent positions where light emitted from the point P is received. 6, 8, and 9, pixels that receive light and output signals are hatched.

図5に示すように、各マイクロレンズ53の受光部53aの形状が円形で大きさが揃っている場合には、対応点(0)〜(6)の画素はそれぞれ点Pの位置から発せられた光を受光できる。そのため、図6に示すように、対応点(0)〜(6)の画素は受光した光強度に応じた信号を出力する。一方、対応点(7)〜(9)の画素では、点Pから発せられた光を受光しないので受光信号の出力がない。したがって、対応点(6)の画素と(7)の画素を境界とする点Pのボケ像が得られる。   As shown in FIG. 5, when the shape of the light receiving portion 53a of each microlens 53 is circular and the size is uniform, the pixels of the corresponding points (0) to (6) are emitted from the positions of the points P, respectively. Can receive light. Therefore, as shown in FIG. 6, the pixels at the corresponding points (0) to (6) output signals corresponding to the received light intensity. On the other hand, in the pixels of the corresponding points (7) to (9), the light emitted from the point P is not received, so that no light reception signal is output. Therefore, a blurred image of the point P with the pixel of the corresponding point (6) and the pixel of (7) as a boundary is obtained.

これに対し一実施の形態の一対の画素41,42(図2参照)は、一対の半円形の受光部41a,42aが交互に位置を変えるために、図7に示すように、対応点(0)〜(9)の画素42,41,42,・・・,41は点Pから発した光を受光したり、受光しなかったりする。図7において、対応点(0)の画素42はその受光部42aで点Pからの光を受光するが、対応点(1)の画素41はその受光部41aで点Pからの光を受光しない。   On the other hand, the pair of pixels 41 and 42 (see FIG. 2) of the embodiment has a corresponding point (see FIG. 7) because the pair of semicircular light receiving portions 41a and 42a alternately change positions. The pixels 42, 41, 42,..., 41 in (0) to (9) receive light emitted from the point P or not. In FIG. 7, the pixel 42 at the corresponding point (0) receives light from the point P at the light receiving portion 42a, but the pixel 41 at the corresponding point (1) does not receive light from the point P at the light receiving portion 41a. .

この結果、図8に示すように、点Pからの光を受光した画素(0、2、3、5)は受光強度に応じた信号を出力するが、受光しない画素(1、4、6)は受光信号の出力がなく、本来のボケ像の境界である対応点(6)と(7)を境にしてボケ像が形成されるはずの(0)〜(6)の範囲において、部分的に受光出力がない画素が発生してしまう。したがって、撮像素子4のすべての画素出力から形成される画像は、まだらな、ざらついたボケ像となり、印象の悪い画像になる。   As a result, as shown in FIG. 8, the pixels (0, 2, 3, 5) receiving the light from the point P output a signal corresponding to the received light intensity, but the pixels (1, 4, 6) that do not receive the light. In the range of (0) to (6) where there is no light reception signal output and a blurred image should be formed at the corresponding points (6) and (7) which are the boundaries of the original blurred image. In this case, a pixel having no light receiving output is generated. Therefore, an image formed from all the pixel outputs of the image sensor 4 is a mottled and rough blurred image, and an image having a poor impression.

この一実施の形態では、このような問題を解決するために、制御装置5の画像合成部5bにより一対の画素41,42の受光部41a,42aの出力を加算して合成する。つまり、マイクロレンズの光軸と受光部の中心とのずれが対象な一対の画素、換言すれば、受光部を重ね合わせたときの外周が略円形になる一対の画素の出力を加算して合成する。この結果、合成後の1画素は、撮影レンズ2の瞳全体の光を利用することになる。   In this embodiment, in order to solve such a problem, the outputs of the light receiving portions 41a and 42a of the pair of pixels 41 and 42 are added and combined by the image combining portion 5b of the control device 5. In other words, the output of a pair of pixels whose deviation between the optical axis of the microlens and the center of the light receiving unit is an object, in other words, the output of a pair of pixels whose outer periphery is substantially circular when the light receiving units are overlapped is added and combined. To do. As a result, the combined pixel uses the light of the entire pupil of the photographic lens 2.

具体的には、図7に示す対応点(0)と(1)、(1)と(2)、(2)と(3)、・・・の画素出力を加算し、加算後の出力をそれぞれ1画素の出力とすれば、合成後の各画素は撮影レンズ2の瞳領域Aを透過した光と瞳領域Bを透過した光とを受光することになり、合成後の各画素の出力状況は図9に示すようになる。したがって、受光信号のない画素がなくなり、まだらな、ざらついたボケ像になるのを防止できる。   Specifically, the pixel outputs of corresponding points (0) and (1), (1) and (2), (2) and (3),... Shown in FIG. If each output is one pixel, each combined pixel receives light transmitted through the pupil region A of the photographing lens 2 and light transmitted through the pupil region B, and the output state of each pixel after combining. Is as shown in FIG. Therefore, it is possible to prevent a pixel having no light reception signal from being lost, and an uneven blur image.

次に、この一実施の形態では、上述した瞳領域の分割方向と直交する方向において、画素41,42の配列が並進対称性を持たないように画素41,42を配列している。図2に示す画素配列例では、受光部41a,42aを重ね合わせたときの外周が略円形になる一対の画素41、42を、瞳領域の分割方向(列方向)に配列するとともに、瞳領域の分割方向と直交する方向(図2に示す撮像素子4の行方向)においては、画素配列が並進対称性を持たないようにしている。具体的には、図2の第1列と第2列の行方向において隣接する画素どうしは並進対称性を有していない。行方向における第2列と第3列、第3列と第4列、以下、すべての隣接する画素列においても同様に並進対称性を有していない。   Next, in this embodiment, the pixels 41 and 42 are arranged so that the arrangement of the pixels 41 and 42 does not have translational symmetry in a direction orthogonal to the above-described division direction of the pupil region. In the pixel arrangement example shown in FIG. 2, a pair of pixels 41 and 42 whose outer periphery is substantially circular when the light receiving portions 41 a and 42 a are overlapped are arranged in the pupil region division direction (column direction), and the pupil region In the direction orthogonal to the division direction (the row direction of the image sensor 4 shown in FIG. 2), the pixel array is made not to have translational symmetry. Specifically, adjacent pixels in the row direction of the first column and the second column in FIG. 2 do not have translational symmetry. Similarly, the second column and the third column in the row direction, the third column and the fourth column, and all the adjacent pixel columns hereinafter have no translational symmetry.

このような画素配列とすることによって、一実施の形態の上述した画素合成を実施したときに、画像に画素ピッチの濃淡が発生してまだらな、ざらついたボケ像になるのを抑制できる。例えば、図9に示す画素合成後の画素出力において、図8に示す対応点(2)と(3)の画素はともに点Pからの光を受光して信号を出力するので、合成後の対応点(2+3)の出力は2画素分の信号を加算したレベルとなり、濃い画像となる。一方、図8に示す対応点(1)の画素は受光せず、対応点(2)の画素は受光するので、合成後の対応点(1+2)の出力は1画素分の信号レベルとなり、淡い画像となる。このように、合成後の画像に画素ピッチの濃淡ができるが、瞳領域の分割方向と直交する方向において、一対の画素41,42の配列が並進対称性を持たないような画素配列としたことによって、画面に画素ピッチの帯状の濃淡が現れず、画素ピッチの濃淡が画面全体に適当にばらまかれてまだらな、ざらついたボケ像になるのを抑制できる。   By adopting such a pixel arrangement, it is possible to suppress the occurrence of a mottled and blurred image due to the occurrence of pixel pitch shading in the image when the above-described pixel synthesis of one embodiment is performed. For example, in the pixel output after pixel synthesis shown in FIG. 9, the pixels at the corresponding points (2) and (3) shown in FIG. 8 both receive light from the point P and output a signal. The output of the point (2 + 3) becomes a level obtained by adding the signals for two pixels, resulting in a dark image. On the other hand, the pixel at the corresponding point (1) shown in FIG. 8 does not receive light, and the pixel at the corresponding point (2) receives light. Therefore, the output of the corresponding point (1 + 2) after combining has a signal level of one pixel and is light. It becomes an image. In this way, the pixel image can be shaded in the synthesized image, but the pixel array is arranged so that the arrangement of the pair of pixels 41 and 42 does not have translational symmetry in the direction orthogonal to the division direction of the pupil region. Therefore, it is possible to suppress the formation of a grainy, blurred image in which the pixel pitch band-like shade does not appear on the screen and the pixel pitch shade is appropriately dispersed over the entire screen.

上記実施の形態では、一対の画素の受光部の出力を加算して画像信号を合成するようにしたが、一つの画素に存在しない受光部の仮想的な出力を求めて画像信号を合成する例を説明する。   In the above embodiment, the output of the light receiving unit of a pair of pixels is added to synthesize the image signal. However, the image signal is synthesized by obtaining the virtual output of the light receiving unit that does not exist in one pixel. Will be explained.

上述したように、図2に示す撮像素子は、撮影レンズ2の異なる瞳領域を通過した一対の光束のうちの一方を受光する受光部41aを有する第1の画素41と、撮影レンズ2の異なる瞳領域を通過した一対の光束のうちの他方を受光する第2の受光部42aを有する第2の画素42とが交互に二次元状に配列されている。この第1の画素41には、第2の画素42の受光部42aに相当する第1の画素内の位置に、上記一対の光束のうちの他方の光束が入射しているはずであるが、受光部が存在しないために出力を得ることはできない。また、第2の画素42も同様に、第1の画素41の受光部41aに相当する位置に入射する光束の出力が得られない。   As described above, the imaging element shown in FIG. 2 is different from the first pixel 41 having the light receiving portion 41 a that receives one of a pair of light beams that have passed through different pupil regions of the photographing lens 2, and the photographing lens 2. Second pixels 42 having second light receiving parts 42a that receive the other of the pair of light beams that have passed through the pupil region are alternately arranged two-dimensionally. The first light beam 41 should be incident on the first pixel 41 at a position in the first pixel corresponding to the light receiving portion 42a of the second pixel 42. Since there is no light receiving part, no output can be obtained. Similarly, the second pixel 42 cannot obtain the output of the light beam incident on the position corresponding to the light receiving portion 41 a of the first pixel 41.

そこで、第1の画素41の「仮想的な受光部」が存在した場合に得られるはずの「他方の光束」による出力を、第2の画素42の受光部42aの出力を代用することによって求める。これは、受光部42aの出力をそのまま仮想的な受光部の出力としてもよいし、受光部42aの出力から補間して求めてもよい。そして、第1の画素の受光部41aの出力と仮想的な受光部の出力との和を第1の画素の出力とし、第2の画素42を含む他の画素についても同様に画素の出力を求めるようにする。   Therefore, the output of the “other light flux” that should be obtained when the “virtual light receiving portion” of the first pixel 41 exists is obtained by substituting the output of the light receiving portion 42 a of the second pixel 42. . This may be obtained by directly using the output of the light receiving unit 42a as the output of the virtual light receiving unit or by interpolating from the output of the light receiving unit 42a. Then, the sum of the output of the light receiving unit 41a of the first pixel and the output of the virtual light receiving unit is used as the output of the first pixel, and the output of the pixel is similarly applied to other pixels including the second pixel 42. Try to ask.

なお、第1の画素41の周囲には、第2の画素42の他にも第2の画素と同様に、「他方の光束」を受光する第3の画素43が存在している。そこで、第2の画素42の受光部42aの出力と第3の画素43の受光部43aの出力とを平均し、第1の画素41における「仮想的な受光部」の出力を求めるようにしてもよい。この場合、第1の画素に対して第2の画素42と反対側に位置する第2の画素と同様の画素で得られる出力を受光部42aの出力と平均すれば、より良好に仮想的な受光部の出力を求めることができる。   In addition to the second pixel 42, a third pixel 43 that receives “the other beam” is present around the first pixel 41, as in the second pixel. Therefore, the output of the light receiving unit 42a of the second pixel 42 and the output of the light receiving unit 43a of the third pixel 43 are averaged to obtain the output of the “virtual light receiving unit” in the first pixel 41. Also good. In this case, if the output obtained from the same pixel as the second pixel located on the opposite side of the second pixel 42 with respect to the first pixel is averaged with the output of the light receiving unit 42a, the virtual image is better. The output of the light receiving unit can be obtained.

さらに、第1の画素41と第2の画素42との距離と、第1の画素41と第3の画素43との距離とが異なる場合は、距離の近い方の画素の出力に大きな重みをおいて加重加算平均すればよい。   Furthermore, when the distance between the first pixel 41 and the second pixel 42 is different from the distance between the first pixel 41 and the third pixel 43, a large weight is given to the output of the pixel having the shorter distance. In this case, the weighted averaging may be performed.

《画素配列の変形例》
図10は画素配列の変形例を示す。図において、40は通常の撮像用画素であり、マイクロレンズ下に円形の受光部を有する。この撮像素子4Aでは、一対の画素41と42の間に撮像用画素40が配置されている。このような画素配列にして一対の画素41と42の出力を加算し合成することによって、まだらな、ざらつきのない画像を得ることができる。
《Pixel array modification》
FIG. 10 shows a modification of the pixel array. In the figure, reference numeral 40 denotes a normal imaging pixel, which has a circular light receiving portion under the microlens. In the imaging element 4A, an imaging pixel 40 is disposed between a pair of pixels 41 and 42. By combining the outputs of the pair of pixels 41 and 42 in such a pixel array, a mottled and rough image can be obtained.

《画素配列の他の変形例》
図11は画素配列の他の変形例を示す。この撮像素子4Bでは、各マイクロレンズ下に(1/4)円の形状の受光部43a、44a、45a、46aを有する画素43,44,45,46を配列する。この撮像素子4Bでは、4個の画素43,44,45,46で1組となっており、これら1組4個の画素出力を加算し合成することによって、まだらな、ざらつきのない画像を得ることができる。
<< Other variations of pixel arrangement >>
FIG. 11 shows another modification of the pixel array. In this image sensor 4B, pixels 43, 44, 45, 46 having light receiving portions 43a, 44a, 45a, 46a in the shape of (1/4) circles are arranged under each microlens. In this imaging device 4B, four pixels 43, 44, 45, and 46 form one set, and by adding and combining these four sets of pixel outputs, a mottled and rough image is obtained. be able to.

次に、一実施の形態の焦点検出方法について説明する。制御装置5の焦点検出部5aは、撮影レンズ2の撮影画面内の予め設定された焦点検出エリアにおいて、瞳分割型位相差検出方式により撮影レンズ2の焦点調節状態を検出する。例えば図2に示す撮像素子4の第4列が焦点検出エリアに対応しており、この焦点検出エリアで焦点検出を行う場合には、第4列に属する複数の画素41,42の内、一方の画素41の出力信号のみを並べて第1信号列(第1像)を生成するとともに、他方の画素42の出力信号のみを並べて第2信号列(第2像)を生成し、周知の位相差検出方式の焦点検出演算を行って像ズレ量を算出し、撮影レンズ2のデフォーカス量に変換する。   Next, a focus detection method according to an embodiment will be described. The focus detection unit 5a of the control device 5 detects the focus adjustment state of the photographing lens 2 by a pupil division type phase difference detection method in a preset focus detection area in the photographing screen of the photographing lens 2. For example, the fourth column of the image sensor 4 shown in FIG. 2 corresponds to the focus detection area, and when focus detection is performed in this focus detection area, one of the plurality of pixels 41 and 42 belonging to the fourth column is selected. A first signal sequence (first image) is generated by arranging only the output signals of the pixels 41, and a second signal sequence (second image) is generated by arranging only the output signals of the other pixel 42. An image shift amount is calculated by performing a focus detection calculation of a detection method, and converted into a defocus amount of the photographing lens 2.

なお、図2に示す撮像素子4については、一対の画素41,42の瞳分割方向であれば第4列に限らず、他の画素配列を用いて撮影レンズ2の焦点検出を行うことができる。また、図11に示す撮像素子4Bについては、焦点検出に用いる一対の画素は、左上がり斜め45度方向に配列される一対の画素(43,46)と、右上がり斜め45度方向に配列される一対の画素(44,45)の出力に基づいて、撮影画面内の45度方向における撮影レンズ2のデフォーカス量を検出することができる。   Note that the imaging element 4 shown in FIG. 2 is not limited to the fourth column as long as it is in the pupil division direction of the pair of pixels 41 and 42, and the focus detection of the photographic lens 2 can be performed using another pixel arrangement. . For the image sensor 4B shown in FIG. 11, a pair of pixels used for focus detection are arranged in a pair of pixels (43, 46) arranged in a 45 ° upward diagonal direction and in a 45 ° upward diagonal direction. The defocus amount of the photographing lens 2 in the 45 degree direction in the photographing screen can be detected based on the output of the pair of pixels (44, 45).

制御装置5の焦点検出部5aはデフォーカス量をフォーカシングレンズ(不図示)のレンズ駆動量に変換し、駆動装置6へ出力する。駆動装置6は、レンズ駆動量にしたがって撮影レンズ2のフォーカシングレンズ(不図示)を駆動し、撮影レンズ2の焦点調節を行う。   The focus detection unit 5 a of the control device 5 converts the defocus amount into a lens drive amount of a focusing lens (not shown) and outputs the lens drive amount to the drive device 6. The driving device 6 drives a focusing lens (not shown) of the photographic lens 2 according to the lens driving amount to adjust the focus of the photographic lens 2.

なお、上述した一実施の形態とその変形例では、焦点検出に用いる画素を二次元状に展開した撮像素子を例に上げて説明したが、例えば図10の画素40のような撮像のみに用いる画素を二次元状に展開し、その一部の配列を焦点検出に用いる画素とした撮像素子を用い、焦点検出と撮像を行う場合にも本発明を適用することができる。   In the above-described embodiment and its modification, the image pickup element in which the pixels used for focus detection are two-dimensionally developed has been described as an example. However, the embodiment is used only for image pickup such as the pixel 40 in FIG. The present invention can also be applied to a case where an image sensor is used in which pixels are developed two-dimensionally and a part of the array is used for focus detection, and focus detection and imaging are performed.

一実施の形態の撮像装置の主要部構成を示す図The figure which shows the principal part structure of the imaging device of one Embodiment 撮像素子をコンデンサーレンズの側から見た正面図Front view of the image sensor as seen from the condenser lens side 四角形の受光部で撮像素子を構成した場合の図Diagram when the image sensor is configured with a rectangular light receiving part 撮影レンズに対し共役関係にある物体側の面O上の点0〜9と、それらの点0〜9の像面Q上の対応点(0)〜(9)との結像関係と、デフォーカス位置にある点Pが撮影レンズの瞳のA、Bのいずれを通過して像面Q上に至るかを示した図The imaging relationship between the points 0 to 9 on the object-side surface O that are conjugate to the photographic lens and the corresponding points (0) to (9) on the image plane Q of these points 0 to 9 and A diagram showing which point P at the focus position passes through the pupils A and B of the photographing lens and reaches the image plane Q 点Pの像面Q上における像のボケの状態を説明するための図The figure for demonstrating the blurring state of the image on the image surface Q of the point P 点Pの像面Q上における像のボケの状態を説明するための図The figure for demonstrating the blurring state of the image on the image surface Q of the point P 点Pの像面Q上における像のボケの状態を説明するための図The figure for demonstrating the blurring state of the image on the image surface Q of the point P 点Pの像面Q上における像のボケの状態を説明するための図The figure for demonstrating the blurring state of the image on the image surface Q of the point P 点Pの像面Q上における像のボケの状態を説明するための図The figure for demonstrating the blurring state of the image on the image surface Q of the point P 変形例の撮像素子を示す図The figure which shows the image pick-up element of a modification 他の変形例の撮像素子を示す図The figure which shows the image pick-up element of another modification

符号の説明Explanation of symbols

1 撮像装置
2 撮影レンズ
4、4A、4B 撮像素子
5 制御装置
5a 焦点検出部
5b 画像合成部
41〜46 画素
41a〜46a 受光部(光電変換素子)
41b、42b マイクロレンズ
DESCRIPTION OF SYMBOLS 1 Imaging device 2 Shooting lens 4, 4A, 4B Image pick-up element 5 Control apparatus 5a Focus detection part 5b Image composition part 41-46 Pixel 41a-46a Light-receiving part (photoelectric conversion element)
41b, 42b Microlens

Claims (11)

撮影光学系の異なる瞳領域を通過した一対の光束の内、マイクロレンズを介して第1受光部により一方の光束を受光する第1画素と、マイクロレンズを介して第2受光部により他方の光束を受光する第2画素とが配列され、前記撮影光学系により結像された像を受光する撮像素子と、
前記第1画素の出力と前記第2画素の出力とを加算する演算手段とを備えることを特徴とする撮像装置。
Of the pair of light beams that have passed through different pupil regions of the photographing optical system, a first pixel that receives one light beam by the first light receiving unit through the micro lens, and the other light beam by the second light receiving unit through the micro lens. And an image sensor that receives an image formed by the imaging optical system,
An imaging apparatus comprising: an arithmetic unit that adds the output of the first pixel and the output of the second pixel.
請求項1に記載の撮像装置において、
前記演算手段は、前記マイクロレンズの光軸に対する前記第1受光部と前記第2受光部のそれぞれのずれが対象となる前記第1画素と前記第2画素の出力を加算することを特徴とする撮像装置。
The imaging apparatus according to claim 1,
The computing means adds the outputs of the first pixel and the second pixel that are targeted for the respective deviations of the first light receiving unit and the second light receiving unit with respect to the optical axis of the microlens. Imaging device.
請求項2に記載の撮像装置において、
前記演算手段は、前記第1画素と前記第2画素の配列の内、前記各マイクロレンズの光軸に対して前記第1受光部と前記第2受光部とが互いに内側となる組の出力を加算した加算結果と、前記各マイクロレンズの光軸に対して前記第1受光部と前記第2受光部とが互いに外側となる組の出力を加算した加算結果とに基づいて画像を合成することを特徴とする撮像装置。
The imaging device according to claim 2,
The calculation means outputs a set of outputs in which the first light receiving unit and the second light receiving unit are inside each other with respect to the optical axis of each microlens in the arrangement of the first pixel and the second pixel. An image is synthesized based on the addition result obtained by addition and the addition result obtained by adding the output of the pair in which the first light receiving unit and the second light receiving unit are outside each other with respect to the optical axis of each microlens. An imaging apparatus characterized by the above.
撮影光学系の異なる瞳領域を通過した一対の光束の内の一方を受光する第1受光部を有する第1画素と、前記一対の光束の内の他方の光束を受光する第2受光部を有する第2画素とが配列され、前記撮影光学系により結像された像を受光する撮像素子と、
前記第1画素に入射する前記他方の光束に相当する出力を、前記第2画素における前記第2受光部で得られる出力に基づいて求め、前記他方の光束に相当する出力と前記第1受光部で得られる出力とに基づいて前記第1画素の出力を求める演算手段とを備えることを特徴とする撮像装置。
A first pixel having a first light-receiving unit that receives one of a pair of light beams that have passed through different pupil regions of the imaging optical system; and a second light-receiving unit that receives the other light beam of the pair of light beams. An image sensor in which a second pixel is arranged and receives an image formed by the imaging optical system;
An output corresponding to the other light beam incident on the first pixel is obtained based on an output obtained by the second light receiving unit in the second pixel, and an output corresponding to the other light beam and the first light receiving unit And an arithmetic unit that obtains the output of the first pixel based on the output obtained in step (1).
請求項4に記載の撮像装置において、
前記演算手段は、前記第1画素の近傍に位置する前記第2画素における前記第2受光部で得られる出力に基づいて、前記他方の光束に相当する出力を求めることを特徴とする撮像装置。
The imaging apparatus according to claim 4,
The imaging device is characterized in that an output corresponding to the other light beam is obtained based on an output obtained by the second light receiving unit in the second pixel located in the vicinity of the first pixel.
請求項1〜5のいずれか1項に記載の撮像装置において、
前記撮像素子は、前記第1画素と前記第2画素とが交互に配列されることを特徴とする撮像装置。
In the imaging device according to any one of claims 1 to 5,
The imaging device is characterized in that the first pixel and the second pixel are alternately arranged.
請求項1〜6のいずれか1項に記載の撮像装置において、
前記第1受光部と前記第2受光部の形状は、前記第1画素と前記第2画素とを重ね合わさせた場合に、前記第1受光部と前記第2受光部の外周が略円形になるように決定されることを特徴とする撮像装置。
In the imaging device according to any one of claims 1 to 6,
The shapes of the first light receiving part and the second light receiving part are such that, when the first pixel and the second pixel are overlapped, the outer circumferences of the first light receiving part and the second light receiving part are substantially circular. An imaging apparatus characterized by being determined as follows.
請求項1〜7のいずれか1項に記載の撮像装置において、
前記撮像素子は、前記第1画素と前記第2画素とが二次元状に配列されていることを特徴とする撮像装置。
In the imaging device according to any one of claims 1 to 7,
In the imaging device, the first pixel and the second pixel are two-dimensionally arranged.
請求項8に記載の撮像装置において、
前記演算手段は、前記撮影光学系の瞳領域の分割方向に配列される前記第1画素と前記第2画素の出力を加算することを特徴とする撮像装置。
The imaging device according to claim 8,
The imaging device is characterized in that the calculation means adds the outputs of the first pixel and the second pixel arranged in the dividing direction of the pupil region of the photographing optical system.
請求項8または請求項9に記載の撮像装置において、
前記撮像素子は、前記第1画素と前記第2画素が前記瞳領域の分割方向と直交する方向に並進対称性を持たないように配列されることを特徴とする撮像装置。
In the imaging device according to claim 8 or 9,
The imaging device, wherein the first pixel and the second pixel are arranged so as not to have translational symmetry in a direction orthogonal to a division direction of the pupil region.
請求項1〜10のいずれか1項に記載の撮像装置と、
前記第1画素と前記第2画素の出力に基づいて瞳分割型位相差検出により前記撮影光学系の焦点調節状態を検出する焦点検出手段とを備えることを特徴とする焦点検出装置。
The imaging device according to any one of claims 1 to 10,
A focus detection device comprising: focus detection means for detecting a focus adjustment state of the photographing optical system by pupil division type phase difference detection based on outputs of the first pixel and the second pixel.
JP2007146819A 2007-06-01 2007-06-01 Imaging apparatus and focus detecting device Pending JP2008299184A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007146819A JP2008299184A (en) 2007-06-01 2007-06-01 Imaging apparatus and focus detecting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007146819A JP2008299184A (en) 2007-06-01 2007-06-01 Imaging apparatus and focus detecting device

Publications (1)

Publication Number Publication Date
JP2008299184A true JP2008299184A (en) 2008-12-11

Family

ID=40172738

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007146819A Pending JP2008299184A (en) 2007-06-01 2007-06-01 Imaging apparatus and focus detecting device

Country Status (1)

Country Link
JP (1) JP2008299184A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010160313A (en) * 2009-01-08 2010-07-22 Sony Corp Imaging element and imaging apparatus
WO2012002298A1 (en) 2010-06-30 2012-01-05 富士フイルム株式会社 Imaging apparatus and imaging method
WO2012002297A1 (en) 2010-06-30 2012-01-05 富士フイルム株式会社 Imaging device and imaging method
JP2012159533A (en) * 2011-01-28 2012-08-23 Olympus Corp Imaging device and control method of imaging device
WO2013027513A1 (en) 2011-08-19 2013-02-28 富士フイルム株式会社 Image pickup apparatus and shading correction method
WO2013047160A1 (en) * 2011-09-28 2013-04-04 富士フイルム株式会社 Solid-state image capture element, image capture device, and focus control method
CN103039066A (en) * 2010-06-30 2013-04-10 富士胶片株式会社 Imaging device, image processing device, and image processing method
JP2013081105A (en) * 2011-10-04 2013-05-02 Canon Inc Imaging device and control method thereof
JP2013118459A (en) * 2011-12-02 2013-06-13 Canon Inc Imaging apparatus and control method thereof
JP5451894B2 (en) * 2010-09-22 2014-03-26 富士フイルム株式会社 Stereo imaging device and shading correction method
EP2755086A4 (en) * 2011-09-09 2015-06-03 Fujifilm Corp Stereoscopic image capture device and method
US9215389B2 (en) 2011-05-16 2015-12-15 Samsung Electronics Co., Ltd. Image pickup device, digital photographing apparatus using the image pickup device, auto-focusing method, and computer-readable medium for performing the auto-focusing method
US9332205B2 (en) 2014-03-24 2016-05-03 Canon Kabushiki Kaisha Image pickup element, image pickup apparatus, and image processing method
US9521395B2 (en) 2011-10-04 2016-12-13 Canon Kabushiki Kaisha Imaging apparatus and method for controlling same
JP2017229075A (en) * 2017-07-13 2017-12-28 キヤノン株式会社 Image processing apparatus and control method therefor, and imaging device
US10084978B2 (en) 2015-02-27 2018-09-25 Canon Kabushiki Kaisha Image capturing apparatus and image processing apparatus

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010160313A (en) * 2009-01-08 2010-07-22 Sony Corp Imaging element and imaging apparatus
CN103039066B (en) * 2010-06-30 2016-01-27 富士胶片株式会社 Imaging device, image processing apparatus and image processing method
US8902294B2 (en) 2010-06-30 2014-12-02 Fujifilm Corporation Image capturing device and image capturing method
US8933995B2 (en) 2010-06-30 2015-01-13 Fujifilm Corporation Image capturing device and image capturing method
WO2012002297A1 (en) 2010-06-30 2012-01-05 富士フイルム株式会社 Imaging device and imaging method
CN104977777A (en) * 2010-06-30 2015-10-14 富士胶片株式会社 Imaging device and imaging method
CN103039066A (en) * 2010-06-30 2013-04-10 富士胶片株式会社 Imaging device, image processing device, and image processing method
US9319659B2 (en) 2010-06-30 2016-04-19 Fujifilm Corporation Image capturing device and image capturing method
WO2012002298A1 (en) 2010-06-30 2012-01-05 富士フイルム株式会社 Imaging apparatus and imaging method
JP5451894B2 (en) * 2010-09-22 2014-03-26 富士フイルム株式会社 Stereo imaging device and shading correction method
JP2014098912A (en) * 2010-09-22 2014-05-29 Fujifilm Corp Imaging apparatus and shading correction method
JP2012159533A (en) * 2011-01-28 2012-08-23 Olympus Corp Imaging device and control method of imaging device
US9215389B2 (en) 2011-05-16 2015-12-15 Samsung Electronics Co., Ltd. Image pickup device, digital photographing apparatus using the image pickup device, auto-focusing method, and computer-readable medium for performing the auto-focusing method
CN103748873A (en) * 2011-08-19 2014-04-23 富士胶片株式会社 Image pickup apparatus and shading correction method
JP5499224B2 (en) * 2011-08-19 2014-05-21 富士フイルム株式会社 Imaging apparatus and shading correction method
US9549166B2 (en) 2011-08-19 2017-01-17 Fujifilm Corporation Imaging device and shading correction method
WO2013027513A1 (en) 2011-08-19 2013-02-28 富士フイルム株式会社 Image pickup apparatus and shading correction method
EP2755086A4 (en) * 2011-09-09 2015-06-03 Fujifilm Corp Stereoscopic image capture device and method
WO2013047160A1 (en) * 2011-09-28 2013-04-04 富士フイルム株式会社 Solid-state image capture element, image capture device, and focus control method
US9154688B2 (en) 2011-09-28 2015-10-06 Fujifilm Corporation Solid-state image capture element, image capture device, and focus control method
JP5613843B2 (en) * 2011-09-28 2014-10-29 富士フイルム株式会社 Solid-state imaging device, imaging apparatus, and focusing control method
JPWO2013047160A1 (en) * 2011-09-28 2015-03-26 富士フイルム株式会社 Solid-state imaging device, imaging apparatus, and focusing control method
JP2013081105A (en) * 2011-10-04 2013-05-02 Canon Inc Imaging device and control method thereof
US9521395B2 (en) 2011-10-04 2016-12-13 Canon Kabushiki Kaisha Imaging apparatus and method for controlling same
US9924155B2 (en) 2011-10-04 2018-03-20 Canon Kabushiki Kaisha Imaging apparatus and method for controlling same
US10587860B2 (en) 2011-10-04 2020-03-10 Canon Kabushiki Kaisha Imaging apparatus and method for controlling same
JP2013118459A (en) * 2011-12-02 2013-06-13 Canon Inc Imaging apparatus and control method thereof
US9332205B2 (en) 2014-03-24 2016-05-03 Canon Kabushiki Kaisha Image pickup element, image pickup apparatus, and image processing method
US10084978B2 (en) 2015-02-27 2018-09-25 Canon Kabushiki Kaisha Image capturing apparatus and image processing apparatus
JP2017229075A (en) * 2017-07-13 2017-12-28 キヤノン株式会社 Image processing apparatus and control method therefor, and imaging device

Similar Documents

Publication Publication Date Title
JP2008299184A (en) Imaging apparatus and focus detecting device
JP5264131B2 (en) Imaging device
JP5212396B2 (en) Focus detection device
JP5398346B2 (en) Imaging apparatus and signal processing apparatus
JP5914192B2 (en) Imaging apparatus and control method thereof
JP2009290157A (en) Imaging element, and imaging device
JP4823167B2 (en) Imaging device
WO2016158618A1 (en) Imaging device
US20130250167A1 (en) Image capturing apparatus, image processing method, and storage medium
JP6397281B2 (en) Imaging apparatus, control method thereof, and program
JP5475384B2 (en) Imaging apparatus and imaging method
JP4823168B2 (en) Imaging device
JP6329557B2 (en) Imaging apparatus and electronic apparatus
JP2013021615A (en) Image pickup apparatus
JP4823169B2 (en) Imaging device
JP6190119B2 (en) Image processing apparatus, imaging apparatus, control method, and program
JP2008241872A (en) Light detecting device, focus detecting device, and imaging apparatus
JP6364259B2 (en) Imaging apparatus, image processing method, and image processing program
WO2017170392A1 (en) Optical device
JP2012129804A (en) Imaging apparatus and control method of imaging apparatus
JP5589799B2 (en) Imaging device
JP2017085337A (en) Image processing device, imaging device and image processing program
JP6532411B2 (en) IMAGE PROCESSING DEVICE, IMAGING DEVICE, AND IMAGE PROCESSING PROGRAM
JP2016057402A (en) Imaging device and method for controlling the same
JP5691440B2 (en) Imaging device