JP2010258669A - Omnidirectional imaging apparatus - Google Patents

Omnidirectional imaging apparatus Download PDF

Info

Publication number
JP2010258669A
JP2010258669A JP2009105136A JP2009105136A JP2010258669A JP 2010258669 A JP2010258669 A JP 2010258669A JP 2009105136 A JP2009105136 A JP 2009105136A JP 2009105136 A JP2009105136 A JP 2009105136A JP 2010258669 A JP2010258669 A JP 2010258669A
Authority
JP
Japan
Prior art keywords
imaging
imaging system
omnidirectional
unit
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2009105136A
Other languages
Japanese (ja)
Inventor
Katsuo Kawamura
佳津男 河村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Priority to JP2009105136A priority Critical patent/JP2010258669A/en
Publication of JP2010258669A publication Critical patent/JP2010258669A/en
Pending legal-status Critical Current

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To obtain a panoramic image of an excellent image even though characteristic differences with lapse of time among respective cameras occur in an omnidirectional imaging apparatus in which a plurality of cameras are arranged radially to image a panoramic image of the full 360 degrees. <P>SOLUTION: The omnidirectional imaging apparatus includes: a first imaging system A composed of a plurality of imaging units arranged such that edge portions of photographic field angles of adjacent imaging units A1 to A4 overlap each other to image an object image of the full 360 degrees; a second imaging system B composed of a plurality of imaging units B1 to B4 provided adjacently to the respective imaging units A1 to A4 of the first imaging system A; and a driving means for moving the second imaging system B relatively with respect to the first imaging system A to change pairs of the imaging units of the first imaging system A and the imaging units of the second imaging system B which are adjacent. <P>COPYRIGHT: (C)2011,JPO&INPIT

Description

本発明は、複数台の撮像部を配置して周囲360度の被写体画像を撮影する全方位撮像装置に関する。   The present invention relates to an omnidirectional imaging apparatus that arranges a plurality of imaging units and captures a subject image of 360 degrees around.

複数台のカメラ(撮像部)をリング状に配置して周囲360度のパノラマ画像を撮影できる全方位撮像装置が、下記の特許文献1,2に記載されている。この様な全方位撮像装置では、複数台のカメラを用い各カメラの撮像画像をパノラマ合成する関係で、特性差すなわち個体差の小さな複数台のカメラを用意するのが好ましい。   Patent Documents 1 and 2 listed below describe omnidirectional imaging devices that can arrange a plurality of cameras (imaging units) in a ring shape and capture a 360-degree panoramic image. In such an omnidirectional imaging apparatus, it is preferable to prepare a plurality of cameras having a small characteristic difference, that is, a small individual difference because of the panoramic composition of the captured images of the cameras using a plurality of cameras.

しかし、全方位撮像装置を長期間使用すると、経年的に各カメラの特性にバラツキが生じ、パノラマ合成したとき、ある画像部分は輝度が高く、別の画像部分はシェーディングが酷いという結果になってしまう。   However, if the omnidirectional imaging device is used for a long time, the characteristics of each camera will vary over time, and when panoramic synthesis is performed, the result is that one image part has high brightness and another image part has severe shading. End up.

ある期間使用した後に、全方位撮像装置を分解することなく各カメラの特性差を補正することができれば、全カメラで特性が均一の画像を得ることができる。例えば、下記の特許文献3記載の複眼カメラでは、複眼カメラから個体差情報を取り出して複眼カメラを制御する様にしている。   If the difference in characteristics of each camera can be corrected without disassembling the omnidirectional imaging apparatus after a certain period of use, an image with uniform characteristics can be obtained with all cameras. For example, in the compound eye camera described in Patent Document 3 below, individual difference information is extracted from the compound eye camera and the compound eye camera is controlled.

しかしながら、この複眼カメラは、同一被写体を撮像するため、2つのカメラの個体差を求めることができるが、全方位撮像装置では、例えば日光に対して逆光となる向きのカメラの撮像画像と、順光となる向きのカメラの撮像画像とでは全く異なる被写体,露光量となり、被写体画像間の違いが大きすぎてカメラ間の特性差を求めることができない。   However, since this compound-eye camera captures the same subject, the individual difference between the two cameras can be obtained. However, in the omnidirectional imaging apparatus, for example, the captured image of the camera in the direction of backlight against sunlight is The subject and exposure amount are completely different from the captured image of the camera in the direction of light, and the difference between the subject images is too large to determine the characteristic difference between the cameras.

特開平11―98342号公報Japanese Patent Laid-Open No. 11-98342 特開2004―61808号公報JP 2004-61808 A 特開平11―146425号公報JP-A-11-146425

本発明の目的は、複数台のカメラを配置して周囲360度の被写体画像を撮影する全方位撮像装置において、経年的にも特性の均一な全方位撮像画像を得ることができるようにする。   An object of the present invention is to enable an omnidirectional imaging device that arranges a plurality of cameras and shoots a subject image of 360 degrees around to obtain an omnidirectional captured image having uniform characteristics over time.

本発明の全方位撮像装置は、隣接する撮像部の撮影画角の端部分が重なるように配置され周囲360度の被写体画像を撮像する複数台の撮像部で構成される第1撮像系と、該第1撮像系の各撮像部に隣接して設けられる撮像部複数台で構成される第2撮像系と、前記第1撮像系に対し前記第2撮像系を相対的に移動して前記隣接する前記第1撮像系の前記撮像部と前記第2撮像系の前記撮像部とのペアを変更する駆動手段とを備えることを特徴とする。   An omnidirectional imaging device according to the present invention includes a first imaging system configured by a plurality of imaging units that are arranged so that end portions of shooting field angles of adjacent imaging units overlap and that capture a subject image of 360 degrees around the imaging unit. A second imaging system comprising a plurality of imaging units provided adjacent to each imaging unit of the first imaging system; and the adjacent by moving the second imaging system relative to the first imaging system Driving means for changing a pair of the imaging unit of the first imaging system and the imaging unit of the second imaging system.

本発明によれば、複数の撮像部間で経時的に特性差が発生しても、容易に各撮像部間の相対的な特性差を検出できるため、特性差を吸収する画像補正を行い、常に良好な画質の全方位撮像画像を得ることが可能となる。   According to the present invention, even if a characteristic difference occurs over time between a plurality of imaging units, a relative characteristic difference between each imaging unit can be easily detected, so image correction that absorbs the characteristic difference is performed, It is possible to obtain an omnidirectional captured image with always good image quality.

本発明の一実施形態に係る全方位撮像装置の側面図(a)及び平面図(b)である。It is the side view (a) and top view (b) of the omnidirectional imaging device which concern on one Embodiment of this invention. 図1に示すカメラのブロック構成図である。It is a block block diagram of the camera shown in FIG. 図2に示す制御部の機能構成図である。It is a function block diagram of the control part shown in FIG. 図1に示す全方位撮像装置の個体差検出時の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence at the time of the individual difference detection of the omnidirectional imaging device shown in FIG. 本発明の他の実施形態に係る全方位撮像装置の説明図である。It is explanatory drawing of the omnidirectional imaging device which concerns on other embodiment of this invention.

以下、本発明の一実施形態について、図面を参照して説明する。   Hereinafter, an embodiment of the present invention will be described with reference to the drawings.

図1(a)は、本発明の一実施形態に係る全方位撮像装置の側面図であり、図1(b)は、その上面図である。本実施形態の全方位撮像装置1は、撮像系Aと、撮像系Bを備え、撮像系Aは4台のカメラ(撮像部)A1〜A4を備え、撮像系Bも4台のカメラ(撮像部)B1〜B4を備える。   FIG. 1A is a side view of an omnidirectional imaging apparatus according to an embodiment of the present invention, and FIG. 1B is a top view thereof. The omnidirectional imaging apparatus 1 of this embodiment includes an imaging system A and an imaging system B. The imaging system A includes four cameras (imaging units) A1 to A4, and the imaging system B also includes four cameras (imaging). Part) B1-B4 are provided.

撮像系Aは、円筒状の筐体2を備え、その周壁部分に、90度毎に各カメラA1,A2,A3,A4が固定設置される。また、撮像系Bも、円筒状の筐体3を備え、その周囲部分に、90度毎に各カメラB1,B2,B3,B4が固定設置されている。そして、筐体3に対して筐体2が軸4を中心に双頭矢印Xで示す様に回転可能に取り付けられている。   The imaging system A includes a cylindrical housing 2, and cameras A1, A2, A3, A4 are fixedly installed on the peripheral wall portion every 90 degrees. The imaging system B also includes a cylindrical housing 3, and cameras B1, B2, B3, and B4 are fixedly installed every 90 degrees around the casing 3. And the housing | casing 2 is attached with respect to the housing | casing 3 so that rotation is possible as shown by the double-headed arrow X centering on the axis | shaft 4. FIG.

撮像系Aの軸4を中心としてリング状,放射状に配置された各カメラA1〜A4は、各々の撮影画角が端部分で重複するように設置され、軸4周りの周囲360度の被写体画像を4台のカメラA1〜A4で撮影できる様になっている。同様に、撮像系Bの各カメラB1〜B4も、各々の撮影画角が端部分で重複するように設置され、軸4周りの周囲360度の被写体画像を4台のカメラB1〜B4で撮影できる様になっている。   The cameras A1 to A4 arranged in a ring shape and a radial shape around the axis 4 of the imaging system A are installed so that the respective shooting angles of view overlap at the end portions, and subject images of 360 degrees around the axis 4 Can be photographed by four cameras A1 to A4. Similarly, the cameras B1 to B4 of the imaging system B are also installed so that the respective shooting angles of view overlap at the end portions, and subject images of 360 degrees around the axis 4 are taken by the four cameras B1 to B4. It can be done.

図1(a)に示す例では、撮像系Bが地面側に設置され、空側(上側)にくるカメラA4と下側のカメラB4とが隣接するペアを構成する一対の複眼カメラとなる。カメラA4とカメラB4の各入射光軸は平行となり、撮影画角は、両カメラA4,B4間の入射光軸位置差(カメラ位置差)だけずれた関係となる。他のペアとなる複眼カメラ(A1,B1)(A2,B2)(A3,B3)も同様である。   In the example shown in FIG. 1A, the imaging system B is installed on the ground side, and the camera A4 on the sky side (upper side) and the lower camera B4 form a pair of adjacent compound cameras. The incident optical axes of the camera A4 and the camera B4 are parallel to each other, and the shooting angle of view is shifted by the incident optical axis position difference (camera position difference) between the cameras A4 and B4. The same applies to the other compound eye cameras (A1, B1) (A2, B2) (A3, B3).

ここで、撮像系Aは撮像系Bに対して回転可能に設けられているため、撮像系Aを90度回転させると、ペアとなるカメラの上下対は、(A1,B2)(A2,B3)(A3,B4)(A4,B1)となる。更に90度回転させると、ペアとなるカメラの上下対は、(A1,B3)(A2,B4)(A3,B1)(A4,B2)となり、更に90度回転させると、ペアとなるカメラの上下対は、(A1,B4)(A2,B1)(A3,B2)(A4,B3)となる。   Here, since the image pickup system A is provided so as to be rotatable with respect to the image pickup system B, when the image pickup system A is rotated by 90 degrees, the pair of upper and lower cameras of the pair becomes (A1, B2) (A2, B3 ) (A3, B4) (A4, B1). If the camera is further rotated 90 degrees, the paired upper and lower camera pairs are (A1, B3) (A2, B4) (A3, B1) (A4, B2). The upper and lower pairs are (A1, B4) (A2, B1) (A3, B2) (A4, B3).

即ち、撮像系Aの各カメラA1〜A4と、撮像系Bの各カメラB1〜B4との全組合せのペアを実現することができる。ペアとなるカメラ対は、上述した様に、撮影画角がカメラ位置差(図1(a)のカメラA4,B4間の位置差)だけずれただけの関係で入射光軸も平行なため、大部分が同一被写体を撮影することになる。従って、両カメラの撮影画像の差から両カメラの特性差を求めることが可能となる。   That is, all combinations of the cameras A1 to A4 of the imaging system A and the cameras B1 to B4 of the imaging system B can be realized. As described above, the paired camera pair is parallel to the incident optical axis because the shooting angle of view is shifted by the camera position difference (position difference between the cameras A4 and B4 in FIG. 1A). Most of them will shoot the same subject. Therefore, it is possible to obtain the characteristic difference between the two cameras from the difference between the captured images of the two cameras.

本発明は、図1(b)に示す4台のカメラで360度の画像を撮像する実施形態に限るものでなく、例えば45度毎に8台のカメラを設置する実施形態とすることも可能であり、また、30度毎に12台のカメラを設置することでも良いが、説明の都合上、本実施形態では4台とする。尚、魚眼レンズを用いた2台のカメラを背中合わせに配置して360度の画像を撮像する構成とすることも可能である。   The present invention is not limited to the embodiment in which an image of 360 degrees is captured by the four cameras shown in FIG. 1B, and may be an embodiment in which eight cameras are installed every 45 degrees, for example. In addition, although 12 cameras may be installed every 30 degrees, for convenience of explanation, the number is 4 in this embodiment. In addition, it is also possible to have a configuration in which two cameras using fisheye lenses are arranged back to back to capture a 360-degree image.

図2は、図1に示すカメラA1の構成図である。他のカメラA2〜A4,B1〜B4も同一構成である。カメラA1は、固体撮像素子11を備える。この固体撮像素子11の前段に、絞り12が配置され、その前段に、撮影レンズ13が配置される。固体撮像素子11から出力される撮像画像信号は、CDSVGA部(CDS:相関二重サンプリング処理,VGA:可変利得増幅器)14、アナログ/デジタル(A/D)変換部15、ホワイトバランス(WB)演算部16を通って、制御部9に入力される。この制御部9は、本実施形態では8台のカメラA1〜A4,B1〜B4に共通に設けられ、この全方位撮像装置1を1台の制御部9で統括制御する。   FIG. 2 is a configuration diagram of the camera A1 shown in FIG. The other cameras A2 to A4 and B1 to B4 have the same configuration. The camera A1 includes a solid-state image sensor 11. A diaphragm 12 is disposed in front of the solid-state image sensor 11, and a photographing lens 13 is disposed in front thereof. A picked-up image signal output from the solid-state image pickup device 11 includes a CDSVGA section (CDS: correlated double sampling processing, VGA: variable gain amplifier) 14, an analog / digital (A / D) conversion section 15, and a white balance (WB) calculation. The signal is input to the control unit 9 through the unit 16. In the present embodiment, the control unit 9 is provided in common for the eight cameras A1 to A4 and B1 to B4, and the omnidirectional imaging device 1 is centrally controlled by the single control unit 9.

また、カメラA1は、制御部9からの指示信号に基づいて各種タイミングパルスを生成するタイミングジェネレータ(TG)17と、制御部9からの指示信号に基づいて絞り12の開口量調整を行う絞り駆動回路18と、制御部9からの指示信号に基づいて撮影レンズ13の焦点位置調整等を行うレンズ駆動回路19と、タイミングジェネレータ17から出力される撮像素子駆動パルスによって撮像素子11を駆動する撮像素子駆動回路20とを備える。また、CDSVGA部14,A/D変換部15,WB演算部16は、タイミングジェネレータ17から出力される各タイミングパルスに従って動作する。   In addition, the camera A1 has a timing generator (TG) 17 that generates various timing pulses based on an instruction signal from the control unit 9, and an aperture drive that adjusts the opening amount of the diaphragm 12 based on the instruction signal from the control unit 9. A circuit 18, a lens drive circuit 19 that adjusts the focal position of the photographic lens 13 based on an instruction signal from the control unit 9, and an image sensor that drives the image sensor 11 by an image sensor drive pulse output from the timing generator 17. And a drive circuit 20. Further, the CDSVGA unit 14, the A / D conversion unit 15, and the WB calculation unit 16 operate according to each timing pulse output from the timing generator 17.

図3は、制御部9の機能構成図である。制御部9は、全方位撮像装置1の全体を管理制御する中央演算処理ユニット(CPU)9aと、CPU9aからの指示を受け撮像系Aを撮像系Bに対して相対的に所定角度づつ回転制御する回転制御部9bと、各カメラで撮像され画像処理された画像データを表示したりメニュー画面,動作モード等を表示する表示部9cと、撮像素子11などの駆動信号を生成し各カメラのタイミングジェネレータ17に出力する駆動信号生成部9dと、操作員からの指示を入力する操作部9eと、各カメラの調整パラメータや撮像画像データを保存したり画像処理するときに使用されるメモリ9fとを備える。   FIG. 3 is a functional configuration diagram of the control unit 9. The control unit 9 controls the rotation of the imaging system A at a predetermined angle relative to the imaging system B in response to an instruction from the central processing unit (CPU) 9a for managing and controlling the entire omnidirectional imaging apparatus 1 and the CPU 9a. A rotation control unit 9b for displaying, image data captured and processed by each camera, a display unit 9c for displaying a menu screen, an operation mode, and the like. A drive signal generation unit 9d that outputs to the generator 17, an operation unit 9e that inputs an instruction from an operator, and a memory 9f that is used for storing adjustment parameters and captured image data of each camera and image processing. Prepare.

制御部9は更に、8台のカメラA1〜A4,B1〜B4から出力される画像信号の各画像間のホワイトバランスを総合的に制御するホワイトバランス制御部9gと、各カメラの露出値を総合的に決定して絞り駆動回路18に出力する露出制御部9hと、各カメラのフォーカス位置を総合的に決定してレンズ駆動回路19に出力するフォーカス制御部9iと、時刻を計時する計時部9jと、各カメラから出力される撮像画像信号に対してγ補正やオフセット補正,RGB/YC変換処理等の周知の画像処理を施し、更に、周囲360度のパノラマ画像を合成しこれをJPEG画像やMPEG画像等に圧縮するデジタル信号処理部9kとを備える。   The controller 9 further includes a white balance controller 9g that comprehensively controls the white balance between the images of the image signals output from the eight cameras A1 to A4 and B1 to B4, and the exposure value of each camera. Exposure control unit 9h that determines and outputs to the aperture driving circuit 18, a focus control unit 9i that comprehensively determines the focus position of each camera and outputs it to the lens driving circuit 19, and a time measuring unit 9j that measures time Then, well-known image processing such as γ correction, offset correction, and RGB / YC conversion processing is performed on the captured image signal output from each camera, and a 360-degree panoramic image is synthesized to generate a JPEG image or And a digital signal processing unit 9k for compressing the MPEG image or the like.

実施形態の全方位撮像装置1では、図2に示す様に、CDSVGA14,A/D15,…,レンズ駆動回路19等を個々のカメラ内に設けたが、全カメラを同一制御する構成とし部品コスト削減を図る場合には、これらを制御部9内に設けることも可能である。但し、撮像素子駆動回路20はノイズの影響を受けやすいため個々のカメラ内の撮像素子11近傍に配置するのが良い。   In the omnidirectional imaging apparatus 1 of the embodiment, as shown in FIG. 2, the CDSVGA 14, A / D 15,..., The lens drive circuit 19 and the like are provided in each camera. In the case where reduction is intended, these can be provided in the control unit 9. However, since the image sensor driving circuit 20 is easily affected by noise, the image sensor driving circuit 20 is preferably arranged in the vicinity of the image sensor 11 in each camera.

全方位撮像装置1を製造する場合には、8台のカメラに用いる撮像素子11として特性が同一となる8個の撮像素子11を選択し搭載するのが普通である。特性としては、感度,ノイズ,輝度シェーディング,色シェーディングなどがある。   When the omnidirectional imaging device 1 is manufactured, it is usual to select and mount eight imaging elements 11 having the same characteristics as the imaging elements 11 used for the eight cameras. Characteristics include sensitivity, noise, luminance shading, and color shading.

しかし、各特性が完全に一致していることは無く、また、仮に特性が完全に一致している撮像素子11を使用したとしても、全方位撮像装置1を定点カメラや監視カメラ等として長期間使用すると、8台のカメラ間で特性が経時的にばらついてきてしまう。例えば、直射日光を受けることが多いカメラは、直射日光を受けないカメラに比べ、紫外線の影響で撮像素子11の特性が劣化する傾向が強いからである。   However, the characteristics do not completely match, and even if the image sensor 11 whose characteristics are completely matched is used, the omnidirectional imaging device 1 can be used as a fixed point camera, a monitoring camera, or the like for a long time. If used, the characteristics will vary over time between the eight cameras. For example, a camera that often receives direct sunlight is more likely to deteriorate the characteristics of the image sensor 11 due to the influence of ultraviolet rays than a camera that does not receive direct sunlight.

しかしながら、8個の撮像素子11の特性間にばらつきが生じ個体差が生じても、各撮像素子11から出力される撮像画像信号に対しこの個体差に基づく適切な補正処理を行うことで、特性劣化した撮像素子による撮像画像が360度のパノラマ画像中で劣化画像として目立つことなくなる。   However, even if variations occur between the characteristics of the eight image sensors 11 and individual differences occur, by performing appropriate correction processing based on the individual differences on the captured image signals output from the image sensors 11, the characteristics can be obtained. An image captured by the deteriorated image sensor does not stand out as a deteriorated image in a 360 degree panoramic image.

例えば、感度の場合には、VGA部14のゲインに一定の係数を乗算し、感度差を補正すべく係数を加減することで補正を行う。ノイズの場合には、画像信号処理におけるノイズリダクションの強さを決定するパラメータを調整することで補正する。輝度シェーディングや色シェーディングであれば、画像有効エリア内の規定の位置の出力値それぞれに固定値を乗算することで補正する。通常、エリアはm×n(m,nは整数)に細分化され、各地点の乗算値が二次元マップデータとしてメモリに記憶されている。画像補正で用いる上記の係数やパラメータ,乗算する固定値等を、以下、調整パラメータという。   For example, in the case of sensitivity, correction is performed by multiplying the gain of the VGA unit 14 by a certain coefficient and adjusting the coefficient to correct the sensitivity difference. In the case of noise, correction is performed by adjusting a parameter that determines the strength of noise reduction in image signal processing. In the case of luminance shading or color shading, correction is performed by multiplying each output value at a specified position in the image effective area by a fixed value. Usually, the area is subdivided into m × n (m and n are integers), and the multiplication value at each point is stored in the memory as two-dimensional map data. The above coefficients and parameters used for image correction, fixed values to be multiplied, and the like are hereinafter referred to as adjustment parameters.

この様な個体差に基づく画像補正を行う場合、撮像素子11の個体差を精度良く検出する必要が生じる。本実施形態の全方位撮像装置1では、撮像系Aと撮像系Bとを対構成とし、両系A,Bを相対的に回転可能として対となるカメラ即ち撮像素子の組合せを変更可能とすることで、各撮像素子11間の経時的な個体差でも検出可能としている。   When performing image correction based on such individual differences, it is necessary to detect individual differences of the image sensor 11 with high accuracy. In the omnidirectional imaging apparatus 1 of the present embodiment, the imaging system A and the imaging system B are configured as a pair, and both systems A and B can be relatively rotated so that the combination of the paired cameras, that is, the imaging elements can be changed. Thus, even individual differences over time between the image pickup devices 11 can be detected.

図4は、撮像素子の特性差(個体差)を検出して画像補正を行うときの駆動方法を示すフローチャートである。全方位撮像装置1を動作させるべくシステム起動すると、先ず、初期値のカメラペア、今の場合は(A1,B1)(A2,B2)(A3,B3)(A4,B4)の個体差の補正を行い(ステップS1)、次の撮像動作に進む(ステップS2)。   FIG. 4 is a flowchart illustrating a driving method when image characteristic correction (individual difference) is detected and image correction is performed. When the system is started to operate the omnidirectional imaging apparatus 1, first, correction of individual differences of the initial camera pair, in this case (A1, B1) (A2, B2) (A3, B3) (A4, B4), is performed. (Step S1), and proceeds to the next imaging operation (step S2).

ステップS1では、例えばカメラA1,B1がペアとなって同じ明るさ,同じ大きさの同一被写体を撮像しているため、両撮像画像を比較することで、カメラA1搭載の撮像素子と、カメラB1搭載の撮像素子の個体差を求めることができ、個体差を吸収する補正が可能となる。他のカメラペアも同様である。   In step S1, for example, the cameras A1 and B1 are paired to capture the same subject with the same brightness and size, and therefore, by comparing the captured images, the image sensor mounted on the camera A1 and the camera B1 are compared. Individual differences between mounted image sensors can be obtained, and correction for absorbing individual differences is possible. The same applies to other camera pairs.

カメラA1,A2,A3,A4間の個体差,B1,B2,B3,B4間の個体差もあるが、動作初期においては特性差の小さな撮像素子を選択し使用しているため、それほどの特性差は無い。仮に特性差があったとしても、以下のステップS2〜S10を実行することで、特性差を吸収する補正が可能となる。   There are individual differences among the cameras A1, A2, A3 and A4, and individual differences between B1, B2, B3 and B4. However, since an image sensor with a small characteristic difference is selected and used in the initial stage of operation, the characteristics are not so great. There is no difference. Even if there is a characteristic difference, by performing the following steps S2 to S10, correction for absorbing the characteristic difference becomes possible.

次のステップS3では、計時部9jの計時結果に基づき全方位撮像装置1を動作させてからの経過時間が所定時間経過したか否か、あるいは撮像系間での特性差に経時変化が検出されたか否かを判定し、所定時間が経過していない場合や経時変化が無い場合にはステップS2に戻り、撮像動作を継続する。   In the next step S3, whether the elapsed time from operating the omnidirectional imaging apparatus 1 has elapsed for a predetermined time based on the time measurement result of the time measuring unit 9j, or a change over time is detected in the characteristic difference between the imaging systems. If the predetermined time has not elapsed or if there is no change over time, the process returns to step S2 to continue the imaging operation.

ステップS3の判定の結果、動作開始から所定時間が経過し、あるいは閾値以上の特性差の経時変化が生じた場合には、次のステップS4に進み、撮像系Aを撮像系Bに対して90度回転させ、カメラのペアを変える。今、ステップS1における初期状態でのカメラのペア(図1(a)に示す上下対)は、(A1,B1)(A2,B2)(A3,B3)(A4,B4)であるため、このステップS4で、カメラペアは、(A1,B2)(A2,B3)(A3,B4)(A4,B1)に変更される。   As a result of the determination in step S3, when a predetermined time has elapsed from the start of the operation or when a change in the characteristic difference equal to or greater than the threshold value has occurred over time, the process proceeds to the next step S4, and the imaging system A is moved 90 Rotate it and change the camera pair. Now, since the camera pair (upper and lower pairs shown in FIG. 1A) in the initial state in step S1 is (A1, B1) (A2, B2) (A3, B3) (A4, B4), In step S4, the camera pair is changed to (A1, B2) (A2, B3) (A3, B4) (A4, B1).

そして、次のステップS5で撮像動作を行うと、各カメラペアは、同一被写体を撮像することになる。即ち、対となるカメラA1とカメラB2とは、同じ明るさ,同じ大きさの同一被写体の画像を撮像することになる。他のカメラ対も同様である。これにより、カメラA1の撮像素子とカメラB2の撮像素子との個体差が両撮像画像を比較することで検出することができ、同様に、カメラA2とカメラB3の各撮像素子の個体差を検出することでできる。他のペアでも同様である。   Then, when the imaging operation is performed in the next step S5, each camera pair images the same subject. That is, the paired camera A1 and camera B2 capture images of the same subject with the same brightness and size. The same applies to the other camera pairs. Thereby, the individual difference between the image sensor of the camera A1 and the image sensor of the camera B2 can be detected by comparing both captured images, and similarly, the individual difference between the image sensors of the camera A2 and the camera B3 is detected. You can do it. The same applies to the other pairs.

次のステップS6では、撮像系Aを撮像系Bに対して更に90度回転させる。これにより、カメラペアは、(A1,B3)(A2,B4)(A3,B1)(A4,B2)となり、ステップS7で撮像動作を行うことで、今度は、カメラA1とカメラB3の各撮像素子の個体差、カメラA2,B4の各撮像素子の個体差、…を検出することが可能となる。   In the next step S6, the imaging system A is further rotated 90 degrees with respect to the imaging system B. As a result, the camera pair becomes (A1, B3) (A2, B4) (A3, B1) (A4, B2). By performing the imaging operation in step S7, each imaging of the camera A1 and the camera B3 is performed. It becomes possible to detect individual differences between elements, individual differences between the image sensors of the cameras A2 and B4, and so on.

次のステップS8で、撮像系Aを撮像系Bに対して更に90度回転させると、カメラ対は(A1,B4)(A2,B1)(A3,B2)(A4,B3)となり、ステップS9で撮像動作を行うと、カメラA1,B4の各撮像素子の個体差、カメラA2,B1の各撮像素子の個体差、…を検出することが可能となる。   When the imaging system A is further rotated 90 degrees with respect to the imaging system B in the next step S8, the camera pair becomes (A1, B4) (A2, B1) (A3, B2) (A4, B3), and step S9. When the imaging operation is performed, it is possible to detect individual differences between the image sensors of the cameras A1 and B4, individual differences between the image sensors of the cameras A2 and B1, and so on.

次のステップS10では、ステップS1,S5,S7,S9で求められた各ペア間の撮像画像を比較し、全撮像素子の個体差を吸収する補正を行い、ステップS2に戻る。このステップS10はCPU9aが配下のデジタル信号処理部9kを用いて処理し、この補正を行うために求めた調整パラメータはメモリ9fに保存し或いはステップS10を行う毎に逐次更新し、システム再起動時等に使用する。   In the next step S10, the captured images between the pairs obtained in steps S1, S5, S7, and S9 are compared, correction for absorbing individual differences among all image sensors is performed, and the process returns to step S2. This step S10 is processed by the CPU 9a using the subordinate digital signal processing unit 9k, and the adjustment parameter obtained for this correction is stored in the memory 9f or is updated sequentially every time step S10 is performed. Used for etc.

例えば、撮像系Aだけの場合、カメラA1の撮像素子11と、これに対し180度の位置にあるカメラA3の撮像素子11との個体差を求めるのは容易でない。特許文献1,2に記載の従来の全方位撮像装置も同様である。しかし、本実施形態の場合、カメラA1とカメラB1とをペアにして同一被写体を撮像し、次にカメラA3とカメラB1をペアにして同一被写体を撮像すれば、カメラA1とカメラA3の個体差を検出することが可能となる。   For example, in the case of only the imaging system A, it is not easy to obtain an individual difference between the imaging element 11 of the camera A1 and the imaging element 11 of the camera A3 at a position 180 degrees relative to this. The same applies to the conventional omnidirectional imaging devices described in Patent Documents 1 and 2. However, in the case of this embodiment, if the same subject is imaged by pairing camera A1 and camera B1, and then the same subject is imaged by pairing camera A3 and camera B1, the individual difference between camera A1 and camera A3 Can be detected.

この様に、本実施形態では、撮像系Aを構成するカメラと撮像系Bを構成するカメラの全ての組合せを実現できる構成としたため、各カメラに用いる撮像素子の個体差を吸収する補正が可能となる。   As described above, in this embodiment, since all the combinations of the camera constituting the imaging system A and the camera constituting the imaging system B can be realized, it is possible to perform correction to absorb individual differences of the imaging elements used in each camera. It becomes.

各撮像素子の個体差を吸収する補正は、例えば、カメラA1に搭載する撮像素子の特性を基準として、他のカメラA2〜A4,B1〜B4の特性をこの基準に合わせる補正としても良く、また、8台のカメラの各撮像素子の特性平均値を基準として、この基準に合わせる様に補正することでも良い。   The correction that absorbs the individual differences between the image sensors may be, for example, correction that matches the characteristics of the other cameras A2 to A4 and B1 to B4 with the characteristics of the image sensor mounted on the camera A1. The characteristic average value of each image sensor of the eight cameras may be used as a reference, and correction may be performed so as to match this reference.

図1に示す実施形態の全方位撮像装置1では、ペアを構成する2つのカメラを上下に配置したステレオカメラ構成としたため、被写体までの距離を各カメラ視差に基づいて精度良く検出することが可能である。しかし、ペアが上下方向のため、通常の「ステレオカメラ」と異なり、操作員がペアの撮像画像を見ても被写体の立体画像を見ることはできない。   The omnidirectional imaging apparatus 1 of the embodiment shown in FIG. 1 has a stereo camera configuration in which two cameras constituting a pair are arranged up and down, so that the distance to the subject can be accurately detected based on each camera parallax. It is. However, because the pair is in the vertical direction, unlike a normal “stereo camera”, the operator cannot see the stereoscopic image of the subject even when viewing the captured image of the pair.

そこで、立体画像を見ることができるようにした実施形態が、図5である。図5(a)に示す実施形態の全方位撮像装置では、ペアカメラを(A1,B1)(A2,B2)(A3,B3)(A4,B4)とし、ペアを構成する2台のカメラの入射光軸が平行となるように左右に配置している。   FIG. 5 shows an embodiment in which a stereoscopic image can be viewed. In the omnidirectional imaging apparatus of the embodiment shown in FIG. 5A, the pair cameras are (A1, B1) (A2, B2) (A3, B3) (A4, B4), and the two cameras constituting the pair are used. It arrange | positions on either side so that an incident optical axis may become parallel.

そして、例えば、ペアカメラ(A1,B1)が東方向、ペアカメラ(A2,B2)が南方向、ペアカメラ(A3,B3)が西方向、ペアカメラ(A4,B4)が北方向を向くように設置され、撮像系Aを構成する4台のカメラA1〜A4だけで周囲360度のパノラマ画像が合成できるように、また、撮像系Bを構成する4台のカメラB1〜B4だけで周囲360度のパノラマ画像が合成できるように各カメラ画角が設定される。   For example, the pair cameras (A1, B1) face the east, the pair cameras (A2, B2) face the south, the pair cameras (A3, B3) face the west, and the pair cameras (A4, B4) face the north. The panorama image of 360 degrees around can be synthesized by only the four cameras A1 to A4 constituting the imaging system A, and the circumference 360 by only the four cameras B1 to B4 constituting the imaging system B. Each camera angle of view is set so that a panoramic image can be synthesized.

撮像系Bを構成するリング状,放射状に配置された4台のカメラB1〜B4は設置台(地面側)に固定され、撮像系Aを構成するリング状,放射状に配置された4台のカメラA1〜A4は、上下動/回転機構25によって連結されている。撮像系Aが、図5(b)の矢印Lに示される様に上動され、次に矢印Mに示されるように90度回転され、図5(c)の矢印Nに示される様に下動されると、ペアとなるカメラが変更される。これにより、図1に示す実施形態と同様にして、各カメラ搭載撮像素子の相対的な個体差を全方位撮像装置を分解することなく、撮像動作の最中に容易に検出して個体差を吸収する画像補正が可能となり、全方位撮像装置の撮像画像の品質劣化を回避することができる。   Four cameras B1 to B4 arranged in a ring shape and in a radial shape constituting the imaging system B are fixed to an installation base (ground side), and four cameras arranged in a ring shape and a radial shape constituting the imaging system A A <b> 1 to A <b> 4 are connected by a vertical movement / rotation mechanism 25. The imaging system A is moved up as indicated by an arrow L in FIG. 5B, then rotated 90 degrees as indicated by an arrow M, and moved down as indicated by an arrow N in FIG. 5C. When moved, the paired camera is changed. Thus, in the same manner as in the embodiment shown in FIG. 1, the relative individual differences between the camera-equipped imaging elements can be easily detected during the imaging operation without disassembling the omnidirectional imaging device, and the individual differences can be detected. Absorbing image correction can be performed, and quality degradation of the captured image of the omnidirectional imaging apparatus can be avoided.

以上述べた様に、実施形態による全方位撮像装置は、隣接する撮像部の撮影画角の端部分が重なるように配置され周囲360度の被写体画像を撮像する複数台の撮像部で構成される第1撮像系と、該第1撮像系の各撮像部に隣接して設けられる撮像部複数台で構成される第2撮像系と、前記第1撮像系に対し前記第2撮像系を相対的に移動して前記隣接する前記第1撮像系の前記撮像部と前記第2撮像系の前記撮像部とのペアを変更する駆動手段とを備えることを特徴とする。   As described above, the omnidirectional imaging apparatus according to the embodiment includes a plurality of imaging units that are arranged so that the end portions of the shooting angle of view of adjacent imaging units overlap with each other and capture a subject image of 360 degrees around. A first imaging system; a second imaging system comprising a plurality of imaging units provided adjacent to each imaging unit of the first imaging system; and the second imaging system relative to the first imaging system And driving means for changing a pair of the imaging unit of the first imaging system and the imaging unit of the second imaging system adjacent to each other.

また、実施形態の全方位撮像装置の前記駆動手段は、前記第1撮像系の或る撮像部の位置を相対的に順に移動させて、前記第2撮像系の全ての前記撮像部と順に前記ペアを組ませることを特徴とする。   Further, the driving unit of the omnidirectional imaging apparatus according to the embodiment moves the position of a certain imaging unit of the first imaging system in order relatively, and sequentially with all the imaging units of the second imaging system. It is characterized by pairing.

また、実施形態の全方位撮像装置は、前記ペアの組合せを変更して前記撮像部の相対的な特性差を求め該特性差を吸収する画像補正を行う制御部を備えることを特徴とする。   In addition, the omnidirectional imaging apparatus according to the embodiment includes a control unit that changes a combination of the pairs to obtain a relative characteristic difference between the imaging units and performs image correction that absorbs the characteristic difference.

また、実施形態の全方位撮像装置は、前記画像補正を行うために前記制御部が求めた調整パラメータを保存する記憶手段を備えることを特徴とする。   In addition, the omnidirectional imaging apparatus according to the embodiment includes a storage unit that stores adjustment parameters obtained by the control unit in order to perform the image correction.

また、実施形態の全方位撮像装置は、前記特性差を求める処理を所定時間毎に行うことを特徴とする。   In addition, the omnidirectional imaging apparatus according to the embodiment is characterized in that the process of obtaining the characteristic difference is performed every predetermined time.

また、実施形態の全方位撮像装置は、前記ペアを構成する前記撮像部間で閾値以上の特性差が発生したことを検出したとき前記画像補正を行うことを特徴とする。   In addition, the omnidirectional imaging apparatus according to the embodiment performs the image correction when it is detected that a characteristic difference equal to or greater than a threshold value has occurred between the imaging units constituting the pair.

また、実施形態の全方位撮像装置は、前記ペアを構成する前記第1撮像系の前記撮像部の入射光軸と前記第2撮像系の前記撮像部の入射光軸とが平行に設けられることを特徴とする。   In the omnidirectional imaging apparatus of the embodiment, the incident optical axis of the imaging unit of the first imaging system and the incident optical axis of the imaging unit of the second imaging system that constitute the pair are provided in parallel. It is characterized by.

また、実施形態の全方位撮像装置は、前記ペアを構成する前記第1撮像系の前記撮像部と前記第2撮像系の前記撮像部とで立体画像を撮像することを特徴とする。   The omnidirectional imaging apparatus according to the embodiment is characterized in that a stereoscopic image is captured by the imaging unit of the first imaging system and the imaging unit of the second imaging system that constitute the pair.

また、実施形態の全方位撮像装置は、前記ペアを構成する前記第1撮像系の前記撮像部による撮像画像と前記第2撮像系の前記撮像部による撮像画像とから被写体までの距離を求めることを特徴とする。   In addition, the omnidirectional imaging device of the embodiment obtains a distance from the captured image by the imaging unit of the first imaging system and the captured image by the imaging unit of the second imaging system constituting the pair to the subject. It is characterized by.

以上述べた実施形態によれば、複数のカメラ(撮像部)間で経時的に特性差が発生しても、容易に各カメラ間の相対的な特性差を検出できるため、特性差を吸収する画像補正を行い、常に良好な画質の全方位撮像画像を得ることが可能となる。   According to the embodiment described above, even if a characteristic difference occurs with time between a plurality of cameras (imaging units), it is possible to easily detect a relative characteristic difference between the cameras, and thus absorb the characteristic difference. Image correction is performed, and an omnidirectional captured image with always good image quality can be obtained.

本発明の全方位撮像装置は、使用カメラ搭載の撮像素子間に特性差が生じても容易にこの特性差を検出して画像補正ができるため、常に良好な品質の全方位撮像画像を得ることができ、定点カメラ,監視カメラ等に適用すると有用である。   The omnidirectional imaging device of the present invention can always detect a characteristic difference between image pickup elements mounted on the camera used and easily correct the image and correct the image. It is useful when applied to fixed-point cameras, surveillance cameras, and the like.

1,23 全方位撮像装置
2 可動側の円筒状筐体
3 固定側の円筒状筐体
4 回転軸
9 制御部
9a CPU
9b 回転制御部
11 撮像素子
13 撮影レンズ
25 上下動/回転機構
1, 23 Omnidirectional imaging device 2 Cylindrical casing 3 on movable side Cylindrical casing 4 on fixed side Rotating shaft 9 Controller 9a CPU
9b Rotation controller 11 Image sensor 13 Shooting lens 25 Vertical movement / rotation mechanism

Claims (9)

隣接する撮像部の撮影画角の端部分が重なるように配置され周囲360度の被写体画像を撮像する複数台の撮像部で構成される第1撮像系と、該第1撮像系の各撮像部に隣接して設けられる撮像部複数台で構成される第2撮像系と、前記第1撮像系に対し前記第2撮像系を相対的に移動して前記隣接する前記第1撮像系の前記撮像部と前記第2撮像系の前記撮像部とのペアを変更する駆動手段とを備える全方位撮像装置。   A first imaging system configured by a plurality of imaging units that are arranged so that end portions of shooting field angles of adjacent imaging units overlap with each other and image a subject image of 360 degrees around, and each imaging unit of the first imaging system A second imaging system composed of a plurality of imaging units provided adjacent to each other, and the imaging of the adjacent first imaging system by moving the second imaging system relative to the first imaging system And an omnidirectional imaging device comprising: a driving unit that changes a pair of the imaging unit of the second imaging system. 請求項1に記載の全方位撮像装置であって、前記駆動手段は、前記第1撮像系の或る撮像部の位置を相対的に順に移動させて、前記第2撮像系の全ての前記撮像部と順に前記ペアを組ませる全方位撮像装置。   2. The omnidirectional imaging apparatus according to claim 1, wherein the driving unit relatively sequentially moves a position of a certain imaging unit of the first imaging system to perform all of the imaging of the second imaging system. An omnidirectional imaging device that allows the pair to be assembled in order with the unit. 請求項1または請求項2に記載の全方位撮像装置であって、前記ペアの組合せを変更して前記撮像部の相対的な特性差を求め該特性差を吸収する画像補正を行う制御部を備える全方位撮像装置。   3. The omnidirectional imaging apparatus according to claim 1, further comprising: a control unit that performs image correction to obtain a relative characteristic difference of the imaging unit by changing the combination of the pairs and absorb the characteristic difference. An omnidirectional imaging apparatus. 請求項3に記載の全方位撮像装置であって、前記画像補正を行うために前記制御部が求めた調整パラメータを保存する記憶手段を備える全方位撮像装置。   4. The omnidirectional imaging apparatus according to claim 3, further comprising a storage unit that stores adjustment parameters obtained by the control unit for performing the image correction. 請求項3または請求項4に記載の全方位撮像装置であって、前記特性差を求める処理を所定時間毎に行う全方位撮像装置。   The omnidirectional imaging apparatus according to claim 3 or 4, wherein the processing for obtaining the characteristic difference is performed every predetermined time. 請求項3または請求項4に記載の全方位撮像装置であって、前記ペアを構成する前記撮像部間で閾値以上の特性差が発生したことを検出したとき前記画像補正を行う全方位撮像装置。   5. The omnidirectional imaging apparatus according to claim 3 or 4, wherein the image correction is performed when it is detected that a characteristic difference equal to or greater than a threshold value has occurred between the imaging units constituting the pair. . 請求項1乃至請求項6のいずれかに記載の全方位撮像装置であって、前記ペアを構成する前記第1撮像系の前記撮像部の入射光軸と前記第2撮像系の前記撮像部の入射光軸とが平行に設けられる全方位撮像装置。   7. The omnidirectional imaging apparatus according to claim 1, wherein an incident optical axis of the imaging unit of the first imaging system and the imaging unit of the second imaging system constituting the pair are configured. An omnidirectional imaging device provided with an incident optical axis in parallel. 請求項7に記載の全方位撮像装置であって、前記ペアを構成する前記第1撮像系の前記撮像部と前記第2撮像系の前記撮像部とで立体画像を撮像する全方位撮像装置。   The omnidirectional imaging apparatus according to claim 7, wherein the stereoscopic imaging is performed by the imaging unit of the first imaging system and the imaging unit of the second imaging system configuring the pair. 請求項7に記載の全方位撮像装置であって、前記ペアを構成する前記第1撮像系の前記撮像部による撮像画像と前記第2撮像系の前記撮像部による撮像画像とから被写体までの距離を求める全方位撮像装置。   The omnidirectional imaging apparatus according to claim 7, wherein a distance from the captured image by the imaging unit of the first imaging system and the captured image by the imaging unit of the second imaging system constituting the pair to a subject An omnidirectional imaging device for obtaining
JP2009105136A 2009-04-23 2009-04-23 Omnidirectional imaging apparatus Pending JP2010258669A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009105136A JP2010258669A (en) 2009-04-23 2009-04-23 Omnidirectional imaging apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009105136A JP2010258669A (en) 2009-04-23 2009-04-23 Omnidirectional imaging apparatus

Publications (1)

Publication Number Publication Date
JP2010258669A true JP2010258669A (en) 2010-11-11

Family

ID=43319108

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009105136A Pending JP2010258669A (en) 2009-04-23 2009-04-23 Omnidirectional imaging apparatus

Country Status (1)

Country Link
JP (1) JP2010258669A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013165006A1 (en) * 2012-05-01 2013-11-07 セントラルエンジニアリング株式会社 Stereo camera and stereo camera system
CN104506761A (en) * 2014-12-20 2015-04-08 中国地质大学(武汉) 360-degree panoramic stereoscopic camera
CN105739231A (en) * 2016-05-06 2016-07-06 中国科学技术大学 Multi-camera panorama stereo imaging device of planar distribution
CN106647149A (en) * 2016-10-20 2017-05-10 北京机灵科技有限公司 Compact and intensive lens group device
TWI608288B (en) * 2017-02-20 2017-12-11 奇鋐科技股份有限公司 Panoramic camera device
DE102017104777A1 (en) 2017-03-07 2018-09-13 Rheinmetall Defence Electronics Gmbh Camera wreath for creating a panoramic picture
US10523918B2 (en) 2017-03-24 2019-12-31 Samsung Electronics Co., Ltd. System and method for depth map
US10594950B2 (en) 2014-06-23 2020-03-17 Sony Corporation Photographing apparatus

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013165006A1 (en) * 2012-05-01 2013-11-07 セントラルエンジニアリング株式会社 Stereo camera and stereo camera system
JPWO2013165006A1 (en) * 2012-05-01 2015-12-24 セントラルエンジニアリング株式会社 Stereo camera and stereo camera system
US10594950B2 (en) 2014-06-23 2020-03-17 Sony Corporation Photographing apparatus
CN104506761A (en) * 2014-12-20 2015-04-08 中国地质大学(武汉) 360-degree panoramic stereoscopic camera
CN105739231A (en) * 2016-05-06 2016-07-06 中国科学技术大学 Multi-camera panorama stereo imaging device of planar distribution
CN106647149A (en) * 2016-10-20 2017-05-10 北京机灵科技有限公司 Compact and intensive lens group device
TWI608288B (en) * 2017-02-20 2017-12-11 奇鋐科技股份有限公司 Panoramic camera device
DE102017104777A1 (en) 2017-03-07 2018-09-13 Rheinmetall Defence Electronics Gmbh Camera wreath for creating a panoramic picture
US10523918B2 (en) 2017-03-24 2019-12-31 Samsung Electronics Co., Ltd. System and method for depth map

Similar Documents

Publication Publication Date Title
US8890972B2 (en) Image capturing apparatus and image processing method
JP2010258669A (en) Omnidirectional imaging apparatus
US20210092304A1 (en) Image pickup apparatus, an image processing method and a non-transitory computer-readable medium
JP2008241491A (en) Three-dimensional measurement instrument
JPWO2012111220A1 (en) Imaging apparatus, imaging apparatus main body, and shading correction method
JP2007097085A (en) Digital camera
US11010030B2 (en) Electronic apparatus capable of performing display control based on display mode, control method thereof, and non-transitory computer readable medium
JP2012226184A5 (en) Imaging apparatus and control method thereof
JP5586796B2 (en) Imaging device, control method thereof, interchangeable lens and interchangeable lens imaging device body
JP6539075B2 (en) Imaging device, control method therefor, program, storage medium
US8712231B2 (en) Camera body, and camera system
JP2013219675A (en) Imaging device and control method for imaging device
JP2004361611A (en) Solid-state image pickup element and photographing device
JP2011211329A (en) Imaging apparatus and control method thereof, image processing apparatus and control method thereof, and image processing program
JP2010135984A (en) Compound-eye imaging apparatus and imaging method
JP2010035131A (en) Imaging apparatus and imaging method
JP2012095116A (en) Imaging device and control method of the same
US9967452B2 (en) Imaging apparatus and imaging method for controlling auto-focus
JP2008283477A (en) Image processor, and image processing method
JP5597273B2 (en) Imaging apparatus and image processing method
JP5755311B2 (en) Information processing apparatus, photographing apparatus, and control method for information processing apparatus
JP6257316B2 (en) Imaging apparatus and control method thereof
JP6355324B2 (en) Imaging apparatus and control method thereof
JP6479149B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND IMAGE PROCESSING DEVICE
JP6902921B2 (en) Imaging equipment, control methods, and programs

Legal Events

Date Code Title Description
RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20111216