JPH05344422A - Compound eye image pickup device - Google Patents

Compound eye image pickup device

Info

Publication number
JPH05344422A
JPH05344422A JP4152214A JP15221492A JPH05344422A JP H05344422 A JPH05344422 A JP H05344422A JP 4152214 A JP4152214 A JP 4152214A JP 15221492 A JP15221492 A JP 15221492A JP H05344422 A JPH05344422 A JP H05344422A
Authority
JP
Japan
Prior art keywords
image pickup
image
point
position information
object plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP4152214A
Other languages
Japanese (ja)
Other versions
JP2974500B2 (en
Inventor
Tatsuji Katayama
達嗣 片山
Shigeyuki Suda
繁幸 須田
Yukichi Niwa
雄吉 丹羽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to JP4152214A priority Critical patent/JP2974500B2/en
Priority to AT93104677T priority patent/ATE158129T1/en
Priority to EP93104677A priority patent/EP0563737B1/en
Priority to DE69313694T priority patent/DE69313694T2/en
Priority to US08/036,079 priority patent/US5668595A/en
Publication of JPH05344422A publication Critical patent/JPH05344422A/en
Application granted granted Critical
Publication of JP2974500B2 publication Critical patent/JP2974500B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Abstract

PURPOSE:To make a picture subjected to synthesis processing from plural original pictures high precise by providing a means synthesizing a picture outputted from each image pickup system based on an image pickup condition and position information to the device. CONSTITUTION:A common object plane 1, 1st and 2nd image pickup optical systems 102, 202 having equivalent specification (a zoom lens is employed in general), image sensors 103, 203 having an equivalent specification (image pickup tube such as Saticon or solid-state image pickup element such as CCD) are arranged. Optical axes 101, 201 passes through a point 0 while being crossed at the point 0 on the object plane 1 with a tilt angle of theta symmetrical with respect to a normal 0-0' on the object plane 1. Furthermore, 2theta is referred to as a congestion angle and an image is picked up by changing the congestion angle in response to a change in an object distance S. In this case, pictures being outputs from the 1st and 2nd image pickup optical systems 102, 202 are synthesized based on an image pickup condition and position information.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は撮像装置に関し、少なく
とも2組の撮像光学系を用いて得られた少なくとも2つ
の画像を電気的に補正及び合成処理することにより高精
細な1つの画像を提供する装置に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an image pickup apparatus, and provides one high-definition image by electrically correcting and synthesizing at least two images obtained by using at least two sets of image pickup optical systems. Related to the device.

【0002】[0002]

【従来の技術】従来より2組の撮像光学系を用いて得ら
れた2つの画像を合成することにより1つの高精細な画
像を得る原理方式として、例えば画像電子学会予稿90
−03−04(p23〜28)等に開示される様な方式
が知られている。
2. Description of the Related Art Conventionally, as a principle method for obtaining one high-definition image by synthesizing two images obtained by using two sets of image pickup optical systems, for example, the IEICE Preliminary Report 90
A method disclosed in -03-04 (p23 to 28) and the like is known.

【0003】これは2組の撮像光学系に用いたイメージ
センサのサンプリング点を仮想的に共通の被写体に対し
て投影した際に、空間的に位相が異なることを利用し
て、各々の画像信号を合成処理することにより1つの高
精細な画像を得ようとするものである。図9にこの原理
の概念図を示す。図9に於て811及び821は撮像光
学系であり、812及び822は各々の撮像光学系によ
り出力された画像である。801に於て各々の画像信号
を合成処理し、高精細な画像802を得るものである。
This is because when the sampling points of the image sensors used in the two sets of image pickup optical systems are projected onto a virtually common subject, they are spatially different in phase from each other. Is intended to obtain one high-definition image. FIG. 9 shows a conceptual diagram of this principle. In FIG. 9, 811 and 821 are image pickup optical systems, and 812 and 822 are images output by the respective image pickup optical systems. In 801 the respective image signals are combined and processed to obtain a high definition image 802.

【0004】しかしながら、従来例では原理的に輻輳角
を有する光学配置の為、撮像光学系より得られる画像に
於てレジストレーションずれが発生することが考えられ
る。図10(A)は複眼撮像系に於けるレジストレーシ
ョンずれの概要を示したもので910及び920は撮像
光学系(レンズ)であり、911及び921はイメージ
センサである。また各々の撮像光学系910、920の
光軸を912及び922とする。ここで、中心軸O−
O′に対して各々の光軸912及び922をxz平面内
でθ傾斜させて被写体を撮像するとき、被写体面上の任
意の物点をPとする。このときPに対する各々のイメー
ジセンサ911及び921に於ける像点を各々R′及び
L′とすると、R′≠L′であるためにレジストレーシ
ョンずれが発生し、図10(B)に示す様に単純加算し
た合成画像930では物点Pに対する像が二重となり、
その結果高精度な画像が提供できなくなることになっ
た。
However, in the conventional example, because of the optical arrangement having the angle of convergence in principle, it is conceivable that the registration shift occurs in the image obtained from the image pickup optical system. FIG. 10A shows an outline of registration deviation in the compound-eye image pickup system. Reference numerals 910 and 920 are image pickup optical systems (lenses), and 911 and 921 are image sensors. The optical axes of the image pickup optical systems 910 and 920 are 912 and 922, respectively. Here, the central axis O-
When the subject is imaged with the optical axes 912 and 922 inclined by θ in the xz plane with respect to O ′, an arbitrary object point on the subject surface is designated as P. At this time, assuming that the image points on the image sensors 911 and 921 with respect to P are R ′ and L ′, respectively, a registration shift occurs because R ′ ≠ L ′, and as shown in FIG. In the composite image 930 obtained by simply adding to, the image for the object point P becomes double,
As a result, it has become impossible to provide highly accurate images.

【0005】[0005]

【発明が解決しようとする課題】本発明は複数の元画像
から合成処理した画像を高精細化することを課題とす
る。
SUMMARY OF THE INVENTION An object of the present invention is to improve the definition of an image obtained by combining a plurality of original images.

【0006】[0006]

【課題を解決するための手段】本発明は、複数の撮像系
を用いて共通の被写体を撮像する装置に於いて、撮像系
の撮像条件を検出する手段と、各撮像系から出力される
画像信号から被写体の位置情報を検出する手段と、各撮
像系からの出力に依る画像を前記撮像条件と前記位置情
報を基に合成する手段を具える。
According to the present invention, in an apparatus for imaging a common subject using a plurality of image pickup systems, means for detecting the image pickup conditions of the image pickup systems and an image output from each image pickup system. It comprises means for detecting the position information of the subject from the signal, and means for synthesizing an image based on the output from each imaging system based on the imaging conditions and the position information.

【0007】[0007]

【実施例】図1に本発明に係る複眼撮影系の基本配置を
示す。図中1は共通の被写体平面、102及び202は
等価な仕様を有する第1及び第2の撮像光学系であり、
一般的には後述の様にズームレンズが用いられる。10
3及び203は同様に等価な仕様を有するイメージセン
サであり、サチコン等の撮像管またはCCD等の固体撮
像素子が用いられる。ここでは簡単のために単板式(ま
たは単管式)を模式的に示したが、色分解光学系を介し
た2板式(2管式)あるいは3板式(3管式)であって
も一般性を失わない。
DESCRIPTION OF THE PREFERRED EMBODIMENTS FIG. 1 shows the basic arrangement of a compound eye photographing system according to the present invention. In the figure, 1 is a common object plane, 102 and 202 are first and second imaging optical systems having equivalent specifications,
Generally, a zoom lens is used as described later. 10
Image sensors 3 and 203 similarly have equivalent specifications, and an image pickup tube such as a SATICON or a solid-state image pickup device such as a CCD is used. Here, for simplicity, a single plate type (or a single tube type) is schematically shown, but a two plate type (two tube type) or a three plate type (three tube type) via a color separation optical system is also general. Do not lose.

【0008】これらの光軸101及び201が被写体面
1上の点Oで交差し、点Oを通り、被写体面1の法線O
−O′に対して対称にθ傾斜した状態に配置する。尚2
θを輻輳角と定義し、被写体距離Sの変化に応じてこの
輻輳角を変えて撮像する。
These optical axes 101 and 201 intersect at a point O on the object plane 1, pass through the point O, and pass through a normal line O of the object plane 1.
It is arranged in a state of being inclined by θ symmetrically with respect to −O ′. 2
θ is defined as the vergence angle, and the vergence angle is changed according to the change in the subject distance S, and an image is taken.

【0009】図2は撮像光学系102、202の具体的
な構成を示すもので、図3は撮像光学系を構成する各部
材の機能をブロック図として示している。102a、1
02b、102c、102d及び、202a、202
b、202c、202dは第1及び第2の撮像光学系1
02及び202を構成するレンズ群を示し、特に102
b、202bは変倍群、102d、202dは合焦群を
示す。また106及び206は、変倍群102b及び2
02bを駆動するための駆動系(ズームモータ)、同時
に107及び207は合焦群102d及び202dを駆
動するための駆動系(フォーカスモータ)を示す。更に
102及び103、202及び203は一体として、光
軸101及び201を含む平面内で回転する不図示の機
構系と、駆動系(輻輳角モータ)104及び204を設
ける。
FIG. 2 shows a concrete structure of the image pickup optical systems 102 and 202, and FIG. 3 shows a function of each member constituting the image pickup optical system as a block diagram. 102a, 1
02b, 102c, 102d and 202a, 202
Reference numerals b, 202c and 202d denote the first and second imaging optical systems 1
02 and 202 show the lens groups that form the lens group, particularly 102
Reference numerals b and 202b represent a zooming group, and 102d and 202d represent a focusing group. Reference numerals 106 and 206 denote zoom groups 102b and 2
A drive system (zoom motor) for driving 02b, and 107 and 207 are drive systems (focus motors) for driving the focusing groups 102d and 202d at the same time. Further, 102 and 103, 202 and 203 are integrally provided with a mechanical system (not shown) that rotates in a plane including the optical axes 101 and 201, and drive systems (convergence angle motors) 104 and 204.

【0010】また105と205は角度エンコーダで、
撮像光学系102、202の回転角を測定する。108
と208はズームエンコーダで、変倍群102b、20
2bの移動を計測して変倍比を求める。109と209
はフォーカスエンコーダで、合焦群の位置を計測する。
また110と210は映像信号でイメージセンサ103
と203から出力され、画像メモリ111と211に記
憶される。
Further, 105 and 205 are angle encoders,
The rotation angles of the image pickup optical systems 102 and 202 are measured. 108
And 208 are zoom encoders, which are variable power groups 102b, 20
The movement of 2b is measured to obtain the zoom ratio. 109 and 209
Is a focus encoder that measures the position of the focusing group.
Further, 110 and 210 are video signals, and the image sensor 103
And 203 and are stored in the image memories 111 and 211.

【0011】演算制御部12、相関演算部13、補正演
算部14の作用は追って説明する。
The operation of the arithmetic control unit 12, the correlation arithmetic unit 13, and the correction arithmetic unit 14 will be described later.

【0012】次に図4に示す配置で撮像しているとき被
写体の位置情報の検出について示す。
Next, the detection of the position information of the subject when the image is picked up in the arrangement shown in FIG. 4 will be described.

【0013】図4に示すように被写体面上の前述の点O
を原点として、x軸、y軸、及びz軸を定義する。
As shown in FIG. 4, the above-mentioned point O on the object plane
The origin is defined as x-axis, y-axis, and z-axis.

【0014】撮像光学系102及び202と各々の撮像
光学系の光軸101、201との交点を各々QR、QL
し、撮像光学系(レンズ)102及び202の前側主点
から前述の点Oまでの距離をS0、また後側主点から各
々のイメージセンサ103及び203までの距離を
0′とする。ここで、図5に示すxz平面内に於ける
被写体面1上の座標P2(x0,z0)の導出について簡
単に示す。
[0014] Each Q R the intersection of the optical axis 101 and 201 of the imaging optical system 102 and 202 with each of the imaging optical system, and Q L, a point from the front side principal point of the imaging optical system (lens) 102 and 202 described above The distance to O is S 0 , and the distance from the rear principal point to each of the image sensors 103 and 203 is S 0 ′. Here, the derivation of the coordinates P 2 (x 0 , z 0 ) on the object plane 1 in the xz plane shown in FIG. 5 will be briefly described.

【0015】撮像光学系102及び202と各々の撮像
光学系の光軸101、201との交点を各々QR(x1
−z1)、QL(−x1,−z1)とすると、その座標
1、z1については撮影条件S0と輻輳角θを用いて幾
何学的に x1=S0sinθ − (1−a) z1=S0cosθ − (1−b) と表せられる。
[0015] Each Q R (x 1 the intersection of the optical axis 101 and 201 of the imaging optical system 102 and 202 with each of the imaging optical system,
-Z 1), Q L (-x 1, when the -z 1), the coordinates x 1, for z 1 by using the convergence angle θ and the photographing condition S 0 geometrically x 1 = S 0 sinθ - It is expressed as (1-a) z 1 = S 0 cos θ − (1-b).

【0016】また、各々のイメージセンサ103及び2
03と光軸101及び201との交点OR′(x2,−z
2)、OL′(−x2,−z2)についても同様に x2=(S0+S0′)sinθ − (2−
a) z2=(S0+S0′)cosθ − (2−b) と表せられる。
Further, each of the image sensors 103 and 2
03 and the intersection between the optical axis 101 and 201 O R '(x 2, -z
2 ) and O L ′ (−x 2 , −z 2 ) similarly, x 2 = (S 0 + S 0 ′) sin θ − (2-
a) It can be expressed as z 2 = (S 0 + S 0 ′) cos θ − (2-b).

【0017】ここで、図5に示すように被写体上の物点
2(x0,z0)に対する各々のイメージセンサ10
3、203に於ける像点をPR′(xR,−zR)及び
L′(−xL,−zL)、またイメージセンサ103及
び203に於ける各々の像高をxR′、xL′とすると、
幾何学的に xR=(S0+S0′)sinθ−xR′cosθ − (3−a) zR=(S0+S0′)cosθ+xR′sinθ − (3−b) xL=(S0+S0′)sinθ+xL′cosθ − (3−c) zL=(S0+S0′)cosθ−xL′sinθ − (3−d) と表すことができる。
Here, as shown in FIG. 5, each image sensor 10 for the object point P 2 (x 0 , z 0 ) on the object.
3,203 to at image point P R '(x R, -z R) and P L' (-x L, -z L), also the in each image height on the image sensor 103 and 203 x R ′, X L ′,
Geometrically x R = (S 0 + S 0 ′) sin θ−x R ′ cos θ − (3-a) z R = (S 0 + S 0 ′) cos θ + x R ′ sin θ − (3-b) x L = ( It can be expressed as S 0 + S 0 ′) sin θ + x L ′ cos θ − (3-c) z L = (S 0 + S 0 ′) cos θ−x L ′ sin θ − (3-d).

【0018】このとき点PR′と点QRを通る直線をfR
(S0,S0′,θ,xR′)とし、点PL′と点QLを通
る直線をfL(S0,S0′,θ,xL′)と表すと、被写
体面上の点P2(x0,z0)は定義によりこの2直線の
交点の座標となる。
At this time, a straight line passing through the points P R ′ and Q R is f R
Let (S 0 , S 0 ′, θ, x R ′) be f L (S 0 , S 0 ′, θ, x L ′) be the straight line passing through the point P L ′ and the point Q L. By definition, the upper point P 2 (x 0 , z 0 ) becomes the coordinates of the intersection of these two straight lines.

【0019】また図4に示すようにy0についても撮像
光学系102に関して y0=yR′SR′/SR − (4) として求めることができる。ここで、SRは図5に於け
る物点P2(x0,z0)から撮像光学系102の前側主
点までの距離であり、SR′は撮像光学系102の後側
主点からイメージセンサ103に於ける像点PR′まで
の距離を表す。
Further, as shown in FIG. 4, y 0 can also be calculated for the image pickup optical system 102 as y 0 = y R ′ S R ′ / S R − (4). Here, S R is the distance from the object point P 2 (x 0 , z 0 ) in FIG. 5 to the front principal point of the imaging optical system 102, and S R ′ is the rear principal point of the imaging optical system 102. To the image point P R ′ at the image sensor 103.

【0020】このとき図4の点P(x0,y0,z0
は、 P=f(S0,S0′,θ,xR′,xL′,yR(L)′) (5) により表すことができ、撮影条件(S0,S0′,θ,)
と各々のイメージセンサの出力画像(xR′,xL′,y
R(L))の関数となり、各パラメータを検出することによ
り被写体面の位置情報を求めることができる。
At this time, point P (x 0 , y 0 , z 0 ) in FIG.
Can be expressed by P = f (S 0 , S 0 ′, θ, x R ′, x L ′, y R (L) ′) (5), and the shooting conditions (S 0 , S 0 ′, θ 、)
And the output image of each image sensor (x R ′, x L ′, y
It becomes a function of R (L) ), and the position information of the object surface can be obtained by detecting each parameter.

【0021】以上の過程の求め方を図2,図3を使って
説明する。まず輻輳角2θを回転型エンコーダの様な回
転角情報検出手段105及び205により検出する。撮
像光学系の変倍群102b及び202bに設けた各々の
レンズ群の光軸方向の位置情報を得るためのエンコーダ
(ズームエンコーダ)108、208を使用し、この信
号により撮像光学系102及び202の焦点距離fを求
める。同様に109、209は撮像光学系の合焦群10
2d及び202bdに設けた各々のレンズ群の光軸方向
の位置情報を得るためのエンコーダ(フォーカスエンコ
ーダ)であるがこれらは例えばポテンショメータのよう
な外付けの部材でも良いし、例えばパルスモータのよう
な駆動系自身で駆動方法によりレンズの光軸方向の位置
情報を知る系でも良い。そして演算制御部12にてフォ
ーカスエンコーダ109、及び209からの信号により
撮像光学系102及び202に対する被写体距離S0
求められ、更に上述の撮像光学系102及び202の焦
点距離fと合わせて、撮像光学系102及び202のレ
ンズバックS0′が求まる。尚、エンコーダ108、1
09、208、209からの信号により駆動系106、
107、206、207を別途制御することによって、
2組の撮像光学系102及び202の焦点距離fとレン
ズバックS0′は、常に一致させるようにしているもの
とする。
A method of obtaining the above process will be described with reference to FIGS. First, the convergence angle 2θ is detected by the rotation angle information detecting means 105 and 205 such as a rotary encoder. Encoders (zoom encoders) 108 and 208 for obtaining position information in the optical axis direction of the respective lens groups provided in the variable power groups 102b and 202b of the image pickup optical system are used. Obtain the focal length f. Similarly, 109 and 209 are focusing groups 10 of the imaging optical system.
The encoders (focus encoders) provided on the lens groups 2d and 202bd for obtaining the positional information of the respective lens groups in the optical axis direction may be external members such as potentiometers, or pulse motors such as pulse motors. A system in which the driving system itself knows position information in the optical axis direction of the lens by the driving method may be used. Then, the arithmetic and control unit 12 obtains the subject distance S 0 with respect to the image pickup optical systems 102 and 202 from the signals from the focus encoders 109 and 209, and further combines with the focal length f of the image pickup optical systems 102 and 202 described above to obtain an image The lens back S 0 ′ of the optical systems 102 and 202 is obtained. The encoders 108, 1
09, 208, 209 drive system 106,
By controlling 107, 206 and 207 separately,
It is assumed that the focal lengths f of the two sets of imaging optical systems 102 and 202 and the lens back S 0 ′ are always matched.

【0022】以上のように各機構系に設けたエンコーダ
105、108、109、205、208、209の信
号をもとに演算制御部12に於てS0(撮像光学系(レ
ンズ)の前側主点から各々の光軸の交点までの距離)、
0′(撮像光学系(レンズ)の後側主点から像面まで
の距離)及びθ(輻輳角)を求める。一方111及び2
11は画像メモリであり、映像信号110及び210を
一時保存する。
Based on the signals of the encoders 105, 108, 109, 205, 208 and 209 provided in each mechanical system as described above, S 0 (the front side main part of the image pickup optical system (lens)) Distance from the point to the intersection of each optical axis),
S 0 ′ (distance from the rear principal point of the image pickup optical system (lens) to the image plane) and θ (angle of convergence) are obtained. While 111 and 2
An image memory 11 temporarily stores the video signals 110 and 210.

【0023】13は相関演算処理部であり画像メモリ1
11及び211の画像データについて相関演算を行うが
図6(A)(B)に相関演算処理についての概念図を示
す。処理に際しては、図6(B)に示すようにまず画像
メモリ111に於ける点(xiR′,yjR′)の画素デー
タRijを中心とした画素データ群112を1つのブロッ
クとし、これと画像メモリ211に於ける画像との間で
相関演算を行なう。図7は水平方向及び垂直方向につい
て相関演算に於ける相関値とx′(y′)軸座標の関係
を模式的に表したものである。ここで、この相関ピーク
近傍の関係を関数近似すること等により相関値が最大と
なるxL′(yL′)をイメージセンサのサンプリングピ
ッチ以下の精度で求める。画像メモリ111の各画素デ
ータRijについてブロック112を設け、同様の演算処
理を行ない各画素データに対応するxL′及びyL′を求
める。
Reference numeral 13 denotes a correlation calculation processing unit, which is the image memory 1
Correlation calculation is performed on the image data of 11 and 211, and FIGS. 6A and 6B are conceptual diagrams of the correlation calculation process. At the time of processing, as shown in FIG. 6B, the pixel data group 112 centering on the pixel data R ij at the point (x iR ′, y jR ′) in the image memory 111 is made into one block, and And the image in the image memory 211 are correlated. FIG. 7 schematically shows the relationship between the correlation value and the x '(y') axis coordinate in the correlation calculation in the horizontal direction and the vertical direction. Here, x L ′ (y L ′) that maximizes the correlation value is obtained with an accuracy equal to or lower than the sampling pitch of the image sensor by performing a function approximation of the relationship near the correlation peak. A block 112 is provided for each pixel data R ij of the image memory 111, and similar calculation processing is performed to obtain x L ′ and y L ′ corresponding to each pixel data.

【0024】この一連の処理により求めた画像情報(x
iR′、xL′、yjR′)と前記撮影条件(S0、S0′、
θ)を用いて、前述の(5)式に示した関係よりイメー
ジセンサ103の各画素データRijに対応する被写体面
の座標PR(x0,y0,z0)を求めることができる。
The image information (x
iR ′, x L ′, y jR ′) and the photographing conditions (S 0 , S 0 ′,
θ) can be used to determine the coordinates P R (x 0 , y 0 , z 0 ) of the object plane corresponding to each pixel data R ij of the image sensor 103 from the relationship shown in the above equation (5). ..

【0025】イメージセンサ203の各画素データLij
(xiL′,yjL′)についても同様の演算処理を施し、
各画素に対応するxR′及びyR′を求め、イメージセン
サ203の各画素データLijに対応する被写体面の座標
L(x0,y0,z0)を求める。尚、被写体面の座標P
RとPLは必ずしも一致しない。
Each pixel data L ij of the image sensor 203
Similar calculation processing is performed on (x iL ′, y jL ′),
X R ′ and y R ′ corresponding to each pixel are obtained, and the coordinates P L (x 0 , y 0 , z 0 ) of the object plane corresponding to each pixel data L ij of the image sensor 203 are obtained. Incidentally, the coordinates P of the subject plane
R and P L do not necessarily match.

【0026】以上の処理により求めた座標PR及びPL
基に補正処理部14に於て処理を行なう。補正処理部1
4に於ては、まず所望の視点位置を入力する。このとき
視点位置を例えば図8に示すように入力すると、前述の
撮影条件S0、S0′と被写体面の座標PR(L)(x,
0,z0)を用い、各座標値に対する像点(x′、
y′)を以下のように求めることができる x′=x00′/(S0+z0) − (6−a) y′=y00′/(S0+z0) − (6−b) 前記(6−a、b)式を基に演算処理により各々の画像
メモリの各座標(xiR(L)′,yjR(L)′)を座標変換し
画像メモリ15に書き込む。画像メモリ15に於ける画
像はレジストレーションずれが補正されており、また画
像データは各々のイメージセンサ103及び203より
出力される画像データに対して理想的には2倍となり、
その結果、出力画像は高精細化されたものとなるわけで
ある。
The correction processing unit 14 performs processing based on the coordinates P R and P L obtained by the above processing. Correction processing unit 1
In step 4, first, a desired viewpoint position is input. Entering this time viewpoint position as shown in FIG. 8, the coordinates P R (L) (x photographing condition S 0, S 0 'and the object plane of the aforementioned,
image point (x ′, y 0 , z 0 ) for each coordinate value
y ′) can be obtained as follows: x ′ = x 0 S 0 ′ / (S 0 + z 0 ) − (6-a) y ′ = y 0 S 0 ′ / (S 0 + z 0 ) − ( 6-b) The coordinates (x iR (L) ′, y jR (L) ′) of each image memory are coordinate-converted and written in the image memory 15 by arithmetic processing based on the equation (6-a, b). .. The registration error of the image in the image memory 15 is corrected, and the image data is ideally doubled with respect to the image data output from the image sensors 103 and 203,
As a result, the output image has a high definition.

【0027】[0027]

【発明の効果】以上、述べた様に本発明によれば高精細
な画像を得ることができるから、これから益々求められ
る高精細情報を提供することができる効果がある。
As described above, according to the present invention, since a high-definition image can be obtained, there is an effect that high-definition information which is required more and more can be provided.

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明実施例の基本配置を示す図。FIG. 1 is a diagram showing a basic arrangement of an embodiment of the present invention.

【図2】実施例の具体的構成を示す斜視図。FIG. 2 is a perspective view showing a specific configuration of the embodiment.

【図3】実施例の機能の関連を示すブロック図。FIG. 3 is a block diagram showing the relationship of functions of the embodiment.

【図4】被写体の3次元位置情報導出に関する説明図。FIG. 4 is an explanatory diagram regarding derivation of three-dimensional position information of a subject.

【図5】被写体の2次元位置情報導出に関する説明図。FIG. 5 is an explanatory diagram for deriving two-dimensional position information of a subject.

【図6】相関演算処理の模式図。FIG. 6 is a schematic diagram of correlation calculation processing.

【図7】相関値を示す特性図。FIG. 7 is a characteristic diagram showing a correlation value.

【図8】画像補正処理の説明図。FIG. 8 is an explanatory diagram of image correction processing.

【図9】高精細化の原理を説明するための模式図。FIG. 9 is a schematic diagram for explaining the principle of high definition.

【図10】従来例で発生するレジストレーションずれの
説明図。
FIG. 10 is an explanatory diagram of registration deviation that occurs in a conventional example.

【符号の説明】[Explanation of symbols]

1 被写体面 102、202 撮像光学系 102a、102b、102c、102d 撮像光学系
102のレンズ群 202a、202b、202c、202d 撮像光学系
202のレンズ群 101、201 撮像光学系102、201の光軸 103、203 イメージセンサ 104、204 輻輳角モータ 105、205 角度エンコーダ 106、206 ズームモータ 107、207 フォーカスモータ 108、208 ズームエンコーダ 109、209 フォーカスエンコーダ 110、210 画像信号 111、211、15 画像メモリ 12、13、14 演算制御部
1 Object plane 102, 202 Imaging optical system 102a, 102b, 102c, 102d Lens group 202a, 202b, 202c, 202d Lens of imaging optical system 202 Optical axis 103 of imaging optical system 102, 201 , 203 image sensor 104, 204 angle of convergence motor 105, 205 angle encoder 106, 206 zoom motor 107, 207 focus motor 108, 208 zoom encoder 109, 209 focus encoder 110, 210 image signal 111, 211, 15 image memory 12, 13 , 14 Arithmetic control unit

Claims (2)

【特許請求の範囲】[Claims] 【請求項1】 複数の撮像系を用いて共通の被写体を撮
像する装置に於いて、撮像系の撮像条件を検出する手段
と、各撮像系から出力される画像信号から被写体の位置
情報を検出する手段と、各撮像系からの出力に依る画像
を前記撮像条件と前記位置情報を基に合成する手段を有
することを特徴とする複眼撮像装置。
1. A device for imaging a common subject using a plurality of imaging systems, and means for detecting an imaging condition of the imaging system and position information of the subject from an image signal output from each imaging system. And a means for synthesizing an image depending on an output from each image pickup system based on the image pickup condition and the position information.
【請求項2】 前記合成する手段は所定の視点位置を基
準にして合成画像を形成することを特徴とする請求項1
の複眼撮像装置。
2. The synthesizing means forms a synthetic image based on a predetermined viewpoint position.
Compound eye imaging device.
JP4152214A 1992-03-23 1992-06-11 Compound eye imaging device Expired - Fee Related JP2974500B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP4152214A JP2974500B2 (en) 1992-06-11 1992-06-11 Compound eye imaging device
AT93104677T ATE158129T1 (en) 1992-03-23 1993-03-22 MULTIPLE LENS IMAGE RECORDING DEVICE AND MISREGISTRATION CORRECTION
EP93104677A EP0563737B1 (en) 1992-03-23 1993-03-22 Multilens imaging apparatus with correction of misregistration
DE69313694T DE69313694T2 (en) 1992-03-23 1993-03-22 Multi-lens imaging device and correction of misregistration
US08/036,079 US5668595A (en) 1992-03-23 1993-03-23 Multi-lens imaging apparatus having a mechanism for combining a plurality of images without displacement of registration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP4152214A JP2974500B2 (en) 1992-06-11 1992-06-11 Compound eye imaging device

Publications (2)

Publication Number Publication Date
JPH05344422A true JPH05344422A (en) 1993-12-24
JP2974500B2 JP2974500B2 (en) 1999-11-10

Family

ID=15535570

Family Applications (1)

Application Number Title Priority Date Filing Date
JP4152214A Expired - Fee Related JP2974500B2 (en) 1992-03-23 1992-06-11 Compound eye imaging device

Country Status (1)

Country Link
JP (1) JP2974500B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5907434A (en) * 1995-03-20 1999-05-25 Canon Kabushiki Kaisha Image pickup apparatus
JP2009239392A (en) * 2008-03-26 2009-10-15 Fujifilm Corp Compound eye photographing apparatus, control method therefor, and program
US7689117B2 (en) 2006-01-11 2010-03-30 Panasonic Corporation Multi-module photography system
JP2012090288A (en) * 2011-11-18 2012-05-10 Fujifilm Corp Compound-eye photographing apparatus, control method thereof, and program

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003231457A1 (en) * 2003-04-23 2004-11-19 Seijiro Tomita Method and apparatus for measuring distance
JP4297111B2 (en) 2005-12-14 2009-07-15 ソニー株式会社 Imaging apparatus, image processing method and program thereof

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5907434A (en) * 1995-03-20 1999-05-25 Canon Kabushiki Kaisha Image pickup apparatus
US7689117B2 (en) 2006-01-11 2010-03-30 Panasonic Corporation Multi-module photography system
JP2009239392A (en) * 2008-03-26 2009-10-15 Fujifilm Corp Compound eye photographing apparatus, control method therefor, and program
JP2012090288A (en) * 2011-11-18 2012-05-10 Fujifilm Corp Compound-eye photographing apparatus, control method thereof, and program

Also Published As

Publication number Publication date
JP2974500B2 (en) 1999-11-10

Similar Documents

Publication Publication Date Title
US5668595A (en) Multi-lens imaging apparatus having a mechanism for combining a plurality of images without displacement of registration
US10863164B2 (en) Stereo camera and method of controlling stereo camera
US5646679A (en) Image combining method and apparatus
US5682198A (en) Double eye image pickup apparatus
JP2883265B2 (en) Image processing device
US6236748B1 (en) Compound eye image pickup device utilizing plural image sensors and plural lenses
US5699108A (en) Multi-eye image pickup apparatus with multi-function finder screen and display
US7139424B2 (en) Stereoscopic image characteristics examination system
US6304284B1 (en) Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera
US6839081B1 (en) Virtual image sensing and generating method and apparatus
JP2012215720A (en) Stereoscopic image pickup device and stereoscopic image pickup method
CN111854636B (en) Multi-camera array three-dimensional detection system and method
JP2002071309A (en) Three-dimensional image-detecting device
Gehrig Large-field-of-view stereo for automotive applications
JP2974500B2 (en) Compound eye imaging device
KR20140121345A (en) Surveillance Camera Unit And Method of Operating The Same
JPH07303207A (en) Image pickup device
JP2018134712A (en) Robot system and control method for robot system
TWM594322U (en) Camera configuration system with omnidirectional stereo vision
JPH10170227A (en) Display device
JP3093447B2 (en) Compound eye imaging device
JP3412945B2 (en) Image synthesis device and photographing device
JPH06195447A (en) Compound eye image pickup device
TWI725620B (en) Omnidirectional stereo vision camera configuration system and camera configuration method
JPH0715749A (en) Compound-eye image pickup device

Legal Events

Date Code Title Description
FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080903

Year of fee payment: 9

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090903

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090903

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100903

Year of fee payment: 11

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100903

Year of fee payment: 11

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110903

Year of fee payment: 12

LAPS Cancellation because of no payment of annual fees