JPH06167564A - Position measuring system and device using fish eye lens - Google Patents

Position measuring system and device using fish eye lens

Info

Publication number
JPH06167564A
JPH06167564A JP17739492A JP17739492A JPH06167564A JP H06167564 A JPH06167564 A JP H06167564A JP 17739492 A JP17739492 A JP 17739492A JP 17739492 A JP17739492 A JP 17739492A JP H06167564 A JPH06167564 A JP H06167564A
Authority
JP
Japan
Prior art keywords
target
fisheye
image
coordinate system
position coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP17739492A
Other languages
Japanese (ja)
Other versions
JP2611173B2 (en
Inventor
Kakuichi Shiomi
塩見格一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ship Research Institute
Original Assignee
Ship Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ship Research Institute filed Critical Ship Research Institute
Priority to JP4177394A priority Critical patent/JP2611173B2/en
Publication of JPH06167564A publication Critical patent/JPH06167564A/en
Application granted granted Critical
Publication of JP2611173B2 publication Critical patent/JP2611173B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Landscapes

  • Measurement Of Optical Distance (AREA)

Abstract

PURPOSE:To measure a position by obtaining the cross point of a group of linear equation group corresponding to each target for each target, determining the three-dimensional position coordinates of each target, and then tracking a plurality of targets simultaneously. CONSTITUTION:An image pick-up device 3 using a fish eye lens 1 is installed for determining light axis and then a system coordinate system (X, Y, Z) is determined arbitrarily at this point. Therefore, system coordinates are given to the fish eye images which are picked up by three image pick-up devices 3a, 3b, and 3c. Targets T1, T2, T3,... within each image are subjected to image processing 6 and are detected and then an operation part 7 obtains three linear equations according to the light axis of the lens 1 on the coordinate system (X, Y, Z). For example, three linear equations passing through the center of a visual field surface between the target T1 and the devices 3a, 3b, and 3c are obtained. Namely, the number corresponding to the number of lenses 1, namely each linear equation group consisting of three linear equations is obtained for the targets T1, T2, T3,.... The three-dimensional position coordinates are calculated according to the cross points of the linear equation groups.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】この発明は,魚眼レンズを用いて
複数の目標物の3次元測位情報を得ることが出来る測位
方式およびその装置に関するものである。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a positioning system and apparatus for obtaining three-dimensional positioning information of a plurality of targets by using a fisheye lens.

【0002】[0002]

【従来の技術】現状の画像情報に基づく測位装置におい
ては,目標物を通常2台の望遠レンズを有するカメラで
追尾するもので,この方式においては,目標物が各カメ
ラの視野の中心に得られるようにカメラの方向(レンズ
の光軸方向)を調整し,2台のカメラの設置位置と光軸
の角度情報から,三角法により目標物の位置を算出して
いる。そのため,2台のカメラを目標物に正確に向ける
ためのサ−ボ系を含む調整装置,及び,カメラの光軸方
向を正確に測定する測角装置が必要である。
2. Description of the Related Art In a current positioning apparatus based on image information, a target object is usually tracked by a camera having two telephoto lenses. In this method, the target object is located in the center of the field of view of each camera. The direction of the camera (optical axis direction of the lens) is adjusted so that the position of the target object is calculated by trigonometry from the installation positions of the two cameras and the optical axis angle information. Therefore, an adjusting device including a servo system for accurately pointing the two cameras to the target object and an angle measuring device for accurately measuring the optical axis direction of the cameras are required.

【0003】[0003]

【発明が解決しようとする問題点】このように,通常の
カメラで目標を追尾する場合には,サ−ボ系および測角
系ともに高い応答性が要求される。しかし,目標物の移
動速度が速い場合には,現状の技術では充分追尾しつ
つ、同時に測位することはかなり困難である。その上,
このような方式では,1台の装置では1個の目標物しか
追尾することが出来ず,2個以上の複数の目標物を追尾
するには,システムとしてその目標物の数だけの装置が
必要である。このように,1台の装置では同時に複数の
目標物を追尾することができなかった。
As described above, when a target is tracked by a normal camera, both the servo system and the angle measuring system are required to have high responsiveness. However, when the moving speed of the target object is high, it is quite difficult to simultaneously perform positioning while sufficiently tracking with the current technology. Moreover,
In such a method, one device can track only one target object, and in order to track two or more target objects, as many devices as the target object are required as a system. . In this way, one device cannot track multiple targets at the same time.

【0004】[0004]

【問題点を解決するための手段】この発明は,任意にシ
ステム座標系を与え,このシステム座標系に対し個々の
魚眼レンズの光軸を決定し,この各魚眼レンズを用いた
複数の撮像装置により与えられる各魚眼画像から,それ
ぞれこの魚眼画像上に存在する目標物を検出し,システ
ム座標系におけるこの目標物と撮像装置の視界面中心と
を通る直線方程式を,この撮像装置の個数に対応する数
だけそれぞれ求め,この複数の直線方程式を1組とする
直線方程式群を検出された各目標物についてそれぞれ求
め,各目標物にそれぞれ対応する直線方程式群の交点
を,各目標物毎に求め,システム座標系における各目標
物の3次元位置座標を決定するようにして,複数の目標
物を同時に追尾し,測位するようにしたものである。
According to the present invention, a system coordinate system is arbitrarily provided, the optical axis of each fisheye lens is determined with respect to this system coordinate system, and a plurality of image pickup devices using each fisheye lens are provided. A target that exists on the fisheye image is detected from each of the fisheye images, and a linear equation that passes through the target in the system coordinate system and the center of the visual interface of the image pickup device corresponds to the number of the image pickup devices. The number of linear equations is determined for each target object, and the intersection of the linear equation group corresponding to each target object is determined for each target object. By determining the three-dimensional position coordinates of each target in the system coordinate system, a plurality of targets are simultaneously tracked and positioned.

【0005】[0005]

【作用】魚眼画像でとらえた目標物について,魚眼レン
ズを用いた撮像装置の個数に対応する数の直線方程式を
1組とした直線方程式群を,各目標物に対応する数だけ
求めるとともに,この直線方程式群の交点を各目標物毎
に求め,この交点の3次元位置座標を求めて目標物の測
位情報が得られる。
With respect to the target object captured by the fisheye image, a linear equation group including one set of linear equations corresponding to the number of image pickup devices using the fisheye lens is obtained, and the number of linear equations corresponding to each target object is obtained. The intersection of the linear equation group is obtained for each target, and the three-dimensional position coordinates of this intersection are obtained to obtain the positioning information of the target.

【0006】[0006]

【発明の実施例】この発明の実施例を,図1〜図7に基
づいて詳細に説明する。図1はこの発明の要部構成図,
図2は視野角180°の魚眼レンズによる魚眼画像,図
3は3個の魚眼レンズによる魚眼画像と測位情報,図4
は測位のための説明図,図5〜図7は魚眼レンズの配置
例を示す説明図である。図1〜図3において,1は魚眼
レンズで,CCD等の光電変換素子2と組み合わされて
撮像装置3が構成されており,この実施例では,3台の
撮像装置3a,3b,3cが用いられている。魚眼レン
ズ1を用いた撮像装置3を設置することにより,図2,
図3に示すように,光軸L(La ,Lb ,Lc )が決定
される。又,システムのシステム座標系(X,Y,Z)
は,魚眼レンズ1を設置した時に任意に決定される。従
って,3台の撮像装置3a ,3b ,3c で撮像されたそ
れぞれ魚眼画像4a ,4b ,4c のシステム座標は,図
3に示すように,それぞれ(Xa ,Ya ,Za ),(X
b ,Yb ,Zb ),(Xc ,Yc ,Zc )で与えられ
る。
Embodiments of the present invention will be described in detail with reference to FIGS. FIG. 1 is a block diagram of the essential parts of the present invention,
2 is a fisheye image with a fisheye lens with a viewing angle of 180 °, FIG. 3 is a fisheye image with three fisheye lenses and positioning information, and FIG.
Is an explanatory diagram for positioning, and FIGS. 5 to 7 are explanatory diagrams showing arrangement examples of fisheye lenses. 1 to 3, reference numeral 1 denotes a fish-eye lens, which is combined with a photoelectric conversion element 2 such as a CCD to constitute an image pickup device 3. In this embodiment, three image pickup devices 3a, 3b, 3c are used. ing. By installing the image pickup device 3 using the fisheye lens 1, as shown in FIG.
As shown in FIG. 3, the optical axis L (L a , L b , L c ) is determined. Also, the system coordinate system (X, Y, Z) of the system
Is arbitrarily determined when the fisheye lens 1 is installed. Therefore, as shown in FIG. 3, the system coordinates of the fish-eye images 4 a , 4 b , and 4 c captured by the three image pickup devices 3 a , 3 b , and 3 c are (X a , Y a , respectively). , Z a ), (X
b, Y b, Z b) , is given by (X c, Y c, Z c).

【0007】5はインタフェ−スで,撮像装置3で撮像
された魚眼画像4を画像処理するためにデジタル変換し
ている。6は画像処理部で,魚眼画像4内にある目標物
1 ,T2 ,T3 ・・・が画像処理されて検出される。
7は演算部で,システム座標系(X,Y,Z)上の魚眼
レンズ1の光軸Lにより3本の直線方程式が求められ
る。例えば,目標物T1 と各撮像装置3a ,3b,3c
との視界面中心とを通る3本の直線方程式がそれぞれ求
められる。即ち,魚眼レンズ1の個数に対応する数の直
線方程式を1組(この実施例では,3本の直線方程式で
1組)とする直線方程式群が各目標物T1 ,T2 ,T3
・・・についてそれぞれ求められ,この直線方程式群の
交点から,3次元位置座標が算出される。8はメモリ,
9はトラッキング処理部で,魚眼画像4を逐次トラッキ
ング処理して,目標物T1 ,T2 ,T3 ・・の航跡が決
定される。10は画像出力のためのデ−タ処理装置で,
目標物T1 ,T2 ,T3 ・・・の航跡をビデオ信号に変
換して,表示装置11のモニタ画面上に3次元表示する
ための画像デ−タ処理が行われる。12は記録部で,撮
像装置2で撮像された魚眼画像4が一時記録される。
An interface 5 digitally converts the fish-eye image 4 picked up by the image pickup device 3 for image processing. Reference numeral 6 denotes an image processing unit, which detects the targets T 1 , T 2 , T 3 ... In the fisheye image 4 by image processing.
Reference numeral 7 denotes an arithmetic unit, which obtains three linear equations by the optical axis L of the fisheye lens 1 on the system coordinate system (X, Y, Z). For example, the target T 1 and each of the imaging devices 3 a , 3 b , 3 c
Three linear equations passing through the center of the visual interface of and are obtained respectively. That is, a group of linear equations having one set of linear equations (three linear equations in this embodiment) corresponding to the number of fish-eye lenses 1 is used for each target T 1 , T 2 , T 3.
Are calculated respectively, and three-dimensional position coordinates are calculated from the intersections of the linear equation group. 8 is a memory,
Reference numeral 9 denotes a tracking processing unit which sequentially performs tracking processing on the fisheye image 4 to determine the tracks of the targets T 1 , T 2 , T 3 ... 10 is a data processing device for image output,
Image data processing for converting the tracks of the targets T 1 , T 2 , T 3, ... To video signals and displaying them three-dimensionally on the monitor screen of the display device 11 is performed. A recording unit 12 temporarily records the fisheye image 4 captured by the image capturing apparatus 2.

【0008】次に,測位原理について説明する。図2に
示すように,一般に,魚眼レンズ1の視野角は180°
であるとともに,光軸Lは画像面の中心を通る垂線であ
り,このシステム座標系における直線方程式は,魚眼レ
ンズ1(魚眼レンズ1を用いた撮像装置3を示す)の設
置場所により決定される。視野角180°の魚眼レンズ
1で作られる魚眼画像4は円形であり,魚眼レンズ1の
前方に存在する目標物T1 ,T2 ,T3 ・・を見た時の
魚眼画像4の中心Oは,魚眼レンズ1の真正面(光軸L
方向)を示し,左右方向A,B点はそれぞれ魚眼レンズ
1の左右方向90°の点,上下方向のC,D点は魚眼レ
ンズ1の真上と真下を示している。なお,システム座標
系(X,Y,Z)は,魚眼レンズ1を設置した時に任意
に決定される。又,魚眼レンズ1で目標物T1 ,T2
3 ・・を見た場合,中心Oから離れるにつれて目標物
1 ,T2 ,T3 ・・・の形状は魚眼画像4の円周に沿
って歪むため,目標物T1 ,T2 ,T3 ・・・を形状で
認識することは困難である。そこで,本発明者は,航空
機等の目標物T1 ,T2 ,T3 ・・・を認識する手段と
して,航空機の尾翼に点滅する光点(ストロボ)を目標
物T1 ,T2 ,T3・・・として定めた。目標物T1
2 ,T3 ・・・が点状である場合には,その目標物T
が魚眼画像4の円周方向にあっても歪むことはない。
Next, the positioning principle will be described. As shown in FIG. 2, the viewing angle of the fisheye lens 1 is generally 180 °.
In addition, the optical axis L is a perpendicular line passing through the center of the image plane, and the linear equation in this system coordinate system is determined by the installation location of the fisheye lens 1 (showing the imaging device 3 using the fisheye lens 1). The fisheye image 4 formed by the fisheye lens 1 with a viewing angle of 180 ° is circular, and the center O of the fisheye image 4 when the targets T 1 , T 2 , T 3 ... In front of the fisheye lens 1 are seen. Is directly in front of the fisheye lens 1 (optical axis L
Direction), points A and B in the left-right direction are points at 90 ° in the left-right direction of the fish-eye lens 1, and points C and D in the up-down direction are directly above and below the fish-eye lens 1. The system coordinate system (X, Y, Z) is arbitrarily determined when the fisheye lens 1 is installed. In addition, with the fisheye lens 1, the targets T 1 , T 2 ,
T 3 when viewed ..., the shape of the target T 1, T 2, T 3 ··· are distorted along the circumference of the fish-eye image 4 with distance from the center O, the target T 1, T 2 , T 3 ... Is difficult to recognize by shape. Therefore, the inventor of the present invention recognizes the targets T 1 , T 2 , T 3, ... Of the aircraft by using the flashing light spot (strobe) on the tail of the aircraft as the targets T 1 , T 2 , T 3. It was set as 3 ... Target T 1 ,
When T 2 , T 3 ... Are dot-like, the target T
Is not distorted even in the circumferential direction of the fisheye image 4.

【0009】そこで,目標物T1 ,T2 ,T3 ・・・の
形状を点(光点)に決定するとともに,図3において
は,点状目標物として目標物T1 ,T2 の2個存在する
と設定されている。従って,魚眼レンズ1をそれぞれ用
いた複数の撮像装置3(3a,3b,3c・・・)が与
える魚眼画像4(4a,4b,4c)中に,2個の目標
物T1 ,T2 が存在し,この目標物T1 ,T2 に共通す
る測位情報としては,撮像装置3の設置場所によってそ
れぞれ3個の撮像装置3a,3b,3cの設置座標(X
a ,Ya ,Za ),(Xb ,Yb ,Zb ),(Xc ,Y
c ,Zc )および3本の光軸方向La (θa ,ψa ),
b (θb ,ψb ),Lc (θc ,ψc )がそれぞれ決
定される。
[0009] Therefore, a target T 1, T 2, T 3 ··· shape and determines a point (spot), in FIG. 3, the target T 1, T 2 as a point-shaped target 2 It is set to exist individually. Therefore, in the fisheye image 4 (4a, 4b, 4c) provided by the plurality of image pickup devices 3 (3a, 3b, 3c ...) Using the fisheye lens 1, two target objects T 1 and T 2 are detected. As the positioning information that exists and is common to the targets T 1 and T 2 , depending on the installation location of the imaging device 3, the installation coordinates (X
a, Y a, Z a) , (X b, Y b, Z b), (X c, Y
c, Z c) and three optical axis L a (θ a, ψ a ),
L bb , ψ b ) and L cc , ψ c ) are respectively determined.

【0010】次に,目標物T1 に関する測位情報として
は,第1の撮像装置3aから見た時の目標物T1 の光軸
方向La (θa ,ψa )からの変位角(θa1,ψa1)と
第2の撮像装置3bから見た時の目標物T1 の光軸方向
b (θb ,ψb )からの変位角(θb1,ψb1)および
第3の撮像装置3cから見た時の目標物T1 の光軸方向
c (θc ,ψc )からの変位角(θc1,ψc1)が得ら
れる。同様に,目標物T2 に関する測位情報としては,
第1の撮像装置3aから見た時の目標物T2 の光軸方向
a (θa ,ψa )からの変位角(θa2,ψa2)と第2
の撮像装置3bから見た時の目標物T2 の光軸方向Lb
(θb ,ψb )からの変位角(θb2,ψb2)および第3
の撮像装置3cから見た時の目標物T2 の光軸方向Lc
(θc ,ψc )からの変位角(θc2,ψc2)が得られ
る。
[0010] Then, the positioning information regarding target T 1, the target T 1 of the optical axis L a (θ a, ψ a ) when viewed from the first image pickup device 3a displacement angle from (theta a1 , ψ a1 ), the displacement angle (θ b1 , ψ b1 ) from the optical axis direction L bb , ψ b ) of the target T 1 when viewed from the second image pickup device 3 b , and the third image pickup The displacement angle (θ c1 , ψ c1 ) from the optical axis direction L cc , ψ c ) of the target T 1 when viewed from the device 3c is obtained. Similarly, as the positioning information regarding the target T 2 ,
Target T 2 of the optical axis L a (θ a, ψ a ) when viewed from the first image pickup device 3a displacement angle from (θ a2, ψ a2) and second
Optical axis direction L b of the target T 2 when viewed from the image pickup device 3b of
b, ψ b) displacement angle from (θ b2, ψ b2) and third
The optical axis direction L c of the target T 2 when viewed from the imaging device 3c of
c, ψ c) displacement angle from (θ c2, ψ c2) is obtained.

【0011】このように,システム座標系において,魚
眼レンズ1の光軸Lの直線方程式(設置位置と光軸方向
Lによる)と,この魚眼画像4上に識別された目標物T
1 ,T2 の位置デ−タから,1個の魚眼レンズ1により
得られた魚眼画像4について,この魚眼画像4上の全て
の目標物T1 ,T2 ・・・と撮像装置3とを結ぶ三次元
空間における直線方程式が与えられる。そこで,図1,
図4に示すように,互いに異なる3箇所に設置された3
台の撮像装置3(3a,3b,3c)によりそれぞれ得
られる3個の魚眼画像4(4a,4b ,4c )上に2個
の目標物T1 ,T2 が存在する場合,目標物T(T1
2 )の位置を画像情報から読み取れば,1個の目標物
1 とそれぞれ3台の撮像装置3a,3b,3cの視界
面中心とをそれぞれ結ぶ3本の直線方程式f1a,f1b
1cが得られる。目標物T2 についても同様に,目標物
2 と3台の撮像装置3a ,3b ,3cの視界面中心と
をそれぞれ結ぶ3本の直線方程式f2a,f2b,f2cが得
られる。このように,撮像装置3の個数に対応する数の
直線方程式を1組とする直線方程式群が各目標物T1
2 毎に,即ち,2組得られる。このようにして得られ
た1組の直線方程式群の与える撮像装置3の個数に対応
する数の直線が1箇所に交わった交点に目標物Tが位置
しているので,この交点を算出すれば,この点が3次元
空間における目標物Tの位置座標を与える。
Thus, in the system coordinate system, the linear equation of the optical axis L of the fisheye lens 1 (depending on the installation position and the optical axis direction L) and the target T identified on this fisheye image 4.
With respect to the fisheye image 4 obtained by one fisheye lens 1 from the position data of 1 and T 2 , all the targets T 1 , T 2 ... A linear equation in a three-dimensional space that connects is given. So, in Figure 1,
As shown in FIG. 4, 3 installed in 3 different places
When two targets T 1 and T 2 are present on the three fish-eye images 4 (4 a , 4 b and 4 c ) respectively obtained by the three imaging devices 3 (3 a , 3 b and 3 c ), Target T (T 1 ,
If the position of (T 2 ) is read from the image information, three linear equations f 1a , f 1b , which connect one target T 1 and the visual interface centers of the three image pickup devices 3a, 3b, 3c respectively,
f 1c is obtained. Similarly, the target T 2, target T 2 and three image pickup device 3 a, 3 b, 3 c visibility surface center and the three straight lines equation f 2a connecting respectively, f 2b, f 2c is obtained To be In this way, a group of linear equations, each of which has a number of linear equations corresponding to the number of the image pickup devices 3, is set as each target T 1 ,
Every T 2 , that is, two sets are obtained. Since the target object T is located at an intersection where a number of straight lines corresponding to the number of the image pickup devices 3 given by the set of linear equation groups thus obtained intersect at one location, if this intersection is calculated, , This point gives the position coordinates of the target T in the three-dimensional space.

【0012】次に,上記,測位原理に基づいて,実際に
目標物Tの測位情報を得るための測位装置の動作につい
て,第1図,第5図に基づいて説明する。まず,目標物
Tが複数存在する場合について,完全に偽像をなくすた
めには,それぞれの目標物Tは少なくとも3台の撮像装
置3の視野に捉えられなくてはならない。そこで,3台
の撮像装置3(3a,3b,3c)のそれぞれ前方18
0°の視野内にある目標物Tを測位する場合には,図5
に示すように,この実施例では,3台の撮像装置3(3
a,3b,3c)は,それぞれ同一平面上に光軸L(L
a,Lb ,Lc )が平行となるように設置される。但
し,魚眼レンズ1の角度分解能は光軸方向Lが最大であ
り,これから離れるに従って低下する。又,目標物Tが
撮像装置3の設置平面上の2台の撮像装置3を結ぶ直線
上にある場合には,合わせて偽造が発生する可能性があ
る。
Next, the operation of the positioning device for actually obtaining the positioning information of the target T based on the positioning principle will be described with reference to FIGS. 1 and 5. First, in the case where there are a plurality of targets T, in order to completely eliminate the false image, each target T must be captured by the field of view of at least three imaging devices 3. Therefore, in front of each of the three image pickup devices 3 (3a, 3b, 3c),
When positioning the target T within the field of view of 0 °, FIG.
As shown in FIG. 3, in this embodiment, three image pickup devices 3 (3
a, 3b, 3c) are respectively on the same plane and the optical axis L (L
a , L b , L c ) are installed in parallel. However, the angular resolution of the fish-eye lens 1 is maximum in the optical axis direction L, and decreases with distance from this. If the target T is on the straight line connecting the two image pickup devices 3 on the installation plane of the image pickup device 3, there is a possibility that counterfeiting will occur together.

【0013】従って,目標物Tを測位する時には,どの
領域をどの程度の精度で測位するかと言った目的に合わ
せ撮像装置3の配置を適正化する必要がある。例えば,
180°の視野角を必要としない場合には,図6に示す
ように,3台の撮像装置3の光軸La ,Lb ,Lc が前
方で交わるように設置することにより,第2の撮像装置
3bの真正面の測位精度を低下させ,図5に示す場合よ
りも広い前方領域の左右方向における測位精度を向上さ
せることが可能である。又,図7に示すように,撮像装
置3aを中心にして,上下左右方向に合計で5個の撮像
装置3b,3c,3d,3eを立体的に配置すれば,上
下方向においても同様により広い領域に対する測位精度
を向上させることができる。さらに,180°以上の視
野角が必要な場合には,撮像装置3を正6面体や正12
面体のように立体的に配置することにより,又,測位精
度についても同様に組み合わせた撮像装置群を複数用い
ることにより向上させることが出来る。即ち,魚眼レン
ズ1(撮像装置3)の位置関係と個数とにより測位領域
が決定される。
Therefore, when positioning the target T, it is necessary to optimize the arrangement of the image pickup device 3 in accordance with the purpose such as which area is to be positioned with what degree of accuracy. For example,
When a viewing angle of 180 ° is not required, as shown in FIG. 6, by installing the three image pickup devices 3 so that the optical axes L a , L b , and L c intersect at the front, It is possible to reduce the positioning accuracy directly in front of the image pickup device 3b and improve the positioning accuracy in the left-right direction of the front area wider than that shown in FIG. Further, as shown in FIG. 7, if a total of five image pickup devices 3b, 3c, 3d, and 3e are arranged three-dimensionally in the vertical and horizontal directions centering on the image pickup device 3a, the image pickup device 3a becomes wider in the vertical direction as well. The positioning accuracy for the area can be improved. Further, when a viewing angle of 180 ° or more is required, the image pickup device 3 is set to a regular hexahedron or regular 12
It is possible to improve the positioning accuracy by arranging it three-dimensionally like a face piece and by using a plurality of imaging device groups that are similarly combined. That is, the positioning area is determined by the positional relationship and the number of fish-eye lenses 1 (imaging devices 3).

【0014】又,魚眼レンズ1の作る視野像が充分大き
い場合には,光電変換素子2(以下,CCD2と記す)
の素子数として40万素子や時には100万素子以上の
ものが使用も可能であり,CCD2の素子数を増加して
分解能の向上を計り,撮像装置の使用台数を少なくして
測位精度を良くすることも可能である。以上のような要
件が検討されるとともに,測位視野角,測位領域,測位
精度等から,魚眼レンズ1を用いた撮像装置3の個数お
よび配置が決定される。撮像装置3の配置が決定される
と,それぞれの魚眼レンズ1の光軸Lが決定される。即
ち,魚眼レンズ1の法線方向が光軸Lである。それとと
もに,このシステム座標系における3次元の直線方程式
としての記述がなされる。
If the field of view image formed by the fisheye lens 1 is sufficiently large, a photoelectric conversion element 2 (hereinafter referred to as CCD 2) is used.
It is possible to use more than 400,000 elements, and sometimes more than 1 million elements. The number of CCD 2 elements is increased to improve the resolution, and the number of imaging devices used is reduced to improve the positioning accuracy. It is also possible. The above requirements are examined, and the number and arrangement of the image pickup devices 3 using the fisheye lens 1 are determined based on the positioning viewing angle, the positioning region, the positioning accuracy, and the like. When the arrangement of the imaging device 3 is determined, the optical axis L of each fisheye lens 1 is determined. That is, the normal direction of the fisheye lens 1 is the optical axis L. At the same time, the description is made as a three-dimensional linear equation in this system coordinate system.

【0015】次に,3台の撮像装置3(3a,3b,3
c)からは,図3に示すように,それぞれ3つの魚眼画
像4(4a ,4b ,4c )が得られる。この魚眼画像4
はCCD2によりそれぞれ光電変換されてビデオ信号に
変換される。なお,目標物Tの測位情報を実時間で求め
る必要がない場合,例えば,航跡等を追尾する場合に
は,このビデオ信号は記録部12に一時記憶され,後に
画像処理部6で処理される。CCD2における各魚眼画
像4のビデオ信号は,インタフェ−ス5を介してデジタ
ル信号に変換され,画像処理部6に入力される。画像処
理部6においては,目標物Tを検出するための画像処理
がなされ,各魚眼画像4に撮像された目標物Tが検出さ
れ,この目標物Tが光軸Lに対してどれだけずれている
上下角何度,左右角何度と検出され,その画像情報は,
(Xa1,Ya1),(Xa2,Ya2)・・・・,(Xb1,Y
b1),(Xb2,Yb2)・・・・,(Xc1,Yc1),(X
c2,Yc2)・・・のように2次元表示されて演算部7に
入力される。
Next, three image pickup devices 3 (3a, 3b, 3
From c), as shown in FIG. 3, three fish-eye images 4 ( 4a , 4b , 4c ) are obtained. This fisheye image 4
Are photoelectrically converted by the CCD 2 and converted into video signals. When it is not necessary to obtain the positioning information of the target T in real time, for example, when tracking a track or the like, this video signal is temporarily stored in the recording unit 12 and later processed by the image processing unit 6. . The video signal of each fish-eye image 4 in the CCD 2 is converted into a digital signal via the interface 5 and input to the image processing unit 6. The image processing unit 6 performs image processing for detecting the target T, detects the target T imaged in each fisheye image 4, and shifts the target T with respect to the optical axis L. The number of vertical corners and the number of horizontal corners are detected, and the image information is
(X a1 , Y a1 ), (X a2 , Y a2 ) ... (X b1 , Y a)
b1 ), (X b2 , Y b2 ) ... (X c1 , Y c1 ), (X
c2 , Y c2 ) ... Two-dimensionally displayed and input to the calculation unit 7.

【0016】演算部7においては,システム座標系を基
準にして,光軸Lの方程式が決定されるとともに,この
光軸Lを通り,撮像装置3の視界面中心を示している魚
眼レンズ1の中心点O(X0 ,Y0 ,Z0 )を通って目
標物Tと交わる直線が求められる。この直線の方程式
は, (x−x0a)/αa =(y−y0a)/βa =(z−
0a)/γa (x−x0b)/αb =(y−y0b)/βb =(z−
0b)/γb (x−x0c)/αc =(y−y0c)/βc =(z−
0c)/γc で表される1組が3本の直線方程式からなる直線方程式
群が目標物Tの数だけ求められる。このようにして,魚
眼レンズ1の個数に対応する数の直線方程式を1組とす
る直線方程式群が,それぞれ目標物T(T1 ,T2 )の
2組求められる。次に,それぞれこの2組の直線方程式
群の交点が演算部7により算出され,交点のリストが得
られる。この交点のリストから,3個の同じ出力がある
点(魚眼レンズ1の個数に対応する数の同じ出力がある
点)のみが抽出され,この点が3本の直線方程式,即
ち,直線方程式群の交点を表し,この交点は目標物Tの
位置デ−タを与える。なお,魚眼レンズ1の個数が3個
の場合,偽像は2本の直線方程式の交点として発生し,
3本の直線方程式の交点には発生しない。従って,魚眼
レンズ1の数が多くなればなるほど偽像の発生する確率
は小さくなる。
In the calculation unit 7, the equation of the optical axis L is determined with reference to the system coordinate system, and the center of the fisheye lens 1 passing through the optical axis L and showing the center of the visual interface of the image pickup device 3 is determined. A straight line that intersects the target T through the point O (X 0 , Y 0 , Z 0 ) is obtained. The equation of this straight line is (x−x 0a ) / α a = (y−y 0a ) / β a = (z−
z 0a ) / γ a (x−x 0b ) / α b = (y−y 0b ) / β b = (z−
z 0b ) / γ b (x−x 0c ) / α c = (y−y 0c ) / β c = (z−
A set of linear equations, each set of which is represented by z 0c ) / γ c , is formed by three linear equations. In this way, two sets of linear equations each having one set of linear equations corresponding to the number of fisheye lenses 1 are obtained for the target T (T 1 , T 2 ). Next, the intersection of each of these two sets of linear equation groups is calculated by the arithmetic unit 7, and a list of intersections is obtained. From this list of intersections, only three points with the same output (points with the same number of outputs corresponding to the number of fisheye lenses 1) are extracted, and these points are the three linear equations, that is, the linear equation groups. Represents an intersection, which gives the position data of the target T. When the number of fish-eye lenses 1 is 3, the false image occurs as the intersection of two linear equations,
It does not occur at the intersection of the three linear equations. Therefore, the greater the number of fisheye lenses 1, the lower the probability of occurrence of false images.

【0017】このようにして,直線方程式群を構成する
魚眼レンズ1の個数に対応する数の直線方程式の,同じ
く魚眼レンズ1の個数に対応する数の直線が1点で交わ
る交点の位置がシステム座標系における目標物の3次元
空間座標(Xn ,Yn ,Zn,tm )として算出され,
この値は,3次元位置座標を示している。但し,tは時
刻を示す。即ち,目標物Tの数がN個である場合には,
3台の撮像装置3から得られた3本の直線方程式を1組
とする直線方程式群がN組求められ,このN組の直線方
程式群が互いに交じわる箇所は,最大で(3N−1)!
箇所で交点を結ぶが,この内3本の直線の交わる交点
は,目標物Tの数に相当するN箇所だけである。例え
ば,測位する目標物Tの数が10個であるとすると,3
台の撮像装置3から得られた直線方程式は,29!箇所
の交点を結ぶが,この内,3本の直線方程式が交わる交
点(目標物Tが交点となる)は目標物Tの数に相当する
10箇所だけである。 で示される3次元位置座標が時刻t1 ,t2 ,t3 ・・
に対してそれぞれ求められ,この3次元位置座標はメモ
リ8に記憶される。
In this way, the position of the intersection of the linear equations of the number corresponding to the number of fish-eye lenses 1 constituting the linear equation group, where the number of straight lines corresponding to the number of the fish-eye lenses 1 intersect at one point, is the system coordinate system. Is calculated as the three-dimensional space coordinates (X n , Y n , Z n , t m ) of the target in
This value indicates the three-dimensional position coordinate. However, t shows time. That is, when the number of target objects T is N,
N sets of linear equation groups each including three linear equations obtained from the three image pickup devices 3 are obtained, and the maximum number of intersections of the N linear equation groups is (3N-1 )!
The intersections are connected at points, but the intersections of the three straight lines are only N points corresponding to the number of target objects T. For example, if the number of target objects T to be positioned is 10, then 3
The linear equation obtained from the three image pickup devices 3 is 29! The intersections of the points are connected, but among these, the intersections where the three linear equations intersect (the target T is the intersection) are only 10 corresponding to the number of the target T. The three-dimensional position coordinates indicated by are times t 1 , t 2 , t 3 ...
For each of these three-dimensional position coordinates are stored in the memory 8.

【0018】メモリ8に記憶されているデ−タは,トラ
ッキング処理部9において時刻t1,t2 ,t3 ・・・
についてトラッキング処理されて,即ち,各座標の軌跡
が求められて,上から見た画像の中に目標物Tの航跡が
得られる。この航跡は3次元測位された連続する時刻に
おける座標デ−タであるから,任意の空間座標系に対し
て,3次元的な軌跡として可視化処理が可能であり,デ
−タ処理装置10により,モニタ画面上に3次元表示す
るための処理がなされて,ビデオ信号に変換され,連続
的な像として表示装置11に表示され,管制官等により
知識処理される。
The data stored in the memory 8 is transferred to the tracking processing section 9 at times t 1 , t 2 , t 3, ...
Is tracked, that is, the trajectory of each coordinate is obtained, and the track of the target T is obtained in the image viewed from above. Since this track is coordinate data at continuous times which are three-dimensionally positioned, it can be visualized as a three-dimensional trajectory with respect to an arbitrary spatial coordinate system. Processing for three-dimensional display on the monitor screen is performed, converted into a video signal, displayed as a continuous image on the display device 11, and knowledge-processed by a controller or the like.

【0019】[0019]

【発明の効果】この発明は,任意にシステム座標系を与
え,このシステム座標系に対し個々の魚眼レンズの光軸
を決定し,この各魚眼レンズを用いた複数の撮像装置に
より与えられる各魚眼画像から,それぞれこの魚眼画像
上に存在する目標物を検出し,システム座標系における
この目標物と撮像装置の視界面中心とを通る直線方程式
を,この撮像装置の個数に対応する数だけそれぞれ求
め,この複数の直線方程式を1組とする直線方程式群を
検出された各目標物についてそれぞれ求め,各目標物に
それぞれ対応する直線方程式群の交点を,各目標物毎に
求め,システム座標系における各目標物の3次元位置座
標を決定するようにたので,ひとつのシステムで複数の
目標物を同時に測位し,追尾することが出来る。又,目
標物の画像情報を記憶しておけば,目標物の航跡を算出
し,解析することが出来る。
According to the present invention, a system coordinate system is arbitrarily provided, the optical axis of each fisheye lens is determined with respect to this system coordinate system, and each fisheye image provided by a plurality of image pickup devices using each fisheye lens is determined. Then, the target objects existing on the fisheye image are detected, and the linear equations passing through the target object and the center of the visual interface of the image pickup device in the system coordinate system are obtained by the number corresponding to the number of the image pickup devices. , A linear equation group including the plurality of linear equations as a set is obtained for each detected target object, and an intersection point of the linear equation group corresponding to each target object is obtained for each target object in the system coordinate system. Since the three-dimensional position coordinates of each target object are determined, one system can simultaneously position and track a plurality of target objects. If the image information of the target object is stored, the track of the target object can be calculated and analyzed.

【図面の簡単な説明】[Brief description of drawings]

【図1】この発明の実施例を示す構成図である。FIG. 1 is a configuration diagram showing an embodiment of the present invention.

【図2】魚眼レンズ1の与える魚眼画像である。FIG. 2 is a fisheye image provided by the fisheye lens 1.

【図3】この発明の実施例を示すもので,3個の魚眼レ
ンズ1の与える魚眼画像である。
FIG. 3 shows an embodiment of the present invention and is a fisheye image provided by three fisheye lenses 1.

【図4】この発明の実施例を示す説明図である。FIG. 4 is an explanatory diagram showing an embodiment of the present invention.

【図5】この発明の実施例を示すもので,撮像装置の配
置例を示す図である。
FIG. 5 shows an embodiment of the present invention and is a diagram showing an arrangement example of an image pickup device.

【図6】この発明の実施例を示すもので,撮像装置の配
置例を示す図である。
FIG. 6 shows an embodiment of the present invention and is a diagram showing an arrangement example of image pickup devices.

【図7】この発明の実施例を示すもので,撮像装置の配
置例を示す図である。
FIG. 7 illustrates an embodiment of the present invention and is a diagram illustrating an arrangement example of image pickup devices.

【符号の説明】[Explanation of symbols]

1・・・魚眼レンズ 3・・・撮像装置 4・・・魚眼画像 5・・・インタフェ−ス 6・・・画像処理部 7・・・演算部 8・・・メモリ 9・・・トラッキング処理部 10・・デ−タ処理装置 11・・表示装置 12・・記録部 T・・・目標物 1 ... Fisheye lens 3 ... Imaging device 4 ... Fisheye image 5 ... Interface 6 ... Image processing unit 7 ... Calculation unit 8 ... Memory 9 ... Tracking processing unit 10 ... Data processing device 11 ... Display device 12 ... Recording unit T ... Target

Claims (5)

【特許請求の範囲】[Claims] 【請求項1】 任意にシステム座標系を与え,このシス
テム座標系に対し個々の魚眼レンズの光軸を決定し,前
記各魚眼レンズを用いた複数の撮像装置により与えられ
る各魚眼画像から,それぞれこの魚眼画像上に存在する
目標物を検出し,前記システム座標系におけるこの目標
物と撮像装置の視界面中心とを通る直線方程式を,前記
撮像装置の個数に対応する数だけそれぞれ求め,この複
数の直線方程式を1組とする直線方程式群を前記検出さ
れた各目標物についてそれぞれ求め,前記各目標物にそ
れぞれ対応する直線方程式群の交点を,各目標物毎に求
め,前記システム座標系における前記各目標物の3次元
位置座標を決定することを特徴とする魚眼レンズを用い
た測位方式。
1. A system coordinate system is arbitrarily provided, and an optical axis of each fish-eye lens is determined with respect to this system coordinate system. A target existing on the fisheye image is detected, and linear equations passing through the target in the system coordinate system and the center of the visual interface of the image pickup device are respectively obtained by the number corresponding to the number of the image pickup devices. A linear equation group including one set of linear equations is obtained for each of the detected target objects, and the intersection of the linear equation group corresponding to each of the target objects is determined for each target object in the system coordinate system. A positioning method using a fisheye lens, characterized in that the three-dimensional position coordinates of each target are determined.
【請求項2】 前記各目標物の時刻に対する位置デ−タ
を,その測定時刻とともにメモリに記憶し,このメモリ
に記憶されている前記位置デ−タを,各目標物毎にトラ
ッキング処理してそれぞれ目標物の航跡を求めることを
特徴とする請求項1に記載の魚眼レンズを用いた測位方
式。
2. The position data with respect to the time of each target is stored in a memory together with its measurement time, and the position data stored in this memory is tracked for each target. The positioning method using a fisheye lens according to claim 1, wherein the tracking of each target is obtained.
【請求項3】 目標物を捉える複数の魚眼レンズと,こ
の魚眼レンズにより得られる魚眼画像を撮像する撮像装
置と,この撮像装置により撮像された前記魚眼画像を画
像処理して前記目標物を検出する画像処理部と,前記各
魚眼レンズの光軸と前記システム座標系とにより,前記
画像処理部で検出された前記目標物の3次元位置座標を
算出する演算部と,この演算部で算出された前記各目標
物の3次元位置座標を記憶するメモリと,このメモリに
記憶されている前記各目標物の3次元位置座標を,モニ
タ画面上に3次元表示するためのデ−タ処理装置と,前
記目標物の3次元位置座標を表示する表示装置と,から
なることを特徴とする魚眼レンズを用いた測位装置。
3. A plurality of fisheye lenses for capturing a target object, an imaging device for capturing a fisheye image obtained by the fisheye lens, and image processing of the fisheye image captured by the imaging device to detect the target object. The image processing unit, the calculation unit that calculates the three-dimensional position coordinates of the target object detected by the image processing unit by the optical axis of each fish-eye lens and the system coordinate system, and the calculation unit A memory for storing the three-dimensional position coordinates of each target, and a data processing device for three-dimensionally displaying the three-dimensional position coordinates of each target stored in this memory on a monitor screen, A positioning device using a fisheye lens, comprising: a display device that displays three-dimensional position coordinates of the target.
【請求項4】 前記メモリに記憶されている前記各目標
物の3次元位置座標をトラッキング処理して,この各目
標物の航跡を決定するトラッキング処理部と,前記各目
標物の航跡をモニタ画面上に3次元表示するためのデ−
タ処理装置と,前記目標物の航跡を表示する表示装置
と,からなることを特徴とする請求項3に記載の魚眼レ
ンズを用いた測位装置。
4. A tracking processing unit for tracking the three-dimensional position coordinates of each target stored in the memory to determine the track of each target, and a monitor screen for tracking the track of each target. Data for three-dimensional display on top
A positioning device using a fisheye lens according to claim 3, comprising a data processing device and a display device for displaying a track of the target.
【請求項5】 前記魚眼レンズは少なくとも3個用いた
ことを特徴とする請求項3および請求項4にそれぞれ記
載の魚眼レンズを用いた測位装置。
5. The positioning device using a fisheye lens according to claim 3, wherein at least three fisheye lenses are used.
JP4177394A 1992-06-11 1992-06-11 Positioning method and device using fisheye lens Expired - Lifetime JP2611173B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP4177394A JP2611173B2 (en) 1992-06-11 1992-06-11 Positioning method and device using fisheye lens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP4177394A JP2611173B2 (en) 1992-06-11 1992-06-11 Positioning method and device using fisheye lens

Publications (2)

Publication Number Publication Date
JPH06167564A true JPH06167564A (en) 1994-06-14
JP2611173B2 JP2611173B2 (en) 1997-05-21

Family

ID=16030169

Family Applications (1)

Application Number Title Priority Date Filing Date
JP4177394A Expired - Lifetime JP2611173B2 (en) 1992-06-11 1992-06-11 Positioning method and device using fisheye lens

Country Status (1)

Country Link
JP (1) JP2611173B2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001141423A (en) * 1999-11-11 2001-05-25 Fuji Photo Film Co Ltd Image pickup device and image processor
JP2007024647A (en) * 2005-07-14 2007-02-01 Iwate Univ Distance calculating apparatus, distance calculating method, structure analyzing apparatus and structure analyzing method
JP2008096162A (en) * 2006-10-06 2008-04-24 Iwate Univ Three-dimensional distance measuring sensor and three-dimensional distance measuring method
US7403836B2 (en) 2003-02-25 2008-07-22 Honda Motor Co., Ltd. Automatic work apparatus and automatic work control program
EP2884460A1 (en) 2013-12-13 2015-06-17 Panasonic Intellectual Property Management Co., Ltd. Image capturing apparatus, monitoring system, image processing apparatus, image capturing method, and non-transitory computer readable recording medium
EP3220638A1 (en) 2016-02-29 2017-09-20 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus, image processing apparatus, imaging system, imaging method, image processing method, and recording medium
JP2019074374A (en) * 2017-10-13 2019-05-16 三菱重工業株式会社 Position orientating system and position orientating method
US10484665B2 (en) 2017-04-20 2019-11-19 Panasonic Intellectual Property Management Co., Ltd. Camera parameter set calculation method, recording medium, and camera parameter set calculation apparatus
US10602125B2 (en) 2016-09-08 2020-03-24 Panasonic Intellectual Property Management Co., Ltd. Camera-parameter-set calculation apparatus, camera-parameter-set calculation method, and recording medium
US10757395B2 (en) 2017-04-28 2020-08-25 Panasonic Intellectual Property Management Co., Ltd. Camera parameter set calculation method, recording medium, and camera parameter set calculation apparatus
EP2950142B1 (en) * 2014-04-30 2021-06-09 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus and distance measuring apparatus using the same
JP2022505772A (en) * 2018-11-01 2022-01-14 ウェイモ エルエルシー Time-of-flight sensor with structured light illumination
US11941829B2 (en) 2019-10-01 2024-03-26 Fujifilm Business Innovation Corp. Information processing apparatus, light emitting device, and non-transitory computer readable medium storing program

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1185112B1 (en) 2000-08-25 2005-12-14 Fuji Photo Film Co., Ltd. Apparatus for parallax image capturing and parallax image processing
JP4425495B2 (en) 2001-06-08 2010-03-03 富士重工業株式会社 Outside monitoring device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05346950A (en) * 1991-12-19 1993-12-27 Eastman Kodak Co Method for sensing three-dimensional scene and device for the same

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05346950A (en) * 1991-12-19 1993-12-27 Eastman Kodak Co Method for sensing three-dimensional scene and device for the same

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001141423A (en) * 1999-11-11 2001-05-25 Fuji Photo Film Co Ltd Image pickup device and image processor
US6876762B1 (en) * 1999-11-11 2005-04-05 Fuji Photo Film Co., Ltd. Apparatus for imaging and image processing and method thereof
US7403836B2 (en) 2003-02-25 2008-07-22 Honda Motor Co., Ltd. Automatic work apparatus and automatic work control program
JP2007024647A (en) * 2005-07-14 2007-02-01 Iwate Univ Distance calculating apparatus, distance calculating method, structure analyzing apparatus and structure analyzing method
JP2008096162A (en) * 2006-10-06 2008-04-24 Iwate Univ Three-dimensional distance measuring sensor and three-dimensional distance measuring method
US10839213B2 (en) 2013-12-13 2020-11-17 Panasonic Intellectual Property Management Co., Ltd. Image capturing apparatus, monitoring system, image processing apparatus, image capturing method, and non-transitory computer readable recording medium
US10157315B2 (en) 2013-12-13 2018-12-18 Panasonic Intellectual Property Management Co., Ltd. Image capturing apparatus, monitoring system, image processing apparatus, image capturing method, and non-transitory computer readable recording medium
US11354891B2 (en) 2013-12-13 2022-06-07 Panasonic Intellectual Property Management Co., Ltd. Image capturing apparatus, monitoring system, image processing apparatus, image capturing method, and non-transitory computer readable recording medium
US10565449B2 (en) 2013-12-13 2020-02-18 Panasonic Intellectual Property Management Co., Ltd. Image capturing apparatus, monitoring system, image processing apparatus, image capturing method, and non-transitory computer readable recording medium
EP3654286A2 (en) 2013-12-13 2020-05-20 Panasonic Intellectual Property Management Co., Ltd. Image capturing apparatus, monitoring system, image processing apparatus, image capturing method, and non-transitory computer readable recording medium
EP2884460A1 (en) 2013-12-13 2015-06-17 Panasonic Intellectual Property Management Co., Ltd. Image capturing apparatus, monitoring system, image processing apparatus, image capturing method, and non-transitory computer readable recording medium
EP2950142B1 (en) * 2014-04-30 2021-06-09 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus and distance measuring apparatus using the same
US10484668B2 (en) 2016-02-29 2019-11-19 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus, image processing apparatus, imaging system, imaging method, image processing method, and recording medium
EP3220638A1 (en) 2016-02-29 2017-09-20 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus, image processing apparatus, imaging system, imaging method, image processing method, and recording medium
US11272166B2 (en) 2016-02-29 2022-03-08 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus, image processing apparatus, imaging system, imaging method, image processing method, and recording medium
US11233983B2 (en) 2016-09-08 2022-01-25 Panasonic Intellectual Property Management Co., Ltd. Camera-parameter-set calculation apparatus, camera-parameter-set calculation method, and recording medium
US10602125B2 (en) 2016-09-08 2020-03-24 Panasonic Intellectual Property Management Co., Ltd. Camera-parameter-set calculation apparatus, camera-parameter-set calculation method, and recording medium
US10484665B2 (en) 2017-04-20 2019-11-19 Panasonic Intellectual Property Management Co., Ltd. Camera parameter set calculation method, recording medium, and camera parameter set calculation apparatus
US10757395B2 (en) 2017-04-28 2020-08-25 Panasonic Intellectual Property Management Co., Ltd. Camera parameter set calculation method, recording medium, and camera parameter set calculation apparatus
JP2019074374A (en) * 2017-10-13 2019-05-16 三菱重工業株式会社 Position orientating system and position orientating method
JP2022505772A (en) * 2018-11-01 2022-01-14 ウェイモ エルエルシー Time-of-flight sensor with structured light illumination
US11941829B2 (en) 2019-10-01 2024-03-26 Fujifilm Business Innovation Corp. Information processing apparatus, light emitting device, and non-transitory computer readable medium storing program

Also Published As

Publication number Publication date
JP2611173B2 (en) 1997-05-21

Similar Documents

Publication Publication Date Title
JP2611173B2 (en) Positioning method and device using fisheye lens
JP6518952B2 (en) Position adjustment method of display device for vehicle
US20020024599A1 (en) Moving object tracking apparatus
JP4559874B2 (en) Motion tracking device
JP2002366937A (en) Monitor outside vehicle
JP2004354257A (en) Calibration slippage correction device, and stereo camera and stereo camera system equipped with the device
JPH11118425A (en) Calibration method and device and calibration data production
JP3994217B2 (en) Abnormal point detection system by image processing
JPH04113213A (en) Vehicle distance detector
JP4044454B2 (en) Position measurement using zoom
JP2000028332A (en) Three-dimensional measuring device and method therefor
JPH0827188B2 (en) Inter-vehicle distance detector
Lu et al. Image-based system for measuring objects on an oblique plane and its applications in 2-D localization
JP2001338280A (en) Three-dimensional space information input device
JPH05122606A (en) Method and device for synthesizing image
JP2511082B2 (en) Three-dimensional position measuring device for moving objects
JP2966683B2 (en) Obstacle detection device for vehicles
JPH06226561A (en) Circular position recognizing device
JP2010281685A (en) System and method for measurement of position
JPH02151828A (en) All-azimuth observation device
KR102044639B1 (en) Method and apparatus for aligning stereo cameras
JP2006293629A (en) Method and device for detecting height of mobile object, and method for determining object shape
JPH08171627A (en) Centroid detecting method for calibration pattern
JPH0933247A (en) Image object detecting apparatus
JPS6247512A (en) Three dimensional position recognizing device

Legal Events

Date Code Title Description
EXPY Cancellation because of completion of term