JPH03200007A - Stereoscopic measuring instrument - Google Patents

Stereoscopic measuring instrument

Info

Publication number
JPH03200007A
JPH03200007A JP33874689A JP33874689A JPH03200007A JP H03200007 A JPH03200007 A JP H03200007A JP 33874689 A JP33874689 A JP 33874689A JP 33874689 A JP33874689 A JP 33874689A JP H03200007 A JPH03200007 A JP H03200007A
Authority
JP
Japan
Prior art keywords
screen
picture
conversion processing
lenses
pictures
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP33874689A
Other languages
Japanese (ja)
Inventor
Satoshi Shimada
聡 嶌田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Telegraph and Telephone Corp
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Priority to JP33874689A priority Critical patent/JPH03200007A/en
Publication of JPH03200007A publication Critical patent/JPH03200007A/en
Pending legal-status Critical Current

Links

Landscapes

  • Measurement Of Optical Distance (AREA)

Abstract

PURPOSE:To easily allow plural pictures to correspond to each other and to simplify a measuring system by executing picture conversion processing for expanding or contracting a picture by using the ratio of the focal distances of lenses in right and left image pickup devices as its magnification. CONSTITUTION:The image pickup devices 101, 102 output pictures obtained by picking up an subject image. The pictures obtained by the devices 101, 102 are respectively set up as pictures L, R and the picture R is outputted to a correspondence processing part 104. When the focal distances of the lenses in the devices 101, 102 are respectively defined as fL, fR (fL<fR), a picture conversion processing part 103 executes the picture conversion processing for expanding or contracting the picture L outputted from the device 101 by using the ratio fR/fL of the focal distances of the lenses in the devices 101, 102 as the magnification and outputs a picture L' obtained by the picture conversion processing of the screen L to the processing part 104. The processing part 104 allows same picture elements in a real space to correspond to each other out of picture elements in the two pictures L', R and outputs the coordinate values of respective corresponding picture elements in the pictures L', R to a depth information extracting part 106.

Description

【発明の詳細な説明】 〈産業上の利用分野〉 本発明は、ステレオ画像により、カメラから被写体の各
部までの距離である奥行き情報を抽出するステレオ計測
装置に関するものである。
DETAILED DESCRIPTION OF THE INVENTION <Industrial Application Field> The present invention relates to a stereo measurement device that extracts depth information, which is the distance from a camera to each part of a subject, using stereo images.

〈従来の技術〉 従来のステレオ計測装置は、設置場所の異なる2台の同
じカメラで同一のシーンを撮影したり、1台のカメラで
カメラの場所を変えて2回撮影したりして得られる2つ
の画面において、実空間で同一であるものを対応付け、
三角測量の原理に基づいて奥行き情報を抽出するもので
ある。
<Conventional technology> Conventional stereo measurement devices are obtained by photographing the same scene with two cameras installed in different locations, or by photographing the same scene twice with one camera in different locations. Correlate things that are the same in real space on two screens,
It extracts depth information based on the principle of triangulation.

〈発明が解決しようとする課題〉 従来の計測装置において、画像の量子化誤差の影響を受
けずに正しく奥行き情報を抽出するなめには、2台のカ
メラを十分離す必要がある。従って、11!測系が大掛
かりになったり、2つのカメラで撮影して得られる画面
が大きく異なるため画面間の対応付けにおいて誤処理が
生じたり対応付けができないことがあるという問題点が
あった。
<Problems to be Solved by the Invention> In conventional measuring devices, two cameras must be placed sufficiently apart in order to extract depth information correctly without being affected by image quantization errors. Therefore, 11! There are problems in that the measurement system becomes large-scale, and the screens obtained by taking pictures with two cameras are greatly different, so that incorrect processing may occur in the correspondence between the screens, or it may not be possible to make the correspondence.

本発明の目的は、このような従来の問題点を解決するた
めに、画面間の対応付けが容易で観測系が簡易なステレ
オ計測装置を提供することにある。
SUMMARY OF THE INVENTION In order to solve these conventional problems, it is an object of the present invention to provide a stereo measurement device that allows easy correspondence between screens and has a simple observation system.

く課題を解決するための手段〉 上記課題を解決する本発明の構成は、 使用するレンズの焦点距離が互いに異なっており、同一
の被写体を同時に撮影して被写体像を含む画面を出力す
る2台の並置した撮像装置と、 この2台の撮像装置から出力された2つの画面のうち一
方の画面に、双方の撮像装置のレンズの焦点距離の比を
倍率として拡大または縮小する画面変換処理を施し、画
面変換処理した変換処理画面を出力する画面変換処理部
と、 前記変換処理画面の各画素と、画面変換処理されていな
い他方の画面の各画素とを、実空間で同一であるものど
うしを対応付ける対応付は処理部と、 対応付けられた各画素の座標を基に、撮像装置から被写
体の各部までの距離である奥行き情報を抽出する奥行き
情報抽出部と、を有することを特徴とする。
Means for Solving the Problems> The configuration of the present invention to solve the above problems is as follows: The lenses used have different focal lengths, and two cameras are used to simultaneously photograph the same subject and output a screen containing the subject image. image pickup devices arranged side by side, and one of the two screens output from these two image pickup devices is subjected to screen conversion processing that enlarges or reduces the ratio of the focal lengths of the lenses of both image pickup devices as a magnification factor. , a screen conversion processing unit that outputs a converted screen that has undergone screen conversion processing, and a screen conversion processing unit that outputs a converted screen that has undergone screen conversion processing, and a screen conversion processing unit that outputs a converted screen that has undergone screen conversion processing, and a screen conversion processing unit that connects each pixel of the conversion processed screen and each pixel of the other screen that has not undergone screen conversion processing, so that they are the same in real space. The association is characterized by having a processing unit and a depth information extraction unit that extracts depth information that is the distance from the imaging device to each part of the subject based on the coordinates of each associated pixel.

用〉 本発明では、左右の撮像装置に焦点距離の異なるレンズ
を用いて、同一のシーンに対して左右2つの画面を同時
に取り込み、取り込んだ2つの画面のどちらか一方の画
面に、左右の撮像装置のレンズの焦点距離の比を倍率と
して拡大または縮小する画面変換処理を施し、画面変換
処理を施していないもう一方の画面と、画面変換処理を
施した変換処理画面の2つの画面から奥行き情報を抽出
する。
In the present invention, lenses with different focal lengths are used in the left and right imaging devices to simultaneously capture two left and right images of the same scene, and the left and right images are displayed on one of the two captured images. Screen conversion processing is performed to enlarge or reduce the ratio of the focal length of the device's lens as a magnification, and depth information is obtained from two screens: the other screen that has not been subjected to screen conversion processing, and the conversion processing screen that has undergone screen conversion processing. Extract.

〈作 く実 施 例〉 以下、本発明の実施例を図面に基づいて詳細に説明する
。左側の撮像装置から得られる画面に画面変換処理を施
す場合について、本発明の実施例の構成図を第1図に示
す。同図において、101は左側の撮像装置、102は
右側の撮像装置、103は画面変換処理部、104は対
応付は処理部、105は観測系のパラメータ記憶部、1
06は奥行き情報抽出部である。
<Embodiments to be Produced> Hereinafter, embodiments of the present invention will be described in detail based on the drawings. FIG. 1 shows a configuration diagram of an embodiment of the present invention in the case where a screen conversion process is performed on the screen obtained from the left imaging device. In the figure, 101 is an imaging device on the left side, 102 is an imaging device on the right side, 103 is a screen conversion processing unit, 104 is a processing unit for mapping, 105 is an observation system parameter storage unit, 1
06 is a depth information extraction unit.

撮像装置101と撮像装置102は、双方のレンズの光
軸が平行となるように左右に並んで配置されており、同
一の被写体(シーン)を同時に撮影する。この場合、撮
像装置101゜102は、使用するレンズの焦点距離が
互いに異なっている。そして撮像装置101.102は
、被写体像を写し込んだ画面を出力する。
The imaging device 101 and the imaging device 102 are arranged side by side so that the optical axes of both lenses are parallel, and photograph the same subject (scene) at the same time. In this case, the imaging devices 101 and 102 use different focal lengths of lenses. The imaging devices 101 and 102 then output a screen on which the subject image is projected.

撮像装置101が取り込んだ画面を画面りとし、この画
面りを画面変換処理部103に出力する。撮像装置10
2が取り込んだ画面を画面Rとし、この画面Rを対応付
は処理部104に出力する。また、撮像装置101゜1
02におけるレンズの焦点距離をそれぞれfL、  f
R(fL< f、)  とする。
The screen captured by the imaging device 101 is used as a screen image, and this screen image is output to the screen conversion processing unit 103. Imaging device 10
2 is taken as screen R, and this screen R is output to the processing unit 104 for correspondence. In addition, the imaging device 101゜1
The focal length of the lens at 02 is fL, f
Let R(fL<f,).

画面変換処理部103は、撮像装置101より出力され
る画面りに、撮像装置101゜102のレンズの焦点距
離の比であるf、/fLを倍率として拡大または縮小す
る画面変換処理を施し、画面りを画面変換処理した画面
L′を対応付は処理部104に出力する。
The screen conversion processing unit 103 performs screen conversion processing on the screen output from the imaging device 101 to enlarge or reduce it using a magnification of f, /fL, which is the ratio of the focal lengths of the lenses of the imaging devices 101 and 102. The screen L' obtained by performing the screen conversion process is output to the mapping processing section 104.

画面を拡大したり縮小したりする画面変換処理方式とし
ては、編集機能付インテリジェントコピー等に用いられ
ている拡大縮小方式、例えば、SPC方式、論理和法、
投影法、9分割法、高速投影法など、各方式を採用する
ことができる。
Screen conversion processing methods for enlarging or reducing the screen include scaling methods used in intelligent copies with editing functions, such as the SPC method, the logical sum method,
Various methods such as a projection method, a 9-division method, and a high-speed projection method can be adopted.

対応付は処理部104は、画面変換処理部103より出
力される画面L′と、右側の撮像装置102より出力さ
れろ画面Rの2つの画面上の各画素において、実空間で
同一であるものを対応付け、対応付けられた画面L′上
と画面R上の各画素の座標値を奥行き情報抽出部106
に出力する。2つの画面間の対応付は処理は、例えば「
“コンピュータビジョン”白井良明昭晃堂 pp、65
−71Jに示された方式で行なうことにより対応付は処
理部104が実現できる。
The processing unit 104 determines that each pixel on the two screens, the screen L' output from the screen conversion processing unit 103 and the screen R output from the right imaging device 102, is the same in real space. The depth information extraction unit 106 associates the coordinate values of each pixel on the screen L' and the screen R that are associated with each other.
Output to. The process of mapping between two screens is, for example,
“Computer Vision” Yoshiaki Shirai Shokodo pp, 65
-71J, the processing unit 104 can realize the correspondence.

ここで、この対応付は処理の手法を、第4図を参照して
概説しておく。例えば左画面の中の点PLに対応する右
画面の中の点P8を見つけるには、まず点PLを中心に
含むウィンドWLを設定する。次に、右画面の中に、点
PLの対応点となる複数の候補を決めておき各候補点を
中心とした各ウィンド篤とウィンドーを比較し、ウィン
ド罵に最も近いウィンド鴇の中にある点を対応点P6と
する。図において罵とW8の画像は似ているが、−とW
′8は異なっているので、W′8中の点P′8は対応点
でないことがわかる。ウィンド同志の類否は、ウィンド
の明るさと明るさの分散を評価するととにより判定する
Here, the method of this correspondence processing will be outlined with reference to FIG. For example, to find the point P8 on the right screen that corresponds to the point PL on the left screen, first, a window WL containing the point PL in the center is set. Next, in the right screen, decide on multiple candidates to be the corresponding point of point PL, compare each window with each candidate point as the center, and find the one in the window that is closest to the window point. Let the point be the corresponding point P6. In the figure, the images of swear and W8 are similar, but - and W
'8 are different, so it can be seen that point P'8 in W'8 is not a corresponding point. The similarity between windows is determined by evaluating the brightness of the windows and the variance of the brightness.

一方、第1図におけるU側糸のパラメータ記憶部105
は、撮像装置101,102におけるレンズの焦点距g
lfLとfFl、及び、予め実測しておいた撮像装置1
01,102の撮像面間の距1IIZFILを記憶して
おき、記憶しておいたfL、fRとZRLの値を奥行き
情報抽出部106に出力する。
On the other hand, the parameter storage unit 105 for the U side yarn in FIG.
is the focal length g of the lens in the imaging devices 101 and 102
lfL and fFl, and the imaging device 1 that was actually measured in advance
The distance 1IIZFIL between the imaging planes 01 and 102 is stored, and the stored values of fL, fR, and ZRL are output to the depth information extraction unit 106.

奥行き情報抽出部106は、対応付は処理部104より
出力される実空間で同一である点の画面L′と画面Rに
おける座標値とU側糸のパラメータ記憶部105より出
力される観測系のパラメータから奥行き情報を抽出する
The depth information extraction unit 106 associates the coordinate values in the screen L′ and screen R of the same point in the real space output from the processing unit 104 with the observation system output from the parameter storage unit 105 of the U side thread. Extract depth information from parameters.

奥行き情報抽出部106の実施例について以下に説明す
る。
An example of the depth information extraction unit 106 will be described below.

本実施例における座標系を第2図に示す。The coordinate system in this example is shown in FIG.

左右の撮像装置1101,102の位置が第2図と異な
る場合には、左右の撮像装置101゜102から得られ
る画面りと画面Rに回転移動及び平行移動させるアフィ
ン変換を施すことにより、第2図のように左右の撮像装
置101.102を配置したときに得られる画面を求め
ることができろ。左右の撮像装置101.102の位置
が第2図のとき、シーンにおける点Pの実空間での位置
と画面り。
If the positions of the left and right imaging devices 1101 and 102 are different from those shown in FIG. Can you find the screen obtained when the left and right imaging devices 101 and 102 are arranged as shown in the figure? When the positions of the left and right imaging devices 101 and 102 are as shown in FIG. 2, the position of the point P in the scene in real space and the screen image.

画面R上での位置の関係は、第3図に示すようになる。The positional relationship on the screen R is as shown in FIG.

したがって点Pの実空間での座標値をp (xp yp
 Z)N点Pの画面り上での座標値を(UL、 VL)
 、画面R上での座標値を(UFl、V、)とすると、
点Pの空間的位置のX座標値、Y座標値は、次式で表せ
る。
Therefore, the coordinate value of point P in real space is p (xp yp
Z) Coordinate values of N point P on the screen (UL, VL)
, let the coordinate values on screen R be (UFl, V,),
The X and Y coordinate values of the spatial position of point P can be expressed by the following equations.

画面りについて ここで、Y8Lは左右の撮像装置101,102におけ
るレンズの光軸間の距離、z8.は左右の撮像装置の撮
像面間の距離である。
Regarding screen size, Y8L is the distance between the optical axes of the lenses in the left and right imaging devices 101 and 102, and z8. is the distance between the imaging surfaces of the left and right imaging devices.

式(1)2式(3)よりXを消去するととなる。Eliminating X from equations (1) and (3) yields.

また、画面変換処理部103において、右側の撮像装置
101より出力される画面りに、左右の撮像装置におけ
るレンズの焦点距離の比であるf、/fLを倍率として
画面変換処理を施しているから、画面り上の(UL、v
L)と、その点の画像変換後の画面L′上におけろ座標
値(U’、、 V’L)とのあいだには、画面Rについ
て が成り立つ。式(6)を式(5)に代入すると、が得ら
れる。式(7)から点Pの奥行き情報が求められる。
In addition, the screen conversion processing unit 103 performs screen conversion processing on the screen output from the right imaging device 101 using f, /fL, which is the ratio of the focal lengths of the lenses in the left and right imaging devices, as a magnification. , on the screen (UL, v
L) and the coordinate values (U', V'L) of that point on the screen L' after image conversion, the following holds for the screen R. Substituting equation (6) into equation (5) yields. Depth information of point P is obtained from equation (7).

すなわち、奥行き情報抽出部106は、第1図に示す対
応付は処理部104から出力される実空間で同一である
と対応付けられた画面L′上と画面R上の各画素の座標
値のなかのX成分U’、、UFlと、第1図に示すパラ
メータ記憶部105から出力される観測系のパラン−1
fL、  f、 トZ、Lヲ用イテ、式(7)より各点
P (x、y、z)のZ座標値を算出することにより奥
行き情報を抽出することができる。
That is, the depth information extraction unit 106 determines the coordinate values of each pixel on the screen L' and the screen R that are associated with each other in the real space outputted from the processing unit 104 as shown in FIG. The X components U', , UFl and the paran-1 of the observation system output from the parameter storage unit 105 shown in FIG.
Depth information can be extracted by calculating the Z coordinate value of each point P (x, y, z) from equation (7).

さらに、対応付は処理部104から出力されろ座標値の
なかで実空間における各点Pの画面R上の座標値(馬、
R8)と、パラメータ記憶部105から出力される観測
系のパラメータと、式(7)より算出した点Pの奥行き
情報を用いて、式(3)2式(4)から点Pの空間的位
置を求めろこともできる。
Furthermore, the correspondence is output from the processing unit 104. Among the coordinate values, the coordinate values on the screen R of each point P in the real space (horse,
R8), the parameters of the observation system output from the parameter storage unit 105, and the depth information of point P calculated from equation (7), the spatial position of point P is determined from equations (3), 2, and (4). You can also ask for .

〈発明の効果〉 以上説明したように、本発明によれば、左右の撮像装置
に焦点距離の異なるレンズを用いて、同一のシーンに対
して取り込んだ左右2つの画面のどちらか一方の画面に
、左右の撮像装置のレンズの焦点距離の比を倍率として
拡大または縮小する画面変換処理を施し、画面変換処理
を施していないもう一方の画面と、画面変換処理を施し
た変換処理画面の2つの画面から奥行き情報を抽出でき
るので、左右の撮像装置を隣接させることができるため
、観測系が簡易で、対応付けの誤処理から生じる奥行き
情報抽出誤差の少ないステレオ計測装置が実現できる。
<Effects of the Invention> As explained above, according to the present invention, lenses with different focal lengths are used in the left and right imaging devices, so that one of the two left and right images captured for the same scene can be , a screen conversion process is performed to enlarge or reduce the ratio of the focal lengths of the lenses of the left and right imaging devices as a magnification, and the other screen is not subjected to the screen conversion process, and the conversion process screen is subjected to the screen conversion process. Since depth information can be extracted from the screen, the left and right imaging devices can be placed adjacent to each other, making it possible to realize a stereo measurement device with a simple observation system and less depth information extraction errors caused by incorrect processing of association.

換言すると、本発明のステレオ計測装置は、左右の撮像
装置間の距離に関係なく奥行き情報が抽出できるため、
左右の撮像装置を隣接させろことができる。
In other words, since the stereo measurement device of the present invention can extract depth information regardless of the distance between the left and right imaging devices,
The left and right imaging devices can be placed adjacent to each other.

左右の撮像装置を隣接させたときには、画面変換手段か
ら出力されろ2つの画面の類似度が高いので画面間の対
応付けが容易になることと観測系が簡易になることの特
長を持つ。
When the left and right imaging devices are placed adjacent to each other, the similarity between the two screens output from the screen converting means is high, making it easy to correlate the screens and simplifying the observation system.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は本発明の一実施例を示す構成図、第2図は実施
例における座標系を示す説明図、第3図は空間的位置と
画面上の位置の関係を示す説明図、第4図は対応付は手
法を示す原理図である。 図面 0 0 0 0 0 0 中、 1は左側の撮像装置、 2は右側の撮像装置、 3は画面変換処理部、 4は対応付は処理部、 5ば観測系のパラメータ記憶部、 6は奥行き情報抽出部である。 第4図 特  許  出  願 人 日本電信電話株式会社 代    理    人
FIG. 1 is a configuration diagram showing an embodiment of the present invention, FIG. 2 is an explanatory diagram showing a coordinate system in the embodiment, FIG. 3 is an explanatory diagram showing the relationship between spatial position and position on the screen, and FIG. The figure is a diagram showing the principle of the correspondence method. In the drawing 0 0 0 0 0 0, 1 is the left imaging device, 2 is the right imaging device, 3 is the screen conversion processing unit, 4 is the processing unit for correspondence, 5 is the observation system parameter storage unit, 6 is the depth This is the information extraction part. Figure 4 Patent applicant: Agent of Nippon Telegraph and Telephone Corporation

Claims (1)

【特許請求の範囲】 使用するレンズの焦点距離が互いに異なっており、同一
の被写体を同時に撮影して被写体像を含む画面を出力す
る2台の並置した撮像装置と、 この2台の撮像装置から出力された2つの画面のうち一
方の画面に、双方の撮像装置のレンズの焦点距離の比を
倍率として拡大または縮小する画面変換処理を施し、画
面変換処理した変換処理画面を出力する画面変換処理部
と、 前記変換処理画面の各画素と、画面変換処理されていな
い他方の画面の各画素とを、実空間で同一であるものど
うしを対応付ける対応付け処理部と、 対応付けられた各画素の座標を基に、撮像装置から被写
体の各部までの距離である奥行き情報を抽出する奥行き
情報抽出部と、 を有することを特徴とするステレオ計測装置。
[Claims] Two imaging devices juxtaposed, whose lenses have different focal lengths and which simultaneously photograph the same subject and output a screen containing the subject image; A screen conversion process that performs screen conversion processing to enlarge or reduce one of the two output screens using the ratio of the focal lengths of the lenses of both imaging devices as a magnification, and outputs the converted screen after the screen conversion process. a correspondence processing unit that associates each pixel of the conversion processing screen with each pixel of the other screen that has not undergone the screen conversion process so that they are the same in real space; A stereo measurement device comprising: a depth information extraction unit that extracts depth information that is a distance from an imaging device to each part of a subject based on coordinates;
JP33874689A 1989-12-28 1989-12-28 Stereoscopic measuring instrument Pending JPH03200007A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP33874689A JPH03200007A (en) 1989-12-28 1989-12-28 Stereoscopic measuring instrument

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP33874689A JPH03200007A (en) 1989-12-28 1989-12-28 Stereoscopic measuring instrument

Publications (1)

Publication Number Publication Date
JPH03200007A true JPH03200007A (en) 1991-09-02

Family

ID=18321071

Family Applications (1)

Application Number Title Priority Date Filing Date
JP33874689A Pending JPH03200007A (en) 1989-12-28 1989-12-28 Stereoscopic measuring instrument

Country Status (1)

Country Link
JP (1) JPH03200007A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05157528A (en) * 1991-12-03 1993-06-22 Nippon Steel Corp Three-dimensional analyzing method for shape of corrosion
JP2006093858A (en) * 2004-09-21 2006-04-06 Olympus Corp Camera mounted with twin lens image pick-up system
US7103212B2 (en) 2002-11-22 2006-09-05 Strider Labs, Inc. Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
JP2019032295A (en) * 2017-08-09 2019-02-28 ミツミ電機株式会社 Distance measuring camera
WO2019124040A1 (en) * 2017-12-18 2019-06-27 ミツミ電機株式会社 Distance measuring camera
JP2019133526A (en) * 2018-02-01 2019-08-08 ミツミ電機株式会社 Authentication device
JP2019164011A (en) * 2018-03-19 2019-09-26 ミツミ電機株式会社 Range-finding camera
WO2020008832A1 (en) * 2018-07-06 2020-01-09 ミツミ電機株式会社 Distance measurement camera
WO2020017377A1 (en) * 2018-07-18 2020-01-23 ミツミ電機株式会社 Ranging camera
WO2020017209A1 (en) * 2018-07-18 2020-01-23 ミツミ電機株式会社 Distance measurement camera
WO2020158158A1 (en) * 2019-02-01 2020-08-06 ミツミ電機株式会社 Authentication device
WO2020162003A1 (en) * 2019-02-06 2020-08-13 ミツミ電機株式会社 Distance measurement camera
JP2022128518A (en) * 2018-03-19 2022-09-01 ミツミ電機株式会社 ranging camera
JP2022128517A (en) * 2017-08-09 2022-09-01 ミツミ電機株式会社 ranging camera
JP2022128516A (en) * 2017-12-18 2022-09-01 ミツミ電機株式会社 ranging camera

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05157528A (en) * 1991-12-03 1993-06-22 Nippon Steel Corp Three-dimensional analyzing method for shape of corrosion
US7103212B2 (en) 2002-11-22 2006-09-05 Strider Labs, Inc. Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
JP2006093858A (en) * 2004-09-21 2006-04-06 Olympus Corp Camera mounted with twin lens image pick-up system
US11525676B2 (en) 2017-08-09 2022-12-13 Mitsumi Electric Co., Ltd. Distance measuring camera
JP2019032295A (en) * 2017-08-09 2019-02-28 ミツミ電機株式会社 Distance measuring camera
CN110998228A (en) * 2017-08-09 2020-04-10 三美电机株式会社 Distance measuring camera
JP2022128517A (en) * 2017-08-09 2022-09-01 ミツミ電機株式会社 ranging camera
US11499824B2 (en) * 2017-12-18 2022-11-15 Mitsumi Electric Co., Ltd. Distance measuring camera
WO2019124040A1 (en) * 2017-12-18 2019-06-27 ミツミ電機株式会社 Distance measuring camera
EP3730898A4 (en) * 2017-12-18 2021-08-25 Mitsumi Electric Co., Ltd. Distance measuring camera
CN111492201A (en) * 2017-12-18 2020-08-04 三美电机株式会社 Distance measuring camera
JP2019109124A (en) * 2017-12-18 2019-07-04 ミツミ電機株式会社 Ranging camera
JP2022128516A (en) * 2017-12-18 2022-09-01 ミツミ電機株式会社 ranging camera
JP2019133526A (en) * 2018-02-01 2019-08-08 ミツミ電機株式会社 Authentication device
WO2019150807A1 (en) * 2018-02-01 2019-08-08 ミツミ電機株式会社 Authentication device
CN111670455A (en) * 2018-02-01 2020-09-15 三美电机株式会社 Authentication device
US11436746B2 (en) 2018-03-19 2022-09-06 Mitsumi Electric Co., Ltd. Distance measuring camera
CN111868474A (en) * 2018-03-19 2020-10-30 三美电机株式会社 Distance measuring camera
JP2022128518A (en) * 2018-03-19 2022-09-01 ミツミ電機株式会社 ranging camera
JP2019164011A (en) * 2018-03-19 2019-09-26 ミツミ電機株式会社 Range-finding camera
EP3770550A4 (en) * 2018-03-19 2022-01-05 Mitsumi Electric Co., Ltd. Distance measurement camera
US11410321B2 (en) 2018-07-06 2022-08-09 Mitsumi Electric Co., Ltd. Distance measuring camera
WO2020008832A1 (en) * 2018-07-06 2020-01-09 ミツミ電機株式会社 Distance measurement camera
JP2020008415A (en) * 2018-07-06 2020-01-16 ミツミ電機株式会社 Distance measuring camera
CN112368544A (en) * 2018-07-06 2021-02-12 三美电机株式会社 Distance measuring camera
JP2020020775A (en) * 2018-07-18 2020-02-06 ミツミ電機株式会社 Distance measuring camera
US11341668B2 (en) 2018-07-18 2022-05-24 Mitsumi Electric Co., Ltd. Distance measuring camera
WO2020017377A1 (en) * 2018-07-18 2020-01-23 ミツミ電機株式会社 Ranging camera
WO2020017209A1 (en) * 2018-07-18 2020-01-23 ミツミ電機株式会社 Distance measurement camera
JP2020126371A (en) * 2019-02-01 2020-08-20 ミツミ電機株式会社 Authentication device
WO2020158158A1 (en) * 2019-02-01 2020-08-06 ミツミ電機株式会社 Authentication device
WO2020162003A1 (en) * 2019-02-06 2020-08-13 ミツミ電機株式会社 Distance measurement camera
CN113424020A (en) * 2019-02-06 2021-09-21 三美电机株式会社 Distance measuring camera
JP2020126029A (en) * 2019-02-06 2020-08-20 ミツミ電機株式会社 Distance measurement camera
US11842507B2 (en) 2019-02-06 2023-12-12 Mitsumi Electric Co., Ltd. Distance measuring camera
CN113424020B (en) * 2019-02-06 2024-04-09 三美电机株式会社 Distance measuring camera

Similar Documents

Publication Publication Date Title
JP3242529B2 (en) Stereo image matching method and stereo image parallax measurement method
JP4825980B2 (en) Calibration method for fisheye camera.
US5602584A (en) Apparatus for producing a panoramic image using a plurality of optical systems
JP2874710B2 (en) 3D position measuring device
US11606542B2 (en) Projection image automatic correction method and system based on binocular vision
JP4825971B2 (en) Distance calculation device, distance calculation method, structure analysis device, and structure analysis method.
JPH03200007A (en) Stereoscopic measuring instrument
CN108629756B (en) Kinectv2 depth image invalid point repairing method
WO2018235163A1 (en) Calibration device, calibration chart, chart pattern generation device, and calibration method
JPH0719832A (en) Extracting method for corresponding points of pulirity of images
CA3233222A1 (en) Method, apparatus and device for photogrammetry, and storage medium
Svoboda et al. Matching in catadioptric images with appropriate windows, and outliers removal
JP2996067B2 (en) 3D measuring device
JPH10320558A (en) Calibration method, corresponding point search method and device therefor, focus distance detection method and device therefor, three-dimensional position information detection method and device therefor, and recording medium
AU2013308155B2 (en) Method for description of object points of the object space and connection for its implementation
JP7489253B2 (en) Depth map generating device and program thereof, and depth map generating system
CN111489384B (en) Method, device, system and medium for evaluating shielding based on mutual viewing angle
JPH0252204A (en) Measuring instrument for three-dimensional coordinate
JPH055609A (en) Cubic recognition method of image
JP2002135807A (en) Method and device for calibration for three-dimensional entry
JP3340599B2 (en) Plane estimation method
JPH05122602A (en) Range moving image input processing method
Szpytko et al. Stereovision 3D type workspace mapping system architecture for transport devices
CN111080689B (en) Method and device for determining face depth map
JP2009237652A (en) Image processing apparatus and method, and program