JPH0392712A - Three-dimensional position recognition by use of image processing device and distance measuring sensor - Google Patents

Three-dimensional position recognition by use of image processing device and distance measuring sensor

Info

Publication number
JPH0392712A
JPH0392712A JP22826189A JP22826189A JPH0392712A JP H0392712 A JPH0392712 A JP H0392712A JP 22826189 A JP22826189 A JP 22826189A JP 22826189 A JP22826189 A JP 22826189A JP H0392712 A JPH0392712 A JP H0392712A
Authority
JP
Japan
Prior art keywords
image processing
distance
camera
points
dimensional position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP22826189A
Other languages
Japanese (ja)
Inventor
Nobutoshi Torii
信利 鳥居
Soji Kato
宗嗣 加藤
Yoshiharu Iwata
吉晴 岩田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fanuc Corp
Original Assignee
Fanuc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fanuc Corp filed Critical Fanuc Corp
Priority to JP22826189A priority Critical patent/JPH0392712A/en
Publication of JPH0392712A publication Critical patent/JPH0392712A/en
Pending legal-status Critical Current

Links

Abstract

PURPOSE:To recognize the three-dimensional position of an object body accurately by detecting the positions of the three points of the body on a reference surface which is orthogonal to the optical axis of a camera, and recognizing the three-dimensional positions of the three points based on a position whose parallax is corrected and a distance from a distance measuring sensor. CONSTITUTION:With respect to a gage hole 3 whose position is known on a door 2 of a vehicle, for example, a position at which a camera 22 directly faces each gage hole 3 is instructed to a robot 1. The three-dimensional position is to be recognized at this time. At the same time, the camera 22 is made to face each gage hole 3 directly, and the camera is calibrated. At this time, the distances (positions on a reference surface) from each gage hole and a distance measuring sensor 24 and the position of the measuring point where measurement is performed with the distance measuring sensor 24 are stored in an image processing device as the correcting quantities (deviated quantities) from the center of the gage hole 3. The three-dimensional position of each gage hole 3 is recognized based on the stored distance form the distance measuring sensor and the correcting quantities with the image processing device. The three- dimensional position of the object body can be recognized based on the obtained three-dimensional positions of three points.

Description

【発明の詳細な説明】 産業上の利用分野 本発明は、ロボッI・等の自動機械による絹立,搬送作
業において利用される作業対象物の3次元位置の認識方
法に関する。
DETAILED DESCRIPTION OF THE INVENTION Field of the Invention The present invention relates to a method for recognizing the three-dimensional position of a workpiece used in silk stand and conveyance work by an automatic machine such as Robot I.

従来の技術 ロボット等の自動機械においては、作業対象物に対し何
らかの作業を行う」二で、対象物の位置を検出しなけれ
ばならない。通常、標準的な対象物の位置がロボット等
の自動機械には教示されているが、個々の作業対象物の
位置は標準的な位置からずれが生じており、そのため位
置補正を行う必・要がある。この位置補正は、画像処理
装置のノyメラで対象物の物体を撮影し、物体の位置を
検出して位置補正を行っている。しかし、画像処理装置
においては、カメラの性能上、カメラの光軸と垂直な平
面における対象物体の位置しか認識できなく、対象物体
の3次元位置を検出することができない。この対象物体
の3次元位置を認識する方法としては、ステレオカメラ
等で対象物体上において位置が既知の点、例えば穴(以
下、ゲージホールという)の3次元位置を認識し、3つ
のゲージホールの位置によって対象物体の3次元位置を
認識する方法がある。
Conventional technology In automatic machines such as robots, the position of the object must be detected when performing some kind of work on the object. Normally, automatic machines such as robots are taught the position of a standard object, but the position of each work object deviates from the standard position, so it is necessary to correct the position. There is. This position correction is performed by photographing the target object with a camera of the image processing device, detecting the position of the object, and performing position correction. However, in the image processing apparatus, due to the performance of the camera, the position of the target object can only be recognized in a plane perpendicular to the optical axis of the camera, and the three-dimensional position of the target object cannot be detected. The method for recognizing the three-dimensional position of the target object is to recognize the three-dimensional position of a known point on the target object, such as a hole (hereinafter referred to as a gauge hole), using a stereo camera, etc. There is a method of recognizing the three-dimensional position of a target object based on its position.

発明が解決しようとする課題 ステレオカメラ等でゲージホールの3次元位置を認識し
ようとする場合、カメラの較正が複雑になる。即ち、カ
メラ座標軸とロボット等の自動機械の座標軸を対応づけ
るための較正(キヤリプレーション)作業が複雑で多く
の作業と時間が必要となる。また、ゲージホールを斜め
から見るため、ゲージホールが楕円として認識され、か
つゲージホールの奥行き等によってゲージホールの検出
が安定しないといった問題が生じる。
Problems to be Solved by the Invention When attempting to recognize the three-dimensional position of a gauge hole using a stereo camera or the like, camera calibration becomes complicated. That is, the calibration work for associating the camera coordinate axes with the coordinate axes of an automatic machine such as a robot is complicated and requires a lot of work and time. Further, since the gauge hole is viewed from an oblique angle, the gauge hole is recognized as an ellipse, and the detection of the gauge hole is unstable depending on the depth of the gauge hole.

そこで、本発明の目的は、画像処理装置のカメラの較正
及び対象物体の画像処理が簡単で、かつ、正確に対象物
体の3次元位置を認識できる物体の3次元位置認識方法
を提供することにある。
SUMMARY OF THE INVENTION Therefore, an object of the present invention is to provide a method for recognizing the three-dimensional position of an object, which allows easy calibration of the camera of an image processing device and image processing of the target object, and which can accurately recognize the three-dimensional position of the target object. be.

課題を解決するための手段 本発明は、物体上における既知の3点に対し、画像処理
装置で該画像処理装置のカメラの光軸と垂直な基準面上
における上記3点の位置を夫々検出し、測距センサによ
って上記3点の測距センサからの距離を夫々測定し、測
定した」二記測距センサからの夫々の距離によって検出
した基準面」二における上記3点の各位置を夫々視差補
正し、視差補正された位置と測距センサからの距離によ
って上記3点の夫々の3次元位置を認識して、該3点の
3次元位置より物体の3次元位置を認識するようにした
Means for Solving the Problems The present invention detects the positions of three known points on an object on a reference plane perpendicular to the optical axis of a camera of the image processing device using an image processing device. The distances from the three points from the distance sensor are measured using the distance sensor, and the positions of the three points on the reference plane 2 detected by the measured distances from the distance sensor are determined by the parallax. The three-dimensional position of each of the three points is recognized based on the parallax-corrected position and the distance from the ranging sensor, and the three-dimensional position of the object is recognized from the three-dimensional positions of the three points.

また、測距センサで測定しようとする物体上の点の存在
する面が測定方向に対し垂直でない場合には、該点周囲
の複数の点を測距センサで測定し、その測定値の平均値
を当該点の測距センサからの距離として求め、物体の3
次元位置を認識するようにした。
In addition, if the surface on which the point on the object to be measured with the distance measurement sensor exists is not perpendicular to the measurement direction, measure multiple points around the point with the distance measurement sensor, and calculate the average value of the measured values. is determined as the distance of the point from the distance measurement sensor, and 3
Dimensional position is now recognized.

作用 画像処理装置のカメラを検出しようとする物体上の点と
正対させて、カメラ光軸と垂直な基準面」二の該点の位
a (x,y)を求める。次に、測距センサによって、
測距センサから該点までの距離を求めて該距離と基準面
の位置により上記基準面上の位置(x,  y)を視差
補正して、上記点の3次元位置を求める。なお、測定し
ようとする物体上の上記点の存在する面が測定方向に対
し、垂直でない場合には、該測定しようとする点の周囲
の複数の点を測距センサで測定し、その測定値の平均に
よって該点と測距センサ間の距離とする。
The camera of the operational image processing device is placed directly opposite a point on the object to be detected, and the position a (x, y) of the point on the reference plane perpendicular to the camera optical axis is determined. Next, the distance measurement sensor
The distance from the distance measurement sensor to the point is determined, and the position (x, y) on the reference plane is corrected for parallax based on the distance and the position of the reference plane to determine the three-dimensional position of the point. In addition, if the surface on which the above point on the object to be measured exists is not perpendicular to the measurement direction, measure multiple points around the point to be measured with a range sensor and calculate the measured value. The distance between the point and the distance measuring sensor is determined by the average of .

こうして、物体上の位置が既知の3点についてその3次
元位置を検出すれば、物,体の3次元位置が認識できる
In this way, by detecting the three-dimensional positions of three points on an object whose positions are known, the three-dimensional position of the object or body can be recognized.

実施例 第3図は、本発明を実施する一実施例の構或図である。Example FIG. 3 is a structural diagram of an embodiment of the present invention.

ロボットJの手首には画像処理装置10のカメラ22と
レーザ光によって距離を測定するレーザセンサ等の測距
センサ24が取付けられ、対象物体2、例えば車体のド
ア等に設けられた対象物体上の位置が既知の穴であるゲ
ージホール3の3次元位置を上記カメラ22(画像処理
装置)゛及び7Ill+距センサ24で測定するもので
ある。なお、破線で示した符号26は、ロボット1に取
付けられるツールを示している。
A camera 22 of an image processing device 10 and a distance measuring sensor 24 such as a laser sensor that measures distance using laser light are attached to the wrist of the robot J. The three-dimensional position of the gauge hole 3, which is a hole whose position is known, is measured by the camera 22 (image processing device) and the distance sensor 24. Note that the reference numeral 26 indicated by a broken line indicates a tool attached to the robot 1.

第2図は同実施例において使用する画像処理装置の要部
ブロック図である。
FIG. 2 is a block diagram of main parts of the image processing apparatus used in the same embodiment.

図中、10は画像処理装置で、該画像処理装置10は主
中央処理装置(以下、メインCPUという)11を有し
、該メインCPUilには、カメラインタフエイス12
,画像処理プロセッザ13,コンソールインタフエイス
14,通信インタフェイス15,センサフェイス1.6
,フレームメモリ17,ROMで形威されたコントロー
ルソフト用メモリ18,RAM等で構威されたブロラム
メモリ19,不揮発性RAMで構成されたデータメモリ
20がバス21で接続されている。
In the figure, 10 is an image processing device, the image processing device 10 has a main central processing unit (hereinafter referred to as main CPU) 11, and the main CPUil includes a camera interface 12.
, image processing processor 13, console interface 14, communication interface 15, sensor face 1.6
, a frame memory 17, a control software memory 18 in the form of a ROM, a brochure memory 19 in the form of a RAM, etc., and a data memory 20 in the form of a non-volatile RAM are connected by a bus 21.

カメラインタフェイス12には車のドア等の対象物の物
体を撮影するカメラ22が接統され、該カメラ22の視
野でとらえた画像は、グレイスケールによる濃淡画像に
変換されてフレームメモリ17に格納される。
A camera 22 is connected to the camera interface 12 for photographing a target object such as a car door, and the image captured in the field of view of the camera 22 is converted into a gray scale image and stored in the frame memory 17. be done.

画像処理プロセッサ{3はフレームメモリ17に格納さ
れた画像を処理し対象物の識別,位置姿勢を計測する。
The image processing processor {3 processes the images stored in the frame memory 17 to identify the object and measure the position and orientation of the object.

コンソールインタフェイス14にはコンソール23が接
続され、該コンソール23は、液晶表示部の外、各種指
令キー,アプリケーションプログラムの入力,編集,登
録,実行などの操作を行うための数字キー等を有してお
り、上記液晶表示部には、各種データ設定のためのメニ
ューやプログラムのリストなどを表示できるようになっ
ている。
A console 23 is connected to the console interface 14, and the console 23 has a liquid crystal display, various command keys, and numeric keys for performing operations such as inputting, editing, registering, and executing application programs. The liquid crystal display section can display menus for setting various data, a list of programs, and the like.

また、コントロールソフト用メモリ18にはメインC 
P U 1. 1が該画像処理装置を制御するためのコ
ントロールプログラムが格納されており、また、プログ
ラムメモリ19にはユーザーが作威するプログラムが格
納されている。
In addition, the main C
P U 1. 1 stores a control program for controlling the image processing apparatus, and a program memory 19 stores programs operated by a user.

通信インタフェイス15にはロボット1の制御装置が接
続されている。また、センサインタフエイス1−6には
A/D変換器25を介して洞距センサ24が接続されて
いる。
A control device for the robot 1 is connected to the communication interface 15 . Further, a distance sensor 24 is connected to the sensor interface 1-6 via an A/D converter 25.

以上の構或は、センサインタフエイス16,A. / 
D変換器25,測距センサ24を設けた以外、従来の視
覚センサシステムの構成と同様である。
The above structure or the sensor interface 16, A. /
The configuration is the same as that of a conventional visual sensor system except that a D converter 25 and a distance measurement sensor 24 are provided.

次に、本発明の動作説明する。Next, the operation of the present invention will be explained.

3次元位置を認識しようとする対象物体(例えば車のド
ア一等)上で、位置が既知な点、本実施例では3つのゲ
ージホール3に対して、カメラ22が各ゲージホール3
と正対する位置(カメラ22の光軸と各ゲージホール3
の面が垂直となる位置)をロボット1に教示すると共に
、カメラ22を各ゲージホール3と正対させてカメラの
較正(キヤリプレーヨン)を行う。また、そのときの各
ゲージホール3と測距センサ24(カメラ22)からの
距離(基準面の位置)及び測距センサ22で測定する測
定点の位置をゲージホール3の中心からの補正量(ずれ
量)として画像処理装置10のデータメモリ20内に設
定する。
On a target object whose three-dimensional position is to be recognized (for example, a car door, etc.), the camera 22 detects points whose positions are known, three gauge holes 3 in this embodiment.
(optical axis of camera 22 and each gauge hole 3)
The camera 22 is calibrated (calibration) by directing the camera 22 to directly face each gauge hole 3. In addition, the distance from each gauge hole 3 and the distance measurement sensor 24 (camera 22) at that time (position of the reference plane) and the position of the measurement point measured by the distance measurement sensor 22 are calculated by the correction amount ( The amount of deviation) is set in the data memory 20 of the image processing device 10.

ゲージホール3の中心位置は、カメラ22でゲージホー
ル3を撮影して基準面上でのゲージホールの中心位置(
x,  y)を検出し、次に、該基準面に乗直な方向(
Z)におけるゲージホールの位置を測距センサ24で測
定するもので、基準面とカメラでの光軸(測距センサ2
4の測定方向)が垂直になるものであれば、または、そ
のずれ角度が小さい場合には第4図に示すように、測距
センサ24による測定点Sは1個でよいが、必ずしも垂
直にならず規準面が傾くものであれば、第5図に示すよ
うに、複数の測定点S1から84の位置を補正量として
設定しておく。そして、測距センサ24で測定した各点
S1〜S4の距離を平均して各ゲージホール3の測距セ
ンサ24(カメラ22)からの距離とする。
The center position of the gauge hole 3 can be determined by photographing the gauge hole 3 with the camera 22 and determining the center position of the gauge hole on the reference plane (
x, y), and then detect the direction perpendicular to the reference plane (
The distance measurement sensor 24 measures the position of the gauge hole at the reference plane and the camera optical axis (the distance measurement sensor 2
If the measurement direction (measurement direction 4) is perpendicular, or if the deviation angle is small, as shown in FIG. If the reference plane is not tilted, the positions of a plurality of measurement points S1 to 84 are set as correction amounts, as shown in FIG. Then, the distances of the respective points S1 to S4 measured by the distance measurement sensor 24 are averaged to determine the distance of each gauge hole 3 from the distance measurement sensor 24 (camera 22).

こうして、カメラ22が較正され各種データが設定され
た後、対象物体が送り込まれて来て該対象物体の3次元
位置を認識するには、ロボット1の制御装置はまず、教
示された第1のゲージホールを撮影できる位置にカメラ
22を移動させ、画像処理装置10にスナップ指令を出
力する。画像処理装置10のC P U I. 1はロ
ボッl・からスナップ指令を受信すると第1,図にフロ
ーチャートで示す処理を開始する。
In this way, after the camera 22 has been calibrated and various data have been set, the control device of the robot 1 first uses the first taught object to recognize the three-dimensional position of the object. The camera 22 is moved to a position where the gauge hole can be photographed, and a snap command is output to the image processing device 10. CPU of the image processing device 10. When robot 1 receives a snap command from robot 1, robot 1 starts the process shown in the flowchart in FIG.

CPU1.1はカメラ22にゲージホーノレ3を{最影
し、画像処理プロセッサ13でグレイスケール濃淡処理
を行った後、該画像をフレームメモリ17に格納する。
The CPU 1.1 images the gauge horn 3 on the camera 22, performs grayscale gradation processing on the image processing processor 13, and then stores the image in the frame memory 17.

(ステップ100)。(Step 100).

次に、CPUI 1は画像処理プロセッサJ3にゲージ
ホール検出指令を出力し、画像処理プロセッサは従来と
同様な方法で対象物即ちゲージホールの形状輪郭を認識
し、第↓のゲージホールの中心位置(xi,yi)を求
める(ステップ10])。
Next, the CPU 1 outputs a gauge hole detection command to the image processing processor J3, and the image processing processor recognizes the shape outline of the object, that is, the gauge hole, in the same manner as before, and the center position of the ↓th gauge hole ( xi, yi) (step 10]).

なお、iは指標で、始めは初期設定でrOJに設定され
ており、この指標iが「0」で第1−、i=1で第2、
i=2で第3のゲージホールを示す。
Note that i is an index, which is initially set to rOJ, and when this index i is "0", it is the first -, when i = 1, it is the second,
i=2 indicates the third gauge hole.

次に、データメモリ20に設定されている測定点の補正
量とステップ101で求めたゲージホール位置より、測
距センサ24で測定しようとする測定点の位置である補
正データを算出し、ロボット1の制御装置へ通信インタ
エイス15を介して送出する(ステップ1.02,1.
03)。
Next, from the correction amount of the measurement point set in the data memory 20 and the gauge hole position determined in step 101, correction data, which is the position of the measurement point to be measured by the distance measurement sensor 24, is calculated, and the robot 1 to the control device via the communication interface 15 (steps 1.02, 1.
03).

ロボット1の制御装置はこの補正データを受信してロボ
ット1−を駆動し、測距センサ24が測定点Sと正対す
る位置へ移動させ、移動完了後、完了イ5号を山像処理
装置10へ送出する。画像処岬装置10のCPUI 1
は完了信号を受信すると(ステップ104)、測距セン
サ24で測定点Sまでの距離を測定し、測定データzi
を得る。
The control device of the robot 1 receives this correction data, drives the robot 1-, moves the robot 1- to a position where the distance measurement sensor 24 directly faces the measurement point S, and after completing the movement, transfers the completed No. 5 to the mountain image processing device 10. Send to. CPUI 1 of image processing misaki device 10
When receiving the completion signal (step 104), the distance sensor 24 measures the distance to the measurement point S, and the measured data zi
get.

(ステップ105)。そして、ロボット1の制御装置に
測定完了信号を送出すると共に、測定した距離ziと、
設定されている当該ケージホールの基準面の位置(規準
位置として設定されている距離センサとゲージホール面
の距離)より、視差補正を行う。即ち、第6図に示すよ
うにカメラ22(測距センサ24)と基準而間の距離を
zsとし、測距センサ24で測定した距離を2とすると
、視差により誤差aが生じる。この誤差aを、ステップ
1−01で得られたゲージホール3の中心位置(xi,
yi)に対し補正し、ゲージホール3の位11 置Pi(xi’,  yi’,  zi)を求めて記憶
する(ステップ106〜108)。なお、本実施例では
測定点Sを1つとしたが、第5図に示すように複数の測
定点31〜S4を測定する場合には、ステップ102〜
1. 0 5の処邪を各測定点毎に行い、得られた71
II1定データの平均舶をゲージホールの測b“14セ
ンサ24からの距離とし、視差補正を行う。
(Step 105). Then, a measurement completion signal is sent to the control device of the robot 1, and the measured distance zi and
Parallax correction is performed based on the set position of the reference plane of the cage hole (distance between the distance sensor and the gauge hole plane set as the reference position). That is, as shown in FIG. 6, if the distance between the camera 22 (range sensor 24) and the reference object is zs, and the distance measured by the range sensor 24 is 2, an error a occurs due to parallax. This error a is calculated as the center position (xi,
yi), and the position Pi(xi', yi', zi) of the gauge hole 3 is determined and stored (steps 106 to 108). In this embodiment, the number of measurement points S is one, but if a plurality of measurement points 31 to S4 are to be measured as shown in FIG.
1. 0 5 was performed at each measurement point, and the obtained 71
II1 The average distance of the constant data is taken as the distance from the gauge hole measurement b"14 sensor 24, and parallax correction is performed.

次に、指標lを「1」インクリメン1・シ、該指標iが
「2」に達したか否か判断ずる(ステップ1.09,1
1.0)。即ち、3つのゲージホールの位置を検出した
か否で判断ずる。達してなければ処理を終了する。
Next, the index l is incremented by "1" and it is determined whether the index i has reached "2" (step 1.09,
1.0). That is, the determination is made based on whether or not the positions of the three gauge holes are detected. If it has not been reached, the process ends.

一方、ロボット1の制御装置はステップ1 0 6で送
出された測定完了信号を受信すると、次のゲージホール
とカメラ22が正対ずる位置ヘロホットを駆動し、画像
処理装置10ヘスナップ指令を送出する。スナップ指令
を画像処理装置10が受信すると、再びステップ100
以下の処理を開始する。こうして、3つのゲージホール
の{1’/:置PO  (Xo’.yo’.Zo),P
l−  (x+’,V+1 9 Z+),P2 (X2’.5’2’.Z2)が検出され
、指標iが「2」になったことが確認されると(ステッ
プ1.10)、画像処理装置10のC P U ]− 
1はロボット1の制御装置へ該3つのゲージホールに位
置PO, Pl.,P2を送出し、指標iを「0」にセ
ッ1・シて対象物体の3次元付1′α認識処即を終了す
る。ロボット1の制御装置は受信した対象物体の3点の
位置より、対象物体の3次元位置を認識することができ
る。
On the other hand, when the control device of the robot 1 receives the measurement completion signal sent out in step 106, it drives the hero hot at a position where the next gauge hole and the camera 22 directly face each other, and sends a snap command to the image processing device 10. When the image processing device 10 receives the snap command, the process returns to step 100.
Start the following process. In this way, the three gauge holes {1'/: PO (Xo'.yo'.Zo), P
l- (x+', V+1 9 Z+), P2 (X2'.5'2'.Z2) is detected, and when it is confirmed that the index i has become "2" (step 1.10), the image CPU of processing device 10 ]-
1 is connected to the control device of the robot 1 at the three gauge holes at positions PO, Pl. , P2 and sets the index i to "0" to complete the three-dimensional 1'α recognition process of the target object. The control device of the robot 1 can recognize the three-dimensional position of the target object from the received three-point positions of the target object.

発明の効果 本発明は、1式のカメラと測距センサの組合せによる簡
単な機器構成によって物体の3次元位置を認識でき、3
次元位置補正が容易となる。さらに、カメラによる画像
処理は平面に対して行われるので、カメラの較正が単純
になる上に、カメラを検出点(ゲージホール)に対しほ
ぼ正対させるため、画像処理」二、検出点(ゲージホー
ル)の検出が容易で安定し、正確に検出することができ
る。
Effects of the Invention The present invention can recognize the three-dimensional position of an object with a simple equipment configuration consisting of a combination of one camera and a distance sensor.
Dimensional position correction becomes easy. Furthermore, since image processing by the camera is performed on a flat surface, camera calibration is simple. holes) can be detected easily, stably, and accurately.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は本発明の一実施例における画像処理装置におけ
る処理のフローチャート、 第2図は同実施例における画像処理装置の要部ブロック
図、 第3図は同実施例の構或図、 第4図,第5図はゲージホールとが{11定点の説明図
、 第6図は視差補正の説明図である。 ↓・・・ロボット、2・・・対象物体、3・・・ゲージ
ホール、10・・・画像処理装置、22・・・カメラ、
24・・・測距センサ、S,Sl〜S4測定点。
FIG. 1 is a flowchart of processing in an image processing apparatus according to an embodiment of the present invention, FIG. 2 is a block diagram of main parts of an image processing apparatus according to the embodiment, FIG. 3 is a configuration diagram of the embodiment, and FIG. FIG. 5 is an explanatory diagram of the fixed point with the gauge hole {11}, and FIG. 6 is an explanatory diagram of parallax correction. ↓...Robot, 2...Target object, 3...Gauge hole, 10...Image processing device, 22...Camera,
24... Distance sensor, S, Sl to S4 measurement points.

Claims (2)

【特許請求の範囲】[Claims] (1)物体上における既知の3点に対し、画像処理装置
で該画像処理装置のカメラの光軸と垂直な基準面上にお
ける上記3点の位置を夫々検出し、測距センサによって
上記3点の測距センサからの距離を夫々測定し、測定し
た上記測距センサからの夫々の距離について検出した基
準面上における上記3点の各位置を夫々視差補正し、視
差補正された位置と測距センサからの距離によって上記
3点の夫々の3次元位置を認識して、該3点の3次元位
置より物体の3次元位置を認識するようにした画像処理
装置と測距センサによる物体の3次元位置認識方法。
(1) For three known points on an object, an image processing device detects the positions of the three points on a reference plane perpendicular to the optical axis of the camera of the image processing device, and a distance measurement sensor detects the three points. The distances from the distance measurement sensor are respectively measured, and the positions of the three points on the reference plane detected for each of the measured distances from the distance measurement sensor are corrected for parallax, and the parallax-corrected position and distance measurement are The three-dimensional position of each of the three points mentioned above is recognized based on the distance from the sensor, and the three-dimensional position of the object is recognized from the three-dimensional position of the three points using an image processing device and a distance measuring sensor. Location recognition method.
(2)上記3点の周囲の複数点を夫々測定センサで測定
し、各点に対して測定された距離の平均値を各点の測距
センサからの距離とした請求項1記載の画像処理装置と
測距センサによる物体の3次元位置認識方法。
(2) Image processing according to claim 1, wherein a plurality of points around the three points are measured by respective measurement sensors, and the average value of the distances measured for each point is determined as the distance of each point from the distance measurement sensor. A method for recognizing the three-dimensional position of an object using a device and a ranging sensor.
JP22826189A 1989-09-05 1989-09-05 Three-dimensional position recognition by use of image processing device and distance measuring sensor Pending JPH0392712A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP22826189A JPH0392712A (en) 1989-09-05 1989-09-05 Three-dimensional position recognition by use of image processing device and distance measuring sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP22826189A JPH0392712A (en) 1989-09-05 1989-09-05 Three-dimensional position recognition by use of image processing device and distance measuring sensor

Publications (1)

Publication Number Publication Date
JPH0392712A true JPH0392712A (en) 1991-04-17

Family

ID=16873702

Family Applications (1)

Application Number Title Priority Date Filing Date
JP22826189A Pending JPH0392712A (en) 1989-09-05 1989-09-05 Three-dimensional position recognition by use of image processing device and distance measuring sensor

Country Status (1)

Country Link
JP (1) JPH0392712A (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0611344A (en) * 1992-05-01 1994-01-21 Aoki Corp Measuring method of position and attitude of moving body
JPH07286820A (en) * 1994-04-20 1995-10-31 Fanuc Ltd Position measuring method using three-dimensional visual sensor, and positional deviation correcting method
JPH09178418A (en) * 1995-12-26 1997-07-11 Ricoh Co Ltd Three-dimensional position detector and transferring robot using it
JP2000046514A (en) * 1998-07-27 2000-02-18 Aisin Takaoka Ltd Method for detecting position of worked hole of casting
JP2002264059A (en) * 2001-03-13 2002-09-18 Think Laboratory Co Ltd Method of handling roll to be processed, and robot hand
US6542840B2 (en) 2000-01-27 2003-04-01 Matsushita Electric Industrial Co., Ltd. Calibration system, target apparatus and calibration method
KR100835906B1 (en) * 2007-05-03 2008-06-09 고려대학교 산학협력단 Method for estimating global position of robot and device for estimating global position of robot
JP2009172718A (en) * 2008-01-24 2009-08-06 Canon Inc Working device and calibration method of the same
JP2013134057A (en) * 2011-12-22 2013-07-08 Japan Siper Quarts Corp Method for determining three-dimensional distribution of bubble distribution of silica glass crucible, and method for manufacturing silicon monocrystal
JP2013134056A (en) * 2011-12-22 2013-07-08 Japan Siper Quarts Corp Method for determining three-dimensional distribution of infrared absorption spectrum of silica glass crucible, and method for manufacturing silicon monocrystal
JP2013528125A (en) * 2010-06-07 2013-07-08 ペックス・レント・ビー.ブイ. Gripping device
JP2015180881A (en) * 2015-05-13 2015-10-15 株式会社Sumco Method for determining three-dimensional distribution of infrared absorption spectrum of silica glass crucible, and method for producing silicon single crystal
US9402151B2 (en) 2011-08-27 2016-07-26 Korea University Research And Business Foundation Method for recognizing position of mobile robot by using features of arbitrary shapes on ceiling
JP2019098409A (en) * 2017-11-28 2019-06-24 東芝機械株式会社 Robot system and calibration method
US10573021B2 (en) 2016-09-28 2020-02-25 Honda Motor Co., Ltd. Position and attitude estimation method and position and attitude estimation system
JP2020197883A (en) * 2019-05-31 2020-12-10 ホームネットカーズ株式会社 Simplified estimating device, simplified estimating system, simplified estimating system control method, program, and recording medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6263801A (en) * 1985-09-13 1987-03-20 Agency Of Ind Science & Technol Distance measuring method utilized attitude measuring mark pattern
JPS6491005A (en) * 1987-10-01 1989-04-10 Atr Tsushin Syst Kenkyusho Three-dimensional position detecting method
JPH01109202A (en) * 1987-10-22 1989-04-26 Fanuc Ltd Visual sensor with parallax correction

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6263801A (en) * 1985-09-13 1987-03-20 Agency Of Ind Science & Technol Distance measuring method utilized attitude measuring mark pattern
JPS6491005A (en) * 1987-10-01 1989-04-10 Atr Tsushin Syst Kenkyusho Three-dimensional position detecting method
JPH01109202A (en) * 1987-10-22 1989-04-26 Fanuc Ltd Visual sensor with parallax correction

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0611344A (en) * 1992-05-01 1994-01-21 Aoki Corp Measuring method of position and attitude of moving body
JPH07286820A (en) * 1994-04-20 1995-10-31 Fanuc Ltd Position measuring method using three-dimensional visual sensor, and positional deviation correcting method
JPH09178418A (en) * 1995-12-26 1997-07-11 Ricoh Co Ltd Three-dimensional position detector and transferring robot using it
JP2000046514A (en) * 1998-07-27 2000-02-18 Aisin Takaoka Ltd Method for detecting position of worked hole of casting
US6542840B2 (en) 2000-01-27 2003-04-01 Matsushita Electric Industrial Co., Ltd. Calibration system, target apparatus and calibration method
JP2002264059A (en) * 2001-03-13 2002-09-18 Think Laboratory Co Ltd Method of handling roll to be processed, and robot hand
KR100835906B1 (en) * 2007-05-03 2008-06-09 고려대학교 산학협력단 Method for estimating global position of robot and device for estimating global position of robot
JP2009172718A (en) * 2008-01-24 2009-08-06 Canon Inc Working device and calibration method of the same
JP2013528125A (en) * 2010-06-07 2013-07-08 ペックス・レント・ビー.ブイ. Gripping device
US9402151B2 (en) 2011-08-27 2016-07-26 Korea University Research And Business Foundation Method for recognizing position of mobile robot by using features of arbitrary shapes on ceiling
JP2013134056A (en) * 2011-12-22 2013-07-08 Japan Siper Quarts Corp Method for determining three-dimensional distribution of infrared absorption spectrum of silica glass crucible, and method for manufacturing silicon monocrystal
JP2013134057A (en) * 2011-12-22 2013-07-08 Japan Siper Quarts Corp Method for determining three-dimensional distribution of bubble distribution of silica glass crucible, and method for manufacturing silicon monocrystal
JP2015180881A (en) * 2015-05-13 2015-10-15 株式会社Sumco Method for determining three-dimensional distribution of infrared absorption spectrum of silica glass crucible, and method for producing silicon single crystal
US10573021B2 (en) 2016-09-28 2020-02-25 Honda Motor Co., Ltd. Position and attitude estimation method and position and attitude estimation system
JP2019098409A (en) * 2017-11-28 2019-06-24 東芝機械株式会社 Robot system and calibration method
JP2020197883A (en) * 2019-05-31 2020-12-10 ホームネットカーズ株式会社 Simplified estimating device, simplified estimating system, simplified estimating system control method, program, and recording medium

Similar Documents

Publication Publication Date Title
JPH0392712A (en) Three-dimensional position recognition by use of image processing device and distance measuring sensor
US7200260B1 (en) Teaching model generating device
US7532949B2 (en) Measuring system
CN101239469B (en) Calibration device and method for robot mechanism
US6256560B1 (en) Method for correcting position of automated-guided vehicle and apparatus therefor
US7355725B2 (en) Measuring system
US20050273199A1 (en) Robot system
US20080013825A1 (en) Simulation device of robot system
US20060269123A1 (en) Method and system for three-dimensional measurement
US7502504B2 (en) Three-dimensional visual sensor
JP2003326486A (en) Work positioning device
KR20020097172A (en) Method for Measuring Three- dimensional Coordinate, Apparatus Thereof and Method for Building Large Construction Therewith
JPH08210816A (en) Coordinate system connection method for determining relationship between sensor coordinate system and robot tip part in robot-visual sensor system
JPH08272425A (en) Method to teach coordinate system to robot in non-contact
JP4303411B2 (en) Tracking method and tracking system
JPS6332306A (en) Non-contact three-dimensional automatic dimension measuring method
JPH0775982A (en) Automatic teaching device for laser robot
US20230278196A1 (en) Robot system
JPH1011146A (en) Device for correcting stop posture of mobile object
JPH01205994A (en) Visual recognition device for robot
JPH06214622A (en) Work position sensor
JPH02276901A (en) Position shift correcting method for visual sensor
JPH10318715A (en) Image measuring apparatus
JPH0731536B2 (en) Teaching data correction robot
JP3384617B2 (en) Object measuring apparatus and method