JPH08272451A - Calibration method in robot with visual sensor - Google Patents

Calibration method in robot with visual sensor

Info

Publication number
JPH08272451A
JPH08272451A JP7010895A JP7010895A JPH08272451A JP H08272451 A JPH08272451 A JP H08272451A JP 7010895 A JP7010895 A JP 7010895A JP 7010895 A JP7010895 A JP 7010895A JP H08272451 A JPH08272451 A JP H08272451A
Authority
JP
Japan
Prior art keywords
position information
calibration
points
visual sensor
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP7010895A
Other languages
Japanese (ja)
Other versions
JP3541980B2 (en
Inventor
Yoko Morita
陽子 森田
Tetsuya Takahashi
哲哉 高橋
Yasuhiro Koga
靖弘 古賀
Masao Nakamura
正夫 中村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yaskawa Electric Corp
Original Assignee
Yaskawa Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yaskawa Electric Corp filed Critical Yaskawa Electric Corp
Priority to JP07010895A priority Critical patent/JP3541980B2/en
Publication of JPH08272451A publication Critical patent/JPH08272451A/en
Application granted granted Critical
Publication of JP3541980B2 publication Critical patent/JP3541980B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Abstract

PURPOSE: To accurately perform calibration based on the highly accurate calibration data in the articulated robot system with visual sensor. CONSTITUTION: In the calibration method in the articulated robot system with visual sensor 1, position information on more than four operation areas and on the picture coordinate system is obtained. Based on the position information on the operation space, the regression plane of these points is calculated on the operation space. The position information of the points required for the calibration is selected based on the more than four position information by taking the remaining difference of the position information of the operation space of each point for the plane as reference. Thus, the calibration is performed based on the selected data.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、視覚センサを有する産
業用ロボットのキャリブレーション方法に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method for calibrating an industrial robot having a visual sensor.

【0002】[0002]

【従来の技術】視覚センサを有する産業用ロボットにお
いてキャリブレーションを行う場合、画像座標系上およ
び作業空間上の位置情報が3点分以上必要となる。従来
は、視覚センサを用いて3点の画像座標系上の位置情報
を認識させ、その3点にロボットの制御点を移動させて
作業空間上の位置情報を得て画像座標系と作業空間との
キャリブレーションを行っていた。ところが、多関節ロ
ボットは、剛性が低く、たわみ等をおこしやすく、ロボ
ットの設計上の寸法と実際のロボットが動いた場合の寸
法の食い違いが生じやすいため、従来の方法では、ロボ
ット各々の作業位置や姿勢により、計算上で到達する点
と、実際に到達する点との違いが生じ、キャリブレーシ
ョンに必要な作業空間上の位置情報に誤差が生じやすく
なるという問題があった。このような問題に対して、特
開昭63−61903号公報には、被位置検出物におけ
る3か所の所定位置に設けられ、同心状の中空部を有す
る大円の周囲に所定の規則にしたがって少なくとも4個
以上の小円を配設したキャリブレーションパターンをそ
れぞれ個々のカメラによって撮像し、自動的に前記カメ
ラのレンズ中心位置を算出し、また、射影変換によって
座標系の変換を行う三次元座標変換装置が記載されてい
る。
2. Description of the Related Art When an industrial robot having a visual sensor is calibrated, position information on an image coordinate system and a work space is required for three points or more. Conventionally, a visual sensor is used to recognize position information of three points on an image coordinate system, and a control point of a robot is moved to the three points to obtain position information on a work space to obtain an image coordinate system and a work space. Was being calibrated. However, since the articulated robot has low rigidity and is liable to bend, it is easy to cause a discrepancy between the design dimensions of the robot and the dimensions when the actual robot moves. There is a problem that a point that is reached in calculation and a point that is actually reached are different depending on the posture and the posture, and an error is likely to occur in the position information on the work space required for calibration. In order to solve such a problem, Japanese Patent Laid-Open No. 63-61903 sets forth a predetermined rule around a great circle having concentric hollow portions which are provided at three predetermined positions in an object to be detected. Therefore, a calibration pattern in which at least four small circles are arranged is taken by each camera, the lens center position of the camera is automatically calculated, and the coordinate system is transformed by projective transformation. A coordinate transformation device is described.

【0003】[0003]

【発明が解決しようとする課題】しかしながら、この従
来の技術では、4点以上の点から、キャリブレーション
平面を算出するものである。このため、算出結果が、比
較的大きな誤差を含むデータに影響され、精度が低くな
るという問題があった。そこで、本発明が解決すべき課
題は、より高い精度のキャリブレーションデータを得
て、これを基に、より高い精度でキャリブレーションを
行うことにある。
However, in this conventional technique, the calibration plane is calculated from four or more points. Therefore, there is a problem that the calculation result is affected by the data including a relatively large error and the accuracy is lowered. Therefore, the problem to be solved by the present invention is to obtain calibration data with higher accuracy and perform calibration with higher accuracy based on this.

【0004】[0004]

【課題を解決するための手段】前記課題を解決するた
め、本発明は、視覚センサ付き多関節ロボットシステム
におけるキャリブレーション方法において、4点以上の
作業空間上および画像座標系上の位置情報を得た後、作
業空間上の位置情報を元に作業空間上にこれらの点の回
帰平面を算出し、この平面に対する各点の作業空間上の
位置情報の残差を基準として4点以上の位置情報からキ
ャリブレーションに必要な数の点の位置情報を選択し、
選択したデータを基にキャリブレーションを行おうとす
るものである。
In order to solve the above-mentioned problems, the present invention provides a calibration method in a multi-joint robot system with a visual sensor to obtain position information on a work space of four points or more and on an image coordinate system. After that, the regression plane of these points is calculated on the work space based on the position information on the work space, and the position information of four or more points is set based on the residual of the position information on the work space of each point with respect to this plane. Select the position information of the number of points required for calibration from
It is intended to perform calibration based on the selected data.

【0005】[0005]

【作用】上記手段により、4点以上のキャリブレーショ
ンデータから、キャリブレーション平面に対して比較的
残差の小さい1つの直線上にない3点のデータを得る。
By the above means, the data of three points which are not on one straight line with a relatively small residual with respect to the calibration plane are obtained from the calibration data of four points or more.

【0006】[0006]

【実施例】以下、本発明の実施例を図に基づいて説明す
る。図1は本発明に係る方法の手順を示すフローチャー
ト、図2はロボットに視覚センサ1を取り付け、視覚セ
ンサ1で画像認識のパターン図2を捉えた場合の斜視
図、図3は視覚センサ1で画像認識のパターン図2を捉
えて得られる画像パターン図5である。また図4は、本
発明の方法を実行するための装置の構成を示すブロック
図である。図4において、1は視覚センサ、10は視覚
センサ1で撮像された画像信号を画像処理する画像処理
部、11はロボット、12は画像処理部10からの信号
とロボット11からの位置情報に基づいて演算処理を行
う処理部、13は処理部からの処理結果をロボット11
の制御装置(図示せず)に出力する出力部である。
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS An embodiment of the present invention will be described below with reference to the drawings. FIG. 1 is a flow chart showing the procedure of the method according to the present invention, FIG. 2 is a perspective view when a visual sensor 1 is attached to a robot, and the visual sensor 1 captures an image recognition pattern FIG. 2. FIG. FIG. 6 is an image pattern diagram 5 obtained by capturing the image recognition pattern diagram 2. Further, FIG. 4 is a block diagram showing the configuration of an apparatus for executing the method of the present invention. In FIG. 4, reference numeral 1 is a visual sensor, 10 is an image processing unit that performs image processing of an image signal captured by the visual sensor 1, 11 is a robot, 12 is based on a signal from the image processing unit 10 and position information from the robot 11. And a processing unit 13 for performing arithmetic processing by using the processing result from the processing unit by the robot 11
This is an output unit for outputting to the control device (not shown).

【0007】ここでは5点を使用する例を示す。まず、
ロボットの制御点2を5個の円形パターン4の重心に移
動させ、移動した点の作業空間上の位置情報 Pr[i]=(rx[i],ry[i],rz[i])
(i=1〜5) を記憶させる(ステップ10)。次に、画像5内の5個
のパターンについて画像処理を行い、画像内の各パター
ンの重心の画像座標系の位置情報 Pv[i]=(vx[i],vy[i],vz[i])
(i=1〜5) を得る(ステップ20)。ステップ10で得た作業空間
上の位置情報より、回帰平面を算出する(ステップ3
0)。回帰平面を、 z=a*x+b*y+c としたとき、係数a,b,cは、作業空間上の位置情報
rx[i],ry[i],rz[i]より、 a=(Syy*Sxz−Sxy*Syz)/|S| b=(Sxx*Syz−Sxy*Sxz)/|S| c=Az−a*Ax−b*Ay ただし、
Here, an example using 5 points is shown. First,
The control point 2 of the robot is moved to the center of gravity of the five circular patterns 4, and the positional information of the moved point in the working space Pr [i] = (rx [i], ry [i], rz [i])
(I = 1 to 5) is stored (step 10). Next, image processing is performed on five patterns in the image 5, and position information Pv [i] = (vx [i], vy [i], vz [i] of the image center of gravity of each pattern in the image. ]))
(I = 1 to 5) is obtained (step 20). A regression plane is calculated from the position information on the work space obtained in step 10 (step 3
0). When the regression plane is z = a * x + b * y + c, the coefficients a, b, and c are a = (Syy *) from the position information rx [i], ry [i], and rz [i] on the work space. Sxx-Sxy * Syz) / | S | b = (Sxx * Syz-Sxy * Sxz) / | S | c = Az-a * Ax-b * Ay where

【数1】 と定まり回帰平面が求められる(ステップ30)。[Equation 1] Then, a regression plane is obtained (step 30).

【0008】ここで、作業空間上の位置情報と回帰平面
との残差e[i]は、 e[i]=rz[i]−(a*rx[i]+b*ry
[i]+c)(i=1〜5) で求められる(ステップ40)。ステップ10で得た作
業空間上の位置情報のうち、ステップ40で得た残差の
比較的小さな3点を選択し、これをキャリブレーション
用のデータとして選択する(ステップ50)。ステップ
50で得た3点の作業空間上の位置情報と、ステップ2
0で得たこれに対応する3点の画像座標系の位置情報よ
り、画素のX方向のサイズXsizeとY方向のサイズY
sizeを求める(ステップ60)。画素サイズと作業空間
上及び画像座標系の位置情報より、画像座標系の位置情
報を作業空間上の位置情報に変換するマトリックスを求
める(ステップ70)。このように、3点より多い数の
点の位置情報のうち回帰平面からの残差の小さい3点を
選択してキャリブレーションに用いるようにしたことに
より、データの精度を向上させることができる。
Here, the residual e [i] between the position information on the working space and the regression plane is e [i] = rz [i]-(a * rx [i] + b * ry
[I] + c) (i = 1 to 5) (step 40). Of the positional information on the working space obtained in step 10, three points with relatively small residuals obtained in step 40 are selected and selected as calibration data (step 50). Position information on the three work spaces obtained in step 50 and step 2
From the position information of the image coordinate system of three points corresponding to this obtained by 0, the size X size of the pixel in the X direction and the size Y of the pixel in the Y direction.
Size is calculated (step 60). A matrix for converting the position information of the image coordinate system into the position information of the working space is obtained from the pixel size and the position information of the working space and the image coordinate system (step 70). As described above, the accuracy of data can be improved by selecting three points having a small residual difference from the regression plane from the positional information of the points greater than three and using them for the calibration.

【0009】[0009]

【発明の効果】以上述べたように、本発明によれば、4
点以上のキャリブレーションデータから、キャリブレー
ション平面に対して比較的残差の小さいデータが得ら
れ、これに基づいてキャリブレーションを行うので、よ
り高い精度のキャリブレーションが可能となる。
As described above, according to the present invention, 4
From the calibration data of the points or more, data with a relatively small residual error is obtained with respect to the calibration plane, and the calibration is performed based on this, so that it is possible to perform the calibration with higher accuracy.

【図面の簡単な説明】[Brief description of drawings]

【図1】 本発明実施例の概略フローチャートである。FIG. 1 is a schematic flowchart of an embodiment of the present invention.

【図2】 本発明実施例の説明図である。FIG. 2 is an explanatory diagram of an example of the present invention.

【図3】 本発明実施例を示す斜視図である。FIG. 3 is a perspective view showing an embodiment of the present invention.

【図4】 本発明の方法を実行するための装置の構成を
示すブロック図である。
FIG. 4 is a block diagram showing the configuration of an apparatus for executing the method of the present invention.

【符号の説明】[Explanation of symbols]

1 視覚センサ、2 ロボットの制御点、3 画像認識
のパターン図、4 円形パターン、5 視覚センサで捉
えた画像認識のパターン図、10 画像処理部、11
ロボット、12 処理部、13 出力部
1 visual sensor, 2 robot control points, 3 image recognition pattern diagram, 4 circular pattern, 5 image recognition pattern diagram captured by the visual sensor, 10 image processing unit, 11
Robot, 12 processing unit, 13 output unit

フロントページの続き (72)発明者 中村 正夫 福岡県北九州市八幡西区黒崎城石2番1号 株式会社安川電機内Front page continued (72) Inventor Masao Nakamura 2-1, Kurosaki Shiroishi, Hachimansai-ku, Kitakyushu, Fukuoka Prefecture Yasukawa Electric Co., Ltd.

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】 視覚センサ付き多関節ロボットシステム
におけるキャリブレーション方法において、 4点以上の作業空間上および画像座標系上の位置情報を
得た後、作業空間上の位置情報を元に作業空間上にこれ
らの点の回帰平面を算出し、この平面に対する各点の作
業空間上の位置情報の残差を基準として4点以上の位置
情報からキャリブレーションに必要な数の点の位置情報
を選択し、選択したデータを基にキャリブレーションを
行うことを特徴とする視覚センサ付きロボットにおける
キャリブレーション方法。
1. A calibration method in an articulated robot system with a visual sensor, wherein position information on a work space of four points or more and an image coordinate system is obtained, and then on the work space based on the position information on the work space. Then, calculate the regression plane of these points, and select the positional information of the number of points necessary for calibration from the positional information of four or more points with reference to the residual of the positional information of each point with respect to this plane. , A calibration method for a robot with a visual sensor, characterized in that calibration is performed based on selected data.
JP07010895A 1995-03-28 1995-03-28 Calibration method for robot with visual sensor Expired - Lifetime JP3541980B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP07010895A JP3541980B2 (en) 1995-03-28 1995-03-28 Calibration method for robot with visual sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP07010895A JP3541980B2 (en) 1995-03-28 1995-03-28 Calibration method for robot with visual sensor

Publications (2)

Publication Number Publication Date
JPH08272451A true JPH08272451A (en) 1996-10-18
JP3541980B2 JP3541980B2 (en) 2004-07-14

Family

ID=13422025

Family Applications (1)

Application Number Title Priority Date Filing Date
JP07010895A Expired - Lifetime JP3541980B2 (en) 1995-03-28 1995-03-28 Calibration method for robot with visual sensor

Country Status (1)

Country Link
JP (1) JP3541980B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010108085A (en) * 2008-10-28 2010-05-13 Makino Milling Mach Co Ltd Error correction method
JP2013215866A (en) * 2012-04-12 2013-10-24 Seiko Epson Corp Robot system, robot system calibration method, calibration device, and digital camera

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5371927B2 (en) * 2010-10-27 2013-12-18 三菱電機株式会社 Coordinate system calibration method and robot system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010108085A (en) * 2008-10-28 2010-05-13 Makino Milling Mach Co Ltd Error correction method
JP2013215866A (en) * 2012-04-12 2013-10-24 Seiko Epson Corp Robot system, robot system calibration method, calibration device, and digital camera

Also Published As

Publication number Publication date
JP3541980B2 (en) 2004-07-14

Similar Documents

Publication Publication Date Title
CN106767393B (en) Hand-eye calibration device and method for robot
EP1215017B1 (en) Robot teaching apparatus
JP4021413B2 (en) Measuring device
JP4167954B2 (en) Robot and robot moving method
JP4191080B2 (en) Measuring device
JP4681856B2 (en) Camera calibration method and camera calibration apparatus
JP3733364B2 (en) Teaching position correction method
JP3930482B2 (en) 3D visual sensor
JPH0435885A (en) Calibration method for visual sensor
US11403780B2 (en) Camera calibration device and camera calibration method
JP2008021092A (en) Simulation apparatus of robot system
JP2010112859A (en) Robot system, robot control device, and method for controlling robot
JP2017077614A (en) Teaching point correction method, program, recording medium, robot device, photographing point generation method, and photographing point generation device
CN109556510B (en) Position detection device and computer-readable storage medium
JP7047306B2 (en) Information processing equipment, information processing methods, and programs
JP4572497B2 (en) Robot controller
JP6410411B2 (en) Pattern matching apparatus and pattern matching method
JPH06249615A (en) Position detecting method
KR20130075712A (en) A laser-vision sensor and calibration method thereof
CN116276938B (en) Mechanical arm positioning error compensation method and device based on multi-zero visual guidance
JPH08272451A (en) Calibration method in robot with visual sensor
EP4129584A1 (en) Calibration system, information processing system, robot control system, calibration method, information processing method, robot control method, calibration program, information processing program, calibration device, information processing device, and robot control device
JPH0588721A (en) Controller for articulated robot
CN116133801A (en) Robot control system, robot control device, robot control method, and program
JP2001191285A (en) Robot system and its usage

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20031217

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20040312

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20040325

R150 Certificate of patent (=grant) or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090409

Year of fee payment: 5

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090409

Year of fee payment: 5

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100409

Year of fee payment: 6

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100409

Year of fee payment: 6

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110409

Year of fee payment: 7

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120409

Year of fee payment: 8

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120409

Year of fee payment: 8

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130409

Year of fee payment: 9

FPAY Renewal fee payment (prs date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140409

Year of fee payment: 10

EXPY Cancellation because of completion of term