JPH07107644B2 - Robot control method - Google Patents

Robot control method

Info

Publication number
JPH07107644B2
JPH07107644B2 JP61041544A JP4154486A JPH07107644B2 JP H07107644 B2 JPH07107644 B2 JP H07107644B2 JP 61041544 A JP61041544 A JP 61041544A JP 4154486 A JP4154486 A JP 4154486A JP H07107644 B2 JPH07107644 B2 JP H07107644B2
Authority
JP
Japan
Prior art keywords
robot
detected
coordinate system
visual sensor
distortion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
JP61041544A
Other languages
Japanese (ja)
Other versions
JPS62200403A (en
Inventor
日出一 佐藤
基誓 木下
裕司 渡辺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Komatsu Ltd
Original Assignee
Komatsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Komatsu Ltd filed Critical Komatsu Ltd
Priority to JP61041544A priority Critical patent/JPH07107644B2/en
Publication of JPS62200403A publication Critical patent/JPS62200403A/en
Publication of JPH07107644B2 publication Critical patent/JPH07107644B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Landscapes

  • Numerical Control (AREA)
  • Manipulator (AREA)

Description

【発明の詳細な説明】 産業上の利用分野 本発明は、視覚センサを用いてロボットの支承部を動作
制御する方法に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method of controlling the operation of a support part of a robot using a visual sensor.

従来の技術 ワークのX・Y方向位置を視覚センサで検出し、その検
出位置に基づいてロボツトをX・Y方向に移動してワー
ク位置にロボツトの支承部を位置決めるロボツトの制御
装置においては、視覚センサの座標系とロボツトの座標
系とが一致せずにロボツトの支承部をワーク位置に正確
に位置決めできない。
2. Description of the Related Art In a robot controller that detects the position of the workpiece in the X and Y directions with a visual sensor, and moves the robot in the X and Y directions based on the detected position to position the support portion of the robot at the workpiece position, The coordinate system of the visual sensor and the coordinate system of the robot do not match and the robot bearing cannot be accurately positioned at the work position.

つまり、視覚センサの座標系はレンズ収差や撮像素子の
ゆがみ等で歪を持つており、ワークの実際の位置と検出
位置とが一致しないと共に、ロボツトの座標系は各関節
角度から直交座標に変換したものであるから各関節部の
ガタや駆動誤差などによつて歪を持ち、かつロボツトの
座標系に基づいて支承部の位置を入力して動作制御する
場合には座標系上の位置と実際の位置とが一致しない。
In other words, the coordinate system of the visual sensor has distortion due to lens aberration and distortion of the image sensor, etc., and the actual position of the workpiece does not match the detected position, and the robot's coordinate system converts each joint angle to Cartesian coordinates. Therefore, if there is distortion due to rattling of each joint or drive error, and if the position of the bearing is input based on the robot coordinate system to control the operation, the position on the coordinate system and the actual position The position of does not match.

発明が解決しようとする問題点 このために、ロボツトによつてワークを正確に動作制御
するにはロボツトの座標系と視覚センサの座標系とを一
致させる必要があるが、その作業は非常に面倒かつ困難
であつて多くの時間と手間を要している。
Problems to be Solved by the Invention For this reason, it is necessary to match the coordinate system of the robot with the coordinate system of the visual sensor in order to accurately control the operation of the work by the robot, but the work is very troublesome. And it is difficult and takes a lot of time and effort.

問題点を解決するための手段及び作用 基体2に対して支承部7を任意の位置に移動自在とした
ロボットの動作制御方法であり、 前記基体2に取付けた視覚センサ10で、予じめ所定の距
離を置いて配設した多数のマークを検出し、その検出し
たマーク間の距離と実際のマーク間の距離との差で視覚
センサの座標系の歪を検出し、ロボットの支承部7を予
じめ定めた複数の目標とする位置に移動させ、この時の
支承部7の各位置を視覚センサ10で検出すると共に、そ
の検出位置を前記歪で補正した実際の位置と支承部7の
目標とする位置によって視覚センサの座標系からロボッ
トの座標系への変換パラメータ及びロボットの座標系の
歪を検出し、 前記ロボットの支承部7を入力データに基づいて目標位
置に移動する際に、前記変換パラメータ及びロボットの
座標系の歪に基づいて補正するようにしたロボットの制
御方法。
Means and Actions for Solving the Problems This is a method for controlling the operation of a robot in which the support portion 7 can be freely moved to an arbitrary position with respect to the base body 2, and a visual sensor 10 attached to the base body 2 is used to make a predetermined determination. A large number of marks arranged at a distance of 10 mm between them are detected, and the distortion of the coordinate system of the visual sensor is detected by the difference between the detected distance between the marks and the actual distance between the marks. The position is moved to a plurality of predetermined target positions, each position of the supporting portion 7 at this time is detected by the visual sensor 10, and the detected position is corrected by the distortion and the actual position of the supporting portion 7 is detected. Depending on the target position, the transformation parameter from the coordinate system of the visual sensor to the coordinate system of the robot and the distortion of the coordinate system of the robot are detected, and when the support portion 7 of the robot is moved to the target position based on the input data, The conversion parameters and A robot control method in which correction is performed based on the distortion of the robot coordinate system.

これにより、視覚センサ10の座標系の歪とロボット1の
座標系の歪とを補正することで両方の座標系が一致し、
支承部7が入力データに基づいた目標位置に正しく移動
する。
Thereby, by correcting the distortion of the coordinate system of the visual sensor 10 and the distortion of the coordinate system of the robot 1, both coordinate systems match,
The support part 7 correctly moves to the target position based on the input data.

実施例 ロボツト1は基体2に本体3を旋回自在に設け、本体3
に支柱4を上下揺動自在に取付けると共に、支柱4にヘ
ツド5を上下揺動自在に取付け、ヘツド5にアーム6を
回転自在に取付けると共に、アーム6にハンドや吸着具
の支承部7を回転及び首振り自在に設けてある。
The robot 1 of the embodiment has a main body 3 rotatably mounted on a base body 2,
The column 4 is swingably mounted on the column 4, the head 5 is swingably mounted on the column 4, the arm 6 is rotatably mounted on the head 5, and the support part 7 of the hand or the suction tool is rotated on the arm 6. And it is provided so that it can be swung freely.

そして、図示しない各アクチユエータを駆動して支承部
7を水平方向X・Y方向及び上下方向に移動するように
してある。
Then, each actuator (not shown) is driven to move the support portion 7 in the horizontal X and Y directions and the vertical direction.

視覚センサ10はフレーム11でロボツト1の上方に配設さ
れ、このフレーム11の両側には一対の光源12が配設して
ある。
The visual sensor 10 is arranged above the robot 1 in a frame 11, and a pair of light sources 12 are arranged on both sides of the frame 11.

前記視覚センサ10はITVとCCDセンサーを組み合せたもの
やITVと撮像管を組み合せたものであつて検出信号を制
御部13にビデオ信号として入力するようにしてあると共
に、前記各アクチユエータはコントローラ14よりの動作
信号で駆動するようにしてある。
The visual sensor 10 is a combination of an ITV and a CCD sensor or a combination of an ITV and an image pickup tube, and a detection signal is input to the control unit 13 as a video signal, and each actuator is supplied from the controller 14. It is designed to be driven by the operation signal.

第1図に示すように、多数のマーク、例えば黒点20をX
・Y方向に等間隔で有するプレート21を、視覚センサ10
の下方に配設し、その多数の黒点20を視覚センサ10で検
出して制御部13よりビデオ信号として演算部22に出力す
る。
As shown in FIG. 1, a large number of marks, such as black dots 20, are marked with Xs.
・ The visual sensor 10 includes the plates 21 having an equal interval in the Y direction.
A large number of black dots 20 are detected by the visual sensor 10 and output from the control unit 13 to the calculation unit 22 as a video signal.

演算部22は各黒点20の位置を演算してX・Yのロケーシ
ヨンとして求めて比較演算部23に送り、設定器24より入
力された予じめ各黒点20の位置を正しく設定された基準
データと比較して隣接する黒点間の距離の差によつて視
覚センサの座標系の歪を検出し、その歪を制御部13にフ
イードバツクして以後の検出したビデオ信号を補正する
ようにする。
The calculation unit 22 calculates the position of each black dot 20 and obtains it as an X / Y location, and sends it to the comparison calculation unit 23, and inputs the preset position of each black dot 20 input from the setter 24 to the correctly set reference data. The distortion of the coordinate system of the visual sensor is detected by the difference in the distance between the adjacent black points, and the distortion is fed back to the control unit 13 to correct the detected video signal thereafter.

次に、第2図に示すようにロボツト1の支承部7が第i
〜第5位置I,II,III,IV,Vの矩形頂点及び中央点に移動
すべくコントローラ14に入力データを入力して支承部7
を第1〜第5位置I〜Vに移動する。つまり、コントロ
ーラ14には第1〜第5位置に支承部7を移動する入力デ
ータが入力され、これによりロボツトの各部が移動して
支承部7を第1〜第5位置I〜Vに移動する。そして支
承部7の各位置と対向する位置に黒点25を有するプレー
ト26を基板27上に配置する。この時に、前述したロボツ
ト座標系の歪によつて入力データに基づく移動による位
置、つまり座標とする位置と実際の位置とが異なる。
Next, as shown in FIG. 2, the support portion 7 of the robot 1 is moved to the i-th position.
~ Input the input data to the controller 14 to move to the rectangular vertices and the center points of the fifth positions I, II, III, IV, V, and the bearing 7
To the first to fifth positions I to V. That is, the controller 14 receives the input data for moving the support portion 7 to the first to fifth positions, whereby each part of the robot moves to move the support portion 7 to the first to fifth positions IV. . Then, a plate 26 having black dots 25 is arranged on the substrate 27 at a position facing each position of the support portion 7. At this time, the position based on the movement based on the input data, that is, the coordinate position and the actual position are different due to the distortion of the robot coordinate system described above.

この後にロボツト1の支承部7を視覚センサ10と離れた
位置として視覚センサ10で各黒点25の位置を演算部22で
X−Yのロケーシヨンとして検出し、その検出位置を補
正回路28で前記比較演算回路23の演算結果、つまり視覚
センサの座標系の歪で補正し、支承部7の実際の位置と
して演算回路29に送り、この支承部7の実際の位置と前
記入力データに対応した支承部7の位置を比較演算して
視覚センサの座標系からロボツトの座標系への変換パラ
メータとロボツトの座標系の歪を算出し、記憶部30に記
憶する。
After this, the support portion 7 of the robot 1 is set at a position apart from the visual sensor 10, and the visual sensor 10 detects the position of each black dot 25 as the XY location by the arithmetic unit 22 and the detected position is corrected by the correction circuit 28. It is corrected by the calculation result of the calculation circuit 23, that is, the distortion of the coordinate system of the visual sensor, and sent to the calculation circuit 29 as the actual position of the support unit 7, and the actual position of the support unit 7 and the support unit corresponding to the input data. The position of 7 is compared and calculated, and the conversion parameter from the coordinate system of the visual sensor to the robot coordinate system and the distortion of the robot coordinate system are calculated and stored in the storage unit 30.

そして、ロボツト1を入力データに基づいて動作制御す
る場合には記憶部30に記憶した値に基づいて補正して動
作制御する。
When the operation of the robot 1 is controlled based on the input data, the operation is controlled by correcting the value based on the value stored in the storage unit 30.

なお、前述の補正回路28を設けずに予じめ演算した視覚
センサの座標系の歪によつてビデオ信号を補正し、その
補正したビデオ信号を演算回路22に送つて支承部7の実
際の位置を検出するようにしても良い。
It should be noted that the video signal is corrected by the distortion of the coordinate system of the visual sensor calculated in advance without providing the correction circuit 28, and the corrected video signal is sent to the arithmetic circuit 22 so that the actual operation of the support unit 7 is performed. The position may be detected.

発明の効果 視覚センサ10の座標系の歪とロボット1の座標系の歪と
を補正して両方の座標系を一致して支承部を入力データ
に基づいた目標位置に正しく移動できる。
Effect of the Invention The distortion of the coordinate system of the visual sensor 10 and the distortion of the coordinate system of the robot 1 are corrected so that both coordinate systems coincide with each other and the bearing can be correctly moved to the target position based on the input data.

【図面の簡単な説明】[Brief description of drawings]

図面は本発明の実施例を示し、第1図、第2図は動作説
明斜視図、第3図は全体斜視図である。 1はロボツト、7は支承部、10は視覚センサ。
The drawings show an embodiment of the present invention, FIGS. 1 and 2 are perspective views for explaining the operation, and FIG. 3 is an overall perspective view. 1 is a robot, 7 is a bearing, and 10 is a visual sensor.

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】基体2に対して支承部7を任意の位置に移
動自在としたロボットの動作制御方法であり、 前記基体2に取付けた視覚センサ10で、予じめ所定の距
離を置いて配設した多数のマークを検出し、その検出し
たマーク間の距離と実際のマーク間の距離との差で視覚
センサの座標系の歪を検出し、ロボットの支承部7を予
じめ定めた複数の目標とする位置に移動させ、この時の
支承部7の各位置を視覚センサ10で検出すると共に、そ
の検出位置を前記歪で補正した実際の位置と支承部7の
目標とする位置によって視覚センサの座標系からロボッ
トの座標系への変換パラメータ及びロボットの座標系の
歪を検出し、 前記ロボットの支承部7を入力データに基づいて目標位
置に移動する際に、前記変換パラメータ及びロボットの
座標系の歪に基づいて補正することを特徴とするロボッ
トの制御方法。
1. A method of controlling a movement of a robot, wherein a support portion 7 is movable to an arbitrary position with respect to a base body 2, wherein a visual sensor 10 attached to the base body 2 sets a predetermined distance in advance. A large number of the arranged marks are detected, the distortion of the coordinate system of the visual sensor is detected by the difference between the detected distance between the detected marks and the actual distance between the marks, and the bearing 7 of the robot is predetermined. It is moved to a plurality of target positions, each position of the support portion 7 at this time is detected by the visual sensor 10, and the detected position is corrected by the distortion and the actual position and the target position of the support portion 7 are detected. When the transformation parameter from the coordinate system of the visual sensor to the coordinate system of the robot and the distortion of the coordinate system of the robot are detected, and when the support portion 7 of the robot is moved to the target position based on the input data, the transformation parameter and the robot are used. To the distortion of the coordinate system of Robot control method and correcting by Zui.
JP61041544A 1986-02-28 1986-02-28 Robot control method Expired - Lifetime JPH07107644B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP61041544A JPH07107644B2 (en) 1986-02-28 1986-02-28 Robot control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP61041544A JPH07107644B2 (en) 1986-02-28 1986-02-28 Robot control method

Publications (2)

Publication Number Publication Date
JPS62200403A JPS62200403A (en) 1987-09-04
JPH07107644B2 true JPH07107644B2 (en) 1995-11-15

Family

ID=12611361

Family Applications (1)

Application Number Title Priority Date Filing Date
JP61041544A Expired - Lifetime JPH07107644B2 (en) 1986-02-28 1986-02-28 Robot control method

Country Status (1)

Country Link
JP (1) JPH07107644B2 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02154852A (en) * 1988-12-05 1990-06-14 Alps Giken:Kk Friction driving device
JPH03251378A (en) * 1990-02-28 1991-11-08 Fanuc Ltd Calibration system for robot
JP2009106992A (en) * 2007-10-31 2009-05-21 Sankyo Mfg Co Ltd Sheet material feeding apparatus
KR102385611B1 (en) * 2019-12-06 2022-04-12 엘지전자 주식회사 Moving robot system and method for generating boundary information of the same

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6295403A (en) * 1985-10-22 1987-05-01 Nec Corp Coordinate system calibrating device
JPH069795B2 (en) * 1985-10-29 1994-02-09 日産自動車株式会社 Robot wrist positioning method

Also Published As

Publication number Publication date
JPS62200403A (en) 1987-09-04

Similar Documents

Publication Publication Date Title
WO2020121396A1 (en) Robot calibration system and robot calibration method
US20110029131A1 (en) Apparatus and method for measuring tool center point position of robot
JP2019076972A (en) Automation apparatus
JP4287788B2 (en) Self-propelled robotic hand
JPH0580842A (en) Control method for moving robot
US20100017032A1 (en) Device for controlling a robot
JPH11156764A (en) Locomotive robot device
JPH07325611A (en) Automatic correcting method for off-line teaching data
JP2679614B2 (en) Mobile robot hand positioning device
JPS62191904A (en) Position correction method for robot loaded on unmanned carrier
JPH07107644B2 (en) Robot control method
JPS63120088A (en) Method of correcting position of unmanned cart loading robot
JPH0511822A (en) Cooperative operation system for robot
JPH11320465A (en) Control method for robot arm
JPH0755439A (en) Three-dimensional shape measuring equipment
JPH03213244A (en) Positioning device for flat plate workpiece work machine
CN113905859B (en) Robot control system and robot control method
JPS6218316B2 (en)
JPH03281182A (en) Coordinate correcting method for moving robot
JP2005271103A (en) Working robot and calibration method thereof
JPS6111815A (en) Compensating system of positional shift of robot
JPH07205014A (en) Grinding robot
JP7242856B2 (en) Robot control system and robot control method
JPH01247285A (en) Method for calibration of work locating device
JPH069795B2 (en) Robot wrist positioning method