JP3198467B2 - Sensor position and attitude path tracking control method - Google Patents

Sensor position and attitude path tracking control method

Info

Publication number
JP3198467B2
JP3198467B2 JP18815092A JP18815092A JP3198467B2 JP 3198467 B2 JP3198467 B2 JP 3198467B2 JP 18815092 A JP18815092 A JP 18815092A JP 18815092 A JP18815092 A JP 18815092A JP 3198467 B2 JP3198467 B2 JP 3198467B2
Authority
JP
Japan
Prior art keywords
path
sensor
end effector
orientation
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP18815092A
Other languages
Japanese (ja)
Other versions
JPH0635524A (en
Inventor
潔 野中
透 金子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Telegraph and Telephone Corp
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Priority to JP18815092A priority Critical patent/JP3198467B2/en
Publication of JPH0635524A publication Critical patent/JPH0635524A/en
Application granted granted Critical
Publication of JP3198467B2 publication Critical patent/JP3198467B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【産業上の利用分野】本発明は、機械部品の組立・加工
作業において、当該作業対象物に対しセンサによりロボ
ットアームやエンドイフェクタの移動経路や姿勢を制御
するのに供されるセンサ位置姿勢経路追従制御法に関す
るものである。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a position and orientation of a sensor used for controlling the movement path and orientation of a robot arm and an end effector by a sensor with respect to an object to be worked in a work of assembling and processing mechanical parts. The present invention relates to a path following control method.

【0002】[0002]

【従来の技術】従来におけるロボットのセンサ位置姿勢
経路追従制御系では、エンドイフェクタと共にロボット
の手先にセンサを取り付け、エンドイフェクタと共に移
動しながらエンドイフェクタより前方の経路の位置・姿
勢情報を収集する場合がある。この際、安定な情報収集
を行うためにはセンサの視野内に確実に経路を捉えるこ
とが必要であり、そのためには、センサの位置と姿勢を
目標経路に添うように、正確に制御しなければならな
い。特に、センサの視野範囲が狭い場合にこのことが重
要になる。
2. Description of the Related Art In a conventional robot sensor position / posture path tracking control system, a sensor is attached to a robot hand along with an end effector, and the position / posture information of a path ahead of the end effector is moved while moving with the end effector. May be collected. At this time, in order to collect information stably, it is necessary to reliably capture the path within the field of view of the sensor, and for that purpose, the position and orientation of the sensor must be accurately controlled so as to follow the target path. Must. This is especially important when the field of view of the sensor is narrow.

【0003】これに対する従来技術の一つを図面につき
説明する。図2は従来の過去のセンサデータを用いてセ
ンサの位置と姿勢を決定する方法の例で(a)は一点近
似、(b)は二点差分近似を用いた説明図である。図
中、1は目標経路、2はエンドイフェクタ中心軸、3は
センサ視野中心軸、4はセンサ視野範囲、D1〜Dnは
過去にセンサが検出した目標経路1上のセンサデータで
ある。
[0003] One of the prior arts will be described with reference to the drawings. 2A and 2B show an example of a conventional method for determining the position and orientation of a sensor using past sensor data. FIG. 2A is an explanatory diagram using one-point approximation, and FIG. 2B is an explanatory diagram using two-point difference approximation. In the drawing, 1 is a target path, 2 is an end effector central axis, 3 is a sensor visual field central axis, 4 is a sensor visual field range, and D1 to Dn are sensor data on the target path 1 detected by sensors in the past.

【0004】ここではセンサがエンドイフェクタ中心軸
2回りに回転できる機構を持っていることを前提として
いる。図2(a)に示すように、最新のセンサデータD
nの示す位置と現在のエンドイフェクタ中心軸2の位置
とを結ぶ直線L1上の経路が続いているものと推定し、
この直線L1上にセンサ視野中心軸3が来るようにセン
サをエンドイフェクタ中心軸2回りに一点近似手法を用
いて制御する。
Here, it is assumed that the sensor has a mechanism capable of rotating around the end effector center axis 2. As shown in FIG. 2A, the latest sensor data D
It is estimated that a path on the straight line L1 connecting the position indicated by n and the current position of the end effector center axis 2 continues,
The sensor is controlled using the one-point approximation method around the end effector central axis 2 so that the sensor visual field central axis 3 is located on the straight line L1.

【0005】図2(b)に示すように、最新のセンサデ
ータDn,Dn−1の示す位置を結ぶ直線L2上に経路
が続いているものと推定し、この差分による直線L2上
にセンサ視野中心軸3が来るようにセンサをエンドイフ
ェクタ中心軸2回りに二点差分近似手法を用いて制御す
る。この例では、いちおうセンサ視野範囲4に目標経路
1が入っているが、かなりの誤差を含んでいる。
[0005] As shown in FIG. 2B, it is estimated that the path continues on a straight line L2 connecting the positions indicated by the latest sensor data Dn and Dn-1. The sensor is controlled using the two-point difference approximation method around the end effector center axis 2 so that the center axis 3 comes. In this example, although the target path 1 is included in the sensor visual field range 4, it includes a considerable error.

【0006】[0006]

【発明が解決しようとする課題】以上のような制御方法
では、経路の継続方向の推定を過去のセンサデータの値
そのもの、あるいは差分を用いて直線で近似して行って
いるため、センサの視野範囲が狭い場合や、経路の曲率
半径が小さく推定誤差が大きくなり易い場合には目標経
路をセンサの視野範囲に捉えることが困難になるという
欠点があった。ここにおいて、本発明は前記従来の技術
の課題に鑑み、より高度な推定経路に基づいてセンサの
位置・姿勢を制御することにより、センサがその視野に
確実に経路を捉えるセンサ位置姿勢経路追従制御法を提
供せんとするものである。
In the control method as described above, the estimation of the continuation direction of the route is performed by approximating a straight line using the past sensor data itself or the difference. When the range is narrow, or when the radius of curvature of the route is small and the estimation error is likely to be large, there is a disadvantage that it is difficult to capture the target route in the field of view of the sensor. In view of the above-mentioned problems of the related art, the present invention controls the position and orientation of a sensor based on a more advanced estimated route, so that the sensor can reliably capture the route in its field of view. It does not provide a law.

【0007】前記課題の解決は、本発明が次の新規な特
徴的構成手法を採用することにより達成される。即ち、
本発明の特徴は、エンドイフェクタと共にロボットの手
先に取り付けられかつ当該エンドイフェクタより前方の
作業対象物上の当該エンドイフェクタ目標経路の位置お
よび姿勢を予測検出力するセンサの出力する経路の位置
および姿勢データと、基準座標系で表した当該センサの
位置および姿勢とから、当該基準座標における前記エン
ドイフェクタ目標経路の位置と姿勢を表す経路関数を決
定出力し、当該経路関数に基づいて前記ロボットの前記
エンドイフェクタの位置および姿勢を制御する位置姿勢
経路追従制御方法において、当該エンドイフェクタの経
路関数が描く目標経路上の現在位置と、当該エンドイフ
ェクタおよび前記センサ間の距離とから、センサ視野範
囲中心軸追従挙動の方程式を算出し、次いで、前記セン
サが取るべき経路上の位置および姿勢を、当該方程式と
前記経路関数に基づいて連立方程式を立て、その解を算
出して求め、当該センサがその視野範囲に確実に前記エ
ンドイフェクタ目標経路を捉えるようにしてなるセンサ
位置姿勢経路追従制御法である。
The above object can be attained by the present invention employing the following novel characteristic configuration method. That is,
A feature of the present invention is that a path output from a sensor which is attached to a robot hand along with an end effector and predicts and detects the position and orientation of the end effector target path on a work object in front of the end effector. From the position and orientation data and the position and orientation of the sensor represented in the reference coordinate system, a path function representing the position and orientation of the end effector target path in the reference coordinates is determined and output, and based on the path function. In a position and orientation path following control method for controlling the position and orientation of the end effector of the robot, a current position on a target path drawn by a path function of the end effector, and a distance between the end effector and the sensor. From equation, the equation of the sensor field range center axis following behavior is calculated, and then the path that the sensor should take The position and orientation of the end-effector target path is determined by establishing a simultaneous equation based on the equation and the path function, calculating the solution thereof, and ensuring that the sensor captures the end effector target path in the field of view. This is a position and orientation path following control method.

【0008】[0008]

【作用】本発明は、前記のような手法を講じて、センサ
からの離散的経路情報に基づいて逐次的に生成される経
路関数を用いて新たにセンシングを行う時点での経路の
位置・姿勢を推定し、その値に基づいてセンサの位置と
姿勢を制御するもので、センサ視野範囲の中心軸の追従
挙動を方程式で表す。
According to the present invention, the position / posture of a path at the time of newly performing sensing using a path function sequentially generated based on discrete path information from a sensor by taking the above-described method. Is estimated, and the position and orientation of the sensor are controlled based on the values, and the following behavior of the central axis of the sensor visual field range is expressed by an equation.

【0009】その方程式と最新の経路関数とを連立させ
た時の解からセンサ視野範囲の中心軸が取るべき位置、
姿勢を決定し、この結果に基づいてセンサのエンドイフ
ェクタ中心軸回りの角度やセンサの姿勢を制御するの
で、経路関数から得られる高度な経路の推定値を用いる
ことによりセンサ視野範囲の中心軸を目標経路に精度よ
く制御し、確実にエンドイフェクタ経路を捉えることを
達成する。
From the solution when the equation and the latest path function are combined, the position that the center axis of the sensor field of view should take,
Since the attitude is determined and the angle of the sensor around the central axis of the end effector and the attitude of the sensor are controlled based on the result, the center axis of the sensor visual field range is obtained by using the estimated value of the advanced path obtained from the path function. Is accurately controlled to the target path, and the end effector path is reliably captured.

【0010】[0010]

【実施例】本発明の実施例を図面につき詳説する。図1
は本実施例のセンサ位置姿勢経路追従制御法の説明図で
ある。図中、5はセンサデータD1〜Dnから得られた
最新の経路関数が描く経路、6はその一つ前の区間D1
〜Dn−1の経路関数が描く経路、7はエンドイフェク
タに固定した三軸XE,YE,ZEからなる三次元立体
座標系である。なお、図2の従来例と同一表示は同一符
号を付した。ここでは、説明の簡略のためセンサはエン
ドイフェクタの中心軸2回りに回転する自由度を持つの
みで、その姿勢は変化されないものとする。
BRIEF DESCRIPTION OF THE DRAWINGS FIG. FIG.
FIG. 3 is an explanatory diagram of a sensor position / posture path following control method according to the present embodiment. In the drawing, 5 is a path drawn by the latest path function obtained from the sensor data D1 to Dn, and 6 is a section D1 immediately before the path
A path described by a path function of Dn-1 to 7 is a three-dimensional solid coordinate system composed of three axes XE, YE, and ZE fixed to the end effector. The same reference numerals as in the conventional example of FIG. 2 denote the same parts. Here, for simplicity of explanation, it is assumed that the sensor only has a degree of freedom to rotate around the central axis 2 of the end effector, and its posture is not changed.

【0011】本実施例を実現するいずれも図示しない具
体的手段としては、ロボットの手先に取付けられるエン
ドイフェクタと、当該エンドイフェクタ前方の図示しな
い作業対象物上のエンドイフェクタ経路の位置および姿
勢を検出予測するセンサと、この検出データと基準座標
系で表した当該センサの位置および姿勢とから経路関数
を求める経路関数決定手段と、この経路関数の入力によ
りエンドイフェクタを駆動するロボット制御手段と、当
該ロボット制御手段により駆動される前記ロボットを備
えた経路追従制御装置を採用した統合制御系から構成さ
れる。
Specific means (not shown) for realizing this embodiment include an end effector attached to the hand of the robot, the position of an end effector path on a work object (not shown) in front of the end effector, and A sensor for detecting and predicting the attitude, path function determining means for obtaining a path function from the detected data and the position and attitude of the sensor expressed in a reference coordinate system, and a robot control for driving an end effector by inputting the path function And an integrated control system employing a path following control device including the robot driven by the robot control means.

【0012】本実施例の実行手順を図面につき説明す
る。本実施例は、前記経路追従制御装置を作動して、図
1に示すようにセンサが作業対象物上の経路情報を収集
しながら経路関数を図示しない経路関数決定手段が逐次
生成し、その経路関数に基づいてエンドイフェクタの位
置・姿勢を制御し、目標経路1に追従させる。
An execution procedure of this embodiment will be described with reference to the drawings. In the present embodiment, the path following control device is operated, and as shown in FIG. 1, a path function determining means (not shown) sequentially generates a path function while collecting path information on a work target, as shown in FIG. The position / posture of the end effector is controlled based on the function to follow the target path 1.

【0013】その際、第i−1番目の区間一つ前の経路
6を表す経路関数Pi−1(S)が生成されており、そ
の後、最新のセンサデータDnが得られた時点で最新の
経路5を表す経路関数Pi(S)が生成され、この経路
関数Pi(S)に基づきロボット制御手段の指令によ
り、さらにエンドイフェクタが目標経路1上を位置・姿
勢を変えて進み、次のセンシングを行なう。
At this time, a path function Pi-1 (S) representing the path 6 immediately before the (i-1) -th section is generated, and thereafter, when the latest sensor data Dn is obtained, the latest path function Pi-1 (S) is obtained. A path function Pi (S) representing the path 5 is generated. Based on the path function Pi (S), the end effector further moves on the target path 1 by changing the position / posture according to a command from the robot control means. Perform sensing.

【0014】センサ視野範囲4の中心軸3はエンドイフ
ェクタの中心軸2を中心とする円周上の位置をとること
ができるから、その追従挙動方向を式で表し、その描線
をセンサ視野範囲4の中心軸3追従挙動方向線L3とす
る。経路5の推定値には前記経路情報から経路関数Pi
(S)を求める。このセンサ視野範囲4の中心軸3追従
挙動方向式と経路関数Pi(S)式を連立させて交点を
解き、二つの解の内、エンドイフェクタの前方側の解を
選ぶことによりセンサで制御すべき予測の位置が求ま
る。
Since the central axis 3 of the sensor visual field range 4 can take a position on the circumference centered on the central axis 2 of the end effector, the following behavior direction is expressed by an equation, and the drawing line is represented by the sensor visual field range. 4 is a central axis 3 following behavior direction line L3. The estimated value of the route 5 is calculated from the route information by using the route function Pi.
(S) is obtained. The central axis 3 following behavior direction formula of the sensor visual field range 4 and the path function Pi (S) formula are simultaneously solved to solve the intersection, and the sensor is controlled by selecting a solution on the front side of the end effector from the two solutions. The position of the prediction to be performed is obtained.

【0015】従って、この位置データからセンサのエン
ドイフェクタ中心軸2回りの角度を算出し、この結果を
ロボット制御手段によりエンドイフェクタに指令を与え
て位置姿勢経路追従制御をすることができる。この演算
はセンサ視野範囲4の中心軸3追従挙動方向線L3と経
路関数Pi(S)をエンドイフェクタに固定した座標系
7で経路情報をセンサが出力するように表しておけば簡
単に行うことができる。
Accordingly, the angle of the sensor around the central axis 2 of the end effector is calculated from the position data, and the robot control means gives a command to the end effector to perform position / posture path tracking control. This calculation is easily performed if the sensor outputs path information in a coordinate system 7 in which the central axis 3 following behavior direction line L3 of the sensor visual field range 4 and the path function Pi (S) are fixed to the end effector. be able to.

【0016】さらに、センサが姿勢を変化させるための
自由度を有している場合には、経路関数から与えられる
ワーク面の法線方向を考慮して、センサ視野範囲4の中
心軸3が法線方向と一致するようにセンサの姿勢を制御
することができる。このセンサ視野範囲4の中心軸3と
センサ視野範囲4と目標経路1の重なり具合は正確であ
り、前記図2(a)(b)の従来法の説明図と比較して
みると、本実施例の位置姿勢経路追従制御が優れている
ことは一目瞭然である。
Further, when the sensor has a degree of freedom for changing the attitude, the center axis 3 of the sensor visual field range 4 is determined by taking the normal direction of the work surface given from the path function into consideration. The attitude of the sensor can be controlled so as to coincide with the line direction. The degree of overlap between the central axis 3 of the sensor visual field range 4, the sensor visual field range 4 and the target path 1 is accurate, and when compared with the conventional method shown in FIGS. It is obvious at a glance that the position / posture path following control in the example is excellent.

【0017】[0017]

【発明の効果】かくして、本発明によれば、従来の技術
では対応できなかった新たにセンシングを行う時点での
経路の位置・姿勢の推定をすることにより、より高度な
経路推定値に基づいてセンサの位置・姿勢を制御するこ
とができ、センサがその視野に確実に経路を捉えること
を可能にする等優れた効用性、有用性を奏する。
As described above, according to the present invention, by estimating the position and orientation of a route at the time of performing a new sensing, which cannot be dealt with by the conventional technique, it is possible to estimate a route based on a more advanced route estimation value. The position and orientation of the sensor can be controlled, and the sensor has excellent utility and usefulness, such as enabling the sensor to reliably grasp the path in its field of view.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明の実施例を示すセンサ位置姿勢経路追従
制御法の説明図である。
FIG. 1 is an explanatory diagram of a sensor position / posture path following control method according to an embodiment of the present invention.

【図2】(a)は一点近似手法を用いる従来のセンサ位
置姿勢経路追従制御法の説明図、(b)は二点差分近似
手法を用いた同・説明図である。
FIG. 2A is an explanatory diagram of a conventional sensor position / posture path tracking control method using a one-point approximation method, and FIG. 2B is an explanatory diagram of the same using a two-point difference approximation method.

【符号の説明】[Explanation of symbols]

1…目標経路 2…エンドイフェクタ中心軸 3…センサ視野範囲中心軸 4…センサ視野範囲 D1〜Dn…センサデータ 5…センサデータD1〜Dnから得られた最新の経路関
数が描く経路 6…その一つ前の区間D1〜Dn−1の経路関数が描く
経路 7…座標系
1: Target path 2: End effector center axis 3: Sensor field range center axis 4: Sensor field range D1 to Dn: Sensor data 5: Path drawn by the latest path function obtained from sensor data D1 to Dn 6 .... Path drawn by the path function of the previous section D1 to Dn-1 7. Coordinate system

───────────────────────────────────────────────────── フロントページの続き (56)参考文献 特開 平2−3805(JP,A) 特開 平2−7108(JP,A) 特開 昭64−42706(JP,A) 特開 昭59−223817(JP,A) (58)調査した分野(Int.Cl.7,DB名) G05B 19/19 B25J 9/10 B25J 13/08 G05B 19/4155 G05D 3/12 ──────────────────────────────────────────────────続 き Continuation of the front page (56) References JP-A-2-3805 (JP, A) JP-A-2-7108 (JP, A) JP-A-64-42706 (JP, A) JP-A-59-1984 223817 (JP, A) (58) Field surveyed (Int. Cl. 7 , DB name) G05B 19/19 B25J 9/10 B25J 13/08 G05B 19/4155 G05D 3/12

Claims (1)

(57)【特許請求の範囲】(57) [Claims] 【請求項1】エンドイフェクタと共にロボットの手先に
取り付けられかつ当該エンドイフェクタより前方の作業
対象物上の当該エンドイフェクタ目標経路の位置および
姿勢を予測検出力するセンサの出力する経路の位置およ
び姿勢データと、基準座標系で表した当該センサの位置
および姿勢とから、当該基準座標における前記エンドイ
フェクタ目標経路の位置と姿勢を表す経路関数を決定出
力し、当該経路関数に基づいて前記ロボットの前記エン
ドイフェクタの位置および姿勢を制御する位置姿勢経路
追従制御方法において、 当該エンドイフェクタの経路関数が描く目標経路上の現
在位置と、当該エンドイフェクタおよび前記センサ間の
距離とから、センサ視野範囲中心軸追従挙動の方程式を
算出し、次いで、前記センサが取るべき経路上の位置お
よび姿勢を、当該方程式と前記経路関数に基づいて連立
方程式を立て、その解を算出して求め、当該センサがそ
の視野範囲に確実に前記エンドイフェクタ目標経路を捉
えるようにしたことを特徴とするセンサ位置姿勢経路追
従制御法。
1. A position of a path output from a sensor which is attached to a hand of a robot together with an end effector and outputs a position and orientation of the end effector target path on a work object in front of the end effector. And the orientation data, and from the position and orientation of the sensor represented in the reference coordinate system, determine and output a path function representing the position and orientation of the end effector target path in the reference coordinates, and based on the path function, In a position and orientation path following control method for controlling the position and orientation of the end effector of the robot, the method further comprises: determining a current position on a target path described by a path function of the end effector, and a distance between the end effector and the sensor. Calculating the equation of the sensor field-of-view range center axis following behavior, and then on the path that the sensor should take The position and orientation are obtained by establishing a simultaneous equation based on the equation and the path function, calculating and obtaining a solution thereof, and the sensor surely captures the end effector target path in the field of view. Sensor position and orientation path following control method.
JP18815092A 1992-07-15 1992-07-15 Sensor position and attitude path tracking control method Expired - Fee Related JP3198467B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP18815092A JP3198467B2 (en) 1992-07-15 1992-07-15 Sensor position and attitude path tracking control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP18815092A JP3198467B2 (en) 1992-07-15 1992-07-15 Sensor position and attitude path tracking control method

Publications (2)

Publication Number Publication Date
JPH0635524A JPH0635524A (en) 1994-02-10
JP3198467B2 true JP3198467B2 (en) 2001-08-13

Family

ID=16218629

Family Applications (1)

Application Number Title Priority Date Filing Date
JP18815092A Expired - Fee Related JP3198467B2 (en) 1992-07-15 1992-07-15 Sensor position and attitude path tracking control method

Country Status (1)

Country Link
JP (1) JP3198467B2 (en)

Also Published As

Publication number Publication date
JPH0635524A (en) 1994-02-10

Similar Documents

Publication Publication Date Title
EP1215017B1 (en) Robot teaching apparatus
US4688184A (en) System for measuring three-dimensional coordinates
JP2005106825A (en) Method and apparatus for determining position and orientation of image receiving device
SE449313B (en) MANIPULATOR WELDING AND MANUFACTURING MANUAL
EP1795315A1 (en) Hand-held control device for an industrial robot
JP2787891B2 (en) Automatic teaching device for laser robot
US6192298B1 (en) Method of correcting shift of working position in robot manipulation system
JP2786874B2 (en) Movable position control device
JP3198467B2 (en) Sensor position and attitude path tracking control method
JP2556830B2 (en) Automatic teaching method in robot
JPH0679671A (en) Method and device for guiding gripping means of robot
JPH05303425A (en) Direct teaching type robot
JPH0727408B2 (en) Robot handling device with fixed 3D vision
JP7138041B2 (en) moving body
JP2001051713A (en) Method and device for teaching work to robot
JP3404681B2 (en) Sensor pointing direction sequential control method
JPS62199383A (en) Control system of robot
JP3078884B2 (en) Copying control device
JP3206775B2 (en) Copying control method and copying control device for machining / assembly device
JP2000094370A (en) Inclination measuring method of work surface of robot and measuring device thereof
JP2793695B2 (en) Object recognition control method
JPS63105893A (en) Automatic teaching method of robot
JPH10264066A (en) Robot controller
JP3218553B2 (en) Robot system control method and device
JP3396072B2 (en) Three-dimensional position measuring method and device

Legal Events

Date Code Title Description
LAPS Cancellation because of no payment of annual fees