JPH0635524A - Sensor position/attitude and route following control method - Google Patents

Sensor position/attitude and route following control method

Info

Publication number
JPH0635524A
JPH0635524A JP18815092A JP18815092A JPH0635524A JP H0635524 A JPH0635524 A JP H0635524A JP 18815092 A JP18815092 A JP 18815092A JP 18815092 A JP18815092 A JP 18815092A JP H0635524 A JPH0635524 A JP H0635524A
Authority
JP
Japan
Prior art keywords
sensor
route
end effector
path
orientation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP18815092A
Other languages
Japanese (ja)
Other versions
JP3198467B2 (en
Inventor
Kiyoshi Nonaka
潔 野中
Toru Kaneko
透 金子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Telegraph and Telephone Corp
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Priority to JP18815092A priority Critical patent/JP3198467B2/en
Publication of JPH0635524A publication Critical patent/JPH0635524A/en
Application granted granted Critical
Publication of JP3198467B2 publication Critical patent/JP3198467B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Landscapes

  • Numerical Control (AREA)
  • Manipulator (AREA)
  • Control Of Position Or Direction (AREA)

Abstract

PURPOSE:To surely catch a route in the viewfield of a sensor by estimating the position and the attitude of the route at a new sensing time point based on the route functions which are sequentially produced based on the discrete information given from the sensor. CONSTITUTION:A route following controller is actuated to collect the route information on a work subject through a sensor and also to sequentially produce the route functions to the reference coordinates of the sensor. Based on these route functions, the position and the attitude of an end effector are controlled and the effector follows a target route 1. At the same time, a route function Pi-1(S) is generated to show a route 6 preceding the (i-1)-th section. Then a route function Pi(S) showing the latest route 5 is generated when the latest sensor data Dn is obtained. Based on this function Pi(S), the end effector advances on the route 1 with change of its position and attitude to perform the next sensing operation with a command of a robot control means.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、機械部品の組立・加工
作業において、当該作業対象物に対しセンサによりロボ
ットアームやエンドアフェクタの移動経路や姿勢を制御
するのに供されるセンサ位置姿勢経路追従制御法に関す
るものである。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a sensor position / orientation used for controlling a moving path or attitude of a robot arm or an end effector by a sensor with respect to a work object in an assembly / machining operation of a machine part. The present invention relates to a route tracking control method.

【0002】[0002]

【従来の技術】従来におけるロボットのセンサ位置姿勢
経路追従制御系では、エンドイフェクタと共にロボット
の手先にセンサを取り付け、エンドイフェクタと共に移
動しながらエンドイフェクタより前方の経路の位置・姿
勢情報を収集する場合がある。この際、安定な情報収集
を行うためにはセンサの視野内に確実に経路を捉えるこ
とが必要であり、そのためには、センサの位置と姿勢を
目標経路に添うように、正確に制御しなければならな
い。特に、センサの視野範囲が狭い場合にこのことが重
要になる。
2. Description of the Related Art In a conventional robot sensor position / orientation route tracking control system, a sensor is attached to the end of the robot together with the end effector, and the position / orientation information of the route ahead of the end effector is obtained while moving together with the end effector. May be collected. At this time, in order to collect stable information, it is necessary to reliably capture the route within the field of view of the sensor.To that end, the position and orientation of the sensor must be accurately controlled so that they follow the target route. I have to. This is particularly important when the field of view of the sensor is narrow.

【0003】これに対する従来技術の一つを図面につき
説明する。図2は従来の過去のセンサデータを用いてセ
ンサの位置と姿勢を決定する方法の例で(a)は一点近
似、(b)は二点差分近似を用いた説明図である。図
中、1は目標経路、2はエンドイフェクタ中心軸、3は
センサ視野中心軸、4はセンサ視野範囲、D1〜Dnは
過去にセンサが検出した目標経路1上のセンサデータで
ある。
One of the conventional techniques for this will be described with reference to the drawings. 2A and 2B show an example of a conventional method for determining the position and orientation of a sensor using past sensor data. FIG. 2A is an explanatory diagram using one-point approximation, and FIG. In the figure, 1 is a target path, 2 is an end effector central axis, 3 is a sensor visual field central axis, 4 is a sensor visual field range, and D1 to Dn are sensor data on the target path 1 detected by the sensor in the past.

【0004】ここではセンサがエンドイフェクタ中心軸
2回りに回転できる機構を持っていることを前提として
いる。図2(a)に示すように、最新のセンサデータD
nの示す位置と現在のエンドイフェクタ中心軸2の位置
とを結ぶ直線L1上の経路が続いているものと推定し、
この直線L1上にセンサ視野中心軸3が来るようにセン
サをエンドイフェクタ中心軸2回りに一点近似手法を用
いて制御する。
Here, it is assumed that the sensor has a mechanism capable of rotating around the end effector central axis 2. As shown in FIG. 2A, the latest sensor data D
It is estimated that the path on the straight line L1 connecting the position indicated by n and the current position of the end effector central axis 2 is continuing,
The sensor is controlled around the end effector central axis 2 using a one-point approximation method so that the sensor visual field central axis 3 comes on the straight line L1.

【0005】図2(b)に示すように、最新のセンサデ
ータDn,Dn−1の示す位置を結ぶ直線L2上に経路
が続いているものと推定し、この差分による直線L2上
にセンサ視野中心軸3が来るようにセンサをエンドイフ
ェクタ中心軸2回りに二点差分近似手法を用いて制御す
る。この例では、いちおうセンサ視野範囲4に目標経路
1が入っているが、かなりの誤差を含んでいる。
As shown in FIG. 2B, it is presumed that the route continues on a straight line L2 connecting the positions indicated by the latest sensor data Dn and Dn-1, and the sensor field of view is drawn on the straight line L2 based on this difference. The sensor is controlled around the end effector center axis 2 using the two-point difference approximation method so that the center axis 3 comes. In this example, the target path 1 is included in the sensor visual field range 4, but it includes a considerable error.

【0006】[0006]

【発明が解決しようとする課題】以上のような制御方法
では、経路の継続方向の推定を過去のセンサデータの値
そのもの、あるいは差分を用いて直線で近似して行って
いるため、センサの視野範囲が狭い場合や、経路の曲率
半径が小さく推定誤差が大きくなり易い場合には目標経
路をセンサの視野範囲に捉えることが困難になるという
欠点があった。ここにおいて、本発明は前記従来の技術
の課題に鑑み、より高度な推定経路に基づいてセンサの
位置・姿勢を制御することにより、センサがその視野に
確実に経路を捉えるセンサ位置姿勢経路追従制御法を提
供せんとするものである。
In the control method as described above, the estimation of the continuing direction of the route is performed by approximating a straight line using the past sensor data value itself or the difference. If the range is narrow, or if the radius of curvature of the route is small and the estimation error is likely to be large, there is a drawback that it becomes difficult to capture the target route within the visual field range of the sensor. In view of the above-mentioned problems of the conventional technique, the present invention controls the position / orientation of the sensor based on a higher-level estimated path, so that the sensor reliably grasps the path in its field of view. It is intended to provide the law.

【0007】前記課題の解決は、本発明が次の新規な特
徴的構成手法を採用することにより達成される。即ち、
本発明の特徴は、エンドイフェクタと共にロボットの手
先に取り付けられかつ当該エンドイフェクタより前方の
作業対象物上の当該エンドイフェクタ目標経路の位置お
よび姿勢を予測検出力するセンサの出力する経路の位置
および姿勢データと、当該センサの基準座標に対する位
置および姿勢とから、当該基準座標における前記エンド
イフェクタ目標経路の位置と姿勢を表す経路関数を決定
出力し、当該経路関数に基づいて前記ロボットの前記エ
ンドイフェクタの位置および姿勢を制御するに当り、当
該エンドイフェクタの経路関数が描く目標経路上の現在
位置と、当該エンドイフェクタおよび前記センサ間の距
離とから、センサ視野範囲中心軸追従挙動の方程式を算
出し、次いで、前記センサが取るべき経路上の位置およ
び姿勢を、前記経路関数に基づいて連立方程式を立て、
その解を算出して求め、当該センサがその視野範囲に確
実に前記エンドイフェクタ目標経路を捉えるようにして
なるセンサ位置姿勢経路追従制御法である。
The above-mentioned problems can be solved by the present invention by adopting the following novel characteristic construction method. That is,
A feature of the present invention is that the path output by a sensor that predictively detects the position and orientation of the end effector target path that is attached to the hand of the robot together with the end effector and is located on the work object in front of the end effector. From the position and orientation data and the position and orientation with respect to the reference coordinates of the sensor, a path function representing the position and orientation of the end effector target path at the reference coordinates is determined and output, and the robot function of the robot is determined based on the path function. In controlling the position and orientation of the end effector, the sensor visual field center axis tracking is performed from the current position on the target path drawn by the path function of the end effector and the distance between the end effector and the sensor. An equation of behavior is calculated, and then the position and orientation on the path that the sensor should take is calculated as described above. Set up a system of equations based on the function,
This is a sensor position / posture path follow-up control method in which the solution is calculated and obtained, and the sensor surely captures the end effector target path in the visual field range.

【0008】[0008]

【作用】本発明は、前記のような手法を講じて、センサ
からの離散的経路情報に基づいて逐次的に生成される経
路関数を用いて新たにセンシングを行う時点での経路の
位置・姿勢を推定し、その値に基づいてセンサの位置と
姿勢を制御するもので、センサ視野範囲の中心軸の追従
挙動を方程式で表す。
According to the present invention, by taking the above-described method, the position / orientation of the route at the time of newly sensing using the route function that is sequentially generated based on the discrete route information from the sensor. Is estimated and the position and orientation of the sensor are controlled based on the estimated value, and the follow-up behavior of the central axis of the sensor visual field range is expressed by an equation.

【0009】その方程式と最新の経路関数とを連立させ
た時の解からセンサ視野範囲の中心軸が取るべき位置、
姿勢を決定し、この結果に基づいてセンサのエンドイフ
ェクタ中心軸回りの角度やセンサの姿勢を制御するの
で、経路関数から得られる高度な経路の推定値を用いる
ことによりセンサ視野範囲の中心軸を目標経路に精度よ
く制御し、確実にエンドイフェクタ経路を捉えることを
達成する。
The position where the central axis of the sensor field of view should be taken from the solution when the equation and the latest path function are made simultaneous,
The attitude is determined, and the angle around the center axis of the sensor's end effector and the attitude of the sensor are controlled based on this result.Therefore, the central axis of the sensor field of view range is used by using the estimated value of the advanced path obtained from the path function. The target path is accurately controlled and the end effector path is reliably captured.

【0010】[0010]

【実施例】本発明の実施例を図面につき詳説する。図1
は本実施例のセンサ位置姿勢経路追従制御法の説明図で
ある。図中、5はセンサデータD1〜Dnから得られた
最新の経路関数が描く経路、6はその一つ前の区間D1
〜Dn−1の経路関数が描く経路、7はエンドイフェク
タに固定した三軸XE,YE,ZEからなる三次元立体
座標系である。なお、図2の従来例と同一表示は同一符
号を付した。ここでは、説明の簡略のためセンサはエン
ドイフェクタの中心軸2回りに回転する自由度を持つの
みで、その姿勢は変化されないものとする。
Embodiments of the present invention will be described in detail with reference to the drawings. Figure 1
FIG. 6 is an explanatory diagram of a sensor position / posture path tracking control method according to the present embodiment. In the figure, 5 is the path drawn by the latest path function obtained from the sensor data D1 to Dn, and 6 is the previous section D1.
A path drawn by the path function of Dn-1 and 7 is a three-dimensional solid coordinate system composed of three axes XE, YE, and ZE fixed to the end effector. Note that the same symbols as those in the conventional example of FIG. Here, for simplification of description, it is assumed that the sensor has only the degree of freedom of rotation about the central axis 2 of the end effector, and its posture is not changed.

【0011】本実施例を実現するいずれも図示しない具
体的手段としては、ロボットの手先に取付けられるエン
ドイフェクタと、当該エンドイフェクタ前方の図示しな
い作業対象物上のエンドイフェクタ経路の位置および姿
勢を検出予測するセンサと、この検出データとセンサの
基準座標に対する位置から経路関数を求める経路関数決
定手段と、この経路関数の入力によりエンドイフェクタ
を駆動するロボット制御手段と、当該ロボット制御手段
により駆動される前記ロボットを備えた経路追従制御装
置を採用した統合制御系から構成される。
As specific means (not shown) for realizing this embodiment, the end effector attached to the hand of the robot, the position of the end effector path on the work object (not shown) in front of the end effector, and A sensor for detecting and predicting a posture, a path function determining means for obtaining a path function from the detected data and a position of the sensor with respect to reference coordinates, a robot control means for driving an end effector by inputting the path function, and the robot control means. It is composed of an integrated control system which adopts a path following control device equipped with the robot driven by.

【0012】本実施例の実行手順を図面につき説明す
る。本実施例は、前記経路追従制御装置を作動して、図
1に示すようにセンサが作業対象物上の経路情報を収集
しながらセンサの基準座標に対する経路関数を図示しな
い経路関数決定手段が逐次生成し、その経路関数に基づ
いてエンドイフェクタの位置・姿勢を制御し、目標経路
1に追従させる。
The execution procedure of this embodiment will be described with reference to the drawings. In this embodiment, the route tracking control device is operated so that the route function determining means (not shown) sequentially determines the route function with respect to the reference coordinates of the sensor while the sensor collects the route information on the work object as shown in FIG. The target path 1 is generated, and the position / orientation of the end effector is controlled based on the path function to cause the target path 1 to follow.

【0013】その際、第i−1番目の区間一つ前の経路
6を表す経路関数Pi−1(S)が生成されており、そ
の後、最新のセンサデータDnが得られた時点で最新の
経路5を表す経路関数Pi(S)が生成され、この経路
関数Pi(S)に基づきロボット制御手段の指令によ
り、さらにエンドイフェクタが目標経路1上を位置・姿
勢を変えて進み、次のセンシングを行なう。
At this time, a path function Pi-1 (S) representing the path 6 immediately before the i-1th section is generated, and thereafter, the latest path is obtained when the latest sensor data Dn is obtained. A path function Pi (S) representing the path 5 is generated. Based on this path function Pi (S), the end effector further advances on the target path 1 by changing the position / posture according to a command from the robot control means, and the next Perform sensing.

【0014】センサ視野範囲4の中心軸3はエンドイフ
ェクタの中心軸2を中心とする円周上の位置をとること
ができるから、その追従挙動方向を式で表し、その描線
をセンサ視野範囲4の中心軸3追従挙動方向線L3とす
る。経路5の推定値には前記経路情報から経路関数Pi
(S)を求める。このセンサ視野範囲4の中心軸3追従
挙動方向式と経路関数Pi(S)式を連立させて交点を
解き、二つの解の内、エンドイフェクタの前方側の解を
選ぶことによりセンサで制御すべき予測の位置が求ま
る。
Since the central axis 3 of the sensor visual field range 4 can take a position on the circumference centered on the central axis 2 of the end effector, its follow-up behavior direction is expressed by a formula, and its drawn line is represented by the sensor visual field range. The center axis 3 following behavior direction line L4 of FIG. For the estimated value of the route 5, from the route information, the route function Pi
Find (S). Control is performed by the sensor by solving the intersection point by simultaneously arranging the movement direction equation of the central axis 3 of the sensor visual field range 4 and the path function Pi (S) equation and selecting the solution on the front side of the end effector from the two solutions. The position of the prediction to be made is obtained.

【0015】従って、この位置データからセンサのエン
ドイフェクタ中心軸2回りの角度を算出し、この結果を
ロボット制御手段によりエンドイフェクタに指令を与え
て位置姿勢経路追従制御をすることができる。この演算
はセンサ視野範囲4の中心軸3追従挙動方向線L3と経
路関数Pi(S)をエンドイフェクタに固定した座標系
7で経路情報をセンサが出力するように表しておけば簡
単に行うことができる。
Therefore, the angle of the sensor about the center axis 2 of the end effector can be calculated from this position data, and the robot control means can give a command to the end effector to perform position / posture path follow-up control. This calculation is easily performed if the sensor outputs route information in the coordinate system 7 in which the center axis 3 following behavior direction line L3 of the sensor visual field range 4 and the route function Pi (S) are fixed to the end effector. be able to.

【0016】さらに、センサが姿勢を変化させるための
自由度を有している場合には、経路関数から与えられる
ワーク面の法線方向を考慮して、センサ視野範囲4の中
心軸3が法線方向と一致するようにセンサの姿勢を制御
することができる。このセンサ視野範囲4の中心軸3と
センサ視野範囲4と目標経路1の重なり具合は正確であ
り、前記図2(a)(b)の従来法の説明図と比較して
みると、本実施例の位置姿勢経路追従制御が優れている
ことは一目瞭然である。
Further, when the sensor has a degree of freedom for changing the posture, the central axis 3 of the sensor visual field range 4 is a normal value in consideration of the normal direction of the work surface given by the path function. The posture of the sensor can be controlled so as to match the line direction. The degree of overlap between the central axis 3 of the sensor visual field range 4, the sensor visual field range 4 and the target path 1 is accurate, and when compared with the explanatory view of the conventional method shown in FIGS. It is obvious that the example position / posture path tracking control is excellent.

【0017】[0017]

【発明の効果】かくして、本発明によれば、従来の技術
では対応できなかった新たにセンシングを行う時点での
経路の位置・姿勢の推定をすることにより、より高度な
経路推定値に基づいてセンサの位置・姿勢を制御するこ
とができ、センサがその視野に確実に経路を捉えること
を可能にする等優れた効用性、有用性を奏する。
As described above, according to the present invention, it is possible to estimate the position / orientation of a route at the time of newly performing sensing, which cannot be handled by the conventional technique, so that a higher-level route estimation value can be used. The position / orientation of the sensor can be controlled, and the sensor can surely capture the route in its field of view, which has excellent utility and usefulness.

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明の実施例を示すセンサ位置姿勢経路追従
制御法の説明図である。
FIG. 1 is an explanatory diagram of a sensor position / posture path tracking control method according to an embodiment of the present invention.

【図2】(a)は一点近似手法を用いる従来のセンサ位
置姿勢経路追従制御法の説明図、(b)は二点差分近似
手法を用いた同・説明図である。
FIG. 2A is an explanatory diagram of a conventional sensor position / posture path tracking control method using a one-point approximation method, and FIG. 2B is the same explanatory diagram using a two-point difference approximation method.

【符号の説明】[Explanation of symbols]

1…目標経路 2…エンドイフェクタ中心軸 3…センサ視野範囲中心軸 4…センサ視野範囲 D1〜Dn…センサデータ 5…センサデータD1〜Dnから得られた最新の経路関
数が描く経路 6…その一つ前の区間D1〜Dn−1の経路関数が描く
経路 7…座標系
1 ... Target path 2 ... End effector central axis 3 ... Sensor visual field range central axis 4 ... Sensor visual field range D1 to Dn ... Sensor data 5 ... Path drawn by latest path function obtained from sensor data D1 to Dn 6 ... Path drawn by the path function of the previous section D1 to Dn-1 7 ... Coordinate system

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】エンドイフェクタと共にロボットの手先に
取り付けられかつ当該エンドイフェクタより前方の作業
対象物上の当該エンドイフェクタ目標経路の位置および
姿勢を予測検出力するセンサの出力する経路の位置およ
び姿勢データと、当該センサの基準座標に対する位置お
よび姿勢とから、当該基準座標における前記エンドイフ
ェクタ目標経路の位置と姿勢を表す経路関数を決定出力
し、当該経路関数に基づいて前記ロボットの前記エンド
イフェクタの位置および姿勢を制御するに当り、 当該エンドイフェクタの経路関数が描く目標経路上の現
在位置と、当該エンドイフェクタおよび前記センサ間の
距離とから、センサ視野範囲中心軸追従挙動の方程式を
算出し、次いで、前記センサが取るべき経路上の位置お
よび姿勢を、当該方程式と前記経路関数に基づいて連立
方程式を立て、その解を算出して求め、当該センサがそ
の視野範囲に確実に前記エンドイフェクタ目標経路を捉
えるようにしたことを特徴とするセンサ位置姿勢経路追
従制御法。
1. A position of a path which is attached to a hand of a robot together with an end effector, and which outputs a sensor for predicting and detecting the position and orientation of the end effector target path on a work object in front of the end effector. And orientation data, and the position and orientation of the sensor with respect to the reference coordinates, a path function representing the position and orientation of the end effector target path at the reference coordinates is determined and output, and the robot of the robot is based on the path function. In controlling the position and orientation of the end effector, the sensor field-of-view range center axis tracking behavior is determined from the current position on the target path drawn by the path function of the end effector and the distance between the end effector and the sensor. And calculate the position and orientation on the path that the sensor should take, And the path function, a simultaneous equation is set up, the solution is calculated and obtained, and the sensor accurately captures the end effector target path in the visual field range. Control method.
JP18815092A 1992-07-15 1992-07-15 Sensor position and attitude path tracking control method Expired - Fee Related JP3198467B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP18815092A JP3198467B2 (en) 1992-07-15 1992-07-15 Sensor position and attitude path tracking control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP18815092A JP3198467B2 (en) 1992-07-15 1992-07-15 Sensor position and attitude path tracking control method

Publications (2)

Publication Number Publication Date
JPH0635524A true JPH0635524A (en) 1994-02-10
JP3198467B2 JP3198467B2 (en) 2001-08-13

Family

ID=16218629

Family Applications (1)

Application Number Title Priority Date Filing Date
JP18815092A Expired - Fee Related JP3198467B2 (en) 1992-07-15 1992-07-15 Sensor position and attitude path tracking control method

Country Status (1)

Country Link
JP (1) JP3198467B2 (en)

Also Published As

Publication number Publication date
JP3198467B2 (en) 2001-08-13

Similar Documents

Publication Publication Date Title
JP4736607B2 (en) Robot controller
Sharma et al. A framework for robot motion planning with sensor constraints
US6192298B1 (en) Method of correcting shift of working position in robot manipulation system
JP2786874B2 (en) Movable position control device
JP3169174B2 (en) Teaching Data Correction Method for Work Path Following Robot Manipulator
JP3198467B2 (en) Sensor position and attitude path tracking control method
JP2556830B2 (en) Automatic teaching method in robot
JPH0746288B2 (en) Control method and device for robot with hand vision
JPH1124718A (en) Device and method for controlling robot
JPH06312392A (en) Control device for multi-joint robot
JPS62199383A (en) Control system of robot
JP3206775B2 (en) Copying control method and copying control device for machining / assembly device
JP3404681B2 (en) Sensor pointing direction sequential control method
JP2000094370A (en) Inclination measuring method of work surface of robot and measuring device thereof
JPS6010309A (en) Method for interpolating path of robot hand
JP2991350B2 (en) Route start point detection guidance method
JP3078884B2 (en) Copying control device
JP2669075B2 (en) Direct teaching device and position teaching method for robot position and posture
JP2584065B2 (en) Offline teaching device for robots
JPH10264066A (en) Robot controller
JPH0784632A (en) Method for teaching position and attitude of robot
JPH05341826A (en) Contact state change detecting method and robot controller
JPS6057408A (en) Locus controller of robot
JPS63105893A (en) Automatic teaching method of robot
JP3175419B2 (en) Machining path follower

Legal Events

Date Code Title Description
LAPS Cancellation because of no payment of annual fees