JPS60136806A - Device for instructing work of robot - Google Patents

Device for instructing work of robot

Info

Publication number
JPS60136806A
JPS60136806A JP25182283A JP25182283A JPS60136806A JP S60136806 A JPS60136806 A JP S60136806A JP 25182283 A JP25182283 A JP 25182283A JP 25182283 A JP25182283 A JP 25182283A JP S60136806 A JPS60136806 A JP S60136806A
Authority
JP
Japan
Prior art keywords
light emitting
robot
emitting elements
work
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP25182283A
Other languages
Japanese (ja)
Other versions
JPH0310125B2 (en
Inventor
Masaru Ishii
優 石井
Shigeyuki Sakane
坂根 茂幸
Masayoshi Kakikura
柿倉 正義
Yoshio Mikami
三上 芳夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Institute of Advanced Industrial Science and Technology AIST
Original Assignee
Agency of Industrial Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agency of Industrial Science and Technology filed Critical Agency of Industrial Science and Technology
Priority to JP25182283A priority Critical patent/JPS60136806A/en
Publication of JPS60136806A publication Critical patent/JPS60136806A/en
Publication of JPH0310125B2 publication Critical patent/JPH0310125B2/ja
Granted legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • G05B19/423Teaching successive positions by walk-through, i.e. the tool head or end effector being grasped and guided directly, with or without servo-assistance, to follow a path
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36451Handheld toollike probe, work instructor, lightweigted, connected to recorder
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36452Touch points with handheld probe, camera detects position and orientation probe
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37572Camera, tv, vision

Abstract

PURPOSE:To enable an operator to hold a work instructor with his hand to instruct a robot by providing the work instructor of a robot work instructing device with plural light emitting elements, detecting the light emitting elements by an image sensor and calculating the three-dimensional position of the light emitting elements. CONSTITUTION:Plural light emitting elements 4 fitted to the upper surface of the work instructor 3 are successively emitted through a control computer 2 on the basis of an instruction from a computer 1, the videos of the spot light are detected by the image sensor 5 and the outputs of the X and Y coordinate values are A/D converted, and the digital values are inputted to the control computer 2. The three-dimensional positions of the plural light emitting elements 4 are found out from said input values and the geometric positional relation of respective points of these light emitting elements 4 to obtain the three-dimensional position and posture of the leading end of the instruction part of the work instructor 3. Thus, the operator instructs the working path of the robot directly by using the work instructor and the obtained three-dimensional information is transferred to the computer 1 and used for the work of the robot.

Description

【発明の詳細な説明】 本発明は、ロボットが作業する3次元空間において、オ
ペレータが指示する任意の3次元位置および姿勢などの
3次元情報をめることのできるロボット作業教示装置に
関するものである。
DETAILED DESCRIPTION OF THE INVENTION The present invention relates to a robot work teaching device that can receive three-dimensional information such as arbitrary three-dimensional positions and postures specified by an operator in a three-dimensional space in which the robot works. .

従来、ロボットの作業教示法として9人間がロボットの
手先を直接持ったり、ティーチング・ボックスを用いて
ロボットを動かし1作業手順を記録してプレイ/1−ツ
クさせる方式が多く用いられている。[7かし、この方
法はロボットの誤動作や入間の教示ミスにより異常事態
が発生することがあり、またロボットの台数が増すと教
示に要する11j7間と労力は多大となり1人間にとっ
て危険で負ロボット作業教示装置を提供することを目的
としてなされたものである。
Conventionally, as a method of teaching robot work, a method is often used in which nine people directly hold the robot's hands or use a teaching box to move the robot, record one work procedure, and have it play/one-trick. [7 However, with this method, abnormal situations may occur due to robot malfunctions or Iruma's teaching errors, and as the number of robots increases, the time and effort required for teaching becomes large, which is dangerous for humans and can lead to negative robots. This was made for the purpose of providing a work teaching device.

以下、未発明について説明する。The non-invention will be explained below.

本発明のシステム禍成の概合図8第1図に示しその概略
について述へる。
A schematic diagram 8 of the system failure of the present invention is shown in FIG. 1, and its outline will be described below.

まず、計算機1からの命令により、制御用計算機2を通
して作業教示器3の上面に取伺けた複φシ個の発光素子
4を指定された1町序に発光させる。
First, in response to a command from the computer 1, the light emitting elements 4, which have multiple diameters of φ and which are placed on the upper surface of the work instructor 3, are caused to emit light in a specified direction through the control computer 2.

次に、そのスポット光の映像をイメージセンサ5によっ
て検出しそのX、Y座標値の出力をA/D変換して制御
用計算機2に読み込む。これらの値と複数個の発光素子
4の各点間の幾何学的位置関係により、複数個の発光素
子4の3次元位置をめ、これから作業教示器3の指ン1
(部先端における3次元位置と姿勢か得られる。オペレ
ータは、複数個の発光素子4を備えた作業教示器3を用
いてロボン)・の作業パスをtα接教示し、得られた3
次元情報は計rJ、機1に転送され、ロボン)・の作業
に、利用される。
Next, the image of the spot light is detected by the image sensor 5, and the output of the X and Y coordinate values is A/D converted and read into the control computer 2. Based on these values and the geometrical positional relationship between each point of the plurality of light emitting elements 4, the three-dimensional position of the plurality of light emitting elements 4 is determined, and from this, the finger 1 of the work instructor 3 is determined.
(The three-dimensional position and posture at the tip of the robot part can be obtained. The operator uses the work teaching device 3 equipped with a plurality of light emitting elements 4 to teach the work path of the robot) tα, and the obtained 3
The dimensional information is transferred to machine 1 and used for the work of robot.

第2図に、複数個の発光素子4を備えた作業教/j<器
3の指示部先端における3次元情報を計測する座標系を
示す。0rを原点とするロボット座標系を (Xr、 
Yr、 Zr )、イメージセンサ5の座標系を(XS
、Ys 、 Zs )とする。ロボッI・座標系におけ
るイメージヤ/す5の原点をOs(Xos 。
FIG. 2 shows a coordinate system for measuring three-dimensional information at the tip of the pointing section of the working tool 3 equipped with a plurality of light emitting elements 4. The robot coordinate system with 0r as the origin is (Xr,
Yr, Zr), and the coordinate system of the image sensor 5 is (XS
, Ys, Zs). Os (Xos) is the origin of the imager/su5 in the robot I coordinate system.

Yos 、 Zos ) 、イメージセンサ5の座標系
の各軸まわりの回転角を α、β、γとすると、2つの
座標系の関係は9次のようになる。
Yos, Zos), and rotation angles around the respective axes of the coordinate system of the image sensor 5 are α, β, and γ, the relationship between the two coordinate systems is 9th order.

ここで1回転マトリックスRは。Here, the one rotation matrix R is.

置ある。There is.

′また。イメージセンサ5の座標系において、スポント
光のイメージセンサ5にょるX、Y座標出力を(Xi、
Yi )、レンズの焦点距離をFとすると。
'Also. In the coordinate system of the image sensor 5, the X and Y coordinate outputs of the image sensor 5 of the spot light are expressed as (Xi,
Yi), and the focal length of the lens is F.

次の2式か得られる。The following two equations can be obtained.

Xi = −F aXs/ Zs + SX (2)Y
i = −F @Ys/ Zs + Sy (3)以−
にの(1)〜 (3)式により、既知である複数個の発
光素子4の3次元位置(Xh、Yh、Zh )とこれに
対応するイメージセンサ5の出力値(Xi、Yi )に
ついての5 Mi以上のチークからギヤリブレージJノ
を行ない9個の未知パラメータ (Xos、 Yos。
Xi = -FaXs/Zs + SX (2)Y
i = −F @Ys/ Zs + Sy (3)
Using Equations (1) to (3) of 5 Perform gear librage J from a cheek of Mi or higher and find 9 unknown parameters (Xos, Yos.

’los、a、β、 y 、 F 、SX、SY )を
める。
'los, a, β, y, F, SX, SY).

次に、3次元空間において同一平面上にある4点の畿何
学的位置関係が既知の場合、イメージセフ′す5]二で
これら4点の対応点が得られると、投影の逆変換から4
点の3次元位置か一意に決定される。よって、ここでは
4個の発光素子4を備えm−う 、メージセンサ5上の対応点までのベクトルをpl(i
=1 、・・・4)とすると、透視変換により→ 、→ ql −81轡p I、s i ) 0 (i=1.、
、.4 ) (4)である。
Next, if the geometrical positional relationship of four points on the same plane in three-dimensional space is known, and when the corresponding points of these four points are obtained by image control, the inverse transformation of the projection 4
The three-dimensional position of the point is uniquely determined. Therefore, here, the vector to the corresponding point on the image sensor 5 equipped with four light emitting elements 4 is expressed as pl(i
=1,...4), then by perspective transformation → , → ql -81轡p I, s i ) 0 (i=1.,
,.. 4) (4).

→ ここで、q = (Xs、 Ys、 Zs )7 = 
(Xi、 Yi。
→ Here, q = (Xs, Ys, Zs)7 =
(Xi, Yi.

−F)。-F).

4個の発光素子4は、同一平面上の長方形の頂点にある
ので。
The four light emitting elements 4 are located at the vertices of a rectangle on the same plane.

、−’)、 +i = −1)> 、 q>l (5)
1 i”2−Fl=I 6−51= D ’(8)1 
自−rt + = l q’a −il =5−7(7
1→ → I q3−q21=l ql−q41=d (8)の4
式か得られる。(5)式は、対角線の中点が一致し、(
8)〜 (8)式は、二辺の長さがり、dてあり、力]
に直交していることを示している。以」−の関係式から
、(4)式(1) s i t’jヨU q i (X
s、Ys。
, -'), +i = -1)> , q>l (5)
1 i”2-Fl=I 6-51=D'(8)1
auto-rt + = l q'a -il =5-7(7
1 → → I q3-q21=l ql-q41=d 4 of (8)
The formula is obtained. In equation (5), the midpoints of the diagonals coincide, and (
8) ~ Equation (8) is the length of the two sides, d, and the force]
This shows that it is orthogonal to From the relational expression (4) and (1) s i t'jyo U q i (X
s, Ys.

そkめられる。I can't stand it.

J(43に示すように、4個の発光素子4を備えたヒ)
ミ業〃示器3の指示部先端の3次χ位置をH,4→ →
 → 個の発光素子4の3次元位置をI、l 、 L2 、 
L3 。
J (H equipped with four light emitting elements 4 as shown in 43)
The third-order χ position of the tip of the indicator of the work indicator 3 is H, 4→ →
→ The three-dimensional positions of the light emitting elements 4 are I, l, L2,
L3.

→ し4とおき、姿勢マトリックスをP (r%、n、li
 I(ただしA、、−i、6は長さ1の力量べり117
.)。
→ Set as 4 and set the posture matrix as P (r%, n, li
I (However, A,, -i, 6 is the strength of length 1 117
.. ).

4個の発光−に子4を備えた作業教示器3の本体のT’
; + a −7$+ d/2 ・yt+(I]+b)
 ・’ii−β’!(10’1TF十a −J+ dt
2− yt+ b−@ = 廿(11)曾+□−N−a
t2−爵や 5.98 廿(12)の4式か得られる。
T' of the main body of the work teaching device 3 equipped with four light-emitting pins 4
; + a -7$+ d/2 ・yt+(I]+b)
・'ii-β'! (10'1TF10a -J+ dt
2-yt+ b-@ = 廿(11)曾+□-N-a
You can get 4 formulas of t2-count and 5.98 廿(12).

 ここで、姿勢マトリックスPは1次の3式からめられ
る。
Here, the posture matrix P is determined from three linear equations.

け= dx 6 ’ (13) i・ (廿−* )/D (14) 廿= (0−け)/d(15) 4個の発光素子4を備えた作業教示器3の指示トしてま
る。
ke = dx 6' (13) i・(廿-*)/D (14) 廿= (0-ke)/d(15) Instructing the work instructor 3 equipped with four light emitting elements 4. circle.

これらからロボット作業空間におけ、る任意の3次元位
置と姿勢が得られる。
From these, any three-dimensional position and orientation in the robot work space can be obtained.

L記の作業教示器3の例では、装置の上面にのみ4個の
発光素子4を取り付けているが、この他に前後左右の側
面に同様なものを取り付けて1lll定箇所を広げるこ
とも可能であり1発光素子を41内以上取り付けて信頼
性をLげることも考えられる、またイメージセンサを複
数台用いる場合、2台のイメージセンサによりステレオ
ビジョンを構成し、3測具Hの発光素子の3次元位値お
よび姿勢をめ、同時に各々のイメージセンサにおいてl
−記のアルゴリズムを併用して1.1頼性をト−げるこ
とが可能である。
In the example of the work instructor 3 described in L, four light emitting elements 4 are attached only to the top surface of the device, but it is also possible to attach similar devices to the front, back, left, and right sides to expand the fixed location by 1llll. Therefore, it is possible to increase the reliability by installing one light emitting element within 41 or more.Also, when using multiple image sensors, stereo vision is configured with two image sensors, and three light emitting elements of measuring tool H are used. 3D position and orientation, and at the same time l in each image sensor.
It is possible to improve the reliability of 1.1 by using the above algorithm in combination.

以上説明したように1本発明のロボット作業教示装置は
、オペレータが毛に持って人間の作業動作なロポy)へ
安全に教示でき、かつ教示時間も短縮が図れる効果を有
するものである。さらに。
As described above, the robot work teaching device of the present invention has the effect that an operator can safely teach human work movements while holding the robot in his/her hand, and the teaching time can be shortened. moreover.

本発明のロボット作業教示装置によって得られた13次
元情報は、ロボットハンドの位置と姿勢を示1しており
、これらはデータベースに保存され9口縁、ント言諜り
やオフラインプログラミングと芽4〜合することにより
様々な有効な使用法が考えられる。
The 13-dimensional information obtained by the robot work teaching device of the present invention indicates the position and posture of the robot hand, and these are stored in a database and combined with information, off-line programming, and so on. By doing so, various effective uses can be considered.

4、図面(7) I!1 、”li す説明第1図は1
本発明によるロポy l・作業教示装置のシステム構成
の概念図を説明するだめの図、第2図は、ロボット座標
系とイメージセンサ5の座標系における複数個の発光素
子4を備えた作業教示器3の幾何学的関係を説明するた
めの図、第3図は、複数個の発光素子4を備えた作業教
示器3の指示部先端の3次元位置と姿勢の幾何学的関係
を説明するための図である。
4. Drawing (7) I! 1, "li" explanation Figure 1 is 1
FIG. 2 is a diagram illustrating a conceptual diagram of the system configuration of the robot system and work teaching device according to the present invention. FIG. 3, which is a diagram for explaining the geometrical relationship of the device 3, explains the geometrical relationship between the three-dimensional position and posture of the tip of the indicator of the work teaching device 3 equipped with a plurality of light emitting elements 4. This is a diagram for

図中、lは計算機、2は制御用計算機、3は作業教示装
置 センサである。
In the figure, l is a computer, 2 is a control computer, and 3 is a work teaching device sensor.

第 1 図 第2 図 Yh 第3 図Figure 1 Figure 2 Yh Figure 3

Claims (1)

【特許請求の範囲】 ロボットの作業空間における3次元位t0姿勢などの3
次元情報をめ、これを用いてロボットの作業を教示する
装置において。 複数個の発光素子を備えた作業教示器と。 該発光素子のスポット光を検出する少なくともシ一台の
テレビカメラなどのイメージセンサと。 1前記複数個の発光素子の夫々の3次元位置を514す
ることによりオペレータが指示する3次元位置や姿勢を
算出する制御用計算機と。 から成ることを特徴とするロボット作業教示装6つ
[Claims] Three dimensional positions such as t0 posture in the work space of the robot.
In a device that obtains dimensional information and uses this to teach robot operations. A work teaching device equipped with a plurality of light emitting elements. and at least one image sensor such as a television camera that detects the spot light of the light emitting element. 1. A control computer that calculates a three-dimensional position and orientation instructed by an operator by calculating the three-dimensional position of each of the plurality of light emitting elements. Six robot work teaching devices characterized by consisting of
JP25182283A 1983-12-26 1983-12-26 Device for instructing work of robot Granted JPS60136806A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP25182283A JPS60136806A (en) 1983-12-26 1983-12-26 Device for instructing work of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP25182283A JPS60136806A (en) 1983-12-26 1983-12-26 Device for instructing work of robot

Publications (2)

Publication Number Publication Date
JPS60136806A true JPS60136806A (en) 1985-07-20
JPH0310125B2 JPH0310125B2 (en) 1991-02-13

Family

ID=17228432

Family Applications (1)

Application Number Title Priority Date Filing Date
JP25182283A Granted JPS60136806A (en) 1983-12-26 1983-12-26 Device for instructing work of robot

Country Status (1)

Country Link
JP (1) JPS60136806A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61243507A (en) * 1985-04-22 1986-10-29 Riyouki Eng Kk Teaching device for industrial robot
JPS62254206A (en) * 1986-04-28 1987-11-06 Fuji Electric Co Ltd Deciding device for plane direction
US5495090A (en) * 1992-06-29 1996-02-27 Matsushita Electric Industrial Co., Ltd. Welding robot
FR2737024A1 (en) * 1995-07-20 1997-01-24 Patenotre Laurent Process for training industrial robots and object digitisation by following geometric form and contours - involves utilising sensor sending position and orientation data as sensor unit is guided over chosen parts of object, and using computer to translate data to three-dimensional data
WO1998000766A1 (en) * 1996-07-02 1998-01-08 Kuka Schweissanlagen Gmbh Process and device for teaching a program-controlled robot
WO1999038656A1 (en) * 1998-01-29 1999-08-05 Armstrong Healthcare Ltd. A robot control system
EP1074934A2 (en) * 1999-08-02 2001-02-07 Lucent Technologies Inc. Computer input device having six degrees of freedom for controlling movement of a three-dimensional object
KR100420696B1 (en) * 1994-09-28 2004-07-27 얀 게. 페게르 Control device with mobility control member
EP1738881A2 (en) * 2005-06-30 2007-01-03 Shibuya Kogyo Co., Ltd. Robot control system
JP2014136275A (en) * 2013-01-16 2014-07-28 Yaskawa Electric Corp Robot teaching system, robot teaching program generation method, and teaching tool
IT202100017033A1 (en) * 2021-06-29 2022-12-29 Comau Spa "Processing equipment and associated marking device for generating process trajectories"

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5449758A (en) * 1977-09-26 1979-04-19 Agency Of Ind Science & Technol Teaching method for controlling robots
JPS58203513A (en) * 1982-05-21 1983-11-28 Hitachi Ltd Device for instructing operation trace of robot
JPS5988298A (en) * 1982-11-12 1984-05-22 川崎重工業株式会社 Method of guiding body

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5449758A (en) * 1977-09-26 1979-04-19 Agency Of Ind Science & Technol Teaching method for controlling robots
JPS58203513A (en) * 1982-05-21 1983-11-28 Hitachi Ltd Device for instructing operation trace of robot
JPS5988298A (en) * 1982-11-12 1984-05-22 川崎重工業株式会社 Method of guiding body

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61243507A (en) * 1985-04-22 1986-10-29 Riyouki Eng Kk Teaching device for industrial robot
JPS62254206A (en) * 1986-04-28 1987-11-06 Fuji Electric Co Ltd Deciding device for plane direction
US5495090A (en) * 1992-06-29 1996-02-27 Matsushita Electric Industrial Co., Ltd. Welding robot
KR100420696B1 (en) * 1994-09-28 2004-07-27 얀 게. 페게르 Control device with mobility control member
FR2737024A1 (en) * 1995-07-20 1997-01-24 Patenotre Laurent Process for training industrial robots and object digitisation by following geometric form and contours - involves utilising sensor sending position and orientation data as sensor unit is guided over chosen parts of object, and using computer to translate data to three-dimensional data
DE19626459C2 (en) * 1996-07-02 1999-09-02 Kuka Schweissanlagen Gmbh Method and device for teaching a program-controlled robot
WO1998000766A1 (en) * 1996-07-02 1998-01-08 Kuka Schweissanlagen Gmbh Process and device for teaching a program-controlled robot
WO1999038656A1 (en) * 1998-01-29 1999-08-05 Armstrong Healthcare Ltd. A robot control system
EP1074934A2 (en) * 1999-08-02 2001-02-07 Lucent Technologies Inc. Computer input device having six degrees of freedom for controlling movement of a three-dimensional object
EP1074934A3 (en) * 1999-08-02 2002-07-03 Lucent Technologies Inc. Computer input device having six degrees of freedom for controlling movement of a three-dimensional object
EP1738881A2 (en) * 2005-06-30 2007-01-03 Shibuya Kogyo Co., Ltd. Robot control system
EP1738881A3 (en) * 2005-06-30 2009-04-08 Shibuya Kogyo Co., Ltd. Robot control system
JP2014136275A (en) * 2013-01-16 2014-07-28 Yaskawa Electric Corp Robot teaching system, robot teaching program generation method, and teaching tool
IT202100017033A1 (en) * 2021-06-29 2022-12-29 Comau Spa "Processing equipment and associated marking device for generating process trajectories"
WO2023275701A1 (en) * 2021-06-29 2023-01-05 Comau S.P.A. A processing apparatus and relative marking device to generate process trajectories

Also Published As

Publication number Publication date
JPH0310125B2 (en) 1991-02-13

Similar Documents

Publication Publication Date Title
KR101988937B1 (en) Method and apparatus for calibration of a robot hand and a camera
US9073211B2 (en) Control system and teaching method for seven-axis articulated robot
JP5022868B2 (en) Information processing apparatus and information processing method
Sharma et al. Motion perceptibility and its application to active vision-based servo control
KR20160070006A (en) Collision avoidance method, control device, and program
JPS60136806A (en) Device for instructing work of robot
KR101736752B1 (en) Method, system, and device for ergocentric tele-operation
Park et al. Dual-arm coordinated-motion task specification and performance evaluation
JPS60263681A (en) Instruction method of robot
Das et al. Kinematic control and visual display of redundant teleoperators
Shimizu Analytical inverse kinematics for 5-DOF humanoid manipulator under arbitrarily specified unconstrained orientation of end-effector
Ranjbaran et al. On the kinematic conditioning of robotic manipulators
US20220168902A1 (en) Method And Control Arrangement For Determining A Relation Between A Robot Coordinate System And A Movable Apparatus Coordinate System
Lathuiliere et al. Visual hand posture tracking in a gripper guiding application
WO2023013740A1 (en) Robot control device, robot control system, and robot control method
JPS61274852A (en) Non-contact curved surface copying sensor
Garcia et al. Autonomous 4DOF robotic manipulator prototype for industrial environment and human cooperation
CN114972498A (en) Apparatus and method for determining a pose of an object
JP3021202B2 (en) Robot position and orientation guidance method
JPS6052703A (en) Detection of three-dimensional position and attitude
JPH07129231A (en) Noncontact point teaching device
Hart et al. Natural task decomposition with intrinsic potential fields
Bon et al. Real-time model-based obstacle detection for the NASA ranger telerobot
JPH0430981A (en) Control unit for television camera of remote control type robot
JPH09323280A (en) Control method and system of manupulator

Legal Events

Date Code Title Description
EXPY Cancellation because of completion of term