JPS62165213A - Work environment teaching device - Google Patents

Work environment teaching device

Info

Publication number
JPS62165213A
JPS62165213A JP631486A JP631486A JPS62165213A JP S62165213 A JPS62165213 A JP S62165213A JP 631486 A JP631486 A JP 631486A JP 631486 A JP631486 A JP 631486A JP S62165213 A JPS62165213 A JP S62165213A
Authority
JP
Japan
Prior art keywords
robot
work
data
worked
shape
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP631486A
Other languages
Japanese (ja)
Inventor
Masami Morita
森田 昌美
Fujio Nakajima
中島 不二雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Institute of Advanced Industrial Science and Technology AIST
Original Assignee
Agency of Industrial Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agency of Industrial Science and Technology filed Critical Agency of Industrial Science and Technology
Priority to JP631486A priority Critical patent/JPS62165213A/en
Publication of JPS62165213A publication Critical patent/JPS62165213A/en
Pending legal-status Critical Current

Links

Landscapes

  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

PURPOSE:To improve the efficiency of a work program by providing a device which displays arrangement relations among a robot, a work object material, etc., on a basis of connection relations generated by a connection relation generating device and generating teaching data of work environments of the robot. CONSTITUTION:A keyboard or the like is used to input shape dimensions of the robot, the work object material, etc., to a shape data generating part 5, and this generating part 5 generates shape data 11 while displaying it on a graphic display device. Position and attitude information are inputted to a position and attitude data generating part 6 with a light pen and a joystick for the purpose of arranging obtained shape data in a work environment position, and the generating part 6 converts inputted information to arrangement type data to obtain position and attitude data 12. This position and attitude data 12 is used to display the robot, the work object material, etc., after movement on the graphic display device. A work environment data generating part 10 edits data generated by each processing to generate work environment data 15. Thus, an accurate robot work program is easily generated.

Description

【発明の詳細な説明】 産業上の利用分野 本発明は、産電用ロボットの作業環境を教示する作業環
境教示装置に関する。
DETAILED DESCRIPTION OF THE INVENTION Field of the Invention The present invention relates to a working environment teaching device for teaching the working environment of industrial electrical robots.

従来の技術 作業環境の記述の従来例としては、以下の通りである0
ロボット言語が、各種データ型を用意し、ロボット・作
業対象物体の形状・位置・姿勢情報。
A conventional example of a description of a conventional technical work environment is as follows.
The robot language provides various data types to provide information on the shape, position, and posture of robots and work objects.

ロボットの作業対象物体に関する把持点・接近点・待避
点といった3次元データを正確に作業プログラム内に記
述する。また、作業対象物体間の正確な接続関係も作業
プログラム内に記述する。
Accurately describe three-dimensional data such as gripping points, approach points, and evacuation points regarding the object to be worked by the robot in the work program. In addition, the exact connection relationships between objects to be worked on are also described in the work program.

発明が解決しようとする問題点 従来の方法では作業環境を教示するためには、作業プロ
グラム内でロボット・作業対象物体の形状データ、位置
・姿勢データ、ロボットの作業対象物体に関する把持点
・接近点・待避点データを正確に記述しなければならな
い。
Problems to be Solved by the Invention In the conventional method, in order to teach the work environment, it is necessary to use the shape data, position and orientation data of the robot and work object in the work program, and the grasping point and approach point of the robot regarding the work target object. - Evacuation point data must be accurately described.

しかし、このような3次元データを机上で正確に作成す
ることは非常に困難である。それはロボット・作業対象
物体の形状やその配置関係の複雑度が高い程、人間の頭
の中で3次元イメージを正確に表現できないためである
。iた、作業対象物体間の接続関係も、作業対象物体間
の相互位置関係を机上では正確に把握できないので、指
定するのは困難である。更に、作業プログラムが作業環
境を記述する部分と動作内容を記述する部分が分かれて
いないため、作業環境だけを修正する場合でも、作業プ
ログラム自体を修正することになり、作業プログラムを
効率よく開発できない。
However, it is extremely difficult to accurately create such three-dimensional data on a desk. This is because the more complex the shape and arrangement of the robot or work object, the less accurately a three-dimensional image can be expressed in the human head. Furthermore, it is difficult to specify the connection relationship between the objects to be worked on because the mutual positional relationship between the objects to be worked on cannot be accurately grasped on paper. Furthermore, since the work program does not have a separate part that describes the work environment and a part that describes the operation content, even if only the work environment is modified, the work program itself must be modified, making it impossible to develop the work program efficiently. .

問題点を解決するだめの手段 本発明は、ロボット・作業対象物体等の形状を生成する
装置と、前記ロボット・前記作業対象物体等の作業空間
内の位置・姿勢情報と、前記ロボットの前記作業対象物
体に関する把持点・接近点・待避点等を入力する入力装
置と、前記入力装置によって入力された位置・姿勢情報
を基にして、移動させたロボットと作業対象物体、ロボ
ットとロボット又は作業対象物体と作業対象物体の接触
状態を検出する接触検出装置と、前記接触検出装置によ
って検出された情報を基にして、ロボット・作業対象物
体の接続関係を生成する接続関係生成装置と、前記入力
装置によって入力された位置・姿勢情報と把持点・接近
点・待避点等の修正を行う装置と、前記接続関係生成装
置によって生成された接続関係を基にしてロボット・作
業対象物体等の配置関係を表示する装置から成る新規な
作業環境装置を構築することによって、上述の問題点を
解決する。
Means for Solving the Problems The present invention provides a device for generating the shape of a robot, an object to be worked, etc., information on the position and orientation of the robot, the object to be worked, etc. in a work space, and information on the position and orientation of the robot, the object to be worked, etc. An input device for inputting the grasping point, approach point, evacuation point, etc. regarding the target object, and based on the position/orientation information input by the input device, the moved robot and the work target object, the robots and the robots, or the work target a contact detection device that detects a contact state between an object and a work target object; a connection relationship generation device that generates a connection relationship between the robot and the work target object based on information detected by the contact detection device; and the input device. A device that corrects position/orientation information and grip points, approach points, evacuation points, etc. inputted by the above-mentioned device, and a device that determines the placement relationship of the robot, work target object, etc. based on the connection relationship generated by the connection relationship generation device. The above-mentioned problems are solved by constructing a new work environment device consisting of a display device.

作  用 本発明は、上記した構成により、ロボット・作業対象物
体等の正確な形状を生成し、前記ロボット・前記作業対
象物体の位置・姿勢情報と、前記ロボットの前記作業対
象物体の把持点・接近点・待避点を入力情報とし、これ
らの入力情報を用いて前記ロボットおよび前記作業対象
物体を移動させ、移動後のロボットと作業対象物体、作
業対象物体と作業対象物体の接融状態を検出し、検出さ
れた情報によって作業対象物体間の接続関係を生成し、
これらによって生成された接続関係を基にしてロボット
・作業対象物体の配置関係を表示するものであり、作業
環境を視覚的にとらえることが可能とな9、作業環境は
教示しやすくなる。また、前記入力したロボット・作業
対象物体の位置・姿勢データ、ロボットの作業対象物体
に関する把持点・接近点・待避点も容易に修正・変更可
能である。
Effects of the present invention With the above-described configuration, the present invention generates an accurate shape of a robot, a work target object, etc., and uses the position/orientation information of the robot/work target object, and the grasping point and position of the work target object of the robot. Using approach points and evacuation points as input information, moving the robot and the object to be worked on using these input information, and detecting the fused state of the robot and the object to be worked on, and the object to be worked and the object to be worked after the movement. Then, based on the detected information, a connection relationship between the objects to be worked on is generated,
The arrangement relationship between the robot and the object to be worked on is displayed based on the connection relationship generated by these, and it is possible to visually grasp the work environment9, making it easier to teach the work environment. Further, the input position/orientation data of the robot/work target object, as well as the grasping point, approach point, and evacuation point of the robot regarding the work target object can be easily corrected/changed.

実施例 以下本発明の実施例について詳細に説明する。Example Examples of the present invention will be described in detail below.

第1図は本発明の一実施例を示すブロック図である。本
発明の作業環境教示装置は、各種処理を行う計算機4と
、予め計算機4内で生成された作業対象物体16.17
、ロボット18の形状データ11を表示するグラフィッ
クディスプレイ1と、3次元位置情報を入力するジョイ
スティック2、ライトベン3とで主に構成されている。
FIG. 1 is a block diagram showing one embodiment of the present invention. The work environment teaching device of the present invention includes a computer 4 that performs various processes, and work target objects 16 and 17 that are generated in advance within the computer 4.
, a graphic display 1 for displaying shape data 11 of a robot 18, a joystick 2 for inputting three-dimensional position information, and a light ben 3.

次に計算機内の各処理について説明する。形状データ生
成部5は、ロボット・作業対象物体等の形状寸法をキー
ボード等を用いて入力し、形状データ11を生成する。
Next, each process within the computer will be explained. The shape data generation unit 5 inputs the shape and dimensions of a robot, a work target object, etc. using a keyboard or the like, and generates shape data 11 .

この生成には、グラフィックディスプレイに表示しなが
ら行われる。
This generation is performed while displaying on a graphic display.

位置・姿勢データ生成部6は、得られた形状データを作
業環境位置へ配置するために、位置・姿勢情報をライト
ベン及びジョイスティックを用いて入力し、入力した情
報を配列型のデータに変換し、これを位置・姿勢データ
12とする。この位置・姿勢データ12を用いて、移動
後のロボット・作業対象物体等をグラフィックディスプ
レイに表示する。
The position/orientation data generation unit 6 inputs position/orientation information using a light ben and joystick, converts the input information into array-type data, and arranges the obtained shape data at a working environment position. This is referred to as position/orientation data 12. This position/orientation data 12 is used to display the robot, work object, etc. after movement on a graphic display.

作業環境位置を入力する場合は、第2図に示すようにグ
ラフィックディスプレイ1のウィンドウを4分割し、作
業空間全体の透視投影図■2作作業量のZX平面への射
影図■、XY平面への射影図■、YZ平面への射影図■
を表示し、ライトベン3を用いて位置データを入力する
When inputting the work environment position, divide the window of graphic display 1 into four as shown in Figure 2, and draw a perspective projection diagram of the entire work space. Projection diagram ■, Projection diagram onto the YZ plane■
is displayed and input position data using Light Ben 3.

ここでは、作業対象物体17上に作業対象物体16を配
置させることを想定して作業対象物体17の各々の射影
図を図示している。また、*印はライトベンで入力した
位置である。
Here, a projection view of each of the work objects 17 is illustrated assuming that the work object 16 is placed on the work object 17. Also, the * mark is the position entered with Light Ben.

姿勢情報を入力する場合は、第3図に示すようにグラフ
ィックディスプレイ1に絶対座標軸aを表示し、この絶
対座標軸aをジョイスティック2を用いて動かす。これ
により動かした後の各座標軸のベクトルを得る。このベ
クトルを姿勢情報とする。
When inputting posture information, an absolute coordinate axis a is displayed on the graphic display 1 as shown in FIG. 3, and the absolute coordinate axis a is moved using the joystick 2. This obtains the vector of each coordinate axis after movement. This vector is taken as posture information.

把持点・接近点・待避点データ生成部9は、第4図に示
すように、ロボットハンド2oが作業対象物体16を把
持する際の位置・姿勢情報(座標軸b)2作業対象物体
16に近づく際の位置・姿勢情報(座標軸C)2作業対
象物体16から離れる際の位置・姿勢情報(座標軸d)
を前述同様ライトペン及びジョイスティックを用いて入
力し、入力された情報を配列型のデータに変換し、これ
を把持点・接近点・待避点データ14とする。
As shown in FIG. 4, the grip point/approach point/evacuation point data generation unit 9 generates position/orientation information (coordinate axis b) 2 when the robot hand 2o approaches the work object 16 when the robot hand 2o grips the work object 16. Position and posture information when moving away from the work target object 16 (coordinate axis C) 2 Position and posture information when leaving the work target object 16 (coordinate axis d)
is input using the light pen and joystick as described above, and the input information is converted into array-type data, which is used as grip point/approach point/evacuation point data 14.

接触状態検出部子では、以下の処理を行う。既にグラフ
ィックディスプレイに表示されている作業対象物体の内
で、接触状態を検出したい作業対象物体等を指定する。
The contact state detection unit performs the following processing. Among the work target objects already displayed on the graphic display, the work target object whose contact state is to be detected is specified.

指定した作業対象物体の形状データを用いて、衝突・接
触・非衝突の3状態を検出し、その状態をグラフィック
ディスプレイに表示する。衝突・非衝突ならば、指定し
た作業対象物体等がどれだけくい込んでいるか、あるい
は離れているかといった情報も出力するので、それらの
情報をキーボード等で入力し、接触するまでこの処理を
繰り返す。接触しているならば、指定した作業対象物体
等の接続関係を生成するだめ、接続関係生成部8に処理
を受渡す。
Using the shape data of the specified object to be worked on, three states of collision, contact, and non-collision are detected and the states are displayed on a graphic display. If there is a collision or non-collision, information such as how deep the specified object is to be worked on or how far away it is is also output, so enter this information using a keyboard or the like, and repeat this process until contact occurs. If they are in contact, the process is passed to the connection relationship generation unit 8 to generate the connection relationship between the designated objects to be worked on and the like.

接続関係生成部8では、接触状態検出部8の処理結果が
接触状態である時に起動され、接触状態にある作業対象
物体間の接続データ13を生成する0 第5図は、作業対象物体16が作業物体17に斜線部で
接触していることを示す。この位置関係で作業対象物体
16は、作業対象物体17上に存在することがわかる。
The connection relationship generation unit 8 is activated when the processing result of the contact state detection unit 8 is a contact state, and generates connection data 13 between work target objects in a contact state. The hatched area indicates contact with the work object 17. It can be seen that the work target object 16 is present on the work target object 17 based on this positional relationship.

生成される接続データは、作業対象物体17の上に作業
対象物体16が存在するといっだ関係を記述したもので
ある。これらの接続関係は、作業環境を教示している間
、作業対象物体等を移動させる度に変化していくもので
あり、接触状態が検出されれば、新しい接続関係を生成
する。一度接続関係が記述されたならば、ある作業対象
物体を移動する場合、その作業対象物体の上に他の作業
対象物体があるかどうか接続関係を調べ、上に作業対象
物体があれば、その作業対象物体も移動させ、グラフィ
ックディスプレイに表示する。
The generated connection data describes the relationship when the work target object 16 exists on the work target object 17. These connection relationships change each time the object to be worked on is moved while teaching the work environment, and if a contact state is detected, a new connection relationship is generated. Once the connection relationship has been described, when moving a work target object, check the connection relationship to see if there is another work target object above the work target object, and if there is a work target object above it, move the work target object. The object to be worked on is also moved and displayed on the graphic display.

作業環境データ作成部10は前述の各処理によって生成
されたデータを編集して作業環境データ15を作成する
The work environment data creation unit 10 creates work environment data 15 by editing the data generated by each of the above-described processes.

発明の効果 以上のように、本発明によれば作業環境データを別途に
作成するだめ、正確なロボット作業プログラムを簡単に
作成でき、また、グラフィックディスプレイに表示する
ことにより、作業環境を視覚的にとらえることが可能と
なり、作業環境は教示しやすい03次元入力装置を用い
ることにより、3次元位置データの修正・変更も容易に
行うことができる。さらに、ロボット・作業対象物体の
形諸データを枝つと入により、梓触状態を検出すること
ができ、その検出された状態によって作業対象物体の接
続関係を自動的に生成するため、動的に変化する接続関
係を効率よく入力できる。作業対象物体を移動させる時
にはこの接続間イ駄を基にその作業対象物体と関連のあ
る作業対象物体も移動させるため、移動情報を効率よく
入力できる。
Effects of the Invention As described above, according to the present invention, an accurate robot work program can be easily created without having to create work environment data separately, and the work environment can be visualized by displaying it on a graphic display. By using the 03D input device, which makes it possible to capture and easily teach the working environment, it is also possible to easily correct and change the 3D position data. Furthermore, by inputting the shape data of the robot and work object, it is possible to detect the contact state, and the connection relationship of the work work object is automatically generated based on the detected state, so it is dynamic. You can efficiently input changing connection relationships. When moving the object to be worked on, the object to be worked on that is related to the object to be worked on is also moved based on the connections, so movement information can be efficiently input.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は本発明の一実施例の作業環境教示装置のブロッ
ク構成図、第2図及び第3図は同装置の位置・姿勢情報
入力手段の説明図、第4図はロボットの作業対象物体に
関する把持点・接近点・待避点の説明図、第5図は作業
対象物体間の配置関係の説明図である。 1・・・・・・グラフィックディスプレイ、2・・・・
・・ジョイスティック、3・・・・・・ライトペン、4
・・・・・・計算機、6・・・・・・形状データ生成部
、6・・・・・・位置・姿勢データ生成部、7・・・・
・・接触状態検出部、8・・・・・・接続関係生成部、
9・・・・・・把持点・接近点・待避点データ生成部、
1o・・・・・・作業環境データ作成部、11・・・・
・・形状データ、12・・・・・位置・姿勢データ、1
3・・・・・・接続データ、14・・・・・・把持点・
接近点・待避点データ、15・・・・・・作業環境デー
タ。 特許出願人  工業技術院長  等々カ 達第 2 図 第3図 第4図 第5図
FIG. 1 is a block configuration diagram of a working environment teaching device according to an embodiment of the present invention, FIGS. 2 and 3 are explanatory diagrams of the position/orientation information input means of the same device, and FIG. 4 is a work target object of the robot. FIG. 5 is an explanatory diagram of the grasping point, approach point, and evacuation point regarding the object, and FIG. 5 is an explanatory diagram of the arrangement relationship between the objects to be worked on. 1... Graphic display, 2...
...Joystick, 3...Light pen, 4
. . . Computer, 6 . . . Shape data generation section, 6 . . . Position/orientation data generation section, 7. . .
...Contact state detection unit, 8...Connection relationship generation unit,
9...Gripping point/approach point/evacuation point data generation unit,
1o... Working environment data creation department, 11...
...Shape data, 12...Position/orientation data, 1
3... Connection data, 14... Grasping point
Approach point/evacuation point data, 15... Work environment data. Patent applicant Director of the Agency of Industrial Science and Technology etc. Figure 2 Figure 3 Figure 4 Figure 5

Claims (1)

【特許請求の範囲】[Claims] ロボット・作業対象物体等の形状を生成する装置と、前
記ロボット・前記作業対象物体等の作業空間内の位置・
姿勢情報および前記ロボットの前記作業対象物体に関す
る把持点・接近点・待避点等を入力する入力装置と、前
記入力装置によって入力された位置・姿勢情報を基にし
て、移動させたロボットと作業対象物体、ロボット相互
又は作業対象物体相互の接触状態を検出する接触検出装
置と、前記接触検出装置によって検出された情報を基に
して、ロボット・作業対象物体の接続関係を生成する接
続関係生成装置と、前記入力装置によって入力された位
置・姿勢情報と把持点・接近点・待避点等の修正を行う
装置と、前記接続関係生成装置によって生成された接続
関係を基にしてロボット・作業対象物体等の配置関係を
表示する装置を具備し、ロボットの作業環境の教示デー
タを形成することを特徴とする作業環境教示装置。
A device that generates the shape of a robot, object to be worked, etc., and a device that generates the shape of the robot, object to be worked, etc., and a device that generates the shape of the robot, object to be worked, etc.
an input device for inputting posture information and gripping points, approach points, evacuation points, etc. of the work target object of the robot; and a robot and work target that are moved based on the position/posture information input by the input device. a contact detection device that detects a contact state between objects, robots, or objects to be worked on; and a connection relationship generation device that generates a connection relationship between the robot and the object to be worked on based on information detected by the contact detection device. , a device that corrects the position/orientation information input by the input device and the gripping point, approach point, evacuation point, etc., and a robot, work target object, etc. based on the connection relationship generated by the connection relationship generation device. What is claimed is: 1. A work environment teaching device comprising a device for displaying the arrangement relationship of the robots, and forming teaching data for a robot's work environment.
JP631486A 1986-01-17 1986-01-17 Work environment teaching device Pending JPS62165213A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP631486A JPS62165213A (en) 1986-01-17 1986-01-17 Work environment teaching device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP631486A JPS62165213A (en) 1986-01-17 1986-01-17 Work environment teaching device

Publications (1)

Publication Number Publication Date
JPS62165213A true JPS62165213A (en) 1987-07-21

Family

ID=11634908

Family Applications (1)

Application Number Title Priority Date Filing Date
JP631486A Pending JPS62165213A (en) 1986-01-17 1986-01-17 Work environment teaching device

Country Status (1)

Country Link
JP (1) JPS62165213A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62274308A (en) * 1986-05-22 1987-11-28 Kobe Steel Ltd Off-line teaching method for robot
JPH03288210A (en) * 1990-04-03 1991-12-18 Kobe Steel Ltd Off-line teaching system for handling robot
JPH03288211A (en) * 1990-04-03 1991-12-18 Kobe Steel Ltd Off-line teaching system for handling robot
JPH03288209A (en) * 1990-04-03 1991-12-18 Kobe Steel Ltd Off-line teaching system for handling robot
WO1998003314A1 (en) * 1996-07-24 1998-01-29 Fanuc Ltd Jog feeding method for robots
DE102005048812B4 (en) * 2005-10-10 2011-02-10 Universität Stuttgart Control of workpiece-processing machines
JP2018144162A (en) * 2017-03-03 2018-09-20 株式会社キーエンス Robot setting device, robot setting method, robot setting program, computer-readable recording medium, and recorded device
JP2018144164A (en) * 2017-03-03 2018-09-20 株式会社キーエンス Robot setting device, robot setting method, robot setting program, computer-readable recording medium, and recorded device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59183408A (en) * 1983-04-01 1984-10-18 Mitsubishi Electric Corp Robot control system
JPS59229619A (en) * 1983-06-10 1984-12-24 Hitachi Ltd Work instructing system of robot and its using
JPS6097409A (en) * 1983-11-02 1985-05-31 Hitachi Ltd Operation teaching method of robot
JPS60195615A (en) * 1984-03-16 1985-10-04 Hitachi Ltd Method for teaching attitude of multi-joint robot
JPS60195613A (en) * 1984-03-16 1985-10-04 Hitachi Ltd Robot teaching device with verifying function
JPS60205720A (en) * 1984-03-30 1985-10-17 Matsushita Electric Ind Co Ltd Robot operation teaching device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59183408A (en) * 1983-04-01 1984-10-18 Mitsubishi Electric Corp Robot control system
JPS59229619A (en) * 1983-06-10 1984-12-24 Hitachi Ltd Work instructing system of robot and its using
JPS6097409A (en) * 1983-11-02 1985-05-31 Hitachi Ltd Operation teaching method of robot
JPS60195615A (en) * 1984-03-16 1985-10-04 Hitachi Ltd Method for teaching attitude of multi-joint robot
JPS60195613A (en) * 1984-03-16 1985-10-04 Hitachi Ltd Robot teaching device with verifying function
JPS60205720A (en) * 1984-03-30 1985-10-17 Matsushita Electric Ind Co Ltd Robot operation teaching device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62274308A (en) * 1986-05-22 1987-11-28 Kobe Steel Ltd Off-line teaching method for robot
JPH03288210A (en) * 1990-04-03 1991-12-18 Kobe Steel Ltd Off-line teaching system for handling robot
JPH03288211A (en) * 1990-04-03 1991-12-18 Kobe Steel Ltd Off-line teaching system for handling robot
JPH03288209A (en) * 1990-04-03 1991-12-18 Kobe Steel Ltd Off-line teaching system for handling robot
WO1998003314A1 (en) * 1996-07-24 1998-01-29 Fanuc Ltd Jog feeding method for robots
US6088628A (en) * 1996-07-24 2000-07-11 Fanuc, Ltd. Jog feeding method for robots
JP3841439B2 (en) * 1996-07-24 2006-11-01 ファナック株式会社 Robot jog feed method
DE102005048812B4 (en) * 2005-10-10 2011-02-10 Universität Stuttgart Control of workpiece-processing machines
JP2018144162A (en) * 2017-03-03 2018-09-20 株式会社キーエンス Robot setting device, robot setting method, robot setting program, computer-readable recording medium, and recorded device
JP2018144164A (en) * 2017-03-03 2018-09-20 株式会社キーエンス Robot setting device, robot setting method, robot setting program, computer-readable recording medium, and recorded device

Similar Documents

Publication Publication Date Title
Ong et al. Augmented reality-assisted robot programming system for industrial applications
JP6469159B2 (en) Offline programming apparatus and method with work position detection program generation function by contact sensor
JP5108032B2 (en) Multi-joint structure teaching device
Weber et al. Visual, vibrotactile, and force feedback of collisions in virtual environments: effects on performance, mental workload and spatial orientation
US20190022864A1 (en) Robot control device, robot system, and simulation device
JPS62165213A (en) Work environment teaching device
Ponomareva et al. Grasplook: a vr-based telemanipulation system with r-cnn-driven augmentation of virtual environment
JPH06110543A (en) Direct teaching device
Williams A framework for robot-generated mixed-reality deixis
JPS6179589A (en) Operating device for robot
JPS62274404A (en) Off-line teaching system for robot
JPH06134684A (en) Teaching method of robot track
JPS6097409A (en) Operation teaching method of robot
JPH1177568A (en) Teaching assisting method and device
JPH06102919A (en) Method for teaching robot orbit
JPWO2021260898A5 (en)
JPH01131904A (en) Robot operation supporting system
Cheng et al. A study of using 2D vision system for enhanced industrial robot intelligence
JP2868343B2 (en) Off-line teaching method of 3D laser beam machine
WO2023203747A1 (en) Robot teaching method and device
JPH0716900B2 (en) Automatic teaching method for robot
JPH01112309A (en) Robot supporting system
Sachs et al. 3-Draw: a three dimensional computer aided design tool
JP2022047503A (en) Detection method for both hand in teaching by demonstration
Miyagawa et al. Task assistance with human-augmented hand and its performance analysis