JPH04100118A - Position detection for robot with visual sense - Google Patents

Position detection for robot with visual sense

Info

Publication number
JPH04100118A
JPH04100118A JP17245690A JP17245690A JPH04100118A JP H04100118 A JPH04100118 A JP H04100118A JP 17245690 A JP17245690 A JP 17245690A JP 17245690 A JP17245690 A JP 17245690A JP H04100118 A JPH04100118 A JP H04100118A
Authority
JP
Japan
Prior art keywords
workpiece
camera
feature point
work
feature points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP17245690A
Other languages
Japanese (ja)
Inventor
Akihiko Ezaki
江崎 昭彦
Katsuhiro Takamori
高森 克廣
Noribumi Yoshida
則文 吉田
Katsuaki Hara
勝明 原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yaskawa Electric Corp
Original Assignee
Yaskawa Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yaskawa Electric Corp filed Critical Yaskawa Electric Corp
Priority to JP17245690A priority Critical patent/JPH04100118A/en
Publication of JPH04100118A publication Critical patent/JPH04100118A/en
Pending legal-status Critical Current

Links

Landscapes

  • Numerical Control (AREA)
  • Manipulator (AREA)
  • Control Of Position Or Direction (AREA)

Abstract

PURPOSE:To control a robot to handle many kinds of work with a high precision by preliminarily indicating a feature point to be detected of a work by a reference tool and obtaining the size of the work in accordance with absolute coordinates of this feature point and detecting the feature point of the work by a camera. CONSTITUTION:The feature point to be detected of a reference work W is preliminarily indicated by a reference tool 4 attached to the front end of a robot arm 1, and absolute coordinates of this feature point are taught to a controller. The feature point of the reference work W is put in the visual field of a camera 3 to store its position in the visual field of the camera 3, and the extent of relative deviation is detected based on the position of the picture of the feature point of an actual work W' put in the visual field of the camera 3 in the actual work and the preliminarily stored position of the feature point of the reference work in the visual field of the camera. Thus, the robot is controlled to handle many kinds of work with a high precision.

Description

【発明の詳細な説明】 [産業上の利用分野] 本発明は、組み立て作業用ロボット等に利用するワーク
の位置を検出する方法に関するもので、とくに正確に位
置決めがされずに供給される対象ワークに対してその位
置ズレ量を補正する位置検出方法に関する。
[Detailed Description of the Invention] [Industrial Application Field] The present invention relates to a method for detecting the position of a workpiece used in an assembly robot, etc., and particularly relates to a method for detecting the position of a workpiece used in an assembly robot, etc. The present invention relates to a position detection method for correcting the amount of positional deviation.

[従来の技術] 一般に、カメラ等の視覚を備えたロボットによりワーク
の位置ズレを検出するための画像処理としては、ハンド
アイ方式と呼ばれるロボットアームの先端にカメラを取
り付けて検出する方法と、固定カメラ方式と呼ばれるロ
ボットとは別の場所にカメラを設置して検出する方法と
があり、どちらもカメラで盪影した画像から対象ワーク
の位置を検出するものである(例えば特開昭62−27
1641号公報)。
[Prior art] In general, image processing methods for detecting positional deviation of a workpiece using a robot equipped with a visual system such as a camera include a method called a hand-eye method, in which a camera is attached to the tip of a robot arm, and a fixed method. There is a detection method called the camera method, in which a camera is installed in a location separate from the robot.
1641).

[発明が解決しようとする課題] ところが、寸法の大きな測定対象物を精度よく検出する
必要がある場合、どちらの方式もカメラ視野の大きさと
測定分解能との相反する課題を抱えているが、特に固定
カメラ方式による検出はカメラ視野の大きさか測定分解
能かのいずれか一方を犠牲にしなければならないため、
特定の大きさのワークに活用する場合に限定されるとい
う欠点があった。
[Problems to be Solved by the Invention] However, when it is necessary to accurately detect a measuring object with large dimensions, both methods have conflicting issues between the size of the camera field of view and the measurement resolution. Detection using a fixed camera method requires sacrificing either the size of the camera field of view or the measurement resolution.
It has the disadvantage that it is limited to applications of specific size workpieces.

また、ハンドアイ方式の場合は、カメラを移動させるこ
とにより、対象ワークの複数の特徴点を精度よ(検出し
、その幾何学的な関係により設定位置からのずれ爪を算
出することができるが、そのためには対象ワークの寸法
データを予め登録しておく必要がある。また、カメラの
画像上の特徴点をカメラの基準位置に合せるとき、ロボ
ットアームを移動させながら見ることができないため位
置合ぜに時間がかかる。しかも、ワークの大きさがカメ
ラの視野より遥かに大きく、寸法が異なるワークが誤っ
て供給された場合など、特徴点付近だけをカメラの視野
に入れて特徴点の位置を検出しても他の特徴点との間の
絶対寸法などをチエツクできる機能を備えていなければ
正しくワークの位置を検出することは出来ないなどの欠
点があった。
In addition, in the case of the hand-eye method, by moving the camera, it is possible to accurately detect multiple feature points of the target workpiece and calculate the deviation of the jaw from the set position based on the geometric relationship. In order to do this, it is necessary to register the dimensional data of the target workpiece in advance.Also, when aligning the feature points on the camera image with the camera's reference position, it is difficult to align the position because it cannot be seen while moving the robot arm. Furthermore, in cases where the size of the workpiece is much larger than the field of view of the camera and a workpiece with a different size is accidentally supplied, it is possible to place only the vicinity of the feature point in the field of view of the camera and determine the position of the feature point. Even if detected, the position of the workpiece cannot be detected correctly unless it is equipped with a function that can check the absolute dimensions between it and other feature points.

本発明は、ハンドアイ方式により対象ワークの設定位置
とのズレ量を精度よく検出し、あわせて絶対寸法を示す
形状情報を得ることを目的とするものである。
An object of the present invention is to accurately detect the amount of deviation of a target workpiece from a set position using a hand-eye method, and also to obtain shape information indicating absolute dimensions.

[課題を解決するための手段] 本発明は、ロボットアームの先端にカメラを設けてワー
クの位置を検出する位置検出方法において、前記ロボッ
トアームの先端に基準ツールを取付け、予め前記基準ツ
ールを実作業ワークと同様の形状を備えた基準ワークの
複数の特徴点の位置を指示して前記指示した特徴点の座
標情報を記憶し、前記ロボットアームを移動させて前記
カメラにより基準ワークの特徴点のカメラ視野内の位置
を記憶し、実作業ワークの前記基準ワークの特徴点に対
応する特徴点のカメラ視野内の位置を記憶し、前記基準
ワークの特徴点のカメラ視野内の位置と前記実作業ワー
クの特徴点のカメラ視野内の位置から前記基準ワークの
特徴点と前記実作業ワークの特徴点とのズレ量を求め、
前記ズレ量と前記基準ワークの特徴点の座標情報から前
記実作業ワークのズレ量を求める方法である。
[Means for Solving the Problems] The present invention provides a position detection method in which a camera is provided at the tip of a robot arm to detect the position of a workpiece. The positions of a plurality of feature points on a reference workpiece having the same shape as the workpiece are specified, the coordinate information of the specified feature points is stored, and the robot arm is moved to locate the feature points of the reference workpiece using the camera. storing a position within the camera field of view of a feature point of the actual workpiece corresponding to the feature point of the reference work; storing a position within the camera field of view of the feature point of the reference work and the actual work; Determining the amount of deviation between the feature points of the reference workpiece and the feature points of the actual workpiece from the position of the feature points of the workpiece within the camera field of view,
This is a method of determining the amount of deviation of the actual workpiece from the amount of deviation and coordinate information of the feature points of the reference work.

また、前記基準ワークの特徴点の座標情報と前記実作業
ワークのズレmとから前記実作業ワークの特徴点の座標
情報を求め、前記基準ワークおよび前記実作業ワークの
形状情報を求める方法である。
Further, the method is such that the coordinate information of the feature points of the actual workpiece is determined from the coordinate information of the feature points of the reference workpiece and the deviation m of the actual workpiece, and the shape information of the reference workpiece and the actual workpiece is determined. .

「作用] 予めロボットアームの先端に取り付(ブた基準ツールに
よって基準ワークの検出すべき特徴点を指示して、その
特徴点の絶対座標を制御装置に教示しておくとともに、
基準ワークの特徴点をカメラの視野に入れてカメラの視
野の中の位置を記憶しておき、実作業でカメラの視野に
入れて取り入れた実作業ワークの特徴点の画像の位置と
、予め記憶したカメラの視野の中の基準ワークの特徴点
の位置から相対的なズレ量を検出することにより、最初
に教示した基準ワークの特徴点の絶対座標値と前記相対
的なズレ量から、基準ワークと実作業ワークの特徴点の
間の絶対位置関係を得ることができるとともに、ワーク
の絶対寸法などの形状情報を演算により求めることがで
きる。
``Function'' Attached to the tip of the robot arm in advance, the reference tool is used to indicate the feature points to be detected on the reference workpiece, and the absolute coordinates of the feature points are taught to the control device.
The feature points of the reference workpiece are placed in the field of view of the camera and their positions within the field of view of the camera are memorized, and the positions of the images of the feature points of the actual workpiece taken into the field of view of the camera during the actual work are stored in advance. By detecting the relative deviation amount from the position of the feature point of the reference workpiece in the field of view of the camera, the reference workpiece is It is possible to obtain the absolute positional relationship between the feature points of the actual workpiece and the actual workpiece, and also to obtain shape information such as the absolute dimensions of the workpiece by calculation.

[実施例] 本発明を図に示す実施例について説明する。[Example] The present invention will be described with reference to embodiments shown in the drawings.

第1図は本発明の実施例を示す斜視図で、ロボットアー
ム1の先端にワーク把持部2とカメラ3とが備えられて
いる。また、ワーク把持部2の付近に設定されたロボッ
トの現在位置を示す制御点の位置を指示するように、取
付は取り外しが可能な基準ツール4が備えられている。
FIG. 1 is a perspective view showing an embodiment of the present invention, in which a robot arm 1 is provided with a work gripping section 2 and a camera 3 at its tip. Further, a reference tool 4 that can be attached and detached is provided so as to indicate the position of a control point that is set near the workpiece gripping part 2 and indicates the current position of the robot.

つぎに、以上のように構成された装置により、実作業で
検出の対象となる実作業ワークと同様の形状を備えた基
準ワークを使用して、実作業ワークの位置検出を行う動
作を説明する。
Next, we will explain the operation of detecting the position of an actual workpiece using the device configured as described above, using a reference workpiece having a shape similar to the actual workpiece to be detected in the actual work. .

まず、ロボットの制御点位置に取り付けられた基準ツー
ル4の先端をロボット作業のティーチング位置に置かれ
た基準ワークWの基準位置となる例えばボルト締付は穴
等の特徴点PI、P2の位置へ移動する(ステップ1)
First, the tip of the reference tool 4 attached to the control point position of the robot is moved to the reference position of the reference workpiece W placed at the teaching position for robot work, for example, to the position of feature points PI and P2 such as holes when tightening bolts. Move (Step 1)
.

次に、ロボット座標系の絶対値として認識するための参
照点として、第2図に示すように、特徴点PI、P2の
絶対座標(x、、 y、、 x2. y2)を記憶する
(ステップ2)。
Next, as a reference point for recognition as the absolute value of the robot coordinate system, the absolute coordinates (x, y, , x2, y2) of the feature points PI and P2 are stored as shown in FIG. 2 (step 2).

次にロボットアーム1を移動させてカメラ3のカメラ視
野Aに特徴点P1を入れて、視野の中での特徴点P1の
位置を記憶してお((ステップ3)。
Next, the robot arm 1 is moved to place the feature point P1 in the camera field of view A of the camera 3, and the position of the feature point P1 within the field of view is memorized ((Step 3).

次に、実作業において、例えば、第1図に示すコンベア
Cにより矢印方向に搬送され、第2図に示すように、テ
ィーチング位置の基準ワークWにほぼ重なる位置(第2
図では分かり易く大きくずれた位置にしである)ではあ
るが正確に位置決めされずに適当に置かれた実作業ワー
クW°に対して、ステップ3のカメラの位置でカメラ3
によって特徴点Piに対応する特徴点PI’ の位置を
画像データとして取り入れ、ついでカメラ3を特徴点P
1から特徴点P2までの相対的位置だけ移動して特徴点
P2に対応する特徴点P2’ の位置を画像データとし
て取り入れる(ステップ4)。そして、各特徴点の位置
画像データから画像処理により特徴点PL、P2の基準
位置(X、、 Y、、 X2Y 2 )からの特徴点P
1°、P2°のズ1ノf11(ΔXΔy+)、(ΔX2
 +  Δy2)を求める(ステップ5)。
Next, in the actual work, the workpiece is transported in the direction of the arrow by the conveyor C shown in FIG. 1, and as shown in FIG.
In the figure, it is easy to understand that the position is greatly shifted), but the camera 3
The position of the feature point PI' corresponding to the feature point Pi is taken in as image data, and then the camera 3 is moved to the feature point P.
1 to the feature point P2, and the position of the feature point P2' corresponding to the feature point P2 is taken in as image data (step 4). Then, the feature point P from the reference position (X,, Y,, X2Y 2 ) of the feature point PL, P2 is determined by image processing from the position image data of each feature point.
1°, P2° angle 1 f11 (ΔXΔy+), (ΔX2
+Δy2) is determined (Step 5).

次に、ズレff1(ΔXl +  Δy+)、(ΔX 
2 +  Δy2)から特徴点PI’、P2°の絶対座
標(X、’ 、  YX2’ 、 Y2°)を次の式に
より求める(ステップ6)。
Next, the deviation ff1 (ΔXl + Δy+), (ΔX
2 + Δy2), the absolute coordinates (X,', YX2', Y2°) of the feature points PI' and P2° are determined by the following formula (Step 6).

x、’  =x、+Δx、、Y、° =Y、+Δyx2
° =:X2−+4x2.y21  =Y2+Δy2ま
た、同時に特徴点PlとP2、および特徴点P1 とP
2 との間隔L 1およびL2を次の演算により求める
(ステップ7)。
x,' =x, +Δx,,Y,° =Y,+Δyx2
° =:X2-+4x2. y21 = Y2 + Δy2 Also, at the same time, the feature points Pl and P2, and the feature points P1 and P
2, the distances L1 and L2 are determined by the following calculations (step 7).

L1=q丁■丁ゴ])−]q L2=     −→− ワークWのズレ補正量は、ワーク中心OがずれてOoの
位置に移動したときの座標のズレ量ΔX。
L1=q ding■ dinggo])-]q L2=-→- The deviation correction amount of the workpiece W is the deviation amount ΔX of the coordinates when the workpiece center O is shifted and moved to the position Oo.

ΔYおよび傾き角θとして、次の式で求める(ステップ
8)。
ΔY and inclination angle θ are determined using the following equations (step 8).

ΔX=(ΔX1+△X2) /2 ΔY=(Δy1+Δyz)/ま ただし、L3はワーク中心OをOoに重なるように平行
移動したときの仮想の特徴点P、” (X、”Yl”)
と特徴点Pl゛  との間の間隔とし、それはXl”=
X、+Δx、y、” =y、+ΔY より、L 3 =
 (1/2) ・X、 −X、  + Y、 −Y。
ΔX=(ΔX1+ΔX2)/2 ΔY=(Δy1+Δyz)/However, L3 is the virtual feature point P when the workpiece center O is translated in parallel so that it overlaps with Oo," (X, "Yl")
and the feature point Pl゛, which is Xl”=
From X, +Δx,y,” =y, +ΔY, L 3 =
(1/2) ・X, -X, + Y, -Y.

として求められる。It is required as.

第3図はワークのずれ量を求める動作のフローチャート
である。
FIG. 3 is a flowchart of the operation for determining the amount of shift of the workpiece.

以上によって求めた特徴点の絶対座標、特徴点間の間隔
、ワーク中心の座標のズレ量等の座標情報から補正量を
求め、それをロボットの制御装置に与えてワーク位置の
補正を行う。
A correction amount is obtained from the coordinate information such as the absolute coordinates of the feature points, the interval between the feature points, and the amount of shift in the coordinates of the center of the workpiece, which are obtained in the above manner, and the correction amount is given to the robot control device to correct the workpiece position.

このようにして、最初に教示した特徴点の絶対座標値と
前記相対的なズレ量から、ワークの特徴点の間の絶対位
置関係を得ることができるとともに、ワークの絶対寸法
などの形状情報を演算により求めることができ、ワーク
の寸法の違い、さらには機種の違いを検証することがで
きる。
In this way, it is possible to obtain the absolute positional relationship between the feature points of the workpiece from the absolute coordinate values of the feature points initially taught and the above-mentioned relative deviation amount, and also obtain shape information such as the absolute dimensions of the workpiece. It can be determined by calculation, and it is possible to verify differences in workpiece dimensions and even differences in models.

また、実作業においてカメラでワークの特徴点の画像を
取り込むとき、従来のように特徴点をカメラ視野の基準
点に位置決めするために注意を払う必要がなく、ただ特
徴点をカメラの視野に入れさえすればよい。
In addition, when capturing images of feature points on a workpiece with a camera during actual work, there is no need to pay attention to positioning the feature points to the reference points in the camera's field of view, as in the past, but simply place the feature points in the camera's field of view. All you have to do is

また、カメラを複数の特徴点付近に移動する場合も、既
に基準ワークによって得られた座標情報をロボットの制
御装置に入力することによって簡単に移動できるので、
カメラの位置を設定する手間が大幅に軽減できる。
Additionally, when moving the camera near multiple feature points, you can easily move the camera by inputting the coordinate information already obtained from the reference work into the robot's control device.
The hassle of setting the camera position can be greatly reduced.

[発明の効果] 以上述べたように、本発明によれば、予め基準ツールに
よってワークの検出すべき特徴点を指示して、その特徴
点の絶対座標からワークの大きさを求め、さらにワーク
の特徴点のみをカメラで検出するので、カメラの視野を
小さくして分解能を上げて、高い精度でズレ量を検出す
ることができるとともに、比較的大きなワークに対して
もワークの絶対寸法を得ることができるので、精度の高
い多品種のワークのハンドリングを行うロボット制御を
可能にする効果がある。
[Effects of the Invention] As described above, according to the present invention, the feature points to be detected on the workpiece are specified in advance by the reference tool, the size of the workpiece is determined from the absolute coordinates of the feature points, and the size of the workpiece is further detected. Since only the feature points are detected by the camera, the field of view of the camera is reduced and the resolution is increased, making it possible to detect the amount of deviation with high accuracy, and also to obtain the absolute dimensions of the workpiece even for relatively large workpieces. This has the effect of enabling robot control that handles a wide variety of workpieces with high precision.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は本発明の実施例を示す斜視図、第2図はワーク
のズレ量を説明する平面座標図、第3図はフローチャー
トである。 l・ ロボットアーム、2・・・ワーク把持部、3・・
・カメラ、4・・基準ツール、 第 図 特許出願人 株式会社 安用電機製作所第 図 手 続 補 正 書(方式) 事件の表示 平成2年 特許願第172456号 発明の名称 視覚付ロボッ)・の位置検出方法 補正をする者 事件との関係   特許出願人 住所  福岡県北九州市八幡西区大字藤田2346番地
名称  (662)株式会社 安用電機製作所第1図か
ら第2図を別紙のとおりに訂正する。 第 図 第 図
FIG. 1 is a perspective view showing an embodiment of the present invention, FIG. 2 is a plane coordinate diagram illustrating the amount of deviation of a workpiece, and FIG. 3 is a flowchart. l. Robot arm, 2... Work gripping part, 3...
・Camera, 4...Reference tool, Diagram Patent applicant Yasuyo Electric Manufacturing Co., Ltd. Diagram procedural amendment (method) Indication of incident 1990 Patent application No. 172456 Name of invention Vision robot) Position detection Relationship to the case of the person amending the method Patent Applicant Address 2346 Oaza Fujita, Yahatanishi-ku, Kitakyushu City, Fukuoka Prefecture Name (662) Anyo Electric Manufacturing Co., Ltd. Figures 1 to 2 are corrected as shown in the attached sheet. Figure Figure

Claims (1)

【特許請求の範囲】 1、ロボットアームの先端にカメラを設けてワークの位
置を検出する位置検出方法において、前記ロボットアー
ムの先端に基準ツールを取付け、 予め前記基準ツールを実作業ワークと同様の形状を備え
た基準ワークの複数の特徴点の位置を指示して前記指示
した特徴点の座標情報を記憶し、前記ロボットアームを
移動させて前記カメラにより基準ワークの特徴点のカメ
ラ視野内の位置を記憶し、 実作業ワークの前記基準ワークの特徴点に対応する特徴
点のカメラ視野内の位置を記憶し、前記基準ワークの特
徴点のカメラ視野内の位置と前記実作業ワークの特徴点
のカメラ視野内の位置から前記基準ワークの特徴点と前
記実作業ワークの特徴点とのズレ量を求め、 前記ズレ量と前記基準ワークの特徴点の座標情報から前
記実作業ワークのズレ量を求めることを特徴とする視覚
付ロボットの位置検出方法。 2、前記基準ワークの特徴点の座標情報と前記実作業ワ
ークのズレ量とから前記実作業ワークの特徴点の座標情
報を求め、前記基準ワークおよび前記実作業ワークの形
状情報を求める請求項1記載の視覚付ロボットの位置検
出方法。
[Claims] 1. In a position detection method in which a camera is provided at the tip of a robot arm to detect the position of a workpiece, a reference tool is attached to the tip of the robot arm, and the reference tool is set in advance to be similar to the actual workpiece. Instructing the positions of a plurality of feature points of a reference workpiece having a shape, storing coordinate information of the indicated feature points, and moving the robot arm to position the feature points of the reference workpiece within the camera field of view using the camera. , memorize the position of the feature point of the actual workpiece in the camera field of view corresponding to the feature point of the reference workpiece, and store the position of the feature point of the reference workpiece in the camera field of view and the feature point of the actual workpiece. Determining the amount of deviation between the feature points of the reference workpiece and the feature points of the actual workpiece from a position within the field of view of the camera, and determining the amount of deviation of the actual workpiece from the amount of deviation and coordinate information of the feature point of the reference workpiece. A method for detecting the position of a robot with vision, characterized in that: 2. The coordinate information of the feature points of the actual workpiece is obtained from the coordinate information of the feature points of the reference workpiece and the amount of deviation of the actual workpiece, and the shape information of the reference workpiece and the actual workpiece is obtained. The position detection method of the described robot with vision.
JP17245690A 1990-06-28 1990-06-28 Position detection for robot with visual sense Pending JPH04100118A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP17245690A JPH04100118A (en) 1990-06-28 1990-06-28 Position detection for robot with visual sense

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP17245690A JPH04100118A (en) 1990-06-28 1990-06-28 Position detection for robot with visual sense

Publications (1)

Publication Number Publication Date
JPH04100118A true JPH04100118A (en) 1992-04-02

Family

ID=15942331

Family Applications (1)

Application Number Title Priority Date Filing Date
JP17245690A Pending JPH04100118A (en) 1990-06-28 1990-06-28 Position detection for robot with visual sense

Country Status (1)

Country Link
JP (1) JPH04100118A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008080466A (en) * 2006-09-28 2008-04-10 Daihen Corp Teaching method of carrier robot
JP2012122786A (en) * 2010-12-07 2012-06-28 Honda Motor Co Ltd Method and device for alignment of works

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008080466A (en) * 2006-09-28 2008-04-10 Daihen Corp Teaching method of carrier robot
JP2012122786A (en) * 2010-12-07 2012-06-28 Honda Motor Co Ltd Method and device for alignment of works

Similar Documents

Publication Publication Date Title
US11046530B2 (en) Article transfer apparatus, robot system, and article transfer method
EP1215017B1 (en) Robot teaching apparatus
CN108965690B (en) Image processing system, image processing apparatus, and computer-readable storage medium
EP1607194B1 (en) Robot system comprising a plurality of robots provided with means for calibrating their relative position
JP5815761B2 (en) Visual sensor data creation system and detection simulation system
JP4004899B2 (en) Article position / orientation detection apparatus and article removal apparatus
JP2004508954A (en) Positioning device and system
US11565422B2 (en) Controller of robot apparatus for adjusting position of member supported by robot
JP2011031346A (en) Apparatus and method for measuring position of tool end point of robot
EP1574300A2 (en) Measuring system
US20180354137A1 (en) Robot System Calibration
US11826919B2 (en) Work coordinate generation device
CN112419406B (en) Object detection device and computer program product for object detection
JPH1063324A (en) Picture input-type robot system
JP7384653B2 (en) Control device for robot equipment that controls the position of the robot
JPH04100118A (en) Position detection for robot with visual sense
JP6750406B2 (en) Failure factor identification method in loading/unloading work of loading/unloading device
JPH04269194A (en) Plane measuring method
JP2003121112A (en) Location detecting apparatus
JP2007025991A (en) Automatic working method for work and automatic working system for work
JP3562096B2 (en) Position detection method
JPH07136959A (en) Robot positioning method
JPH0731536B2 (en) Teaching data correction robot
JPH0477806A (en) Detecting device for position error of robot
JP2601176Y2 (en) Work handling equipment