JPS60151712A - Calibration system for robot visual coordinate system - Google Patents

Calibration system for robot visual coordinate system

Info

Publication number
JPS60151712A
JPS60151712A JP636584A JP636584A JPS60151712A JP S60151712 A JPS60151712 A JP S60151712A JP 636584 A JP636584 A JP 636584A JP 636584 A JP636584 A JP 636584A JP S60151712 A JPS60151712 A JP S60151712A
Authority
JP
Japan
Prior art keywords
robot
coordinate system
fixed
visual
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP636584A
Other languages
Japanese (ja)
Other versions
JPH0727408B2 (en
Inventor
Takushi Okada
岡田 拓史
Koichi Sugimoto
浩一 杉本
Muneyuki Sakagami
坂上 志之
Seiji Hata
清治 秦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP59006365A priority Critical patent/JPH0727408B2/en
Priority to EP85100512A priority patent/EP0151417A1/en
Publication of JPS60151712A publication Critical patent/JPS60151712A/en
Publication of JPH0727408B2 publication Critical patent/JPH0727408B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/33Director till display
    • G05B2219/33263Conversion, transformation of coordinates, cartesian or polar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37572Camera, tv, vision

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

PURPOSE:To calibrate the shift between a fixed visual coordinate system and a robot coordinate system and to secure the accurate working of a robot, by calculating a space error produced when a fixed sensor is attached to the robot coordinate system. CONSTITUTION:A robot 1 and a visual fixed tool 7 are attached to a robot attachment stage 2, and a fixed visual sensor 3 is attached to the tool 7. Then the robot 1 is moved in response to the position/posture command value given from a robot controller 6. The picture of the robot 1 obtained after movement of the robot is fetched by the sensor 3 and sent to a picture processor 4. The processor 4 measures the position and posture of the robot 1. While an error calculator 5 calculates the position and posture of the robot 1 in a fixed visual coordinate system based on said command value given from the controller 6. Based on this result of calculation and said measured value, an attachment error of the sensor 3 is calculated. Then the shift between the robot coordinate system and the fixed visual coordinate system is calibrated in accordance with said error.

Description

【発明の詳細な説明】 〔)色明の利用分!11]・〕 本発明は、固定視覚装置を用いてロボノ1に作業全行な
わせる際に、視覚座標系とロボット座標系との間の空間
的なずれを較正することにLす、特に視覚を用いて位置
・姿勢が任意な物体の・・/ドリツプを正確に行なわせ
るのに好適なrJボッ1〜視覚座標系較正方式に関する
ものである。
[Detailed description of the invention] [) Use of color light! 11].] The present invention is concerned with calibrating the spatial deviation between the visual coordinate system and the robot coordinate system when making Robono 1 perform all tasks using a fixed visual device. This invention relates to a visual coordinate system calibration method suitable for accurately performing dripping of objects with arbitrary positions and orientations.

〔発明の背景〕[Background of the invention]

従来、視覚を利用してロボットに負業を行なわせる場合
、視覚としては平1f目的な2値−または多値画像を認
識して、それに基つい−C・・ン1、リノグを行なって
い/こ。しだがって、ノ・ント1ノン′グt: Il’
 fiUなものVl溢勢が一定な平面的なものに限らj
l、ていた。このどきには、ロボットの座標系と視覚の
tll<標系との間の誤差としては乎面上の・Fl−」
移動:1;2及び\ド面内での回転のみを考庫−すれは
よく、j・1′−標糸の較正は比較的容易である。現在
実用j化されでいる平面視覚装置においては、この(Φ
の較正が行4:われでいる。しかし、[」ボットによる
・・ノドリング作業が増大し、複雑な物体をつかむ必要
が生しるにつれ、平面的な物体だけでなく、位置・姿勢
が定ま2ていない3次元的な物体を認識することが必要
になってきた。このための方法として、両眼視や距離画
像を用いた3次元物体認識方法が知られている。このよ
うな3次元視覚の場合には、ロボット座標系と視覚座標
系との間の空間的な誤差を較正する必要があるが、その
ような技術は、いまだ確立されていない。
Conventionally, when making a robot perform a negative task using vision, the visual system recognizes a binary or multivalued image, and performs renog based on that. child. Therefore, Il'
fiU things Vl limited to flat things with a constant overflow j
I was there. At this point, the error between the robot's coordinate system and the visual tll< reference system is ・Fl− on the surface.
Movement: 1; 2 and rotation in the \do plane are all considered, and calibration of the j·1'-marker thread is relatively easy. In the plane visual device currently in practical use, this (Φ
Calibration of line 4: I am. However, as the number of nodding tasks by bots increases and the need to grasp complex objects increases, robots are able to recognize not only flat objects but also three-dimensional objects whose positions and orientations are not fixed. It has become necessary to do so. As a method for this purpose, a three-dimensional object recognition method using binocular vision or distance images is known. In the case of such three-dimensional vision, it is necessary to calibrate spatial errors between the robot coordinate system and the visual coordinate system, but such a technique has not yet been established.

〔発明の目的〕[Purpose of the invention]

本発明の目的は、固定視覚をロボット座標系に固定する
際に生ずる空間的な誤差を計算することに1す、固定視
覚座標系とロボット座標系とのずれを1険正し、固定視
覚を用いてロボットに作業をさせる際に正確な作業を可
能とするロボット視覚座標系較正装置を提供することに
ある。
The purpose of the present invention is to calculate the spatial error that occurs when fixed vision is fixed to the robot coordinate system, correct the deviation between the fixed vision coordinate system and the robot coordinate system, and fix the fixed vision to the robot coordinate system. An object of the present invention is to provide a robot visual coordinate system calibration device that enables accurate work when the robot is used to perform work.

〔発明の概要〕[Summary of the invention]

本発明に係るロボット視覚座標系較正方式は、ロボット
と、その近傍に固定された固定視覚と、その画像データ
の処理をする画像処理装置と、上記各部についての制御
・処理をするロボット制御装置とからなり、上記ロボッ
ト制御装置の制御により、」1記ロボットを指令位置へ
移動させ、まだ、上記の固定視覚及び画像処理装置によ
り、上記ロボットの移動後の位置・姿勢を固定視覚座標
系で測定し、更に、上記誤差言1算装置により、上記ロ
ボットに対する移動指令の位置・姿勢を固定視覚座標系
で計算し、その割算結果と上記測定結果とから上記画定
視覚の取り付は誤差をs−1算[7、ロボット座標系・
固定視覚座標系間のずれを較正するようにしたものであ
る。
The robot visual coordinate system calibration method according to the present invention includes a robot, a fixed visual system fixed near the robot, an image processing device that processes the image data, and a robot control device that controls and processes each of the above parts. Under the control of the robot control device, 1) move the robot to the commanded position, and then use the fixed vision and image processing device to measure the position and orientation of the robot after the movement in a fixed visual coordinate system. Furthermore, the error calculation device calculates the position/orientation of the movement command for the robot in a fixed visual coordinate system, and from the division result and the measurement result, the error in installing the defined vision is calculated by s. -1 calculation [7, robot coordinate system
This is to calibrate the deviation between fixed visual coordinate systems.

なお、これを補足すると次のどおりである。In addition, the following is a supplement to this.

位置・姿勢が任意の物体をハンドリングさせるには、物
体の位置・姿勢を測定し、その物体の姿勢方向にロボッ
トハンドを向ける心安がある。そのためには、位置・姿
勢を測定する視覚座標系と、ロボットが動くロボット座
標系とが正確に較11:されていなければならない。こ
れらの座標系間にずれがあれば、位置・姿勢の指令値に
基づいて1jボツトを動かし、ロボットの位置・姿勢を
視覚−c71111定すると、両者間には誤差が生じる
。そこで、このことを逆に利用し、ロボットの位置・姿
勢の指令値俣視覚で測定した位置・痰勢との誤差か呟視
覚をロボット座標系に固定する際の取り付は誤差をめる
ことが可能となる。この誤差を用いることにより、ロボ
ット座標系と視覚座標系との較正を行なえるようにした
ものである。
In order to handle an object with an arbitrary position and orientation, it is safe to measure the position and orientation of the object and direct the robot hand in the direction of the orientation of the object. To do this, the visual coordinate system for measuring position and orientation must be accurately compared with the robot coordinate system in which the robot moves. If there is a deviation between these coordinate systems, when the 1j robot is moved based on the position/orientation command values and the robot's position/orientation is visually determined, an error will occur between the two. Therefore, by using this fact in reverse, it is necessary to correct the error in the command value of the robot's position and orientation and the position and phlegm measured by the visual sense. becomes possible. By using this error, it is possible to calibrate the robot coordinate system and the visual coordinate system.

〔発明の実施例〕[Embodiments of the invention]

以下、本発明の実施例を図に基づいて説明する。 Embodiments of the present invention will be described below with reference to the drawings.

第1図は、本発明に係゛るロボット視覚座標系較正装置
の一実施例の説明図、第2図は、ロボット座標系と固定
視覚座標系との相対関係の概念図、−第3図は、固定視
覚をロボット座標系に固定する際に生ずる取り付は誤差
を説明する概念図、第4図は、本方式の機能ブロック図
である。
FIG. 1 is an explanatory diagram of an embodiment of the robot visual coordinate system calibration device according to the present invention, FIG. 2 is a conceptual diagram of the relative relationship between the robot coordinate system and the fixed visual coordinate system, and FIG. 4 is a conceptual diagram illustrating the installation error that occurs when fixing the fixed visual field to the robot coordinate system, and FIG. 4 is a functional block diagram of this system.

ここで、1はロボット、2はロボット据え付は台、3I
J、固定視覚、4は画像処理装置、5は誤差清算装置、
6はロボット制御装置、7は視覚固定治其である。
Here, 1 is the robot, 2 is the robot installation stand, and 3I
J, fixed vision; 4, image processing device; 5, error clearing device;
6 is a robot control device, and 7 is a visual fixation device.

第2図に示すように、ロボット座標系と固定視覚座標系
との関係は、ロボット座標系の原点を名の原点をO■、
同各軸をxv、yv、zvとすると、次式で表わされる
As shown in Figure 2, the relationship between the robot coordinate system and the fixed visual coordinate system is as follows:
Letting the respective axes be xv, yv, and zv, it is expressed by the following equation.

ここで、L= (L、、Ly、L7)はロボットとの間
の変位を表わすベクトル、i〜I、、MfV。
Here, L=(L,,Ly,L7) is a vector representing the displacement with respect to the robot, i~I,,MfV.

M、は両座標系間の3つの回転変位を表わす71列で、
例えば第3図に示すように回転角α、β、z′をとると ・・・・・・・・・(3) となる。
M, is a 71-column representing three rotational displacements between both coordinate systems,
For example, if the rotation angles α, β, and z' are taken as shown in FIG. 3, the following is obtained (3).

い〕1、[1ボットに指令をして、ロボット座標系て位
置X−(”+ 3’+ Z)+姿勢f、(f、。
] 1. [1] Give a command to the bot and set the robot coordinate system to position X - ("+ 3' + Z) + posture f, (f,).

f、、f、)へ動かすと、パラメータJJK+、rJy
11.7.α、β、γか設計値通りであって誤差を含徒
ないとすれば、 X−x xB −1−y y R−1−z zli +
oR・−(4)[−ゴx XR−tゴy yu + f
 t Z I+ ・=(5)−・動<QLずである。こ
のf〃置・姿勢を固定視覚3−(−1j l 昏 [−
7)こ自白 がX 仁−’(X ’ + 3” + Z
 ’ ) 1 f’−:y、’:: (1、/ 、(y
/、r 、L )てあっ/ことすると、\’ =X’ 
Xv −t−y’ yv、l−z’zv +Ov −(
6)r’ ニー1’%Xy 4fy”Jv l−F、’
ZV ”’(7)−C6/) o X+ 3’ r Z
(!: X’ + 3” + Z’ 及ヒr。。
f,,f,), the parameters JJK+, rJy
11.7. If α, β, and γ are as designed and there are no errors, then X-x xB -1-y y R-1-z zli +
oR・-(4) [-go x XR-tgo y yu + f
t Z I+ ・=(5)−・Dynamic<QLzu. This f〃position/posture is fixed visual 3-(-1j l [-
7) This confession is X Jin-'(X' + 3" + Z
' ) 1 f'-:y, ':: (1, / , (y
/, r, L) tea/In this case, \'=X'
Xv −t−y′ yv, l−z′zv +Ov −(
6) r' knee 1'%Xy 4fy"Jv l-F,'
ZV ”'(7)-C6/) o X+ 3' r Z
(!: X' + 3" + Z' and hir.

r、、f、 とr、’、f、′、f、’とのI’A係は
式(1)〜(3)を用いれυ、請求する。
The I'A relationship between r,,f, and r,',f,',f,' is calculated using equations (1) to (3).

どころが、固定視覚3を取!絣1ける際に誤差が生じ−
Cいるどすると、誤差は17、等の誤差Δ1.!。
However, I got fixed vision 3! An error occurs when making one Kasuri.
C, the error is 17, etc., the error Δ1. ! .

ΔI)F・ Δ1・・・ Δα・ Δβ、Δン゛で表わ
される。
ΔI) F・Δ1... Δα・Δβ, Δn゛.

このときには式(1)〜(3)中の各パラメータが(l
、8十Δii、、)、(α」−Δα)等になるので、ロ
ホノ[・をX−(X+ Y+ z)+ ””” (’3
.「ア。
In this case, each parameter in equations (1) to (3) is (l
.
.. "a.

f、)−\動くように指令し、それを固定視覚3で測定
しても、固定視覚座標系でiiX’=(X’。
Even if you command it to move f, )-\ and measure it with fixed vision 3, iiX' = (X' in the fixed visual coordinate system).

Y’ + Z′ )+ ”’ −(ff’+ f、′、
f/)とはならず、ずれた値X// 、−(x// 、
、// 、z// )。
Y' + Z' )+ "' - (ff'+ f,',
f/), but the deviated values X// , -(x// ,
, // , z// ).

f // = ((工//、(yIl、(つ″)となる
1、[−7だがって、X’ 、y/ 、y/とy″、y
″、Z″、f8′。
f // = ((Work//, (yIl, (tsu'')) 1, [-7 Therefore, X', y/, y/ and y'', y
″, Z″, f8′.

fy’ 、”* トfxll、fy″、rz″’(1)
各スレi tlill定ずれば、式(J)〜(3)を解
くととにより、Δ1.よ。
fy', "* fxll, fy", rz"' (1)
If each thread is determined, equations (J) to (3) are solved, and Δ1. Yo.

Δly+ ΔLz+ Δα季 Δβ、Δ7′をめること
ができる。
Δly+ ΔLz+ Δα Season Δβ, Δ7' can be calculated.

第1図のBe 、、、4図をもとに第4図に従って4・
:実施例の構成、動作を説明する。
Be of Fig. 1 , 4. Based on Fig. 4, according to Fig. 4
:The configuration and operation of the embodiment will be explained.

ロボット1はロボット制御装置6からの位置・姿勢指令
値Y、、7(式(4) 、 (5) )に基づいて移動
し、固定視覚3は移動後のロボッ1−1の画像を取り込
んで画像処理装置゛4へ送る。1ijii像処理装置4
は固定視覚3から見たロボット1の位置・姿勢マ〃。
The robot 1 moves based on the position/orientation command values Y, , 7 (Equations (4), (5)) from the robot control device 6, and the fixed vision 3 captures the image of the robot 1-1 after the movement. Send to image processing device 4. 1ijii image processing device 4
is the position/posture of the robot 1 as seen from the fixed visual field 3.

f“を測定する。一方、誤差割算装置7は、[Jポット
ili制御装置6かもの指令値X、rをもとに、式(1
) 、 (2) 、 (3)を用いて固定視覚座標系で
の位置・姿の計嘗の際には・くラメータ誤差がないとし
ているのに対し、固定視覚3で測定したX/J、://
にはパラメータ誤差が含」・れる。[7たがってX′と
める(−とができる。
On the other hand, the error dividing device 7 calculates the equation (1) based on the command values X and r of the
), (2), and (3) when measuring position and figure in a fixed visual coordinate system, it is assumed that there is no parameter error, whereas X/J measured with fixed visual system 3, ://
includes parameter errors. [7 Therefore, stop X' (- can be done.

〔究明の効果〕[Effect of investigation]

以上、ii1′ii+Ilに説明し/Cように、本発明
によれば、1、’、l定視′i゛1′1.4−■」ポア
ノド座標系に固定する際に生ずる13次元的な取り付は
誤差を、ロボットを移動する(−とに、しつでめること
ができるので、(1)3次元的な取り付け誤差をめ、こ
の誤差を油止することに3しり、許面的な物体たけでな
く、位置・姿勢かイ]λ゛5、Cつある3次元的な物体
を・、固定視覚を用い−C11t 411にハンドリン
グ−4−ることがてき 、ロボノl−X+、−、Ill
 071. i’52 ’aイI′なイ乍Y〜か目J’
 ii比となり、17ね、(2)ロボツI・を゛動かし
て測定するだけて取り刊は誤差が:[す、測定器4シ用
いて111接取り付け、J(差の測定をするよりも、は
るかに較正作業か簡略化さ、71、ロボツj・作業の精
度向上、効率向−ににνl′l著な効果が得られる。
As explained above in ii1'ii+Il, according to the present invention, the 13-dimensional Installation errors can be reduced by moving the robot (-), so (1) 3-dimensional installation errors can be accounted for, and this error can be corrected by adjusting the tolerance. It is possible to handle not only physical objects, but also the position and posture of three-dimensional objects using fixed vision. -,Ill
071. i'52 'a i I' na i 乍 Y~ka eyes J'
ii ratio, 17, (2) Just by moving the robot I and measuring, there will be an error in the reading: Significant effects can be obtained in greatly simplifying the calibration work, improving the accuracy of robot work, and improving efficiency.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は、本発明に係るロボット視覚座標糸11iQ正
方禮の一実施例のi?/1図、第2図に1、「jボット
座標系と固定視覚座標系との相対関係のnU(念しj、
第3図&′:1、固定視覚を11ボツI・座標系に固定
する際に生ずる取り伺は誤差を説明する概念図、第4図
は、本牙民の機能ブロック図で、!りる1、1・・ロボ
ット、2・・・F1ホット据え(1け台、、3・111
□1定視覚、4・・画像処理装置、5・・・誤差+’i
+ 3″)装置i/邑茅 1 目 茅2 図 $3日 第4 目 に
FIG. 1 shows i? of an embodiment of the robot visual coordinate thread 11iQ square according to the present invention. /Figure 1, Figure 2 shows 1, ``nU of the relative relationship between the jbot coordinate system and the fixed visual coordinate system (just in case,
Figure 3 &': 1. A conceptual diagram explaining the errors that occur when fixing fixed vision to the 11-point I/coordinate system. Figure 4 is a functional block diagram of Hongamin. Ruru 1, 1...Robot, 2...F1 Hot Station (1 piece,, 3.111
□1 constant vision, 4... image processing device, 5... error +'i
+ 3″)Device i/Obaya 1 Mekaya 2 Figure $3 Day 4th

Claims (1)

【特許請求の範囲】[Claims] 1、ロボットと、その近傍に固定された固定視覚と、そ
の画像データの処理をする画像処理装置と、1誤差バ)
貌装置と、上記各部についての?1.rlJ ml+・
処理を・する[jボット制f卸装置r11どからなり、
上記ロボット制御装檜゛の割付11により、上記「jボ
ットを指令位置へ移動させ、また、−1−記の固定視覚
及び画像処l1ll夜置により、上記し」ボットの移動
後の位置・姿勢を−・固定桟ij′口車標系てLIil
l定し、史に、−1−記課差H13′1ル!そ置e(二
、J−リ、上111−: ”ボットに対する移動相6・
の位置・喫勢イL固定視覚座標系で4層し丁そのrib
’)A11、宋と1−記測定結果とから−1−記固定7
)l!覚の取り伺げ1、猾゛(を・、il t’>、 
L、ロボット座標系・固定用、覚座標系間のずれを軸1
「する」、“)にしだ■]ポット祝視覚標系軸1L方式
。。
1. A robot, a fixed visual system fixed near it, an image processing device that processes the image data, and 1 error bar)
What about the facial expression device and each of the above parts? 1. rlJ ml+・
Processing [J-bot system f wholesale equipment R11, etc.]
According to the assignment 11 of the robot control device, the ``j-bot is moved to the commanded position, and the position and posture of the bot after the movement is determined by the fixed vision and image processing described in -1-''. -・Fixed crosspiece ij′ mouthpiece marker LIil
I determined that -1-recorded difference H13'1! Soso e (2, J-li, top 111-: ``Mobile phase 6 against bots
The position and position of the four-layered rib in a fixed visual coordinate system
') A11, Song Dynasty and 1-Measurement Results -1-Measurement Results 7
)l! Awakening no Toriage 1, 猾゛(wo・、il t'>、
L, robot coordinate system/fixation, the deviation between the sensory coordinate system is axis 1
``Do'', ``)Nishida■] Pot celebration visual reference system axis 1L method.
JP59006365A 1984-01-19 1984-01-19 Robot handling device with fixed 3D vision Expired - Lifetime JPH0727408B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP59006365A JPH0727408B2 (en) 1984-01-19 1984-01-19 Robot handling device with fixed 3D vision
EP85100512A EP0151417A1 (en) 1984-01-19 1985-01-18 Method for correcting systems of coordinates in a robot having visual sensor device and apparatus therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP59006365A JPH0727408B2 (en) 1984-01-19 1984-01-19 Robot handling device with fixed 3D vision

Publications (2)

Publication Number Publication Date
JPS60151712A true JPS60151712A (en) 1985-08-09
JPH0727408B2 JPH0727408B2 (en) 1995-03-29

Family

ID=11636331

Family Applications (1)

Application Number Title Priority Date Filing Date
JP59006365A Expired - Lifetime JPH0727408B2 (en) 1984-01-19 1984-01-19 Robot handling device with fixed 3D vision

Country Status (1)

Country Link
JP (1) JPH0727408B2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62156701A (en) * 1985-12-27 1987-07-11 Yaskawa Electric Mfg Co Ltd Coordinate calibration system for robot system
JPS63221403A (en) * 1987-03-11 1988-09-14 Mitsubishi Electric Corp Industrial robot device
JPH058186A (en) * 1991-07-04 1993-01-19 Fanuc Ltd Automatic calibration method
CN103878770A (en) * 2014-04-08 2014-06-25 哈尔滨工业大学 Space robot visual delay error compensation method based on speed estimation
CN107309884A (en) * 2016-04-27 2017-11-03 上海福赛特机器人有限公司 Robot calibration system and method
CN110465944A (en) * 2019-08-09 2019-11-19 琦星智能科技股份有限公司 Calculation method based on the industrial robot coordinate under plane visual
CN110664484A (en) * 2019-09-27 2020-01-10 江苏工大博实医用机器人研究发展有限公司 Space registration method and system for robot and image equipment
CN111823230A (en) * 2020-06-19 2020-10-27 山东科技大学 Non-fixed hand-eye relationship calibration method based on Scara robot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5398670A (en) * 1977-02-07 1978-08-29 Hitachi Ltd Robot
JPS57122384U (en) * 1981-01-20 1982-07-30
JPS57182205A (en) * 1981-03-26 1982-11-10 Yaskawa Electric Mfg Co Ltd Controlling system of robot's locus
JPS58169987A (en) * 1982-03-31 1983-10-06 Tech Res & Dev Inst Of Japan Def Agency Infrared dye laser oscillating device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5398670A (en) * 1977-02-07 1978-08-29 Hitachi Ltd Robot
JPS57122384U (en) * 1981-01-20 1982-07-30
JPS57182205A (en) * 1981-03-26 1982-11-10 Yaskawa Electric Mfg Co Ltd Controlling system of robot's locus
JPS58169987A (en) * 1982-03-31 1983-10-06 Tech Res & Dev Inst Of Japan Def Agency Infrared dye laser oscillating device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62156701A (en) * 1985-12-27 1987-07-11 Yaskawa Electric Mfg Co Ltd Coordinate calibration system for robot system
JPS63221403A (en) * 1987-03-11 1988-09-14 Mitsubishi Electric Corp Industrial robot device
JPH058186A (en) * 1991-07-04 1993-01-19 Fanuc Ltd Automatic calibration method
CN103878770A (en) * 2014-04-08 2014-06-25 哈尔滨工业大学 Space robot visual delay error compensation method based on speed estimation
CN107309884A (en) * 2016-04-27 2017-11-03 上海福赛特机器人有限公司 Robot calibration system and method
CN110465944A (en) * 2019-08-09 2019-11-19 琦星智能科技股份有限公司 Calculation method based on the industrial robot coordinate under plane visual
CN110465944B (en) * 2019-08-09 2021-03-16 琦星智能科技股份有限公司 Method for calculating coordinates of industrial robot based on plane vision
CN110664484A (en) * 2019-09-27 2020-01-10 江苏工大博实医用机器人研究发展有限公司 Space registration method and system for robot and image equipment
CN111823230A (en) * 2020-06-19 2020-10-27 山东科技大学 Non-fixed hand-eye relationship calibration method based on Scara robot
CN111823230B (en) * 2020-06-19 2022-01-07 山东科技大学 Non-fixed hand-eye relationship calibration method based on Scara robot

Also Published As

Publication number Publication date
JPH0727408B2 (en) 1995-03-29

Similar Documents

Publication Publication Date Title
CN106056664B (en) A kind of real-time three-dimensional scene reconstruction system and method based on inertia and deep vision
CN102825602B (en) PSD (Position Sensitive Detector)-based industrial robot self-calibration method and device
CN110842901B (en) Robot hand-eye calibration method and device based on novel three-dimensional calibration block
CN107152911A (en) Based on the PSD dot laser sensors fed back and the scaling method of robot relative position
CN105091744A (en) Pose detection apparatus and method based on visual sensor and laser range finder
EP2972080B1 (en) Device for three-dimensional scanning, and method thereof
WO2014204548A1 (en) Systems and methods for tracking location of movable target object
JP2014013147A (en) Three-dimensional measuring instrument and robot device
CN109544630A (en) Posture information determines method and apparatus, vision point cloud construction method and device
CN104802173A (en) Data generation device for vision sensor and detection simulation system
CN107229043B (en) A kind of range sensor external parameters calibration method and system
Kim et al. On-line initialization and extrinsic calibration of an inertial navigation system with a relative preintegration method on manifold
JPH0328714A (en) Measuring and control system for sensor scanning
Gratal et al. Visual servoing on unknown objects
CN108648242A (en) Two camera scaling methods and device without public view field are assisted based on laser range finder
JPS60151712A (en) Calibration system for robot visual coordinate system
CN105157691B (en) A kind of determination method and device in compass orientation
CN107121128A (en) A kind of measuring method and system of legged type robot terrain parameter
CN117853441A (en) Visual touch sensor detection method and device, visual touch sensor and electronic equipment
CN109282774A (en) A kind of device and method solving ball-joint Three Degree Of Freedom posture based on range measurement
Cui et al. Novel method of rocket nozzle motion parameters non-contact consistency measurement based on stereo vision
KR20130032764A (en) Apparatus and method for generating base view image
Xu et al. A new active visual system for humanoid robots
TWI764393B (en) Manufacturing method of pressure garment
Handel Compensation of thermal errors in vision based measurement systems using a system identification approach