EP0208406B1 - Méthode de détection et commande du point initial d'un robot - Google Patents

Méthode de détection et commande du point initial d'un robot Download PDF

Info

Publication number
EP0208406B1
EP0208406B1 EP86304120A EP86304120A EP0208406B1 EP 0208406 B1 EP0208406 B1 EP 0208406B1 EP 86304120 A EP86304120 A EP 86304120A EP 86304120 A EP86304120 A EP 86304120A EP 0208406 B1 EP0208406 B1 EP 0208406B1
Authority
EP
European Patent Office
Prior art keywords
work
edge
tool
image
correcting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
EP86304120A
Other languages
German (de)
English (en)
Other versions
EP0208406A2 (fr
EP0208406A3 (en
Inventor
Satoru Nio
Hitoshi Wakisako
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yaskawa Electric Corp
Original Assignee
Yaskawa Electric Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yaskawa Electric Manufacturing Co Ltd filed Critical Yaskawa Electric Manufacturing Co Ltd
Publication of EP0208406A2 publication Critical patent/EP0208406A2/fr
Publication of EP0208406A3 publication Critical patent/EP0208406A3/en
Application granted granted Critical
Publication of EP0208406B1 publication Critical patent/EP0208406B1/fr
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1684Tracking a line or surface by means of sensors

Definitions

  • the present invention relates to a method of detecting and controlling a work start point of a robot having an interpolating function and a response function to a position sensor.
  • a workpiece In teaching playback type robots, a workpiece must be positioned with a high degree of accuracy so that the locus of the edge of the robot end effector as programmed coincides with the work line of a workpiece. When there is a variation in dimensional accuracy of the workpiece itself or when the workpiece is not properly positioned, the robot cannot be worked in accordance with the programmed locus data.
  • robots equipped with a position error detecting sensor have recently been studied and developed and some of them have already been put into practical use (“Werkstattstechnik, Zeitschrift für Industrielletechnik" 74 [1984] December, pages 721-724, article of R.
  • the mechanical type has various drawbacks which are caused because of contact with the workpiece.
  • the magnetic type also has various drawbacks in that the sensor must be arranged near the robot end effector in terms of detection sensitivity.
  • the arc sensor type has the drawback that high precision cannot be obtained due to variations in welding condition and characteristic of the welding arc itself.
  • Fig. 2 is a diagram when the welding torch is located at the taught welding start point 3.
  • 10 denotes a welding torch; 11 a wire electrode; 12 the wrist of the robot; 13 the upper arm of the robot; 14 a two-dimensional camera; 15 a cable through which a video signal (i.e., image pickup signal) of the camera is transmitted; 16 an illumination source (e.g., halogen lamp); 17 the shadow of the wire electrode 11; 18 a camera controller; 20 a monitor television; and 21 an image processor.
  • the illumination source 16 is provided to allow the shadow 17 of the wire electrode 11 to be invariably produced without being influenced by the ambient light.
  • the two-dimensional camera 14 serves to recognize the welding line and the relation of the relative position between the wire electrode 11 and the shadow 17 thereof.
  • Fig. 3B shows the state in which the welding start point 3' is displaced to the left side and, at the same time, the edge of the wire electrode 11 comes into contact with the upper surface of the lower plate 5' of the work piece.
  • Fig. 3D shows the state in which the welding start point 3' is displaced to the right and at the same time, the edge of the wire electrode 11 comes into contact with the upper surface of the upper plate 6' of the work piece.
  • Fig. 3E shows the state in which the welding start point 3' is located below the welding start point 3, although there is no displacement in the horizontal direction between the edge of the wire electrode 11 and the welding line 4 in the image detected by the image sensor.
  • Fig. 3G shows the state in which there is no displacement in the horizontal direction between the edge of the wire electrode 11 and the welding line 4' and the edge of the wire electrode 11 coincides with the edge of the shadow thereof and, at the same time, the welding start point 3 coincides with the welding starting point 3'.
  • Fig. 4 is a diagram explaining the fundamental concept of the image processing method in the image processor 21.
  • 40 denotes a frame memory of the binary video signal;
  • 41 is a window to recognize the image of a wire electrode signal 11'';
  • 42 a window to recognize a shadow signal 17'';
  • 43 a window to recognize a welding line signal 4';
  • 44 an axial line of th wire electrode signal 11'';
  • 44''' an edge of the axial line 44;
  • 45 an axial line of the shadow signal 17''; and 45'' an edge of the axial line 45.
  • the windows 41, 42, and 43 are set by the operator while observing the TV monitor 20 when the locus data is taught (the welding start point 3 and welding line 4 at the work position 2 in Fig. 1 are taught) based on variations in operating workpieces, i.e., variations in welding line 4'.
  • the windows 41 to 43 are set such that even if there are variations in workpieces as well, the workpieces exist in the windows 41, 42 and 43.
  • the differences between the analog speed command signals as outputs of these D/A converters and the speed feedback signals from the tachogenerators 615, 625, 635, 645, and 655 are compensated and amplified by the servo amplifiers 613, 623, 633, 643, 653, thereby driving and controlling the motors 614, 624, 634, 644, and 654.
  • the servo control is performed in a conventional manner. Due to this position servo control, the edge of the wire electrode 11 attached to the edge of the welding torch 10 attached to the robot wrist 12 is controlled so as to follow in accordance with the command pulse.
  • the robot control unit 51 has the well-known linear interpolating function, disclosed in EP-A-60563 entitled “Control Unit of Industrial Robot of the Joint Type” filed on March 18, 1981 by the same applicant as in the present invention.
  • the robot control unit 51 also has the moving function set forth in EP-A-76498 entitled “Control System of Welding Robot” filed on October 7, 1981 by the same applicant as the present invention.
  • the moving function is called the shifting function to move the robot in a predetermined second correcting direction, e.g., wire electrode direction (vertical direction) and in a predetermined first correcting direction, e.g., (horizontal) direction perpendicular to the wire electrode direction due to the control of the three fundamental axes of the robot.
  • the edge of the wire electrode 11 can move at a desired speed on the straight line connecting the welding start point 3 and the welding end point 8 shown in Fig. 1.
  • the robot control unit 51 further has the function that the edge of the wire electrode 11 can be shifted in accordance with the commands of the sensor by controlling the three fundamental axes of the robot in response to the sensor commands in the vertical and horizontal directions in the vertical cross sectional area in the welding direction shown in Fig. 6.
  • the image processor 21 When the edge 44'' of the axial line of the wire electrode signal described in Fig. 4 does not coincide with the edge 45'' of the axial line of the shadow signal 17'', the image processor 21 generates, for example, a "down" direction signal as the welding start point search command signal 50 in the case of Figs. 3A, 3C, 3E, and 4. When the edge 44'' of the axial line is not located on the welding line signal 4'', the image processor 21 generates either "left” or a "right” direction signal as the search command signal 50. The image processor 21 generates the "left" signal in the case of Figs. 3A, 3B, and 4 and the "right” signal in the case of Figs. 3C and 3D.
  • the image processor 21 sequentially receives the video signal from the camera 14 and executes the image processes and sequentially transmits the "left", “right”, “up”, and “down” signals to the robot control unit 51.
  • the control unit 51 drives three fundamental axes of the robot to perform the shifting operations each time the control unit 51 receives the signal 50. Due to the above controls, the states shown in Figs. 3A to 3E become the state of Fig. 3F.
  • the edge of the wire electrode 11 may be controlled in the following manner.
  • the movement of the wire electrode edge in the second correcting direction is stopped and the edge is moved in the first correcting direction until the edge of the wire electrode 11 coincides with the welding line 4' on the image.
  • the wire electrode edge is then moved in the second correcting direction until it coincides with the edge of the shadow of the wire electrode 11 on the image.
  • the position of the edge of the wire electrode 11 is corrected along the first correcting direction in order to reduce the displacement between the welding line on the image and the edge of the wire electrode 11.
  • the elongated line 3''' - 8''' of the straight line connecting the point 8''' and 9' is set to a reference line when moving from the point 9' to the point 3' (from the state of Fig. 3F to the state of Fig. 3G).
  • the wire electrode 11 is moved along the straight line 8''' 3'''. This movement is carried out according to a conventional linear interpolating method in the manner as described in e.g., Japanese Patent Application No. 38872/1981.
  • a conventional linear interpolating method in the manner as described in e.g., Japanese Patent Application No. 38872/1981.
  • FIG. 7B when the edge of the wire electrode 11 advances to the point 9'', this edge does not exist on the welding line 4'.
  • the signal 50 which is sampling image processed on an intermittent basis is generated in order to represent any one of the states of Figs. 3A to 3E.
  • the robot control unit 51 receives the signal 50 and controls the three fundamental axes of the robot in the manner described above, thereby shifting the edge of the wire electrode 11 onto a point 9'''. Thereafter, the control unit 51 controls the three fundamental axes of the robot so that the edge of the wire electrode 11 moves in a straight line which is parallel to the straight line 8''' 3''' until the next signal 50 is given.
  • the control in this parallel moving mode has been disclosed in detail as the well-known technology in Japanese Patent Application No. 158627/1981.
  • Figs. 8A and 8B are diagrams explaining a method whereby the image at the welding start point 3' is recognized from a binary image 3'' at the welding start point 3'.
  • a camera 14 In association with the movement of the edge of the wire electrode 11 from the point 9' to the point 3' described with reference to Fgis. 7A and 7B, a camera 14 also moves along the welding lines 4', wherein the camera 14 is fixed to the robot wrist 12 in order to have a predetermined geometrical relation with regard to the wire electrode 11 as shown in Fig. 2. Therefore, the window 43, which has already been described in Fig. 4, also moves along the welding line 4', i.e., from 4'' to 3'' on the screen. When the window 43 moves from the state of Fig.
  • the point at which the image line connecting with the point 4'' suddenly changes can be recognized as the welding start point 3'', i.e., point 3'.
  • the edge of the wire electrode 11 is positioned at the welding start point 3' derived in this manner and thereafter actual welding is started along the welding line (3' 8'). In execution of actual welding along a locus, two methods can be considered.
  • a reverse straight line locus from 9' to 3' is stored in advance in a memory (not shown) and the welding locus is obtained according to the linear interpolating method on the assumption that the welding end point 8' exists on the elongated line of the welding line 3' 9'.
  • Copy welding is executed from the welding start point 3' by use of a sensor.
  • the camera 14 is obliquely arranged in front of the wire electrode 11 and the welding line 4' and, that strictly speaking, it generates none of the "up”, “down”, “left”, and “right” signals in the cross sectional area which is perpendicualr to the welding line direction 7' described in fig. 6.
  • the video signals 4'', 11'', and 17'' correspond to the images projected onto the camera surface of the actual welding line 4', actual wire electrode 11, and actual shadow 17, so that these video signals are reduced by only the amounts of sine or cosine of the angles between the camera surface and the actual welding line 4', actual wire electrode 11, and actual shadow 17, respectively.
  • Fig. 9 shows an example of a workpiece set at an instructed work position 101.
  • 103A and 103B denote welding start points; 104A and 104B are welding lines; 105 a lower plate to which welding lines are formed; 106A and 106B upper plates or upper base materials; 107A and 107B arrows indicating of the welding directions; 108A and 108B welding end points; and 110 a rotating motor (hereinafter referred to as an R axis or R axis data) to rotate the camera 14 around the robot wrist as a rotational center and a position detector.
  • the welding directions 107A and 107B are substantially opposite directions.
  • the camera 14 observes the welding lines 104A and 104B, wire electrode 11, and shadow 17 thereof from the oblique forward direction of the welding line direction.
  • the camera 14 is rotated by almost 180° by the R axis motor 110 from the welding start point 103A to the welding start point 103B.
  • the welding line 104A is constituted by an arc-like curve
  • the positioning operations of the wire electrode 11 to the welding start points 103A and 103B can be executed in substantially the same manner as described above except that the parallel moving mode during locus motion which is equivalent to the locus motion from the point 9' to the point 3' described in Figs. 7A and 7B is carried out on the basis of the arc interpolation instead of the straight line.
  • the operator determines the operations to set the angle of rotation of the R axis and to set the windows based on variations in the work piece while observing the TV monitor 20.
  • the video signals 4'', 11'', and 17'' in Fig. 4 rotate.
  • the windows 41 to 43 are decided based on variations of the video signals 4'', 11'', and 17'' after rotaion.
  • the taught values of the windows 41, 42, and 43 shown in Fig. 4 near the welding start points 103A and 103B are stored into a RAM memory of the image processor 21 as the coordinate values based on the camera coordinate system (not shown in this specification). Upon playback, these coordinate data are read out of the RAM memory corresponding to the taught welding start points 103A and 103B.
  • the actual video signals 4'', 44'', and 45'' in the windows 41 to 43, respectively, are image processed and recognized.
  • the illumination source 16 is directly coupled with the R axis as shown in Fig. 9, the illuminating direction does not change according to the rotation of the R axis. Therefore, it is necessary to calculate the positions of the video signals 4'', 11'', and 17'' in the frame memory 40 due to the driving of the R axis on the basis of the geometrical relation among the optical axix of the camera coupled to the R axis, direction of the torch axial line coupled to the robot wrist 12 (i.e., the direction of the wire electrode 11), and work surface onto which the shadow of the wire electrode is produced and to calculate the windows 41 to 43 having allowable variation widths. Since these calculations can be performed by use of conventional mathematical techniques, only the idea thereof is mentioned here and its detailed description is omitted in this specification.
  • the wire electrode can be led to the welding start point by use of well-known technology(e.g., Japanese Patent Laid-opn Publication NO. 22488/1980) in which the control of the "left" and "right” directions (i.e., position matching control between 44'' and 4'' in Fig. 4) is carried out in a manner similar to the above and the control in the "up” and “down” directions is executed according due to the contact between the wire electrode 11 and the base material.
  • well-known technology e.g., Japanese Patent Laid-opn Publication NO. 22488/1980
  • the present invention has been described above with respect to the camera as an example. However, the invention can be also obviously applied to a system in which a small-size and light-weight visual such as a fiberscope is attached to the robot wrist and an image pickup signal at the edge of the fiberscope is transmitted to a camera arranged at a remote position (e.g., to the upper arm of the robot).
  • a small-size and light-weight visual such as a fiberscope is attached to the robot wrist and an image pickup signal at the edge of the fiberscope is transmitted to a camera arranged at a remote position (e.g., to the upper arm of the robot).

Claims (6)

  1. Un procédé de détection d'un point de départ de travail d'un robot du type à commande par reproduction d'un mouvement programmé, ayant une fonction d'interpolation, et dans lequel un outil est fixé à une extrémité d'un bras et une source d'illumination et un capteur visuel sont associés à la partie de poignet de ce bras, grâce à quoi un ensemble d'axes d'entraînement constituant le robot sont commandés sous la dépendance de signaux de correction de position provenant du capteur visuel, pour régler ainsi la position de l'outil, ce procédé étant caractérisé par les étapes suivantes :
    (1) on produit une ombre de l'outil au moyen de la source d'illumination;
    (2) on effectue une reconnaissance d'image portant sur une ligne de travail d'une pièce traitée, sur l'outil et sur l'ombre de l'outil, au moyen du capteur visuel;
    (3) on produit des signaux de correction de position dans des première et seconde directions, à partir du capteur visuel, sur la base de la reconnaissance d'image qui est effectuée par le capteur visuel;
    (4) on déplace le bord de l'outil dans la première direction de correction, sous la dépendance du signal de correction de position, dans le but de réduire la distance entre la ligne de travail et le bord de l'outil sur l'image reconnue;
    (5) on déplace le bord de l'outil dans la seconde direction de correction, sous la dépendance du signal de correction de position, dans le but de réduire la distance entre le bord de l'outil et le bord de l'ombre de l'outil sur l'image reconnue; et
    (6) on répète les étapes (3), (4) et (5) jusqu'à ce que la ligne de travail, le bord de l'outil et le bord de l'ombre de l'outil coïncident tous sur l'image reconnue.
  2. Un procédé selon la revendication 1, caractérisé par l'étape supplémentaire suivante :
    (7) on répète les étapes (3), (4) et (5) jusqu'à ce que le capteur visuel détecte un changement de direction brusque en un point de la ligne de travail sur l'image reconnue, en suivant un prolongement d'une ligne droite qui relie un point auquel le point de fin de travail programmé est déplacé par le vecteur entre le point de coïncidence obtenu à l'étape (6) et le point de départ de travail programmé, et le point de coïncidence dans la direction opposée à la direction de travail programmée.
  3. Un procédé de détection d'un point de départ de travail d'un robot du type à commande par reproduction d'un mouvement programmé, ayant une fonction d'interpolation, dans lequel un outil est fixé à une extrémité d'un bras et une source d'illumination et un capteur visuel sont associés à la partie de poignet du bras, grâce à quoi un ensemble d'axes d'entraînement constituant le robot sont commandés sous la dépendance de signaux de correction de position provenant du capteur visuel, pour régler ainsi la position de l'outil, ce procédé étant caractérisé par les étapes suivantes :
    (1) on produit une ombre de l'outil au moyen de la source d'illumination;
    (2) on effectue une reconnaissance d'image portant sur une ligne de travail d'une pièce traitée, sur l'outil et sur l'ombre de l'outil, au moyen du capteur visuel;
    (3) on produit les signaux de correction de position dans des première et seconde directions de correction, à partir du capteur visuel, sur la base de l'image qui est reconnue par le capteur visuel;
    (4) on déplace le bord de l'outil dans la première direction de correction, sous la dépendance du signal de correction de position, jusqu'à ce que la ligne de travail et le bord de l'outil coïncident sur l'image reconnue; et
    (5) on déplace le bord de l'outil dans la seconde direction de correction, sous la dépendance du signal de correction de position, jusqu'à ce que le bord de l'outil et le bord de l'ombre de l'outil coïncident sur l'image reconnue et, simultanément, on corrige la position du bord de l'outil dans la première direction de correction, dans le but de réduire le déplacement entre la ligne de travail et le bord de l'outil sur l'image.
  4. Un procédé selon la revendication 3, caractérisé par l'étape supplémentaire suivante :
    (6) on répète les étapes (3), (4) et (5) jusqu'à ce que le capteur visuel détecte un changement de direction brusque en un point de la ligne de travail sur l'image reconnue, en suivant un prolongement d'une ligne droite qui relie un point auquel le point de fin de travail programmé est déplacé par un vecteur de deviation entre le point de coïncidence qui est obtenu à l'étape (5) et le point de départ de travail programmé, et le point de coïncidence dans la direction opposée à la direction de travail programmée.
  5. Un procédé de détection d'un point de départ de travail d'un robot selon la revendication 1, dans lequel, du fait de problèmes d'obstruction entre la caméra et la pièce traitée, le signal d'ombre (17'') n'est pas produit à l'emplacement correct, le procédé utilisant un capteur à contact pour produire un signal de contact lorsque le bord de l'outil vient en contact avec l'objet à traiter, et étant caractérisé par le remplacement des étapes (1) à (6) par les étapes suivantes :
    (1') on effectue une reconnaissance d'image portant sur une ligne de travail d'un objet traité et sur l'outil, au moyen du capteur visuel;
    (2') on produit un signal de correction de position dans une première direction, à partir du capteur visuel, sur la base de l'image qui est reconnue par le capteur visuel;
    (3') on déplace le bord de l'outil dans la première direction de correction, sous la dépendance du signal de correction de position, jusqu'à ce que la ligne de travail et le bord de l'outil coïncident sur l'image reconnue;
    (4') on déplace le bord de l'outil d'une quantité prédéterminée dans la direction axiale de l'outil, dans une direction telle que le capteur à contact produise le signal de contact; et
    (5') on répète les étapes (2'), (3') et (4') jusqu'à ce que la ligne de travail et le bord de l'outil coïncident sur l'image reconnue et jusqu'à ce qu'un signal de contact soit produit.
  6. Un procédé de détection d'un point de départ de travail d'un robot selon la revendication 3, dans lequel du fait de problèmes d'obstruction entre la caméra et la pièce, le signal d'ombre (17'') n'est pas produit à l'emplacement correct, le procédé utilisant un capteur à contact pour produire un signal de contact lorsque le bord de l'outil vient en contact avec un objet à traiter, et étant caractérisé par le remplacement des étapes (1) à (6) par les étapes suivantes :
    (1'') on effectue une reconnaissance d'image portant sur une ligne de travail de l'objet traité et sur l'outil, au moyen du capteur visuel;
    (2'') on produit le signal de correction de position dans une première direction de correction, à partir du capteur visuel, sur la base de l'image qui est reconnue par le capteur visuel;
    (3'') on déplace le bord de l'outil dans la première direction de correction, sous la dépendance du signal de correction de position, jusqu'à ce que la ligne de travail et le bord de l'outil coïncident sur l'image reconnue; et
    (4'') on déplace le bord de l'outil dans la direction axiale de l'outil, jusqu'à ce que le capteur à contact produise le signal de contact et, simultanément, on corrige la position du bord de l'outil dans la première direction de correction, de façon à réduire le déplacement entre la ligne de travail et le bord de l'outil sur l'image.
EP86304120A 1985-06-01 1986-05-30 Méthode de détection et commande du point initial d'un robot Expired - Lifetime EP0208406B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP117711/85 1985-06-01
JP60117711A JPS61279481A (ja) 1985-06-01 1985-06-01 ロボツトの作業開始点検出制御方法

Publications (3)

Publication Number Publication Date
EP0208406A2 EP0208406A2 (fr) 1987-01-14
EP0208406A3 EP0208406A3 (en) 1989-02-01
EP0208406B1 true EP0208406B1 (fr) 1993-04-07

Family

ID=14718413

Family Applications (1)

Application Number Title Priority Date Filing Date
EP86304120A Expired - Lifetime EP0208406B1 (fr) 1985-06-01 1986-05-30 Méthode de détection et commande du point initial d'un robot

Country Status (4)

Country Link
US (1) US4761596A (fr)
EP (1) EP0208406B1 (fr)
JP (1) JPS61279481A (fr)
DE (1) DE3688221T2 (fr)

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61279491A (ja) * 1985-05-31 1986-12-10 株式会社安川電機 視覚機器付産業用ロボット
EP0275322B1 (fr) * 1986-07-15 1993-07-07 Kabushiki Kaisha Yaskawa Denki Seisakusho Procede de detection de donnees positionnelles lors du soudage a l'arc
JP2713899B2 (ja) * 1987-03-30 1998-02-16 株式会社日立製作所 ロボツト装置
US4864777A (en) * 1987-04-13 1989-09-12 General Electric Company Method of automated grinding
US4907169A (en) * 1987-09-30 1990-03-06 International Technical Associates Adaptive tracking vision and guidance system
DE3741632A1 (de) * 1987-12-05 1989-06-22 Noell Gmbh Verfahren und vorrichtung zum erkennen und ansteuern eines raumzieles
US4969108A (en) * 1988-04-08 1990-11-06 Cincinnati Milacron Inc. Vision seam tracking method and apparatus for a manipulator
US5276777A (en) * 1988-04-27 1994-01-04 Fanuc Ltd. Locus correcting method for industrial robots
US4974210A (en) * 1989-05-01 1990-11-27 General Electric Company Multiple arm robot with force control and inter-arm position accommodation
US5083073A (en) * 1990-09-20 1992-01-21 Mazada Motor Manufacturing U.S.A. Corp. Method and apparatus for calibrating a vision guided robot
US6535794B1 (en) 1993-02-23 2003-03-18 Faro Technologoies Inc. Method of generating an error map for calibration of a robot or multi-axis machining center
JPH06328385A (ja) * 1993-05-20 1994-11-29 Fanuc Ltd 産業用ロボットの視覚センサの姿勢制御方法
US5495410A (en) * 1994-08-12 1996-02-27 Minnesota Mining And Manufacturing Company Lead-through robot programming system
KR100237302B1 (ko) * 1997-05-13 2000-01-15 윤종용 로봇의 초기 용접위치 검출방법
US5959425A (en) * 1998-10-15 1999-09-28 Fanuc Robotics North America, Inc. Vision guided automatic robotic path teaching method
KR100621100B1 (ko) * 2000-02-11 2006-09-07 삼성전자주식회사 용접로봇 교시위치 보정방법 및 용접로봇시스템
KR20020044499A (ko) * 2000-12-06 2002-06-15 윤종용 로봇 제어시스템 및 그 제어방법
US9110456B2 (en) * 2004-09-08 2015-08-18 Abb Research Ltd. Robotic machining with a flexible manipulator
US20060067830A1 (en) * 2004-09-29 2006-03-30 Wen Guo Method to restore an airfoil leading edge
US7549204B1 (en) * 2005-11-30 2009-06-23 Western Digital Technologies, Inc. Methods for picking and placing workpieces into small form factor hard disk drives
JP2008296310A (ja) * 2007-05-30 2008-12-11 Fanuc Ltd 加工ロボットの制御装置
JP5314962B2 (ja) * 2008-08-06 2013-10-16 住友重機械プロセス機器株式会社 コークス炉押出機
US8144193B2 (en) * 2009-02-09 2012-03-27 Recognition Robotics, Inc. Work piece tracking system and method
JP2011062763A (ja) * 2009-09-16 2011-03-31 Daihen Corp ロボット制御装置
US8600552B2 (en) * 2009-10-30 2013-12-03 Honda Motor Co., Ltd. Information processing method, apparatus, and computer readable medium
US8842191B2 (en) 2010-06-03 2014-09-23 Recognition Robotics, Inc. System and method for visual recognition
US9527153B2 (en) 2013-03-14 2016-12-27 Lincoln Global, Inc. Camera and wire feed solution for orbital welder system
US9238274B2 (en) 2013-06-21 2016-01-19 Lincoln Global, Inc. System and method for hot wire TIG positioned heat control
US9770775B2 (en) 2013-11-11 2017-09-26 Lincoln Global, Inc. Orbital welding torch systems and methods with lead/lag angle stop
US9517524B2 (en) 2013-11-12 2016-12-13 Lincoln Global, Inc. Welding wire spool support
US9731385B2 (en) 2013-11-12 2017-08-15 Lincoln Global, Inc. Orbital welder with wire height adjustment assembly
FR3043004B1 (fr) * 2015-10-29 2017-12-22 Airbus Group Sas Procede d'orientation d'un effecteur portant un outil d'assemblage par rapport a une surface
JP7323993B2 (ja) * 2017-10-19 2023-08-09 キヤノン株式会社 制御装置、ロボットシステム、制御装置の動作方法及びプログラム
JP2020001100A (ja) * 2018-06-25 2020-01-09 キョーラク株式会社 バリ除去装置のティーチングの方法およびバリ除去装置
JP2021003794A (ja) 2019-06-27 2021-01-14 ファナック株式会社 ツールの作業位置のずれ量を取得する装置、及び方法
CN110315539B (zh) * 2019-07-12 2021-02-02 广汽乘用车(杭州)有限公司 精定位台车工件层数识别方法
CN112558600A (zh) * 2020-11-09 2021-03-26 福建汉特云智能科技有限公司 一种用于机器人移动校正的方法及机器人

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1449044A (en) * 1972-11-14 1976-09-08 Kongsberg Vapenfab As Procedures and apparatuses for determining the shapes of surfaces
US3888362A (en) * 1973-05-31 1975-06-10 Nasa Cooperative multiaxis sensor for teleoperation of article manipulating apparatus
US4305130A (en) * 1979-05-29 1981-12-08 University Of Rhode Island Apparatus and method to enable a robot with vision to acquire, orient and transport workpieces
US4402053A (en) * 1980-09-25 1983-08-30 Board Of Regents For Education For The State Of Rhode Island Estimating workpiece pose using the feature points method
JPS5789583A (en) * 1980-11-21 1982-06-03 Tokico Ltd Industrial robot
JPS5822689A (ja) * 1981-07-24 1983-02-10 三菱重工業株式会社 ロボット用センサ−装置
US4412121A (en) * 1981-08-28 1983-10-25 S R I International Implement positioning apparatus and process
JPS5877775A (ja) * 1981-10-07 1983-05-11 Yaskawa Electric Mfg Co Ltd 溶接ロボツトの制御方式
WO1984001731A1 (fr) * 1982-11-01 1984-05-10 Nat Res Dev Soudage automatique
US4567348A (en) * 1983-01-25 1986-01-28 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Automated weld torch guidance control system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JOURNAL OF PHYSICS E. SCIENTIFIC INSTRUMENTS, volume 16, no. 11, November 1983, Dorking, Great Britain, pages 1081-1085; J.L. Falkowski et al.: "Vision sensing for arc welding robots - a new approach", chapters 2,3; figures 1-4 *
PATENT ABSTRACTS OF JAPAN, volume 9, no. 34 (M-357) (1757) 14th February 1985 *
WERKSTATTSTECHNIK ZEITSCHRIFT FÜR INDUSTRIELLE FERTIGUNG, volume 74, no. 12, December 1984, Würzburg, pages 721-724; R. Strauch:"Sensorgeführte Industrierobotersysteme", chapter 2 "Nahtsuchsensor für Lichtbogenschweissroboter" *

Also Published As

Publication number Publication date
EP0208406A2 (fr) 1987-01-14
JPH0431836B2 (fr) 1992-05-27
DE3688221D1 (de) 1993-05-13
JPS61279481A (ja) 1986-12-10
DE3688221T2 (de) 1993-07-22
EP0208406A3 (en) 1989-02-01
US4761596A (en) 1988-08-02

Similar Documents

Publication Publication Date Title
EP0208406B1 (fr) Méthode de détection et commande du point initial d'un robot
KR100311663B1 (ko) 여유축을이용하여물체의외형을추적하는장치및방법
US4575304A (en) Robot system for recognizing three dimensional shapes
US4969108A (en) Vision seam tracking method and apparatus for a manipulator
US4907169A (en) Adaptive tracking vision and guidance system
JP2610276B2 (ja) 産業用ロボット装置
US5014183A (en) Method and means for path offsets memorization and recall in a manipulator
GB2254171A (en) Welding robot
SE449313B (sv) Manipulatorsvetsapparat och sett att manovrera sadan
EP0601206B1 (fr) Procédé pour commander l'opération d'un bras de robot
JP2728399B2 (ja) ロボツトの制御方法
JPH1158273A (ja) 移動ロボット装置
JP3424130B2 (ja) レーザ加工機
JP3543329B2 (ja) ロボットの教示装置
JPS6218316B2 (fr)
JP2770570B2 (ja) 溶接用ロボット
JPS6111815A (ja) ロボツトの位置ズレ補正システム
JP3175623B2 (ja) ロボットの制御装置
JPH0813433B2 (ja) 自動加工装置
JPS5946758B2 (ja) 自動作業機械の動作制御方法
JP2802117B2 (ja) 教示機能を備えた加工機
JPH07122823B2 (ja) 手先視覚付きロボット・自動機械の教示および制御方法
JPS58100972A (ja) 溶接ロボツトの制御方法および装置
JPS6054275A (ja) 溶接ト−チの駆動制御方法
JPS63102881A (ja) 自動作業装置の教示方式

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): DE FR GB

RIN1 Information on inventor provided before grant (corrected)

Inventor name: WAKISAKO, HITOSHI

Inventor name: NIO, SATORU

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): DE FR GB

17P Request for examination filed

Effective date: 19890721

17Q First examination report despatched

Effective date: 19890918

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 19930510

Year of fee payment: 8

REF Corresponds to:

Ref document number: 3688221

Country of ref document: DE

Date of ref document: 19930513

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 19930518

Year of fee payment: 8

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 19930602

Year of fee payment: 8

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed
PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Effective date: 19940530

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 19940530

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Effective date: 19950131

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Effective date: 19950201

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST