EP1675709A2 - Procede pour imprimer un mouvement a un appareil de manutention et dispositif de traitement d'image - Google Patents

Procede pour imprimer un mouvement a un appareil de manutention et dispositif de traitement d'image

Info

Publication number
EP1675709A2
EP1675709A2 EP04790671A EP04790671A EP1675709A2 EP 1675709 A2 EP1675709 A2 EP 1675709A2 EP 04790671 A EP04790671 A EP 04790671A EP 04790671 A EP04790671 A EP 04790671A EP 1675709 A2 EP1675709 A2 EP 1675709A2
Authority
EP
European Patent Office
Prior art keywords
movement
handling device
image processing
control
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP04790671A
Other languages
German (de)
English (en)
Inventor
Georg Lambert
Enis Ersü
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Isra Vision Systems AG
Original Assignee
Isra Vision Systems AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isra Vision Systems AG filed Critical Isra Vision Systems AG
Publication of EP1675709A2 publication Critical patent/EP1675709A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36412Fine, autonomous movement of end effector by using camera
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37555Camera detects orientation, position workpiece, points of workpiece
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39387Reflex control, follow movement, track face, work, hand, visual servoing
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39391Visual servoing, track end effector with camera image feedback
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40546Motion of object
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40555Orientation and distance
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40604Two camera, global vision camera, end effector neighbourhood vision camera

Definitions

  • the invention relates to a method for setting up a movement of a handling device with in particular a plurality of movable axes and a control unit, wherein position, time and speed can be specified for each axis. Freedom of movement around at least three axes is advantageously possible in order to enable free arrangement in space. If only one movement in one plane is required, adjustment options around two axes are sufficient. Depending on the task of the handling device, however, more axes can also be provided, which can be adjusted by corresponding actuators.
  • the present invention further relates to a corresponding image processing.
  • the handling device can, for example, be a robot, with a robot generally being understood to be a device which can automatically carry out movement and / or work processes.
  • the robot has a control system, which issues actuating commands to actuators of the robot so that they execute the movements predefined for them.
  • actuating commands to actuators of the robot so that they execute the movements predefined for them.
  • the object of the present invention is therefore to propose a simple possibility for setting up the movement of a handling device, with which the movement sequence of the handling device can be flexibly adapted, for example, to the movement of an object to be processed or automatically, i.e. can be changed without external intervention.
  • This object is achieved by a method for setting up the movement of a handling device, for example a robot, with at least one by means of a
  • control of the handling device or image processing is given an optically recognizable object and a movement sequence related to the object, b) the movement and / or working area of a handling device is recorded with at least one camera,
  • the recorded image is evaluated with image processing in such a way that the predetermined object is recognized and its position and / or state of motion is determined in particular relative to the handling device,
  • control or the image processing calculates a control command for one or more actuators of the handling device from the position and / or the state of motion of the recognized object and the movement sequence related to the object,
  • control system issues a control command in particular to each control element to be moved in accordance with the control command
  • an optically recognizable object predefine a specific movement sequence, in particular relative to the object, which is then automatically processed by the control of the handling device, in particular a computer.
  • the optically recognizable object is defined by a constellation of optically recognizable features that can be identified by image processing, for example geometric arrangements, specific contrasts and / or other features suitable for recognizing. This makes it possible to visually close the control loop between the in particular moving object and the handling device, ie by means of appropriate image processing, and to provide a to follow the moved object with the handling device without the motion sequence having to be known beforehand and having to be programmed into the control of the handling device.
  • a certain movement sequence can thus be specified in an abstract manner.
  • the specification of the movement sequence can consist, for example, of tracking a specific object which is moved in a defined or unpredictable (chaotic) manner by the movement of the handling device. It is also possible, for example, to recognize an edge or joint by specifying a specific contrast value and to guide a robot along this joint or edge.
  • the control or image processing calculates a control command for one or more actuators of the handling device, so that the abstract movement command by corresponding actuation commands to each actuator actually occurs a movement of the handling device can be implemented.
  • the actuating command leads to a movement of the handling device, by means of which either the relative position between the object and the handling device is changed, or the handling device in a constant relative position to the working position. moved object follows. A new relative position, for example due to a movement of the object, is detected again in accordance with the method steps described above and converted into a new control command.
  • This type of setting up a movement of a handling device is particularly simple for the user because he does not have to deal with the control program of the handling device or the specification of specific positions to be approached. He can only use the handling device by specifying an object recognizable by image processing and a movement that is abstractly defined in relation to this object. This enables the robot, for example, to automatically track a slot of any length and shape, without having to enter or know the position of this slot. This also leads to a high degree of flexibility in the movement sequence, since the robot can also independently follow new forms of an object, for example an unforeseen deviation in the course of the groove or the like, or an unforeseeable own movement of the object.
  • a simple implementation of the method according to the invention provides image processing which, in addition to recognizing the object, also calculates the relative positions and / or movement between the object and the handling device and forwards corresponding information as control commands to the control of the handling device.
  • a conventional control for handling devices or robots can then be used, which does not need to be adapted for the use of the method according to the invention. In this case, the control circuit is visually fired by the image processing itself.
  • Object itself moves, the location and speed of the object being determined when determining the movement state.
  • it makes sense to determine the location and speed of the object relative to the handling device, so that this relative movement can be taken into account particularly easily in the movement sequence to be carried out, which is given abstractly in relation to the object, for example, in its rest coordinate system.
  • the object movement is thus determined and overlaid with a known movement of the handling device, or a movement that is determined on the basis of the image processing.
  • This also makes it possible for the handling device to carry out work on the moving object, the movement of the object and / or the movement of the handling device not having to be predetermined beforehand.
  • the movement sequence of the handling device it is also possible for the movement sequence of the handling device to be predetermined, for example, relative to the object in a control program.
  • the method can also be used for simple programming of a handling device for setting up a movement of the handling device or robot, in particular if the handling device is to carry out the same movements again and again.
  • the movement sequence is stored in particular with corresponding time information as a result of control commands determined during the execution of the movement. Then the movement of the handling device can take place particularly easily on the basis of this stored sequence of control commands in the desired sequence and at the predetermined time.
  • the storage of the control commands in particular in their chronological order, corresponds to the creation of a program for a handling device for controlling its movement, but is much easier to handle than the specification of specific ones Positions or the import of CAD data, on the basis of which the movement is then calculated.
  • control command or a sequence of control commands can also depend on the type, position and / or the state of motion of the detected object. This feature can be used, for example, to determine the end of a movement sequence when a certain constellation of optical features reveals a certain object. In addition, this makes it possible, for example in the case of a quality control, to have various handling sequences of a handling device carried out automatically, depending on a known error, in order to make error-dependent corrections.
  • the movement of the handling device is checked on the basis of the recorded images.
  • this can be used to easily check whether the conditions for carrying out the stored sequence of control commands still exist, for example whether the moving object has been followed correctly. If this is not the case, the movement sequence can be stopped immediately, for example in order to avoid damage to the object.
  • Allocate tasks to be performed by the handling device during the movement can be any activity be that can be performed by a handling device-controlled tool. This can include welding, sealing a joint, tracking moving objects or other tasks.
  • the tasks can be carried out both during the processing of a stored sequence of control commands as part of a program-controlled handling device movement and during the movement of a handling device based on the currently recognized image data.
  • the image can be recorded by a stationary camera or a camera which is moved along with the handling device.
  • the stationary camera unit has the entire working and movement area of the handling device in view and . can therefore also handle unforeseen events particularly well capture, e.g. chaotic movements of the object to be tracked.
  • the camera unit carried along with the movement of the handling device can be focused on a special work area and offers a higher optical resolution than the stationary camera unit.
  • the combination of a stationary and a moving camera is therefore also particularly advantageous, wherein the image processing, for example, allows two or more images to be evaluated simultaneously, in particular also in real time, in order to calculate a control command.
  • two, three or more stationary and / or moving cameras can also be provided.
  • the method can also be used if objects move or fall out of the field of view of the moving camera in an unexpected way. These objects can then be captured with the stationary camera and the handling device can be guided so that the handling device continues to track this object.
  • the method according to the invention for setting up the movement of handling devices considerably simplifies the handling of manipulation devices and their adaptation to specific tasks and activities, because the usually complex programming of a handling device program with one or more predetermined movement sequences is eliminated. This increases the flexibility when using handling devices, such as robots.
  • the present invention relates to image processing which is particularly suitable for carrying out the method for setting up a movement of a handling device.
  • An image recorded by means of at least one camera is recognized by the image processing, the position of the object is determined spatially and temporally and / or its speed is determined, a relation of the position and / or the speed of the object to the position and / or speed of a handling device is determined and this relation is passed on to the control of the handling device, for example in the form of a control command, in order to track the handling device to the object or to carry out certain tasks or manipulations on the object. This is done in real time if possible and enables the handling device to be controlled on the basis of the visual findings of the image processing.
  • the required relationship between the object and the handling device can be formed from the difference between the positions and / or speeds of the object and handling device, in particular in the form of a deviation vector and / or a relative speed vector, which is then fed to the control.
  • the difference can be fed directly to the controller, which generates the corresponding control commands from the difference.
  • the image processing can convert the differences found into control commands which are fed to the control, which then only generates the specific control commands for the actuators of the handling device.
  • the one or more camera (s) can be positioned over the object and tracked when the object moves, the camera movement recorded and this recording converted into movement information for the handling device.
  • a movement program for a handling device can be generated in a particularly simple manner by copying an object captured by the image processing to the various positions when it is moved.
  • the movement information preferably contains temporal, spatial and / or speed information.
  • Fig. 1 shows schematically the implementation of the inventive method for setting up the movement of a handling device with a stationary object
  • Fig. 2 shows schematically the implementation of the inventive method for setting up the movement of a handling device for a moving object.
  • FIG. 1 shows as a handling device a robot 1 with a plurality of actuators 2 which can be moved about different axes and on which a camera 3 is arranged as a moving sensor.
  • a camera 3 is arranged as a moving sensor.
  • any tools can also be attached to the robot 1, but these are not shown in FIG. 1.
  • the image field 4 of the moving camera 3 is aligned with the object 5.
  • a controller 6, which in the example shown is arranged directly on the robot 1, but can also easily be formed separately from it in a computing unit, and / or in an image processing stored in the same or a separate computing unit, there are recognition features for the object 5 and predefine a movement sequence 7 related to the object 5.
  • the robot 1 should follow the edge 8 of the object 5 in order to check the edge 8 for defects, for example, or to carry out work on the edge 8 by means of a tool (not shown).
  • the image processing of the camera 3 is given features for recognizing the edge, for example a typical contrast curve in the area of the edge 8.
  • the camera 3 records the movement or working area of the robot 1 and evaluates the recorded image with the image processing.
  • the object 5 is identified, its position is determined relative to the robot 1 and the edge 8 is also recognized, which the handling device is to follow on the basis of the abstractly specified movement sequence 7.
  • the controller 6 or the image processing can calculate a control command for the actuators 2 of the robot 1 and issue it accordingly as an actuating command to each actuator 2, so that the handling device 1 follows the edge 8 of the object 5 without the movement sequence having to be permanently programmed in the controller 6 by specifying coordinates.
  • the camera 3 takes an image of the object 5 again after each movement of the robot 1 and repeats the above-described method steps. This makes it possible to specify an abstract movement sequence 7, which is related to the object 5 or certain visual features of the object 5 and which the robot 1 automatically follows.
  • a stationary camera 9 can also be provided, which has a larger image field 4 than the moving camera 3 and serves to capture the object 5 in an overview.
  • the camera 9 is preferably also connected to the image processing of the camera 3 and / or the controller 6 of the robot 1.
  • Providing a stationary camera 9 is particularly useful if the object 5 itself is moved, as shown in FIG. 2.
  • the direction of movement 10 of the object 5 is indicated in FIG. 2 by arrows.
  • the stationary camera 9 serves for a first orientation of the object 5 relative to the robot 1. Because of the larger image field 4 of the stationary camera 9 in comparison to the camera 3 attached to the robot 1, it is easier to find, identify and identify the object 5 to detect unforeseen movement of the object 5, for example slipping on a conveyor belt, quickly and reliably. The precise identification of certain features of the object 5 can then take place with the camera 3 which is also moved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

La présente invention concerne un procédé pour imprimer un mouvement à un appareil de manutention, comprenant au moins un organe de réglage qui peut se déplacer au moyen d'un dispositif de commande autour d'un ou de plusieurs axes. Selon le procédé: a) un objet reconnaissable optiquement ou un déroulement de mouvement relatif à l'objet, est soumis au dispositif de commande de l'appareil de manutention ou à un dispositif de traitement d'image; b) une image de la zone de déplacement et/ou de fonctionnement de l'appareil de manutention, est prise au moyen d'une caméra; c) l'image prise est évaluée au moyen d'un dispositif de traitement d'image de sorte que l'objet prédéterminé est reconnu et sa position et/ou son état de mouvement est déterminée en particulier par rapport à l'appareil de manutention; d) le dispositif de commande ou le dispositif de traitement d'image calcule, à partir de la position et/ou de l'état de mouvement de l'objet reconnu et du déroulement de mouvement relatif à l'objet, un ordre de commande pour un ou plusieurs organes de réglage de l'appareil de manutention; e) le dispositif de commande produit en fonction de l'ordre de commande, un ordre de réglage destiné à chaque organe de réglage à mettre en mouvement; et f) les étapes b) à e) sont réitérées. L'invention a également pour objet un dispositif de traitement d'image correspondant.
EP04790671A 2003-10-20 2004-10-20 Procede pour imprimer un mouvement a un appareil de manutention et dispositif de traitement d'image Withdrawn EP1675709A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE10349221 2003-10-20
PCT/EP2004/011863 WO2005039836A2 (fr) 2003-10-20 2004-10-20 Procede pour imprimer un mouvement a un appareil de manutention et dispositif de traitement d'image

Publications (1)

Publication Number Publication Date
EP1675709A2 true EP1675709A2 (fr) 2006-07-05

Family

ID=34484924

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04790671A Withdrawn EP1675709A2 (fr) 2003-10-20 2004-10-20 Procede pour imprimer un mouvement a un appareil de manutention et dispositif de traitement d'image

Country Status (3)

Country Link
US (1) US20070216332A1 (fr)
EP (1) EP1675709A2 (fr)
WO (1) WO2005039836A2 (fr)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10345743A1 (de) * 2003-10-01 2005-05-04 Kuka Roboter Gmbh Verfahren und Vorrichtung zum Bestimmen von Position und Orientierung einer Bildempfangseinrichtung
DE102007008903A1 (de) * 2007-02-23 2008-08-28 Abb Technology Ag Einrichtung zum Steuern eines Roboters
EP2255930A1 (fr) * 2009-05-27 2010-12-01 Leica Geosystems AG Procédé et système destinés au positionnement très précis d'au moins un objet dans une position finale dans l' espace
DE102009058817A1 (de) 2009-12-18 2010-08-05 Daimler Ag Anlage und Verfahren zum maßhaltigen Rollfalzen eines Bauteils
JP5609760B2 (ja) * 2011-04-27 2014-10-22 トヨタ自動車株式会社 ロボット、ロボットの動作方法、及びプログラム
EP2602588A1 (fr) 2011-12-06 2013-06-12 Hexagon Technology Center GmbH Détermination de position et d'orientation dans 6-DOF
JP5922932B2 (ja) * 2012-01-18 2016-05-24 本田技研工業株式会社 ロボットティーチング方法
DE102015204867A1 (de) * 2015-03-18 2016-09-22 Kuka Roboter Gmbh Robotersystem und Verfahren zum Betrieb eines teleoperativen Prozesses
DE102015209896B3 (de) * 2015-05-29 2016-08-18 Kuka Roboter Gmbh Ermittlung der Roboterachswinkel und Auswahl eines Roboters mit Hilfe einer Kamera
WO2016203858A1 (fr) * 2015-06-18 2016-12-22 オリンパス株式会社 Système médical
US9926138B1 (en) * 2015-09-29 2018-03-27 Amazon Technologies, Inc. Determination of removal strategies
CN112534240A (zh) 2018-07-24 2021-03-19 玻璃技术公司 用于测量波形玻璃片的表面的系统及方法
JP7467041B2 (ja) * 2018-09-27 2024-04-15 キヤノン株式会社 情報処理装置、情報処理方法及びシステム
JP6898374B2 (ja) 2019-03-25 2021-07-07 ファナック株式会社 ロボット装置の動作を調整する動作調整装置およびロボット装置の動作を調整する動作調整方法
GB2589419A (en) * 2019-08-09 2021-06-02 Quantum Leap Tech Limited Fabric maintenance sensor system
US11867630B1 (en) 2022-08-09 2024-01-09 Glasstech, Inc. Fixture and method for optical alignment in a system for measuring a surface in contoured glass sheets

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4568816A (en) * 1983-04-19 1986-02-04 Unimation, Inc. Method and apparatus for manipulator welding apparatus with improved weld path definition
JP2786225B2 (ja) * 1989-02-01 1998-08-13 株式会社日立製作所 工業用ロボットの制御方法及び装置
WO2000045229A1 (fr) * 1999-01-29 2000-08-03 Georgia Tech Research Corporation Servo-regulateur dynamique non etalonne pour systeme mecanique
WO2003064116A2 (fr) * 2002-01-31 2003-08-07 Braintech Canada, Inc. Procede et appareil pour robotique guidee par vision 3d au moyen d'une camera unique
ATE531488T1 (de) * 2002-03-04 2011-11-15 Vmt Vision Machine Technic Bildverarbeitungssysteme Gmbh Verfahren zur bestimmung der lage eines objektes und eines werkstücks im raum zur automatischen montage des werkstücks am objekt

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2005039836A2 *

Also Published As

Publication number Publication date
US20070216332A1 (en) 2007-09-20
WO2005039836A2 (fr) 2005-05-06
WO2005039836A3 (fr) 2005-11-24

Similar Documents

Publication Publication Date Title
DE102018116053B4 (de) Robotersystem und Roboterlernverfahren
DE19930087B4 (de) Verfahren und Vorrichtung zur Regelung der Vorhalteposition eines Manipulators eines Handhabungsgeräts
DE102018001026B4 (de) Robotersystem mit einer lernenden Steuerungsfunktion und lernendes Steuerungsverfahren
WO2005039836A2 (fr) Procede pour imprimer un mouvement a un appareil de manutention et dispositif de traitement d'image
DE102009034529B4 (de) Automatisches Führungs- und Erkennungssystem sowie Verfahren für dieses
DE102014108956A1 (de) Vorrichtung zum Entgraten mit visuellem Sensor und Kraftsensor
DE112017002639T5 (de) Robotersteuerungsvorrichtung
DE102012104194A1 (de) Roboter und Punktschweißverfahren mit selbstlernender Steuerfunktion
EP1537009A2 (fr) Procede et dispositif pour monter plusieurs elements rapportes sur une piece
DE69837741T2 (de) Verfahren und system zur steuerung eines roboters
DE102014117346B4 (de) Roboter, Robotersteuerungsverfahren und Robotersteuerungsprogramm zur Werkstückkorrektur
DE3317263A1 (de) Manipulator mit adaptiver geschwindigkeitsgesteuerter bahnbewegung
EP3587044B1 (fr) Procédé de préhension d'objets dans une zone de recherche et système de positionnement
DE102018212531B4 (de) Artikeltransfervorrichtung
DE102010023736A1 (de) Robotersystem mit Problemerkennungsfunktion
DE102008062622A1 (de) Verfahren und Vorrichtung zur Befehlseingabe in eine Steuerung eines Manipulators
EP3725472A1 (fr) Procédé de détermination d'une trajectoire d'un robot
EP3974125B1 (fr) Procédé et dispositif destinés à la commande d'un robot
WO2015055320A1 (fr) Reconnaissance de gestes d'un corps humain
DE102007029398A1 (de) Verfahren und Vorrichtung zum Programmieren eines Industrieroboters
DE102016012227A1 (de) Verfahren zur automatischen Lagekorrektur eines Roboterarms
DE102017005194B3 (de) Steuern einer Roboteranordnung
EP3771952A1 (fr) Procédé de déplacement automatique d'un appareil de travail ainsi qu'appareil de travail
DE102015209773B3 (de) Verfahren zur kontinuierlichen Synchronisation einer Pose eines Manipulators und einer Eingabevorrichtung
DE69207018T2 (de) Verfahren zur Führung eines Roboterarmes durch definieren von Ersatzstrecken

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060330

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PL PT RO SE SI SK TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20070612

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20071228