EP3959046A1 - Procédé et système pour faire fonctionner un robot - Google Patents

Procédé et système pour faire fonctionner un robot

Info

Publication number
EP3959046A1
EP3959046A1 EP20715023.6A EP20715023A EP3959046A1 EP 3959046 A1 EP3959046 A1 EP 3959046A1 EP 20715023 A EP20715023 A EP 20715023A EP 3959046 A1 EP3959046 A1 EP 3959046A1
Authority
EP
European Patent Office
Prior art keywords
robot
minimum distance
obstacle
maximum speed
minimum
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20715023.6A
Other languages
German (de)
English (en)
Inventor
Markus WUENSCH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
KUKA Deutschland GmbH
Original Assignee
KUKA Deutschland GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by KUKA Deutschland GmbH filed Critical KUKA Deutschland GmbH
Publication of EP3959046A1 publication Critical patent/EP3959046A1/fr
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1615Programme controls characterised by special kind of manipulator, e.g. planar, scara, gantry, cantilever, space, closed chain, passive/active joints and tendon driven manipulators
    • B25J9/162Mobile manipulator, movable base with manipulator arm mounted on it
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1651Programme controls characterised by the control loop acceleration, rate control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40203Detect position of operator, create non material barrier to protect operator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40455Proximity of obstacles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40544Detect proximity of object

Definitions

  • the present invention relates to a method and a system for operating at least one robot and a computer program product for carrying out the method.
  • the object of the present invention is to improve the operation of robots.
  • Claims 9, 10 provide a system or computer program product
  • a method for operating one or more robots has the following steps:
  • the operation of the robot or robots can be improved, in particular through a dynamic reduction in the
  • the minimum distance from that of several, in particular unforeseen, obstacles whose (minimum) distance from the (respective) robot is the smallest (“robot closest obstacle”) is determined as the minimum distance.
  • the minimum distances to several obstacles are determined for this purpose and from this the smallest of these (each minimum for this obstacle) distances are selected as the minimum distance between the robot and the (next to the robot) obstacle.
  • one or more previously known, in particular temporarily are excluded or masked in one embodiment when or to determine the minimum distance.
  • another mobile robot, an autonomous transport vehicle or the like can register and then excluded as a previously known, temporary obstacle when determining the minimum distance.
  • the method has the step:
  • operation of the robot or robots can be further improved, in particular by a differentiated (re) reduction in the
  • Maximum speed a (even) higher process speed can be achieved and / or human-robot cooperation can be (further) improved.
  • the maximum speed is reduced to zero or the (respective) robot is stopped when the second, third or, in particular, fourth limit value is undershot.
  • the maximum speed is reduced in one embodiment when falling below the second, in particular third or fourth limit value, to a human-robot cooperation speed, which is specified for human-robot cooperation, in one embodiment (only) if the robot is set up to stop when it comes into contact with an obstacle.
  • the first, second, third and / or fourth minimum distance is greater than zero.
  • the maximum speed is reduced (successively) as soon as the obstacle and the robot approach or before contact between the obstacle and the robot.
  • collisions between robots and people can be avoided or their consequences can be reduced and / or a higher process speed can be achieved and / or human-robot cooperation can be (further) improved.
  • the or one or more of the robots has / have in one embodiment (each) a stationary or environmentally fixed or mobile, in particular mobile, base and / or at least one, in particular arranged on, robot arm with at least three, in particular at least six, in one Execution at least seven, joints and joint actuators or drives.
  • the present invention can be used with particular advantage in such robots.
  • the minimum distance is a Cartesian distance, in one embodiment a length of a spatial or three-dimensional connecting line or a two-dimensional projection thereof, in particular onto a horizontal plane.
  • the maximum speed is a permissible or (maximum) permissible speed or a speed limit of the (respective) robot, in particular a robot-fixed reference, in an embodiment of an end effector of the robot arm, a mobile base or the like.
  • a pose of the obstacle is determined, in one embodiment relative to an environmentally fixed reference, in particular (in) one
  • a pose of the obstacle comprises a one, two or three-dimensional position and / or a one, two or three-dimensional orientation of the obstacle.
  • Robot determined in one embodiment relative to an environmentally fixed reference, in particular (in) an environmentally fixed reference system, in one embodiment relative to the same reference or the same reference system as the pose of the obstacle.
  • a pose of the robot includes one
  • the minimum distance is determined (in each case) on the basis of this pose of the obstacle and / or on the basis of this pose of the robot.
  • the distance can be determined particularly advantageously in one embodiment, in particular simply (more) and / or reliably (more).
  • the determined pose of the robot corresponds to the position and / or orientation of several robot-fixed references, in particular limbs or
  • Link points of the robot are determined in one embodiment as the minimum distance of the robot to the obstacle, the smallest of the minimum distances between these robot-fixed references and the obstacle.
  • the minimum distances between the obstacle and the various robot-fixed references are first determined and, among these, the smallest is selected as the minimum distance between the robot and the obstacle. This is based on the idea that the robot-fixed reference closest to the obstacle has the greatest collision probability, which can thus be reduced.
  • the minimum distances between the obstacle and reference are determined for several obstacles and several robot-fixed references, and from this the smallest distance is selected or determined as the minimum distance between the robot and an obstacle closest to the robot.
  • the pose, in particular the position (s) and / or orientation (s) of one or more robot-fixed references, in particular limbs, of the robot is determined on the basis of a detected joint position of the robot, in one embodiment by means of a forward transformation based on a kinematic Model of the robot.
  • the pose of the robot in a further development is based on a, in particular, recorded or actual or predetermined or planned, end effector and / or a, in particular, recorded or actual or predetermined or planned, robot-guided payload, in particular a predetermined dimension of the end effector or the payload.
  • the distance can in each case, in particular in combination, be determined particularly advantageously, in particular simply (more) and / or reliably (more).
  • Obstacle and / or the robot with the help of one or more sensors, in one embodiment of one or more environmentally fixed sensors and / or one or more robot-fixed sensors, in particular with the help of image processing, laser light, ultrasound, radar radiation, a light grid, a projection and / or capacitive , determined.
  • the determination in a reference system in particular a common reference system, can be improved by means of environmentally stable sensors
  • Image processing in particular the detection of obstacles, through laser light in particular the precision, through a light grid and a projection in particular the Security.
  • ultrasound and radar radiation can reduce disruption to robot operation.
  • the minimum distance is determined with the aid of, in particular in or through, at least one robot-external data processing device, in particular in a central data processing device for two or more robots, in one embodiment in a (security) cloud.
  • the invention is advantageously used for several robots at the same time and / or can be easily adapted to or used for individual robots.
  • the minimum distance in an embodiment can also be in a
  • the robot controller is determined and its modules are thereby advantageously used, for example for forward transformation or the like.
  • the minimum distance is determined to be a minimum distance from a spatial area (from) a group which, in one embodiment, has a plurality of predetermined discrete, in particular two- or three-dimensional, spatial areas that are environmentally stable.
  • the minimum distance is a minimum distance between a first spatial area (from) of a group, which has several, in particular environmental, predetermined, discrete, in particular two- or three-dimensional, spatial areas, and a second
  • Spatial area (from) this or another group, which has several, in particular environment-proof, predetermined discrete, in particular two- or three-dimensional, spatial areas, determined.
  • two-dimensional areas in particular floor areas, are generally referred to as (two-dimensional) spatial areas.
  • a room area extends over the entire height of a detection or
  • one, in particular two- or three-dimensional, monitoring area for obstacles and / or one, in particular two- or three-dimensional, monitoring area for the robot is (in each case) discretized into several predetermined spatial areas and then as the minimum distance in each case Minimum distance to the spatial area (closest to the robot) that is (still) injured by the respective obstacle or in which the respective Obstacle at least partially stops, or the spatial area (closest to the obstacle) which is (still) injured by the robot or in which the robot is at least partially, is determined.
  • the distance can in each case, in particular in combination, be determined particularly advantageously, in particular simply (more) and / or reliably (more).
  • the maximum speed is between at least two minimum distances, in particular between the first and second
  • the maximum speed is continuously reduced between at least two minimum distances, in particular between the first and second minimum distances, between the second and third minimum distances and / or between the third and fourth minimum distances.
  • a simple (more) and / or reliable (more) monitoring can advantageously be implemented and / or carried out by means of a continuous reduction, at least in sections, in particular a differentiated (more) or more sensitive Monitoring.
  • the maximum speed is based on a
  • the maximum speed is reduced if (it is detected that) the minimum distance is equal to the second minimum distance and the obstacle and robot are moving away from each other, and on the other hand it is reduced more if (it is detected that) the minimum distance is equal to the second minimum distance and
  • the maximum speed is reduced in one embodiment on the basis of a planned movement of the robot.
  • the maximum speed is reduced if (it is detected that) the minimum distance is equal to the second minimum distance and, based on the planned robot path, it is forecast that this minimum distance will (again) increase, and, in contrast, it is reduced more if (it is detected that) the minimum distance is equal to the second minimum distance and based on the planned robot path
  • the maximum speed is reduced in one embodiment depending on the robot's reach.
  • the maximum speed is reduced if (it is detected that) the minimum distance is equal to the second minimum distance and the robot has a first projection, and, in contrast, it is reduced more if (it is detected that) the minimum distance is equal to the second minimum distance and the robot has a larger second projection.
  • the maximum speed is reduced in one version depending on a robot-guided payload.
  • the maximum speed is reduced if (it is detected that) the minimum distance is equal to the second minimum distance and the robot is carrying a first payload, and on the other hand it is reduced more if (it is detected that) the minimum distance is equal to the second minimum distance and the robot carries a larger second payload.
  • the maximum speed is reduced in one embodiment depending on the current speed of the robot.
  • the maximum speed is reduced by a first amount if (it is detected that) the minimum distance is equal to the second minimum distance and the robot has a first current speed, and reduced by a larger second amount if (it is detected that ) the minimum distance is equal to the second minimum distance and the robot has a larger second current
  • Maximum speed parameterized by a user in particular directly or indirectly, for example by a table, function or the like, the first, second, third and / or fourth minimum distance and / or the, in particular associated, reduction of the maximum speed, in particular its amount.
  • the reduction of the maximum speed is parameterized in such a way that the maximum speed is reduced for a minimum distance equal to the second minimum distance if the signal transmission, the robot and / or the sensor (each) have a first configuration, and the same for a minimum distance
  • the second minimum distance is reduced more if the signal transmission, the robot or the sensor has a different second configuration, in particular the signal transmission has longer communication times, the sensor has longer response times or a coarser detection or the like.
  • a system in particular in terms of hardware and / or software, in particular in terms of programming, is set up to carry out a method described here and / or has:
  • obstacle in particular next to the robot, obstacle, in particular with the exclusion of at least one previously known, in particular temporary, obstacle;
  • the system or its means has:
  • At least one robot-external data processing device for determining the minimum distance
  • a means within the meaning of the present invention can be designed in terms of hardware and / or software, in particular a processing, in particular microprocessor unit (CPU), graphics card (GPU), preferably a data or signal connected to a memory and / or bus system, in particular a digital processing unit ) or the like, and / or one or more programs or program modules.
  • the processing unit can be designed to receive commands that are implemented as one in one
  • a storage system can have one or more,
  • a computer program product can have, in particular, a non-volatile storage medium for storing a program or with a program stored thereon, execution of this program causing a system or a controller, in particular a computer, to use a program here to carry out the described method or one or more of its steps.
  • one or more, in particular all, steps of the method are carried out completely or partially automatically, in particular by the system or its means.
  • the system has the robot or robots.
  • FIG. 1 shows two robots and a system for operating the robots according to an embodiment of the present invention
  • FIG. 1 shows an example of a stationary robot 2 and a mobile robot 30 with a robot arm 32 with an end effector in the form of a gripper 33 with a payload 34 and a system for operating these robots according to an embodiment of the present invention.
  • the system has a sensor 1 fixed to the environment, for example a camera with image processing, and sensors 31 fixed to the robot 30, for example laser scanners, arranged on the mobile robot 30.
  • a common monitoring space of the sensors 1, 31 is divided into predetermined space areas that are stable to the environment, as indicated by dashed lines in FIG. 1.
  • a data processing device 5 external to the robot receives joint positions of the robots 2, 30 and signals from the sensors 1, 31.
  • the data processing device 5 determines in a step S10 (cf. FIG. 2) a current pose of this robot relative to an environmentally fixed reference system, which is indicated in FIG. 1 by (x, y, z). Analogously, the data processing device 5 determines a current pose of the robot 30 from the joint position of the robot 30, in particular an odometrically detected pose of its mobile base.
  • the data processing device 5 can also determine the current pose of the robot 1 and / or 30 on the basis of the sensor 1 in step S10, which can be particularly advantageous in the case of the mobile robot.
  • the data processing device 5 determines from these detected poses in each case those of the monitoring space spatial areas which are injured by the respective robot, ie in which the respective robot is at least partially located. These monitoring space space areas are indicated by hatching in FIG. 1 (different for both robots 1, 30). On the basis of the sensor signals from the sensors 1, 31 determines the
  • Data processing device 5 in step S20 also a pose of unforeseen obstacles and those of the monitoring room space areas which are violated by the respective obstacle, i.e. in which the respective obstacle is at least partially.
  • a person 4 is exemplified in FIG. 1 and the monitoring room area F 6, I is indicated by him (as) injured (determined) by cross hatching.
  • a step S30 the data processing device 5 determines for each of the robots 2, 30 the minimum distances between all of them by a
  • Unforeseen obstacle injured surveillance room areas in the exemplary embodiment the single surveillance room area F 6, I) and the surveillance room areas injured by the robot, of which in FIG. 1 the room areas F 4 , 2 , F 5 3 and F 8 3 are indicated by indices (the indices identify the row and column of the corresponding
  • step S30 for each of the robots 2, 30 in each case the smallest of these minimum distances, i.e. the smallest minimum distance between one injured by the robot
  • Monitoring room spatial areas is the distance 6, i a 5.3 , since none of the other spatial areas currently injured by the robot 2 has an even smaller distance to the spatial area violated by the obstacle 4, and that the minimum distance between the obstacle 4 and the robot 30 or the monitored area spatial areas injured by these and closest to one another is the distance 6, i a 4.2 .
  • the data processing device 5 determines for each of the robots 2, 30 an amount by which the for this robot when free
  • Maximum speed is reduced if the minimum distance to the obstacle 4 falls below a first minimum distance Ai, is reduced more if the minimum distance (also) falls below a second minimum distance A 2 , which is smaller than the first minimum distance, is reduced even more if the minimum distance (also) falls below a third minimum distance A 3 , which is smaller than the second minimum distance, and is reduced even more if the minimum distance (even) falls below a fourth minimum distance A 4 , which is smaller than the third minimum distance.
  • Monitoring room areas also directly determine the minimum distance between the monitoring room area that is violated by an obstacle (closest to the robot) and one or more robot-fixed references, in particular its elbow, end flange and / or tool.
  • the maximum speed can also be continuously reduced, at least in sections, with a decreasing distance.
  • FIG. 3 shows such a reduction in the

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Le procédé selon l'invention, destiné à faire fonctionner au moins un robot (2, 30), comprend les étapes suivantes : déterminer (S10-S30) une distance minimale du robot par rapport à un obstacle (4), en particulier au voisinage proche du robot, en particulier exclusion faite d'au moins un obstacle connu auparavant, en particulier temporaire, réduire (S40, S50) une vitesse maximale du robot, si ladite distance minimale passe au-dessous d'un premier écart minimal et réduire de manière plus marquée (S40, S50) cette vitesse maximale du robot, si la distance minimale passe au-dessous d'un second écart minimal, inférieur au premier écart minimal.
EP20715023.6A 2019-04-26 2020-03-26 Procédé et système pour faire fonctionner un robot Pending EP3959046A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019206012.9A DE102019206012A1 (de) 2019-04-26 2019-04-26 Verfahren und System zum Betreiben eines Roboters
PCT/EP2020/058450 WO2020216569A1 (fr) 2019-04-26 2020-03-26 Procédé et système pour faire fonctionner un robot

Publications (1)

Publication Number Publication Date
EP3959046A1 true EP3959046A1 (fr) 2022-03-02

Family

ID=70050107

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20715023.6A Pending EP3959046A1 (fr) 2019-04-26 2020-03-26 Procédé et système pour faire fonctionner un robot

Country Status (6)

Country Link
US (1) US20220219323A1 (fr)
EP (1) EP3959046A1 (fr)
KR (1) KR20220002408A (fr)
CN (1) CN113966265B (fr)
DE (1) DE102019206012A1 (fr)
WO (1) WO2020216569A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024504020A (ja) * 2020-12-25 2024-01-30 優必康(青島)科技有限公司 衝突回避方法、移動機器、及び記憶媒体
US11797013B2 (en) * 2020-12-25 2023-10-24 Ubtech North America Research And Development Center Corp Collision avoidance method and mobile machine using the same
DE102021208576B3 (de) 2021-08-06 2022-10-06 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung eingetragener Verein Vorgeben einer zulässigen Maximalgeschwindigkeit eines robotischen Gerätes
DE102021130535B3 (de) 2021-11-22 2023-05-17 Helmut Gutzmann System und Verfahren zur Positionierung einer bewegbaren Manipulatoreinheit

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10361132B4 (de) * 2003-06-18 2013-02-28 Elan Schaltelemente Gmbh & Co. Kg Verfahren zur Überwachung der Bewegung eines sich in mehreren Freiheitsgraden bewegenden Gefahr bringenden Objektes eines Handhabungsgerätes, wie Handhabungsmasse und/oder bewegliche Masse
DE102004048563A1 (de) * 2004-10-04 2006-04-13 Benteler Automobiltechnik Gmbh Überwachungseinrichtung für Roboter
WO2007085330A1 (fr) * 2006-01-30 2007-08-02 Abb Ab Procédé et système permettant la supervision d'une zone de travail comportant un robot industriel
JP4512200B2 (ja) * 2007-03-15 2010-07-28 株式会社日立製作所 ロボット
EP1972415B1 (fr) * 2007-03-23 2019-01-02 Honda Research Institute Europe GmbH Robots avec fonction d'évitement de collision
DE102008013400B4 (de) * 2008-03-06 2016-03-10 Voith Engineering Services Gmbh Verfahren zur Ermittlung von Verriegelungsbereichen wenigstens eines im Raum bewegbaren ersten Objekts
EP2353799B8 (fr) * 2010-02-05 2018-08-08 KUKA Deutschland GmbH Procédé et dispositif de surveillance d'une chambre de manipulateur
EP2952301B1 (fr) * 2014-06-05 2019-12-25 Softbank Robotics Europe Robot humanoïde avec des capacités d'évitement de collisions et de récupération d'une trajectoire
DE102015225587A1 (de) * 2015-12-17 2017-06-22 Volkswagen Aktiengesellschaft Interaktionssystem und Verfahren zur Interaktion zwischen einer Person und mindestens einer Robotereinheit
DE102016007520A1 (de) * 2016-06-20 2017-12-21 Kuka Roboter Gmbh Überwachung einer Roboteranordnung
DE102016007601A1 (de) * 2016-06-21 2017-12-21 Deutsches Zentrum für Luft- und Raumfahrt e.V. Konfigurieren und/oder Steuern einer Roboteranordnung
CN107225570A (zh) * 2017-04-20 2017-10-03 深圳前海勇艺达机器人有限公司 智能机器人的避障方法及装置
EP3421189B1 (fr) * 2017-06-28 2019-05-22 Sick AG Procédé de surveillance d'une machine
JP7052308B2 (ja) * 2017-11-15 2022-04-12 セイコーエプソン株式会社 センサー、およびロボット

Also Published As

Publication number Publication date
DE102019206012A1 (de) 2020-10-29
US20220219323A1 (en) 2022-07-14
KR20220002408A (ko) 2022-01-06
CN113966265B (zh) 2024-08-20
WO2020216569A1 (fr) 2020-10-29
CN113966265A (zh) 2022-01-21

Similar Documents

Publication Publication Date Title
EP3959046A1 (fr) Procédé et système pour faire fonctionner un robot
EP2838698B2 (fr) Agencement d'un robot et procédé de commande d'un robot
EP2977149B1 (fr) Procede et moyen de conception et/ou de fonctionnement d'un robot
DE102016216441B3 (de) Verfahren und Vorrichtung zur Mensch-Roboter Kooperation
EP3725472A1 (fr) Procédé de détermination d'une trajectoire d'un robot
DE102019205651B3 (de) Verfahren und System zum Ausführen von Roboterapplikationen
EP3471926B1 (fr) Surveillance d'un ensemble robotique
DE102018117829B4 (de) Steuereinheit für Gelenkroboter
DE102010048369A1 (de) Verfahren und Vorrichtung zur Sicherheitsüberwachung eines Manipulators
EP3974125B1 (fr) Procédé et dispositif destinés à la commande d'un robot
EP3415286A1 (fr) Surveillance d'un robot
DE102020130520A1 (de) Verfahren zum steuern eines roboters in gegenwart menschlicher bediener
EP3408062A1 (fr) Commande d'une association de robots
DE102019211770B3 (de) Verfahren zur rechnergestützten Erfassung und Auswertung eines Arbeitsablaufs, bei dem ein menschlicher Werker und ein robotisches System wechselwirken
EP3959048A1 (fr) Procédé et système de manipulation pour permettre à un objet d'être manipulé par un robot
DE102020209866B3 (de) Verfahren und System zum Betreiben eines Roboters
DE102010007027A1 (de) Verfahren und Vorrichtung zur Überwachung eines Manipulatorraumes
DE102022130178A1 (de) Integritäts- und sicherheitsprüfung für roboter
EP2353800B1 (fr) Procédé et dispositif de surveillance d'une chambre de manipulateur
DE102019219930B3 (de) Verfahren und System zum Steuern eines Roboters
EP3810377B1 (fr) Procédé et système de transfert d'un effecteur terminal d'un robot entre une position d'effecteur terminal et une autre position d'effecteur terminal
EP3955077A1 (fr) Dispositif de commande d'une installation
EP3740358B1 (fr) Surveillance d'une zone de travail d'un ensemble robotique
DE102022112439B3 (de) Sicherer Roboter
DE112018007703C5 (de) Robotersteuerung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20211112

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230528

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240417