WO2020216570A1 - Verfahren und system zur überwachung einer roboteranordnung - Google Patents
Verfahren und system zur überwachung einer roboteranordnung Download PDFInfo
- Publication number
- WO2020216570A1 WO2020216570A1 PCT/EP2020/058451 EP2020058451W WO2020216570A1 WO 2020216570 A1 WO2020216570 A1 WO 2020216570A1 EP 2020058451 W EP2020058451 W EP 2020058451W WO 2020216570 A1 WO2020216570 A1 WO 2020216570A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- arrangement
- signal sources
- optical signals
- sensor
- Prior art date
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/087—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/06—Safety devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1653—Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
- B25J9/1676—Avoiding collision or forbidden zones
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16P—SAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
- F16P3/00—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
- F16P3/12—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
- F16P3/14—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
- F16P3/142—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using image capturing devices
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16P—SAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
- F16P3/00—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
- F16P3/12—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
- F16P3/14—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
- F16P3/144—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using light grids
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/406—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by monitoring or safety
- G05B19/4061—Avoiding collision or forbidden zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37631—Means detecting object in forbidden zone
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40202—Human robot coexistence
Definitions
- the present invention relates to a method and a system for monitoring a robot arrangement which has at least one robot, as well as a
- the object of the present invention is to monitor a
- Claims 9, 10 provide a system or computer program product
- Triggering a monitoring reaction if a discrepancy between an actual arrangement of the detected optical signals and a (predicted) target arrangement of these signals results in a one-dimensional or multi-dimensional limit value exceeds, in particular at least a predetermined minimum number of signals of the desired arrangement is not present in the actual arrangement of the detected optical signals, this minimum number being equal to one in one embodiment, greater than one and / or less than 10 in one embodiment, or this is determined.
- the target arrangement of the signals which is predicted or assumed for a state without unexpected obstacles, is to be compared with a recorded actual arrangement. If one or at least the specified minimum number of the optical signals provided in the target arrangement or predicted by them are not present, this may be due to the fact that
- an unexpected obstacle interrupts the optical path (on which the target arrangement is based) between the signal source (s) and sensor (s).
- the robot arrangement has one or more robots, in particular one or more stationary robots and / or one or more mobile robots, which in one embodiment (each) have a robot arm with at least three, in particular at least six, in one Design have at least seven joints, in particular swivel joints, it can in particular consist of this. Because of its working spaces and processes, the present invention can be used with particular advantage for monitoring such robot arrangements.
- one or more of the signal sources are active (emitting) signal sources that light the optical signals, in one embodiment laser light and / or visible laser light, infrared (laser) light and / or UV-laser light , (actively) emit.
- the monitoring can advantageously be carried out reliably, particularly even in poor (more) light conditions.
- one or more of the signal sources are passive (or only reflecting ⁇ signal sources, which (in each case), in particular in a targeted manner, directed in one embodiment and / or (only) at predetermined times, by one or more
- Light sources are illuminated with light, in one embodiment laser light and / or visible L (laser) light, infrared (laser) light and / or UV-L (laser) light, and reflect this as an optical signal.
- these signal sources can be made smaller in one embodiment and / or an energy supply for signal sources positioned on the robot arrangement can be dispensed with.
- both (active) emitting and (passive) reflecting are generally referred to as emitting (of the optical signals by the (corresponding) signal source (s)) for a more compact representation.
- one or more of the signal sources (each) have one or more LEDs and / or laser pointers. In this way, particularly advantageous optical signals can be used in one embodiment.
- one or more (optical) deflection means are or are in one embodiment in an optical path between one or more of the signal sources and the or one or more of the sensors (in each case planned or targeted) , arranged the light redirect between signal source and sensor or are set up for this or
- these sensors can also be used to see covered areas, as is the case, for example, from ceiling mirrors
- the robot arrangement By positioning the signal sources and / or the sensor (s) on the robot arrangement, in particular its robot (s), the
- the robot arrangement can advantageously be monitored in different positions.
- signal sources are or will be positioned on, in particular on, the robot arrangement, in particular its robot (s).
- the or one or more of the sensors can be positioned at a distance from the robot arrangement or on the environmental side, in particular fixed, and thus observe the robot arrangement from the outside, so to speak.
- the or one or more of the sensors can in turn (likewise) be positioned on the robot arrangement, in particular its robot (s), so that the robot arrangement can, so to speak, observe itself.
- a sensor which is arranged on a limb, for example a base, of a robot can detect optical signals from a signal source on another limb, for example an end effector, of this robot and thus obstacles between the two limbs or base and end effector be recognized.
- the or one or more of the sensors are or will be positioned on the robot arrangement, in particular its robot (s). Then, in one embodiment, as described above, one or more of the signal sources can in turn be (also) positioned on the robot arrangement, in particular its robot (s), so that the robot arrangement can, so to speak, observe itself and look for obstacles between his limbs can react.
- one or more of the signal sources can then also be or be fixedly positioned at a distance from the robot arrangement or on the environmental side, in particular fixed, so that, conversely, the surroundings can be observed from the robot arrangement.
- Robot arrangement not dynamically loaded.
- Positioning of signal sources and positioning of sensors on the robot arrangement can advantageously be monitored according to the invention by means of robot-guided sensors.
- the or one or more of the sensors can (each) be one or more
- the (target or actual arrangement of the) optical signals can in particular be one or more, in one embodiment two- or three-dimensional, (target or actual) images or, in one embodiment two- or three-dimensional, ( Target or actual) images of the corresponding
- unexpected obstacles based on the images of the signal sources covered by these obstacles in such images or signals of the target arrangement that are not present in the actual arrangement of the captured optical signals can be particularly advantageous, in particular reliable (more) and / or simple ( he), be recognized.
- Monitoring can be implemented very sensitively and / or with a few (more) n signal sources, With a minimum number of two or more signals of the desired arrangement that are not present in the actual arrangement of the detected optical signals, monitoring is less susceptible to interference, in particular when there are several or more closely positioned
- one or more captured actual images of the signal sources are compared with one or more predicted target images of these signal sources and a monitoring reaction is triggered if the minimum number of in the actual image (s) is (are) in the Target images, existing or seen signal source images are missing.
- the target arrangement in particular the (target) image or images, is or is determined (relative) to one another on the basis of or as a function of a determined position of the signal sources and the sensor or sensors (relative) to one another, in particular predicted .
- the target arrangement in particular the target image or images, is or is based on or depending on a determined position of the robot arrangement, in particular its robot, and / or a predetermined or determined position of the signal sources and of the sensor (relative) to the
- Robot arrangement determined.
- the position of the robot arrangement can be based on, in one embodiment by means of joint sensors, measured joint positions
- Robot arrangement can be determined.
- Robot assembly the position of positioned on the robot assembly
- Signal sources and / or sensors are determined, in one embodiment based on a known position of these positioned on the robot arrangement
- Signal sources or sensors to or on, in particular on, the robot arrangement this position being or is predetermined in one embodiment, in particular by the signal sources or sensors being specifically at a predetermined position, in particular, the robot arrangement, in particular its robot (s), are or are positioned.
- signal sources positioned or spaced apart from it or on the ambient side and / or a position of the or one or more of the sensors positioned on the robot arrangement or spaced apart therefrom or on the ambient side are determined (relative) to the robot arrangement, in one embodiment by means of triangulation , in particular with the aid of the detected optical signals, or the like.
- Target arrangement of the optical signals in particular the target image or images of the signal sources, determined on the basis of a kinematic and / or optical model, which in one embodiment shows a relationship between positions of the optical signals
- the model can have or take into account optical paths between the signal sources and the sensor (s) and / or an environment, in particular provided or known obstacles, of the robot arrangement.
- the model can be determined, in particular parameterized, theoretically and / or empirically, in particular with the aid of learning runs of the robot arrangement.
- the position of this camera relative to the robot can also be determined from this by means of triangulation. This also applies vice versa for a robot-guided camera and signal sources on the ambient side. Then the image of the signal sources can be predicted for other positions of the robot that the camera at would not have to capture optical paths disturbed by unexpected obstacles, and this should be compared with the actually captured image.
- the desired arrangement of the optical signals can be advantageously determined in one embodiment, in particular in combination of at least two of the aforementioned features, in particular precisely and / or dynamically or up-to-date.
- one or more of the signal sources in one development, non-destructively releasable, in particular form-fitting and / or frictional, or non-non-destructively releasable, in particular cohesively, are attached to a cover, which in turn, in a further development, non-destructively releasable, in particular positively and / or frictionally, or non-destructively releasable, in particular cohesively, on the robot arrangement, in particular its robot (s) is or will be attached.
- the signal sources in one embodiment can be advantageously, in particular simply and / or precisely positioned on the robot arrangement and / or alternatively used for different robot (arrangements), in one embodiment, so to speak, in a kind of “safety vests”.
- the actual arrangement is, in particular, based on a
- the acquired optical signals can then be isolated by subtraction and thus compared particularly well with the corresponding target arrangement.
- the actual arrangement can be determined on the basis of a difference image between a captured image with emitted optical signals and a captured image without emitted optical signals, in particular have, in particular be such.
- the optical signals have one or more of the signal sources in one embodiment, in one embodiment
- Signal sources different, in particular robot (member) specific, optical signals.
- two or more of the signal sources have different geometries, brightnesses and / or colors for this purpose.
- these signal sources can have different optical codes, in particular QR codes.
- these different optical signals have predefined time patterns that are different from one another, in particular transmission times; in one embodiment, two or more of the signal sources transmit their optical signals in a predefined sequence.
- errors and / or interference from the environment can be (further) reduced in an embodiment.
- the probability that instead of an optical signal from a signal source that emits a specific optical signal but is covered by an unforeseen obstacle, an optical signal other than the signal corresponding to the target arrangement can be reduced.
- the monitoring reaction in particular its type and / or its triggering, depends on a number and / or a location of non-existent signals of the desired arrangement in the actual arrangement.
- Signals in particular at least two signals, and / or absence of signals in a predetermined other range that is greater than the first range and / or of can be spaced from this, triggering a (second, in particular stronger or larger) monitoring reaction, in particular reducing a speed to a greater extent, in particular stopping the robot arrangement.
- the monitoring reaction depends on a thermal radiation detection by the one or more of the sensor (s). In one embodiment, this allows advantageously between light sources and
- the or a (type of) monitoring reaction can be a
- Output of a in particular optical, acoustic and / or haptic, warning, a change in a predetermined movement of the robot arrangement, in particular in an evasive movement and / or a speed reduction
- a stopping include, in particular be.
- a robot-guided component interrupts, in particular in a planned manner, an optical path between at least one, in particular robot-side, signal source and the or at least one of the sensor (s).
- a robot-controlled component can be illuminated or a missing component or an empty gripper can be signaled by a corresponding light spot in the vicinity.
- a system in particular in terms of hardware and / or software, in particular in terms of programming, is set up to carry out a method described here and / or has:
- One or more sensors that are set up or used to detect optical signals from several signal sources, the signal sources and / or the one or more of the sensor (s) on the
- Robot assembly is / are positioned
- the system or its means has: means for determining the desired arrangement on the basis of a determined position of the signal sources and the
- Signal sources and the sensor to the robot arrangement and / or based on a kinematic and / or optical model.
- a means within the meaning of the present invention can be designed in terms of hardware and / or software, in particular a processing, in particular microprocessor unit (CPU), graphics card (GPU), preferably a data or signal connected to a memory and / or bus system, in particular a digital processing unit ) or the like, and / or one or more programs or program modules.
- CPU microprocessor unit
- GPU graphics card
- a means within the meaning of the present invention can be designed in terms of hardware and / or software, in particular a processing, in particular microprocessor unit (CPU), graphics card (GPU), preferably a data or signal connected to a memory and / or bus system, in particular a digital processing unit ) or the like, and / or one or more programs or program modules.
- Processing unit can be designed to receive commands as one in one
- a storage system can have one or more,
- the program can be designed in such a way that it embodies or is able to execute the methods described here, so that the processing unit carries out the steps of such
- a computer program product can have, in particular a non-volatile, storage medium for storing a program or with a program stored on it, wherein the execution of this program causes a system or a controller, in particular a computer, to create a to carry out the method described here or one or more of its steps.
- the method is carried out completely or partially automatically, in particular by the system or its means.
- the system has the robot arrangement and / or signal sources.
- the desired and / or actual arrangement or image (s) can be two- or three-dimensional in one embodiment, the sensor (s) having, in particular, 3D cameras (systems).
- the sensor (s) having, in particular, 3D cameras (systems).
- distance information is obtained or used by means of L (ichtl) time measurements.
- the reliability of the monitoring can be (further) improved, for example in that reflections at an obstacle that is not provided are not erroneously detected as an optical signal from a signal source positioned on the robot arrangement. Additionally or alternatively, this can improve the determination of the position of the sensor or camera system relative to the robot arrangement, in particular by triangulation or the like. Additionally or alternatively, in one embodiment, in particular in the case of a mobile or moving sensor, its current position can advantageously be determined in each case.
- the, in particular current, position of the or one or more of the, in particular mobile, sensors relative to the, in particular mobile, robot arrangement is based on a distance, in particular
- the desired arrangement in particular the desired image (s), of captured optical signals then based on this determined position of the sensor or sensors in relation to the
- Robot arrangement determined.
- 1 shows a system for monitoring a robot arrangement according to an embodiment of the present invention
- 2 shows a method for monitoring the robot arrangement according to an embodiment of the present invention.
- Fig. 1 shows a system for monitoring a robot arrangement according to a
- the robot arrangement consists of a multi-jointed or multi-axis robot 10 with a stationary or mobile base 11, a carousel 12 rotatable on it about a vertical axis and a multi-jointed or multi-axis robot hand with a rocker 13, an arm 14 and an end effector 15.
- Light sources in the form of laser pointers and / or LEDs 20, which have limb-specific colors and / or shapes and / or can be activated in specific time patterns, are positioned on limbs of robot 10.
- step S10 These detect optical signals from the light sources 20 (FIG. 2: step S10), with a deflection mirror 200 being indicated by way of example in FIG. 1 in order to detect optical signals with the camera 30 (better).
- the cameras 30 are signal-connected to a monitoring device 100, which can be integrated in a control of the robot 10 and, for example, thereby receives a position of the robot 10 or corresponding joint angles.
- the monitoring device 100 uses a kinematic model to determine the current position of the light sources 20 positioned on the robot, their position in relation to or on the robot
- the positions of the individual cameras 30 are also known in the monitoring device 100. These can for example have been determined in advance by means of triangulation from known positions of the light sources.
- the monitoring device 100 With the help of an optical model, the monitoring device 100 now predicts target images of the light sources 20 as they (should) be captured by the cameras 30, provided that no unexpected obstacles between the robot 10 and the cameras 30 unexpectedly interrupt the optical path from the light sources to the cameras. Known or approved obstacles can be found in the optical model
- the monitoring device compares these target images with the actual images actually captured by the cameras 30-32.
- images with and without active light sources can be subtracted from one another, so that the target and actual images each have only the images of the light sources themselves isolated in this way.
- the monitoring device 100 triggers a monitoring reaction (S50), for example reducing the speed of the robot 10, possibly stopping it . Otherwise (S40: “N”) it or the method returns to step S10.
- FIG. 1 shows a person H who unexpectedly stepped into the working space of the robot 10. It can be seen that this interrupts an optical path between the camera 32 and the light source 20 horizontally opposite it in FIG. 2, so that the image of this light source in the
- corresponding target image of the camera 32 is not present in the actual image captured by the latter.
- Signal source 20 would interrupt, which would instead illuminate (only) the component. Conversely, a light spot from this signal source 20 therefore signals a component-free end effector.
- the exemplary embodiment makes it clear that, for example, using the actual image of the camera 32, it can be determined whether an obstacle - as in FIG.
- Monitoring reaction can advantageously be matched to this.
- Target arrangement is predicted on the basis of a kinematic and optical model that includes known obstacles in the vicinity of the robot. If, for example, person H is planned to be at the position shown in FIG. 1, the corresponding model predicts a target image from camera 32 in which the light source 20 opposite is planned not to be present, so that in this case no monitoring reaction is triggered.
- the signal sources 20 can also have passive signal sources, in particular reflectors, which in one embodiment are specifically illuminated.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020217037909A KR20220002407A (ko) | 2019-04-26 | 2020-03-26 | 로봇 배열체를 모니터링하기 위한 방법 및 시스템 |
EP20716721.4A EP3959047A1 (de) | 2019-04-26 | 2020-03-26 | Verfahren und system zur überwachung einer roboteranordnung |
US17/606,284 US20220314454A1 (en) | 2019-04-26 | 2020-03-26 | Method and system for monitoring a robot arrangement |
CN202080040308.2A CN113905854B (zh) | 2019-04-26 | 2020-03-26 | 用于监视机器人设备的方法和系统 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019206010.2A DE102019206010A1 (de) | 2019-04-26 | 2019-04-26 | Verfahren und System zur Überwachung einer Roboteranordnung |
DE102019206010.2 | 2019-04-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020216570A1 true WO2020216570A1 (de) | 2020-10-29 |
Family
ID=70165982
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2020/058451 WO2020216570A1 (de) | 2019-04-26 | 2020-03-26 | Verfahren und system zur überwachung einer roboteranordnung |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220314454A1 (de) |
EP (1) | EP3959047A1 (de) |
KR (1) | KR20220002407A (de) |
CN (1) | CN113905854B (de) |
DE (1) | DE102019206010A1 (de) |
WO (1) | WO2020216570A1 (de) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4766322A (en) * | 1984-09-14 | 1988-08-23 | Kabushiki Kaisha Toshiba | Robot hand including optical approach sensing apparatus |
US4804860A (en) * | 1985-05-02 | 1989-02-14 | Robotic Vision Systems, Inc. | Robot cell safety system |
US20130325181A1 (en) * | 2012-05-31 | 2013-12-05 | Toyota Motor Engineering & Manufacturing North America, Inc. | Non-contact optical distance and tactile sensing system and method |
DE102016114835A1 (de) * | 2016-08-10 | 2018-02-15 | Joanneum Research Forschungsgesellschaft Mbh | Robotervorrichtung |
WO2018109355A1 (fr) * | 2016-12-12 | 2018-06-21 | Irt Jules Verne | Procede et dispositif de detection d'une intrusion dans l'environnement d'un robot |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102010050547A1 (de) * | 2010-11-05 | 2012-05-10 | Kuka Laboratories Gmbh | Verfahren und Vorrichtung zur Sicherheitsüberwachung eines Roboters |
DE202014100411U1 (de) * | 2014-01-30 | 2015-05-05 | Kuka Systems Gmbh | Sicherheitseinrichtung |
DE102015000793A1 (de) * | 2015-01-23 | 2016-07-28 | Daimler Ag | Sensorvorrichtung für unterschiedliche Robotervarianten |
DE102015001575A1 (de) * | 2015-02-07 | 2016-08-11 | Audi Ag | Verfahren und Vorrichtung zur Visualisierung der Bewegung eines Roboters |
DE102015225587A1 (de) * | 2015-12-17 | 2017-06-22 | Volkswagen Aktiengesellschaft | Interaktionssystem und Verfahren zur Interaktion zwischen einer Person und mindestens einer Robotereinheit |
DE102016007520A1 (de) * | 2016-06-20 | 2017-12-21 | Kuka Roboter Gmbh | Überwachung einer Roboteranordnung |
DE102017005194C5 (de) * | 2017-05-31 | 2022-05-19 | Kuka Deutschland Gmbh | Steuern einer Roboteranordnung |
DE202017104603U1 (de) * | 2017-08-01 | 2018-11-06 | Sick Ag | System zum Absichern einer Maschine |
IT201800002494A1 (it) * | 2018-02-08 | 2019-08-08 | Omron Europe B V | Dispositivo di monitoraggio per monitorare un settore limite di una zona di sicurezza. |
-
2019
- 2019-04-26 DE DE102019206010.2A patent/DE102019206010A1/de active Pending
-
2020
- 2020-03-26 EP EP20716721.4A patent/EP3959047A1/de active Pending
- 2020-03-26 US US17/606,284 patent/US20220314454A1/en active Pending
- 2020-03-26 CN CN202080040308.2A patent/CN113905854B/zh active Active
- 2020-03-26 WO PCT/EP2020/058451 patent/WO2020216570A1/de unknown
- 2020-03-26 KR KR1020217037909A patent/KR20220002407A/ko unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4766322A (en) * | 1984-09-14 | 1988-08-23 | Kabushiki Kaisha Toshiba | Robot hand including optical approach sensing apparatus |
US4804860A (en) * | 1985-05-02 | 1989-02-14 | Robotic Vision Systems, Inc. | Robot cell safety system |
US20130325181A1 (en) * | 2012-05-31 | 2013-12-05 | Toyota Motor Engineering & Manufacturing North America, Inc. | Non-contact optical distance and tactile sensing system and method |
DE102016114835A1 (de) * | 2016-08-10 | 2018-02-15 | Joanneum Research Forschungsgesellschaft Mbh | Robotervorrichtung |
WO2018109355A1 (fr) * | 2016-12-12 | 2018-06-21 | Irt Jules Verne | Procede et dispositif de detection d'une intrusion dans l'environnement d'un robot |
Also Published As
Publication number | Publication date |
---|---|
CN113905854B (zh) | 2024-06-07 |
EP3959047A1 (de) | 2022-03-02 |
DE102019206010A1 (de) | 2020-10-29 |
CN113905854A (zh) | 2022-01-07 |
KR20220002407A (ko) | 2022-01-06 |
US20220314454A1 (en) | 2022-10-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2023160B1 (de) | Dreidimensionale Raumüberwachung mit Konfigurationsmodus zum Bestimmen der Schutzfelder | |
EP3701340B1 (de) | Überwachungsvorrichtung, industrieanlage, verfahren zur überwachung sowie computerprogramm | |
EP2053538B1 (de) | Absicherung eines Überwachungsbereichs und visuelle Unterstützung einer automatisierten Bearbeitung | |
DE102017128543B4 (de) | Störbereich-einstellvorrichtung für einen mobilen roboter | |
EP3383595B1 (de) | Darstellung variabler schutzfelder | |
DE102010023736B4 (de) | Robotersystem mit Problemerkennungsfunktion | |
DE102017008638A1 (de) | Berührungsfreier sicherheitssensor und arbeitsweise | |
DE102017111886B3 (de) | Bestimmen der Bewegung einer abzusichernden Maschine | |
EP3650740B1 (de) | Sicherheitssystem und verfahren zum überwachen einer maschine | |
DE102010050547A1 (de) | Verfahren und Vorrichtung zur Sicherheitsüberwachung eines Roboters | |
EP3401702B1 (de) | Sensorsystem | |
EP1980871B1 (de) | Prüfverfahren zur Prüfung der Funktionsfähigkeit eines Überwachungssensors, Überwachungsverfahren und Überwachungssensor | |
DE102018101162B4 (de) | Messsystem und Verfahren zur extrinsischen Kalibrierung | |
DE102008046346B4 (de) | Verfahren und Vorrichtung zum Überwachen eines räumlichen Bereichs, insbesondere des Umfelds eines bewegbaren medizinischen Geräts | |
EP3959046A1 (de) | Verfahren und system zum betreiben eines roboters | |
DE102019211770B3 (de) | Verfahren zur rechnergestützten Erfassung und Auswertung eines Arbeitsablaufs, bei dem ein menschlicher Werker und ein robotisches System wechselwirken | |
DE10215885A1 (de) | Automatische Prozesskontrolle | |
AT517784B1 (de) | Verfahren zur automatisierten Steuerung einer Maschinenkomponente | |
DE10026711B4 (de) | Positionsüberwachungsvorrichtung und -verfahren | |
DE102016110514B4 (de) | Vorrichtung und Verfahren zum Überwachen eines Raumbereichs, insbesondere zum Absichern eines Gefahrenbereichs einer automatisiert arbeitenden Anlage | |
EP3812863B1 (de) | Bewegbare maschine | |
EP2811318B1 (de) | Optoelektronischer Sensor | |
EP3959047A1 (de) | Verfahren und system zur überwachung einer roboteranordnung | |
DE202017104603U1 (de) | System zum Absichern einer Maschine | |
DE102016221861B4 (de) | Einrichtung und Verfahren zur Einwirkung auf Gegenstände |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20716721 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20217037909 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020716721 Country of ref document: EP Effective date: 20211126 |
|
ENP | Entry into the national phase |
Ref document number: 2020716721 Country of ref document: EP Effective date: 20211126 |