WO2009155948A1 - System for safety protection of human beings against hazardous incidents with robots - Google Patents

System for safety protection of human beings against hazardous incidents with robots Download PDF

Info

Publication number
WO2009155948A1
WO2009155948A1 PCT/EP2008/005208 EP2008005208W WO2009155948A1 WO 2009155948 A1 WO2009155948 A1 WO 2009155948A1 EP 2008005208 W EP2008005208 W EP 2008005208W WO 2009155948 A1 WO2009155948 A1 WO 2009155948A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
sensor
signals
human
acoustical
Prior art date
Application number
PCT/EP2008/005208
Other languages
French (fr)
Inventor
Björn MATTHIAS
Sönke KOCK
Roland Krieger
Original Assignee
Abb Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Ag filed Critical Abb Ag
Priority to EP08773685A priority Critical patent/EP2288839A1/en
Priority to PCT/EP2008/005208 priority patent/WO2009155948A1/en
Priority to CN2008801301189A priority patent/CN102099614A/en
Publication of WO2009155948A1 publication Critical patent/WO2009155948A1/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16PSAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
    • F16P3/00Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
    • F16P3/12Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
    • F16P3/14Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
    • F16P3/147Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using electro-magnetic technology, e.g. tags or radar
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16PSAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
    • F16P3/00Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
    • F16P3/12Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
    • F16P3/14Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
    • F16P3/141Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using sound propagation, e.g. sonar
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16PSAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
    • F16P3/00Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
    • F16P3/12Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
    • F16P3/14Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
    • F16P3/142Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using image capturing devices
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16PSAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
    • F16P3/00Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
    • F16P3/12Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
    • F16P3/14Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
    • F16P3/144Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using light grids
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/78Detection of presence or absence of voice signals

Definitions

  • the invention refers to a system for safety protection of human beings against hazardous incidents with robots in operation by means of a monitoring means comprising at least one sensor which is able to receive and/or recognize signals caused by the respective human being who might be in a dangerous situation within or in the near the working range of the robot.
  • Robots are very well known for example from industrial production.
  • a typical robot has for example 6 degrees of freedom in movement. It consists typically of a robot base, which is turnable around a first axis, and an arm which is mounted pivotable on the robot base.
  • An arm may consist of 2 segments with one degree of freedom in movement each.
  • a robot-hand typically with 2 or 3 degrees of freedom is mounted.
  • a robot might have a working range of 1 m to 4m around the robot base, but also a radius of 10m and higher is thinkable. Also other machines with less or more degrees of freedom or without robot base have to be considered as robots.
  • Safety systems enable human beings to operate potentially dangerous machines in a manner preventing injury hazards.
  • At the forefront of developments are systems that allow some degree of machine operation (typically mechanical motion) in the immediate presence of the human being, utilizing the possibilities for advantageous interaction between human being and machine, while still avoiding dangerous situations by use of more sophisticated safety supervision technology and methods.
  • US7308112B2 describes systems, methods, apparatuses, and computer readable media for human-machine interaction and more particularly for human-computer interaction ("HCI") based on computer vision recognition of human signs and gestures where digital images are received and analyzed.
  • HCI human-computer interaction
  • Signs can be used to interact with machines by providing user instructions or commands.
  • Embodiment of the present invention include human detection, human body parts detection, hand shape analysis, trajectory analysis, orientation determination, gesture matching, and the like.
  • EP0849697B1 which relates to a hand gesture recognition system and method in which streams of images is received in real time and are processed to represent the hand region in each image as a vector, and the vectors are processed to recognise hand gestures.
  • An emotion estimating part classifies present emotion of the user into any of a plurality of emotions dispersed in advance in the predetermined number on the basis of these detecting results.
  • a long-term storage part stores the emotional transition model of the user constructed by intensifying learning on the basis of the emotion after the user reveals a certain behaviour in a certain emotion and remuneration imparted from the user by its transition.
  • a behaviour selecting part selects behaviour revealed in response to the present emotion of the user as a probability in response to a measure [pi].
  • a behaviour output part reveals selected behaviour.
  • the invention is directed to a system as well as to a method by which the human being can cause the signalization of hazardous situations in order to provoke events, such as an Emergency-stop, to the machine controller without needing access to a regular input device such as a button.
  • the invention is characterized in that said monitoring means is located as well within or on the robot as remote of the robot while the at least one sensor is located anywhere around the robot within or near its working range, where said monitoring means comprises at least one processing means being linked to sensors, and where said processing means evaluates the received signals as for the source in order to trigger the robot control system to execute a control action to increase the level of safety.
  • a control action could be a reduction of speed, a change of movement path, a reverse movement, or an emergency stop.
  • Processing means can be for example a processing unit such as a computer.
  • the sensors being used for this purpose are based on acoustical and/or optical effects but likewise it is provided according to an embodiment to use even more and different physical properties of nature, e.g. infrared detection and/or ultrasound detection as well as radar detection or the recognition of signs and gestures.
  • the at least one sensor is being arranged on ground level, alternatively they are being arranged on a medium level or as well on ground level as on medium level.
  • a ground level is characterized by a height of minus 1 to 1 meter from the ground.
  • a medium level is characterized by a height of typically 1 meter from the ground to 3 meters above the highest reach of the robot, including end- of-arm tooling, in vertical direction.
  • the system according to the invention is characterized in that when using at least one optical sensor the optical sensors are sensitive to receiving visible signals with wave-lengths in a range of approximately 380 to 780 nm. But also the recognition of only IR light might be useful, since visible disturbances from variable light sources can be eliminated by using only IR light.
  • the acoustical sensors are being provided for receiving acoustical signals of frequencies within a range of 150 Hz to 5000 Hz and coincidentally a sound intensity level of at least 40 dB or said sensors are provided for receiving a combination of acoustical and optical signals as being defined by the preceding features.
  • very low frequencies such as 50Hz and lower, is useful, since a falling down of a person for example might cause a noise in this frequency range.
  • the invention refers to a method of protecting human beings against hazardous incidents with robots in operation by means of any sensors which are able to receive and recognize signals caused by the respective human being who might be in dangerous situation within or in the near of the working range of the robot.
  • said sensors equally receive and measure signals given by a respective human being
  • any action and movement of any human being within the working range of the robot are analyzed by the at least one processing means
  • the invention utilizes known technology for the detection of the state of the human being in the environment of the machine, the invention makes use of the combination of monitoring and evaluating the detected information in a new way so as to offer new, added safety functionality to the operation of machines.
  • An advantageous improvement of the claimed method is characterized in that the detection of any alerts is being activated for the whole time of operation of the robot.
  • data of the actual status of the robot position which are provided by a belonging robot controller respectiveley robot controlling means, are used by the at least one processing means for the detection.
  • the actual position of the robot Having detailed information of the actual position of the robot within its trajectory from the robot controller, a dangerous situation for a human being, which is close to the robot or to a part of the robot, is easier to detect. Measurement faults concerning the position of the robot are excluded.
  • the actual status of the robot position is measured by at least one sensor means and used by the at least one processing means for the detection.
  • the signals of at least two sensors which are based on a different measurement principle, are analyzed by the at least one processing means.
  • This diversity enables to analyze a broader basis of signals of a human being, for example acoustical and optical signals.
  • signals of a human being for example acoustical and optical signals.
  • optical or acoustical signals might be more suitable to detect for example a dangerous situation.
  • the analysis considers and uses a time- coincidence of an irregular behaviour and/or status and/or a dangerous situation, which has been detected based on the analyzed signals of each of the at least two sensors. This is described more in detail for the Fig. 4.
  • the maximum time difference between to coincident signals depends on the kind of signals and may vary for example from 1 ms to 2(J0ms.
  • the embodiment of the invention comprises a monitoring means.
  • the monitoring section may comprise an acoustical monitoring means and/or an optical monitoring section and/or sensing means for radar waves and/or force and/or infrared waves.
  • the optical monitoring section comprises a sensor interface, an image acquisition means, an image processing section, a gesture recognition section, and a reasoning section.
  • the sensor interface receives data from the optical sensor and makes it available to the image acquisition means respectiveley an image acquisition unit.
  • the image acquisition means creates an image from the received data.
  • the image processing section applies various filtering techniques to enhance the quality of the image, and discriminates different objects in the work-cell of the robot, for example a human, a robot, a work object fixture, etc.
  • the gesture recognition section identifies different gestures or irregular behaviour of a detected human and hands it over to the reasoning section.
  • the reasoning section executes reasoning and decision algorithm that judges if a safety-critical situation is detected, and generates at least one signal based on the result of the analysis.
  • the monitoring means can comprise a separate processing means, or be integrated at least in part into the robot control system, or a software program that is running on one or more of the processors of the robot control system.
  • the robot control system comprises at least one multi-core processor, and the monitoring means runs as one instance on one of the cores.
  • the acoustical sensors are being provided for receiving acoustical signals of frequencies within a range of 150 Hz to 5000 Hz and coincidentally a sound intensity level of at least 40 dB.
  • the acoustical section of the monitoring means includes a signal processing section that filters the acoustical signal to increase the sensitivity of the monitoring system to the human voice in the typical frequency band of 300-3000 Hz.
  • the acoustical section contains a threshold function for the sound level, which discriminates the normal operational noise of the robot and other equipment in the cell.
  • the acoustical section contains at least one matched notch filter that eliminates characteristic frequencies in the production cell, for example frequencies that are caused by fans, pumps, and other machines running at constant speed.
  • the acoustical section contains a learning function that is able to adjust the threshold level, the matched filter frequencies, and other relevant filter parameters automatically in a training run, to achieve best environmental noise suppression.
  • the acoustical section has a sound recognition means that can recognize and suppress typical non-periodic sounds and non-constant sounds, like the sounds emitted by pneumatic valves, welding processes, roll door opening and closing, etc. so that only unusual acoustical signals are being identified.
  • the acoustical section of the monitoring means contains a voice recognition section that is able to recognize basic spoken commands like "stop”, “halt”, “slow”, “no”, “help”, and is able to create at least one signal upon detection of such a command.
  • different signals created by the monitoring section are mapped to different control actions in the robot. For example, a "stop” command could create a signal that generates an emergency stop; a “slow” command could trigger a speed reduction to a safe speed, typically below 250 mm/s at the tool tip of the robot, etc.
  • the invention is characterized in that said monitoring means is prepared to exhibit received acoustical and/or optical alerts by means of a screen while the robot is being stopped, and sends the alarms to superior control systems like cell or line control, so that these control systems can take appropriate actions or notifications.
  • These alarms can be sent via digital outputs, field bus, network, wireless transmissions, optical or wave guided signals.
  • An advantageous embodiment of the monitoring means also comprises an electronic clock, a calendar function and a storage section, and is able to store the alarms with time and/or date in the storage section for later analysis.
  • the invention comprises a monitoring means which is prepared to transform received acoustical and/or optical alerts by means of a hooter or buzzer and light signal while the robot is being stopped.
  • the sensing means as well as the aforesaid device may be located at fixed positions or they are moveable i.e. they can be born by the human being who is conventionally or by accident within the operational range of the robot respectively its working range.
  • the monitoring means and/or sensing means can be either partly or totally integrated in the controller means or they are separate.
  • An important aspect of the invention is related to the fact that the system and the method according to the invention does not imply the signaling and recognition of the presence of any human being as such within the range of operation of the robot but only when it is caused by abnormal behaviour which reflects the presence of a hazardous situation.
  • the recognizing and/or signaling devices are not limited to any kind of sensors but likewise it is provided with the invention to have systems, methods and apparatuses being based on image and video recognition which identify any hazardous situation with regard to human beings being in reach of the robot as well as to the robot itself respectively to other robots and/or machines or to work pieces. In any of such cases the expenditure for such efforts and equipment is justified compared to the damages which can result from any malfunction.
  • a safe situation as being provided as a main objective of the invention does not mean at all moving from a non-safe situation to a safe one but to remain within a safe situation, too. That means that only the reaction on hazardous conditions is not sufficient but a preventive actvity may help to avoid the upcoming hazardous situation.
  • the robot or machine can be prepared to confirm the receipt, proper interpretation, and execution of a visual and/or audible signal.
  • Such confirmation may also take the form of visual and/or audible signals, which should be unique in the environment of the machine and unambiguous in meaning to machine operators and other workers.
  • Fig. 1 is a ground view of a working area of a robot provided with a safety system according to the invention
  • Fig. 2 is a side elevation of said working area according to Fig. 1
  • Fig. 3 is a bird's view on said working area according to Fig. 1
  • Fig. 4 is a kind of flowchart for two different modules
  • Fig. 5 is a scheme for the architecture of the system
  • FIG. 1 a schematic ground view of a working area 10 of a robot 12 with a robot's arm 14 is shown in which working area 10 a number of sensors 16, 18 are arranged.
  • the robot is shown as a schematic drawing comprising a stand 12 which is holding the robot's arm 14. It is controlled by a controller (not shown in this view).
  • the sensors 16, 18 are able to receive and recognize any kind of signals caused by any human being who is in reach of the robot's working area and jeopardized to be injured by the robot 12.
  • the sensors 16, 18 are located in reasonable distance to the robot 12.
  • the sensors 16, 18 are positioned as well on ground level as on a higher level, e.g. at a height of 1-1 ,8 m above ground wherefore the sensors are integrated in poles which surround the robot.
  • Each pole may be provided as well with sensors 16, 18 at ground level as on a higher level.
  • Fig. 2 exhibits the arrangement of Fig. 1 as a side elevation where the working area 10 of the robot 12 with its robot's arm 14 is shown. Further there are poles provided with acoustical sensors 16 and/or optical sensors 18 which are able to receive and recognize signals caused by any human being who might be in a dangerous situation.
  • Fig. 3 exhibits a bird's view on the working area 10 already shown schematically in Fig. 1 with a robot 12 and a human being 20.
  • the working area 10 is being steadily monitored by microphone 16 as an acoustical sensor and two video cameras 22 as optical monitoring means.
  • the cameras 22 as well as the microphone are connected to a processing means 24 for sensor data received from the sound and visual sensors 16, 22.
  • This processing means is linked to a machine controller 26 which is processing the information received from the processing means 24 and as a result in case of any irregularity to initiate an immediate stop of the robot 12.
  • the Overall System refers to the safety system for human-machine interaction and comprising the following features:
  • processing means 24 for analyzing the detected data on human being 20 in working area 10 to determine information describing human beings' 20 behaviour • Equipped with processing means 24 for interpreting information on detected human being's 20 behaviour (10) as a signal or command to machine controller 26;
  • Parts of behaviour of human being 20 relevant to sensor detection can have visual components and/or audible components
  • Sensor data on human being 20 includes human being 20 position, speed, posture, visible cues and gestures, speech, sounds, other audible cues, etc.
  • Processing means 24 executes algorithm to extract information on behaviour of human being 20 from sensor data based upon first principles analysis not requiring configuration
  • Processing means 24 executes algorithm to extract information on behaviour of human being 20 from sensor data based upon prepared configuration of signatures for behaviour patterns
  • Processing means 24 determines from behaviour data a well-defined signal or command, according to a prepared configuration of signal or command associations with behaviour patterns
  • Relevant behaviour of human being 20 may be, for example, visual, through physical gesture by worker, posture of worker's body, or other cue based on human being 20 physique and/or motion • Relevant behaviour of human being 20 may be, for example, audible, through issuing cry, speech, whistle, or other acoustic signal caused by human being 20;
  • Relevant behaviour of human being 20 can be voluntary, such as a particular gesture or word, or it can be involuntary, such as falling to the floor after tripping or other unintentional or unexpected incident
  • Commands determined from the behaviour of the human being 20 may be, o Related to exception conditions in the cooperative work cell, o Related to routine production tasks in the work cell, o Related to other causes or other requirements.
  • Fig. 4 exhibits a kind of flowchart for two different modules 30, 32 as for the processing of the signals of two different sensors, an optical sensor 18, 22 and an acoustical sensor 16.
  • the first module 30 provides a typical processing of a measured acoustic signal.
  • the incoming signals are pre-processed.
  • the incoming data are filtered and any abnormal status of the data, for example an increase of the noise level, is recognized and transmitted to a time-coincidence module 34.
  • a further analysis of the noise signal is done, for example the recognition of any spoken words, using also for example some filters to reduce the environmental noise level.
  • the second module 32 analyzes measured optical signals, for example from one or more cameras.
  • an abnormal status for example an abnormal fast motion of a person is recognized and this information is also transmitted to the time-coincidence module 34.
  • a further analysis of the optical signal is done, for example the recognition of the position of the human and based on it some postures or gestures.
  • a confidence factor of the analyzed and interpreted optical signal is useful. If the confidence factor, for example that a person is in panic, is higher than 80%, a command for an emergency stop is given.
  • a comparison module 36 compares the interpreted signals of first 30 and second module 32. These modules can also be triggered by the time-coincidence module 34, which analyzes a time-coincidence of an abnormal status of the incoming acoustical and optical sensor data.
  • the comparison module 36 might give a command to the robot, for example if there is a time coincidence of two interpreted signals, which both indicate a dangerous situation for a person, whereas both interpreted signals have a confidence factor of for example only 50%. Also expert systems are suitable to determine such a decision.
  • Fig. 5 exhibits one scheme for the architecture of the system with acoustical and optical sensors according to the invention i.e. the arrangement of these sensors with regard to the monitored machinery as well as the processing equipment. According to the invention the following principles are relevant.

Abstract

The invention refers to a system and a method for safety protection of human beings against hazardous incidents with robots in operation by means of a monitoring means comprising sensor means which are able to receive and measure signals caused by the respective human being who might be in a dangerous situation within or near the working range of the robot, where the acoustical and/or optical sensors are located close around the robot within or near its working range, where said monitoring means comprises a processing means being linked to the acoustical and/or optical sensors, and where said processing means evaluates the received signals as for the source in order to trigger the robot controller to execute a control action leading to a safe situation.

Description

System for safety protection of human beings against hazardous incidents with robots
Description
The invention refers to a system for safety protection of human beings against hazardous incidents with robots in operation by means of a monitoring means comprising at least one sensor which is able to receive and/or recognize signals caused by the respective human being who might be in a dangerous situation within or in the near the working range of the robot.
Robots are very well known for example from industrial production. A typical robot has for example 6 degrees of freedom in movement. It consists typically of a robot base, which is turnable around a first axis, and an arm which is mounted pivotable on the robot base. An arm may consist of 2 segments with one degree of freedom in movement each. At the tip of an arm a robot-hand, typically with 2 or 3 degrees of freedom is mounted. A robot might have a working range of 1 m to 4m around the robot base, but also a radius of 10m and higher is thinkable. Also other machines with less or more degrees of freedom or without robot base have to be considered as robots.
Safety systems enable human beings to operate potentially dangerous machines in a manner preventing injury hazards. At the forefront of developments are systems that allow some degree of machine operation (typically mechanical motion) in the immediate presence of the human being, utilizing the possibilities for advantageous interaction between human being and machine, while still avoiding dangerous situations by use of more sophisticated safety supervision technology and methods.
Machine malfunctions that can lead to hazardous situations are still possible, even if very unlikely. If a human being finds himself in a potentially dangerous situation, the usual way to avert the hazard is to press the emergency stop (E-stop) button, which must be/usually is mounted in reach of his workplace. If, however, for some reason, the human being is unable to reach the E-stop button, other means of stopping the machine would improve the safety level of the workplace.
US7308112B2 describes systems, methods, apparatuses, and computer readable media for human-machine interaction and more particularly for human-computer interaction ("HCI") based on computer vision recognition of human signs and gestures where digital images are received and analyzed.
Signs can be used to interact with machines by providing user instructions or commands. Embodiment of the present invention include human detection, human body parts detection, hand shape analysis, trajectory analysis, orientation determination, gesture matching, and the like.
Many types of shapes and gestures are recognized in a non-intrusive manner based on computer vision. A number of applications become feasible by this sign-understanding technology, including remote control of home devices, mouse-less (and touch-less) operation of computer consoles, gaming, and man-robot communication to give instructions among others. Active sensing hardware is used to capture a stream of depth images at a video rate, which is consequently analyzed for information extraction.
A similar teaching is known from EP0849697B1 which relates to a hand gesture recognition system and method in which streams of images is received in real time and are processed to represent the hand region in each image as a vector, and the vectors are processed to recognise hand gestures.
From JP2005238422 A a robot device, its state transition model construction method and behaviour control method is known which is based on a user voice recognizing part and a user imaging recognizing part which detect the emotion of the respective user on the basis of a voice signal and an image signal of the user.
An emotion estimating part classifies present emotion of the user into any of a plurality of emotions dispersed in advance in the predetermined number on the basis of these detecting results. A long-term storage part stores the emotional transition model of the user constructed by intensifying learning on the basis of the emotion after the user reveals a certain behaviour in a certain emotion and remuneration imparted from the user by its transition. A behaviour selecting part selects behaviour revealed in response to the present emotion of the user as a probability in response to a measure [pi]. A behaviour output part reveals selected behaviour.
Though this state of the art relates to a combination of acoustical and/or optical sensors on one hand and robots on the other hand it is only related to robots as such where these features have been realized apart from any safety purpose for the operational personnel of the robot.
Based on the aforementioned state of the art it is a main object for this invention to develop a safety protection system and method provided for robots which provide high gain of safety for the operational personnel of the robot which is to be achieved with good efficiency and reasonable efforts.
Reasons for not being able to reach the Emergency-stop button can range from the human being not standing exactly at the allotted working position - thus not having the Emergency-stop button in direct reach - to the human being having fallen to the floor. So all these aspects shall be covered by the invention.
Therefore, the invention is directed to a system as well as to a method by which the human being can cause the signalization of hazardous situations in order to provoke events, such as an Emergency-stop, to the machine controller without needing access to a regular input device such as a button.
This involves the human being expressing a command by visual means (gesture, body posture) or by audible means (cry, voice) or by a combination of the two. Such commands can be detected by sensors connected to the machine controller and can lead to the execution of machine commands. Relevant machine commands can be Emergency-stop, operation halt, operation pause, operation resume/restart, special retreating or homing sequences or other commands or state changes.
The invention is characterized in that said monitoring means is located as well within or on the robot as remote of the robot while the at least one sensor is located anywhere around the robot within or near its working range, where said monitoring means comprises at least one processing means being linked to sensors, and where said processing means evaluates the received signals as for the source in order to trigger the robot control system to execute a control action to increase the level of safety. A control action could be a reduction of speed, a change of movement path, a reverse movement, or an emergency stop. Processing means can be for example a processing unit such as a computer.
Preferably the sensors being used for this purpose are based on acoustical and/or optical effects but likewise it is provided according to an embodiment to use even more and different physical properties of nature, e.g. infrared detection and/or ultrasound detection as well as radar detection or the recognition of signs and gestures.
According to a preferred embodiment the at least one sensor is being arranged on ground level, alternatively they are being arranged on a medium level or as well on ground level as on medium level. A ground level is characterized by a height of minus 1 to 1 meter from the ground. A medium level is characterized by a height of typically 1 meter from the ground to 3 meters above the highest reach of the robot, including end- of-arm tooling, in vertical direction.
Advantageously the system according to the invention is characterized in that when using at least one optical sensor the optical sensors are sensitive to receiving visible signals with wave-lengths in a range of approximately 380 to 780 nm. But also the recognition of only IR light might be useful, since visible disturbances from variable light sources can be eliminated by using only IR light.
Alternatively with the system according to the invention the acoustical sensors are being provided for receiving acoustical signals of frequencies within a range of 150 Hz to 5000 Hz and coincidentally a sound intensity level of at least 40 dB or said sensors are provided for receiving a combination of acoustical and optical signals as being defined by the preceding features. But also the use of very low frequencies, such as 50Hz and lower, is useful, since a falling down of a person for example might cause a noise in this frequency range.
Furthermore the invention refers to a method of protecting human beings against hazardous incidents with robots in operation by means of any sensors which are able to receive and recognize signals caused by the respective human being who might be in dangerous situation within or in the near of the working range of the robot.
This method is characterized by the following steps:
- said sensors equally receive and measure signals given by a respective human being
- the measured signals are provided to at least one processing means
- based on those signals any action and movement of any human being within the working range of the robot are analyzed by the at least one processing means,
- any detection thereon of an irregular behaviour or status or dangerous situation is causing a protective robot control action and/or movement
It is clear from the invention that all its features are intended to gain a plus of safety for human beings in direct interaction of robot and human being and therefore to initiate instantly an action to stop the robot and thus to prevent injuries and damages.
Though the invention utilizes known technology for the detection of the state of the human being in the environment of the machine, the invention makes use of the combination of monitoring and evaluating the detected information in a new way so as to offer new, added safety functionality to the operation of machines.
An advantageous improvement of the claimed method is characterized in that the detection of any alerts is being activated for the whole time of operation of the robot.
In another embodiment of the invention, data of the actual status of the robot position, which are provided by a belonging robot controller respectiveley robot controlling means, are used by the at least one processing means for the detection.
Having detailed information of the actual position of the robot within its trajectory from the robot controller, a dangerous situation for a human being, which is close to the robot or to a part of the robot, is easier to detect. Measurement faults concerning the position of the robot are excluded. In another embodiment of the invention the actual status of the robot position is measured by at least one sensor means and used by the at least one processing means for the detection.
In a further embodiment of the invention the signals of at least two sensors, which are based on a different measurement principle, are analyzed by the at least one processing means.
This diversity enables to analyze a broader basis of signals of a human being, for example acoustical and optical signals. Depending on the current situation, either optical or acoustical signals might be more suitable to detect for example a dangerous situation.
In a very advantageous embodiment the analysis considers and uses a time- coincidence of an irregular behaviour and/or status and/or a dangerous situation, which has been detected based on the analyzed signals of each of the at least two sensors. This is described more in detail for the Fig. 4. The maximum time difference between to coincident signals depends on the kind of signals and may vary for example from 1 ms to 2(J0ms.
The embodiment of the invention comprises a monitoring means. The monitoring section may comprise an acoustical monitoring means and/or an optical monitoring section and/or sensing means for radar waves and/or force and/or infrared waves.
In a preferred embodiment, the optical monitoring section comprises a sensor interface, an image acquisition means, an image processing section, a gesture recognition section, and a reasoning section. The sensor interface receives data from the optical sensor and makes it available to the image acquisition means respectiveley an image acquisition unit. The image acquisition means creates an image from the received data.
The image processing section applies various filtering techniques to enhance the quality of the image, and discriminates different objects in the work-cell of the robot, for example a human, a robot, a work object fixture, etc. The gesture recognition section identifies different gestures or irregular behaviour of a detected human and hands it over to the reasoning section. The reasoning section executes reasoning and decision algorithm that judges if a safety-critical situation is detected, and generates at least one signal based on the result of the analysis.
The monitoring means can comprise a separate processing means, or be integrated at least in part into the robot control system, or a software program that is running on one or more of the processors of the robot control system.
In a preferred embodiment, the robot control system comprises at least one multi-core processor, and the monitoring means runs as one instance on one of the cores.
According to a further improvement the acoustical sensors are being provided for receiving acoustical signals of frequencies within a range of 150 Hz to 5000 Hz and coincidentally a sound intensity level of at least 40 dB.
In another embodiment, the acoustical section of the monitoring means includes a signal processing section that filters the acoustical signal to increase the sensitivity of the monitoring system to the human voice in the typical frequency band of 300-3000 Hz.
According to a preferred embodiment, the acoustical section contains a threshold function for the sound level, which discriminates the normal operational noise of the robot and other equipment in the cell.
In another embodiment, the acoustical section contains at least one matched notch filter that eliminates characteristic frequencies in the production cell, for example frequencies that are caused by fans, pumps, and other machines running at constant speed.
In yet another embodiment, the acoustical section contains a learning function that is able to adjust the threshold level, the matched filter frequencies, and other relevant filter parameters automatically in a training run, to achieve best environmental noise suppression.
In a further embodiment, the acoustical section has a sound recognition means that can recognize and suppress typical non-periodic sounds and non-constant sounds, like the sounds emitted by pneumatic valves, welding processes, roll door opening and closing, etc. so that only unusual acoustical signals are being identified.
In another embodiment, the acoustical section of the monitoring means contains a voice recognition section that is able to recognize basic spoken commands like "stop", "halt", "slow", "no", "help", and is able to create at least one signal upon detection of such a command.
In a preferred embodiment, different signals created by the monitoring section are mapped to different control actions in the robot. For example, a "stop" command could create a signal that generates an emergency stop; a "slow" command could trigger a speed reduction to a safe speed, typically below 250 mm/s at the tool tip of the robot, etc.
Advantageously the invention is characterized in that said monitoring means is prepared to exhibit received acoustical and/or optical alerts by means of a screen while the robot is being stopped, and sends the alarms to superior control systems like cell or line control, so that these control systems can take appropriate actions or notifications. These alarms can be sent via digital outputs, field bus, network, wireless transmissions, optical or wave guided signals.
An advantageous embodiment of the monitoring means also comprises an electronic clock, a calendar function and a storage section, and is able to store the alarms with time and/or date in the storage section for later analysis.
According to a preferred embodiment, the invention comprises a monitoring means which is prepared to transform received acoustical and/or optical alerts by means of a hooter or buzzer and light signal while the robot is being stopped.
Generally the sensing means as well as the aforesaid device may be located at fixed positions or they are moveable i.e. they can be born by the human being who is conventionally or by accident within the operational range of the robot respectively its working range. Furthermore the monitoring means and/or sensing means can be either partly or totally integrated in the controller means or they are separate.
An important aspect of the invention is related to the fact that the system and the method according to the invention does not imply the signaling and recognition of the presence of any human being as such within the range of operation of the robot but only when it is caused by abnormal behaviour which reflects the presence of a hazardous situation.
Furthermore it is understood that the recognizing and/or signaling devices are not limited to any kind of sensors but likewise it is provided with the invention to have systems, methods and apparatuses being based on image and video recognition which identify any hazardous situation with regard to human beings being in reach of the robot as well as to the robot itself respectively to other robots and/or machines or to work pieces. In any of such cases the expenditure for such efforts and equipment is justified compared to the damages which can result from any malfunction.
Generally the creation of a safe situation as being provided as a main objective of the invention does not mean at all moving from a non-safe situation to a safe one but to remain within a safe situation, too. That means that only the reaction on hazardous conditions is not sufficient but a preventive actvity may help to avoid the upcoming hazardous situation.
According to a further aspect of the invention the robot or machine can be prepared to confirm the receipt, proper interpretation, and execution of a visual and/or audible signal. Such confirmation may also take the form of visual and/or audible signals, which should be unique in the environment of the machine and unambiguous in meaning to machine operators and other workers.
According to the invention there are further principles to be observed which shall be illustrated in the following. In this connection the effective respectively the appropriate comprehension of the behaviour of the respective human being will be supported by active control for example when using some safety-related commands. In the following there are given some examples for such possible safety-related commands which have to be effected: • To issue an emergency stop
• Start execution of specific error handler to uphold safety
• For example backstep, run program in reverse, reversal of movement path, change of movement path, other robot command
• Operating stop to maintain safety
• Reduction of speed to zero without turning off motors ("safe standstill")
• Reduce speed to maintain safety
• To "creep speed" e.g. 30 mm/s or similar
• To "safe reduced speed" e.g. 250 mm/s
• To other preconfigured speed value
• To value computed from characteristics of sensor data e.g. volume of acoustical cue, amplitude of gesture
• By increments (use repeated cues)
• Move safely to specific position
• Computed from cues
• Predefined
• Application-related cues to increase safety e.g. turn off laser Likewise there are some possible commands to be effected for resuming work:
• Resume motion
• Increase speed
• To predefined value(s)
• To value computed from characteristics of sensor data
• By increments (use repeated cues)
• Other application-related cues
These features as well as further embodiments of the invention are subject matter of the claims. By means of some drawings which exhibit the invention, preferred embodiments of the invention and special advantages when using the invention shall be exemplified.
Hence the invention will be more clearly understood from the following description of some embodiments thereof given by way of example only with reference to the accompanying drawings in which :-
Fig. 1 is a ground view of a working area of a robot provided with a safety system according to the invention;
Fig. 2 is a side elevation of said working area according to Fig. 1
Fig. 3 is a bird's view on said working area according to Fig. 1
Fig. 4 is a kind of flowchart for two different modules and
Fig. 5 is a scheme for the architecture of the system
In Fig. 1 a schematic ground view of a working area 10 of a robot 12 with a robot's arm 14 is shown in which working area 10 a number of sensors 16, 18 are arranged.
The robot is shown as a schematic drawing comprising a stand 12 which is holding the robot's arm 14. It is controlled by a controller (not shown in this view).
The sensors 16, 18 are able to receive and recognize any kind of signals caused by any human being who is in reach of the robot's working area and jeopardized to be injured by the robot 12.
In order to improve the detection of these signals the sensors 16, 18 are located in reasonable distance to the robot 12. Preferably the sensors 16, 18 are positioned as well on ground level as on a higher level, e.g. at a height of 1-1 ,8 m above ground wherefore the sensors are integrated in poles which surround the robot. Each pole may be provided as well with sensors 16, 18 at ground level as on a higher level.
Fig. 2 exhibits the arrangement of Fig. 1 as a side elevation where the working area 10 of the robot 12 with its robot's arm 14 is shown. Further there are poles provided with acoustical sensors 16 and/or optical sensors 18 which are able to receive and recognize signals caused by any human being who might be in a dangerous situation.
Fig. 3 exhibits a bird's view on the working area 10 already shown schematically in Fig. 1 with a robot 12 and a human being 20.
The working area 10 is being steadily monitored by microphone 16 as an acoustical sensor and two video cameras 22 as optical monitoring means.
The cameras 22 as well as the microphone are connected to a processing means 24 for sensor data received from the sound and visual sensors 16, 22. This processing means is linked to a machine controller 26 which is processing the information received from the processing means 24 and as a result in case of any irregularity to initiate an immediate stop of the robot 12.
If there is a plurality of such machines which may induce any hazardous situation for the working personnel, the installations, or the respective workpiece their machine controllers 26 have to be linked together in order to initiate an emergency stop immediately.
One can distinguish between three objectives, the Overall System, the Data Processing, and the Behaviour Cues.
The Overall System refers to the safety system for human-machine interaction and comprising the following features:
• a working area 10 with a machine 12 for the purpose of carrying out a routine production task, which production task involves interaction between the machine 12 and one or more human beings 20 in its vicinity;
• Equipped with one or more sensors 16, 18, 22 capable of detecting data associated with the human being 20 in the working area 10 environment;
• Equipped with processing means 24 for analyzing the detected data on human being 20 in working area 10 to determine information describing human beings' 20 behaviour • Equipped with processing means 24 for interpreting information on detected human being's 20 behaviour (10) as a signal or command to machine controller 26;
• Equipped with control means 26 causing machine 12 to react to thusly determined signal or command from human being 20;
• Optionally equipped with visual and/or audible means 16, 18 for machine control to indicate receipt and/or completion of action request, which was communicated by way of human being's 20 behaviour.
As for the safety system for human-machine interaction the Data Processing of received data respectively their evaluation is indispensable.
• Parts of behaviour of human being 20 relevant to sensor detection can have visual components and/or audible components
• Parts of behaviour of human being 20 relevant to sensor detection can be intentional or unintentional
• Sensor data on human being 20 includes human being 20 position, speed, posture, visible cues and gestures, speech, sounds, other audible cues, etc.
• Processing means 24 executes algorithm to extract information on behaviour of human being 20 from sensor data based upon first principles analysis not requiring configuration
• Processing means 24 executes algorithm to extract information on behaviour of human being 20 from sensor data based upon prepared configuration of signatures for behaviour patterns
• Processing means 24 determines from behaviour data a well-defined signal or command, according to a prepared configuration of signal or command associations with behaviour patterns
Finally the safety system for human-machine interaction depends on Behaviour Cues
• Relevant behaviour of human being 20 may be, for example, visual, through physical gesture by worker, posture of worker's body, or other cue based on human being 20 physique and/or motion • Relevant behaviour of human being 20 may be, for example, audible, through issuing cry, speech, whistle, or other acoustic signal caused by human being 20;
• Relevant behaviour of human being 20 can be voluntary, such as a particular gesture or word, or it can be involuntary, such as falling to the floor after tripping or other unintentional or unexpected incident
• Commands determined from the behaviour of the human being 20 may be, o Related to exception conditions in the cooperative work cell, o Related to routine production tasks in the work cell, o Related to other causes or other requirements.
• Relevant examples of commands are o To issue an emergency stop o To stop/resume robot motion o To reduce/increase robot speed o To retreat safely to home position o To retrieve next part from conveyor or similar production-related task
Fig. 4 exhibits a kind of flowchart for two different modules 30, 32 as for the processing of the signals of two different sensors, an optical sensor 18, 22 and an acoustical sensor 16. The first module 30 provides a typical processing of a measured acoustic signal.
The incoming signals are pre-processed. The incoming data are filtered and any abnormal status of the data, for example an increase of the noise level, is recognized and transmitted to a time-coincidence module 34. Then a further analysis of the noise signal is done, for example the recognition of any spoken words, using also for example some filters to reduce the environmental noise level.
It is advantageous to create a level of confidence factor for an analyzed and interpreted voice signal. If the confidence factor is for example higher than 90% (0% = no confidence at all, 100% = no doubt, that the interpretation is correct), then this interpretation leads immediately to a trigger command for the robot, for example for an emergency stop. In a similar way the second module 32 analyzes measured optical signals, for example from one or more cameras. Within a pre-processing of the signals an abnormal status, for example an abnormal fast motion of a person is recognized and this information is also transmitted to the time-coincidence module 34. Afterwards, a further analysis of the optical signal is done, for example the recognition of the position of the human and based on it some postures or gestures.
In analogy to the analysis of the acoustical signals, a confidence factor of the analyzed and interpreted optical signal is useful. If the confidence factor, for example that a person is in panic, is higher than 80%, a command for an emergency stop is given.
If neither the first module 30 nor the second module 32 have a sufficiently high confidence factor on the interpreted signals to give directly a command to the robot, a comparison module 36 compares the interpreted signals of first 30 and second module 32. These modules can also be triggered by the time-coincidence module 34, which analyzes a time-coincidence of an abnormal status of the incoming acoustical and optical sensor data.
As a result the comparison module 36 might give a command to the robot, for example if there is a time coincidence of two interpreted signals, which both indicate a dangerous situation for a person, whereas both interpreted signals have a confidence factor of for example only 50%. Also expert systems are suitable to determine such a decision.
Also the use of 3 and more modules for processing measured signals is in the scope of the invention.
Finally Fig. 5 exhibits one scheme for the architecture of the system with acoustical and optical sensors according to the invention i.e. the arrangement of these sensors with regard to the monitored machinery as well as the processing equipment. According to the invention the following principles are relevant.
• Sensors are either acoustical or optical
• Filtering against other environmental "noise" is applied
• Sensor data is processed to isolate human voice and/or human images • Processed data is interpreted to find cues from human(s)
• Cues are associated with specific safety-related robot commands or functions
• Data processing can be part of sensor
• Data interpretation can be part of data processing
• Data interpretation can be part of robot reasoning
• Reasoning can be part of robot controller
List of reference signs
10 working area
12 machine, robot
14 robot arm
16 acoustical sensor, microphone
18 optical sensor
20 human being
22 video camera
24 processing means
26 machine controller
30 module for processing an acoustical signal
32 module for processing an optical signal
34 time coincidence module
36 comparison module

Claims

Claims
1. System for safety protection of human beings against hazardous incidents with a machine with moving parts in operation, e.g. a robot, by means of a monitoring means comprising at least one sensor for receiving and/or recognizing signals given by the respective human being who might be in a dangerous situation within or near the working range of the robot, characterized in that the at least one sensor is located anywhere around the robot within or nearby its working range, where said monitoring means comprises at least one processing means being linked to the at least one sensor, and where said at least one processing means evaluates the received signals as for the source in order to trigger the robot controller to initiate a control action to create a safe situation.
2. System according to claim 1 , characterized in that the control action is an emergency stop, a speed reduction to a safe level, a speed reduction to zero speed without turning off the motors, a change of movement path or a reverse movement.
3. System according to claim 1 or 2, characterized in that the at least one sensor for receiving and/or recognizing any signal caused by a human being is being provided as a means for detection of acoustical waves and/or optical waves and/or radar waves and/or force and/or infrared waves and/or ultrasound.
4. System according to one of the preceding claims, characterized in that at least one sensor is being arranged on ground level.
5. System according to one of the preceding claims, characterized in that at least one sensor is being arranged on a medium level.
6. System according to one of the preceding claims, characterized in that at least one optical sensor is sensitive for receiving visible signals with wave-lengths in a range of approximately 380 to 780 nm.
7. System according to one of the preceding claims, characterized in that at least one acoustical sensors is sensitive for receiving acoustical signals of frequencies within a range of 150 Hz to 5000 Hz and coincidentally a sound intensity level of at least 40 dB.
8. System according to one of the preceding claims, characterized in that said sensors are provided for receiving a combination of acoustical and optical signals as being defined in the preceding claims 6 and 7.
9. Method of protecting human beings against hazardous incidents with a machine with moving parts in operation, e.g. a robot, by means of at least one monitoring means comprising at least one sensor for receiving and/or measuring signals given by a respective human being who might be in a dangerous situation within or near the working range of the robot, characterized in that the measured signals are provided to at least one processing means, that any action and/or movement and/or behaviour of any human being within or near the the working range of the robot is analyzed by the at least one processing means and that any detection of an irregular behaviour or status or dangerous situation is causing a protective robot control action and/or movement.
10. Method according to claim 9, characterized in that data of the actual status of the robot position, which are provided by a belonging robot controller, are used by the at least one processing means for the detection.
11. Method according to claim 9 or 10, characterized in that data of the actual status of the robot position, which are measured by at least one sensor means, are used by the at least one processing means for the detection.
12. Method according to claim 9 or 11 , characterized in that the robot control action is an emergency stop, a speed reduction to a safe level, a speed reduction to zero speed without turning off the motors, a change of movement path or a reverse movement.
13. Method according to claim 9 to 12, characterized in that the at least one sensor receives and/or measures acoustical waves, optical waves, radar waves, infrared waves, force and/or ultrasound.
14. Method according to claim 9 to 13, characterized in that the detection of an irregular behaviour or status or dangerous situation is active for the whole time of operation of the robot.
15. Method according to claim 9 to 14, characterized in that the signals of at least two sensors, which are based on a different measurement principle, are analyzed by the at least one processing means.
16. Method according to claim 15, characterized in that the analysis considers and uses a time-coincidence of an irregular behaviour and/or status and/or a dangerous situation, which has been detected based on the analyzed signals of
._ each of the at least two sensors.
17. Method according to claim 15 or 16, characterized in that the signals of at least one optical and one acoustical sensor are analyzed.
PCT/EP2008/005208 2008-06-26 2008-06-26 System for safety protection of human beings against hazardous incidents with robots WO2009155948A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP08773685A EP2288839A1 (en) 2008-06-26 2008-06-26 System for safety protection of human beings against hazardous incidents with robots
PCT/EP2008/005208 WO2009155948A1 (en) 2008-06-26 2008-06-26 System for safety protection of human beings against hazardous incidents with robots
CN2008801301189A CN102099614A (en) 2008-06-26 2008-06-26 System for safety protection of human beings against hazardous incidents with robots

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2008/005208 WO2009155948A1 (en) 2008-06-26 2008-06-26 System for safety protection of human beings against hazardous incidents with robots

Publications (1)

Publication Number Publication Date
WO2009155948A1 true WO2009155948A1 (en) 2009-12-30

Family

ID=40532547

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2008/005208 WO2009155948A1 (en) 2008-06-26 2008-06-26 System for safety protection of human beings against hazardous incidents with robots

Country Status (3)

Country Link
EP (1) EP2288839A1 (en)
CN (1) CN102099614A (en)
WO (1) WO2009155948A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120306615A1 (en) * 2011-05-30 2012-12-06 Hon Hai Precision Industry Co., Ltd. Safety system and method
US20140244003A1 (en) * 2013-02-27 2014-08-28 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with redundant system input support
US20140244004A1 (en) * 2013-02-27 2014-08-28 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with position and derivative decision reference
WO2015185351A1 (en) * 2014-06-04 2015-12-10 Holzma Plattenaufteiltechnik Gmbh Method for operating a panel-processing installation, and panel-processing installation
US9868214B2 (en) 2016-06-20 2018-01-16 X Development Llc Localization of a mobile system
US10065316B2 (en) 2016-02-05 2018-09-04 Rethink Robotics, Inc. Systems and methods for safe robot operation
WO2019014940A1 (en) * 2017-07-21 2019-01-24 深圳市萨斯智能科技有限公司 Method of determining security level of robot and robot
EP3627033A1 (en) * 2018-09-24 2020-03-25 Leuze electronic GmbH + Co. KG Device for protecting a hazardous area of a system
US10800409B2 (en) 2018-09-04 2020-10-13 Caterpillar Paving Products Inc. Systems and methods for operating a mobile machine using detected sounds
US11537119B2 (en) 2019-04-11 2022-12-27 Bastian Solutions, Llc Voice controlled material handling mobile robotic system

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101785998B1 (en) * 2017-06-26 2017-10-18 주식회사 썬에이치에스티 Human access detecting system for preventing safety accident
CN107813308A (en) * 2017-10-20 2018-03-20 高井云 A kind of human computer cooperation system of robot
CN109032052B (en) * 2018-06-26 2020-09-22 上海常仁信息科技有限公司 Emergency intelligent control system based on robot identity card
CN109352657B (en) * 2018-11-29 2020-11-17 深圳前海达闼云端智能科技有限公司 Control method, related device and storage medium
CN113096660A (en) * 2021-04-28 2021-07-09 三一汽车制造有限公司 Personnel safety protection method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19500323A1 (en) * 1995-01-07 1996-07-11 Edag Eng & Design Ag Protective apparatus switching off equipment
DE19631579A1 (en) * 1996-08-05 1998-02-12 Vhf Computer Gmbh Acoustic emergency shutdown apparatus for machines
WO2001033134A1 (en) * 1999-11-02 2001-05-10 Helmut Ehrlich Method and device for detecting dangerous situations

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0179252A3 (en) * 1984-09-14 1987-07-15 Siemens Aktiengesellschaft Method and apparatus for protecting people in the operating range of a movable part of a traversing or swiveling machine, particularly of an industrial robot
JP4764070B2 (en) * 2005-05-24 2011-08-31 本田技研工業株式会社 Work station safety system
JP4168072B2 (en) * 2006-12-21 2008-10-22 ファナック株式会社 Robot system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19500323A1 (en) * 1995-01-07 1996-07-11 Edag Eng & Design Ag Protective apparatus switching off equipment
DE19631579A1 (en) * 1996-08-05 1998-02-12 Vhf Computer Gmbh Acoustic emergency shutdown apparatus for machines
WO2001033134A1 (en) * 1999-11-02 2001-05-10 Helmut Ehrlich Method and device for detecting dangerous situations

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120306615A1 (en) * 2011-05-30 2012-12-06 Hon Hai Precision Industry Co., Ltd. Safety system and method
US20140244003A1 (en) * 2013-02-27 2014-08-28 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with redundant system input support
US20140244004A1 (en) * 2013-02-27 2014-08-28 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with position and derivative decision reference
US9798302B2 (en) * 2013-02-27 2017-10-24 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with redundant system input support
US9804576B2 (en) * 2013-02-27 2017-10-31 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with position and derivative decision reference
WO2015185351A1 (en) * 2014-06-04 2015-12-10 Holzma Plattenaufteiltechnik Gmbh Method for operating a panel-processing installation, and panel-processing installation
US10065316B2 (en) 2016-02-05 2018-09-04 Rethink Robotics, Inc. Systems and methods for safe robot operation
US9868214B2 (en) 2016-06-20 2018-01-16 X Development Llc Localization of a mobile system
WO2019014940A1 (en) * 2017-07-21 2019-01-24 深圳市萨斯智能科技有限公司 Method of determining security level of robot and robot
US10800409B2 (en) 2018-09-04 2020-10-13 Caterpillar Paving Products Inc. Systems and methods for operating a mobile machine using detected sounds
EP3627033A1 (en) * 2018-09-24 2020-03-25 Leuze electronic GmbH + Co. KG Device for protecting a hazardous area of a system
US11537119B2 (en) 2019-04-11 2022-12-27 Bastian Solutions, Llc Voice controlled material handling mobile robotic system

Also Published As

Publication number Publication date
CN102099614A (en) 2011-06-15
EP2288839A1 (en) 2011-03-02

Similar Documents

Publication Publication Date Title
WO2009155948A1 (en) System for safety protection of human beings against hazardous incidents with robots
US20200086497A1 (en) Stopping Robot Motion Based On Sound Cues
US9043025B2 (en) Systems and methods for safe robot operation
US9731421B2 (en) Recognition-based industrial automation control with person and object discrimination
EP2772811B1 (en) Recognition-based industrial automation control with confidence-based decision support
US9489730B2 (en) Method and device for safeguarding a hazardous working area of an automated machine
US9804576B2 (en) Recognition-based industrial automation control with position and derivative decision reference
EP3017920A1 (en) An industrial robot and a method for controlling an industrial robot
KR101785998B1 (en) Human access detecting system for preventing safety accident
CN102778858A (en) Device for operating an automated machine for handling, assembling or machining workpieces
WO2009155947A1 (en) Control system and method for control
EP2772812B1 (en) Recognition-based industrial automation control with redundant system input support
CN111223261B (en) Composite intelligent production security system and security method thereof
CN109527728A (en) A kind of method and intelligent wearable device for realizing life security monitoring
CN108482381A (en) Vehicle control system, vehicle and control method for vehicle
CN109544872A (en) A kind of safe avoidance detection of manipulator and alarm system and its operating method
US20140077955A1 (en) Safety device for a technical installation or a technical process
JP4606891B2 (en) Platform door condition monitoring system
CN113075438B (en) Electricity stealing detection method and device, electricity stealing prevention device and readable storage medium
CN113096660A (en) Personnel safety protection method and device, electronic equipment and storage medium
KR20110138711A (en) Security robot and method for controlling security robot
CN107765574B (en) A kind of acoustic control safety protection switch system
US10824126B2 (en) Device and method for the gesture control of a screen in a control room
WO2020181553A1 (en) Method and device for identifying production equipment in abnormal state in factory
US20230072474A1 (en) Power Distribution Equipment Personnel Safety System

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880130118.9

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08773685

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008773685

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE