IES20160070A2 - Monitoring system - Google Patents

Monitoring system Download PDF

Info

Publication number
IES20160070A2
IES20160070A2 IES20160070A IES20160070A IES20160070A2 IE S20160070 A2 IES20160070 A2 IE S20160070A2 IE S20160070 A IES20160070 A IE S20160070A IE S20160070 A IES20160070 A IE S20160070A IE S20160070 A2 IES20160070 A2 IE S20160070A2
Authority
IE
Ireland
Prior art keywords
area
video cameras
images
head
machine
Prior art date
Application number
IES20160070A
Inventor
Terenzi Claudio
Bontempi Agostino
Original Assignee
Wide Automation S R L
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wide Automation S R L filed Critical Wide Automation S R L
Publication of IES20160070A2 publication Critical patent/IES20160070A2/en
Publication of IES86722B2 publication Critical patent/IES86722B2/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/24Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
    • B23Q17/2433Detection of presence or absence
    • B23Q17/2438Detection of presence or absence of an operator or a part thereof
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16PSAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
    • F16P3/00Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
    • F16P3/12Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
    • F16P3/14Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
    • F16P3/142Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using image capturing devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/406Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by monitoring or safety
    • G05B19/4061Avoiding collision or forbidden zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B9/00Safety arrangements
    • G05B9/02Safety arrangements electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37572Camera, tv, vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/49Nc machine tool, till multiple
    • G05B2219/49141Detect near collision and slow, stop, inhibit movement tool

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)
  • Iron Core Of Rotating Electric Machines (AREA)

Abstract

Described is a system for supervising an area (A) close to a processing machine (M), with the machine (M) having at least one movable operating head (2) in the area (A) or close to it, comprising, in combination: a first viewing head (TV1), equipped with a first video camera (Ti), a second video camera (T2), and a third video camera (T3), each positioned at a predetermined distance from each other in such a way as to acquire images of the area (A) and a processing unit (3), connected to the video cameras (Ti, T2, T3) to receive the images. <Figure 1>

Description

Introduction This invention relates to a monitoring system, in particular of areas close to a machine tool.
In the field of machine tools several systems are know for allowing the monitoring, that is, the supervision, of an area adjacent to a machine tool (which 10 may be used by an operator).
The aim of these monitoring systems is to monitor a predetermined operating area, close to the machine tool, to identify the presence of a person inside the operating area and, if necessary, consequently stop the machine tool.
The above-mentioned supervision systems therefore allow the safety for the operators to be increased as the machine is stopped if an operator approaches the machine under conditions of potential danger.
However, there has been a long felt need for providing a supervision system which is particularly reliable, that is to say, in which the generation of undesired alarms is reduced and in which, on the other hand, the risk of failing to identify an operator inside the supervision area is reduced.
The aim of this invention is therefore to overcome the above-mentioned drawbacks by providing a monitoring system and a monitoring method which are efficient and reliable.
Summary of the Invention According to the invention, this aim is achieved by a monitoring system comprising the technical features described in one or more of the appended claims.
Brief Description of the Drawings The technical characteristics of the invention, with reference to the above aims, are clearly described in the claims below and its advantages are apparent from the detailed description which follows, with reference to the accompanying drawings which illustrate a preferred embodiment of the invention provided merely by way of example without restricting the scope of the inventive concept, and in which: Figure 1 shows a schematic view of an application of the monitoring system according to the invention; Figure 2 shows a detail of the monitoring system according to the invention.
Detailed Description of the Preferred Embodiments With reference to the accompanying drawings, the numeral 1 denotes a system for monitoring an area A close to a processing machine M.
The term “processing machine M” is used to mean any machine for carrying out processes or treatments on objects or products, such as, by way of an example, a machine tool. .
It should be noted that these machines generally have at least one movable element 2, which usually carries the processing tool.
This movable element 2 comprises an operating head 2 (these two terms are used below without distinction).
In the example shown in Figure 1 the label P denotes a part being processed and numeral 2 denotes a movable element forming part of the machine M.
It should be noted that the element 2 is movable along a direction X.
The system 1 comprises a first viewing head TV1, provided with a first video 5 camera T1, a second video camera T2, and a third video camera T3, each positioned at a predetermined distance from each other in such a way as to acquire images of the area A.
Preferably, the video cameras (T1, T2, T3) of the first viewing head TV1 have a 10 parallel optical axis.
It should be noted that the video cameras (T1, T2, T3) are positioned in such a way that, in pairs, they allow the acquisition of three-dimensional images of the area A.
Preferably, as illustrated in Figure 1, the system also comprises a second viewing head TV2 positioned, in use, in a predetermined reciprocal position for acquiring images of the area A from a different angle.
The second viewing head TV2 has the same technical and functional features as the first viewing head TV1, to which reference will therefore be made below.
The video cameras (T4, T5, T6) of the second viewing head (TV2) are set at an angle relative to the video cameras (T1, T2, T3) of the first viewing head (TV1): 25 in other words, the video cameras of the two viewing heads (TV1, TV2) take images of the area A on different optical paths.
Preferably, the video cameras (T4, T5, T6) of the second viewing head TV2 are positioned in front (facing) the video cameras (T1, T2, T3) of the first viewing 30 head TV1, as illustrated in Figure 1.
It should be noted that the presence of the second viewing head TV2 is optional and the advantages of using the second viewing head TV2 are described in more detail below.
It should be noted that the video cameras (T1, T2, T3) of the first viewing head TV1 are positioned inside a box-shaped container 5.
Preferably, the box-shaped container 5 comprises a support 6 designed for resting on the ground or for connecting to one or more elements of the machine M.
According to the invention, the system 1 comprises a processing unit 3, connected to the video cameras (T1, T2, T3) for receiving the images and configured for: a) analysing, in pairs, the images of the video cameras (T1, T2, T3) of the first viewing head TV1 for identifying the presence of an object in the area A; b) acquiring a position of the operating head 2 in the area A; c) providing a signal SA for stopping the machine M as a function of the relative position of the operating head 2 with respect to the object detected in the area A.
Preferably, the processing unit 3 is integrated in the first viewing head TV1.
It should be noted that the operation b) for acquiring the position of the movable element 2 in the area A can comprise a reading of the position directly by the machine M or can comprise an analysis of the images acquired by the video cameras to obtain the position of the movable element 2 in the area A.
For this reason, the first viewing head TV1 may comprise an interface connected to the control unit of the machine, for acquiring the position of the movable element 2 in the area A or, alternatively, it may comprise a module (hardware or software) for processing images configured to derive the position of the movable element 2 in the area A, through the analysis of one or more pairs of the video cameras (T1, T2, T3) of the first viewing head TV1.
According to another aspect, the system 1 comprises a control unit for interfacing with the machine M connected to the processing unit 3 and to the machine M, for receiving the stop signal from the processing unit 3 and sending a control signal to the control unit (for example to the PLC) of the machine M.
Alternatively, the processing unit 3 may integrate directly inside it a communication module configured to interface with the machine M, for sending a control signal to the control unit (for example to the PLC) of the machine M.
In other words, the processing unit 3 is equipped with a communication interface (communication module) with the machine M.
Preferably, the communication interface is of the two-way type (to allow commands to be sent to the machine M and information to be received from the machine M, regarding or not the commands).
According to another aspect, the first viewing head TV1 comprises at least one ultrasonic sensor U1 (which integrates an ultrasound emitter and receiver).
The ultrasonic sensor U1 is associated with (fixed to) the viewing head TV1.
It should also be noted that the ultrasonic sensor U1 is connected to the processing unit 3.
The processing unit 3 is configured for processing the signal of the ultrasonic sensor U1, to identify the presence of an object in the area A and to compare the outcome of the processing of the signal of the ultrasonic sensor U1 with the outcome of the analysis, in pairs, of the images of the video cameras (T1, T2, T3).
Advantageously, the ultrasound sensor increases the reliability of the system detecting objects inside the area supervised: in effect, the false alarms are reduced, as well as the cases of intrusions in the area under surveillance which are not detected.
It should be noted that, according to a first embodiment, the control unit 3 is configured to identify the presence of an object in the area A by analysing the images of all the pairs of video cameras (T1, T2, T3) of the first viewing head according to a processing logic of the OR type.
In other words, it is sufficient for the presence of an object to be detected with a single pair of video cameras for the processing unit to derive the presence of an object in the area A under surveillance.
It should be noted that, according to a second and different alternative embodiment, the control unit 3 is configured to identify the presence of an object in the area A by analysing the images of all the pairs of video cameras (T1, T2, T3) of the first viewing head according to a processing logic of the AND type.
In other words, it is necessary for the presence of an object to be detected with all the video cameras for the processing unit to derive the presence of an object in the area A under surveillance.
It should be noted that, irrespective of the processing logic adopted (OR/AND) for analysing the signal of the pairs of video cameras (T1, T2, T3), it is possible to analyse the signal of the ultrasonic sensor U1 in combination with the outcome of the analysis of the images according to different logics, as described in more detail below.
According to a first embodiment, the processing unit 3 is configured for identifying the presence of an object in the area A by combining, according to an AND type processing logic, the outcome of the processing of the pairs of video cameras (T1, T2, T3) with that of the ultrasound sensor.
In other words, in order for the presence of an object to be determined in the area A subject to monitoring, it is necessary that the presence of an object is identified by the video cameras and by the ultrasonic sensor U1.
According to a second embodiment (regardless of the processing logic adopted OR or AND for analysing the signal of the pairs of video cameras T1 ,T2,T3), the processing unit 3 is configured for identifying the presence of an object in the area A by combining, according to an OR type processing logic, the outcome of the processing of the pairs of video cameras (T1, T2, T3) with that of the ultrasound sensor.
In other words, in order for the presence of an object to be determined in the area A subject to monitoring, it is sufficient that the presence of an object is identified by the video cameras (T1, T2, T3) or by the ultrasonic sensor U1.
It should be noted that, advantageously, through the presence of the ultrasonic sensors the reliability of the system is further increased, since the condition of presence of an object is based on a processing of two different types of sensors (optical I ultrasound).
According to another aspect, the system 1 further comprises at least one lighting source, associated with (fixed to) the first viewing head TV1 and configured to illuminate the area under surveillance.
Preferably, the at least one lighting source comprises at least one LED (L1, L2, L3, L4).
Still more preferably, the first head TV1 comprises a plurality of LEDs (L1, L2, L3, L4).
In this way, advantageously, the area A subject to surveillance is illuminated (with a predetermined light pattern).
Preferably, the LED (L1, L2, L3, L4) is of the infrared type.
Further, according to another aspect, the LED (L1, L2, L3, L4) has a high 5 luminous power.
According to this aspect, the system 1 comprises a control unit of the LED (L1, L2, L3, L4), configured for feeding pulses to the LED (L1, L2, L3, L4).
Preferably, the control unit is integrated in the viewing head TV1; still more 10 preferably, the control unit is integrated in the processing unit 3.
According to another aspect, the control unit is configured to activate the LED (L1, L2, L3, L4) in a synchronous fashion with the acquisition of images of a pair of video cameras (T1, T2, T3).
In this way, advantageously, a very high quantity of light is emitted for short periods of time, so that the lighting does not create disturbance and the class of the LED can therefore fall within the safety devices which are not hazardous for the sight.
It should be noted, therefore, that this aspect allows an increase in the overall reliability of the detection.
Preferably, if the system is fitted with two measuring heads, a first and a second, 25 the control unit of each measuring head is configured to measure the luminous radiation emitted by the LEDs of the other viewing head.
It should be noted that, advantageously, the control unit of each measuring head is configured to detect and monitor the position of the LEDs of the other viewing 30 head (by analysing the image): this allows the system to constantly check that the mutual position of the heads is correct, that is to say, it is that for which the calibration has been performed.
In effect, if one of the two measurement heads is moved accidentally (impact, tampering etc.), the system would immediately detect a change of position of the lighting points of the LEDs of the other viewing head.
A preferred embodiment of the system 1 and of the supervision operations actuated by it is described below in detail.
It should be noted that the description of the system 1 and of the operations actuated by it must not be considered as limiting the invention but must be considered purely as an example.
The monitoring steps performed by a pair of video cameras (T1, T2; T2, T3; T1, T3) of the first viewing head TV1 will be described below: the same comments apply with regard to any other pair of video cameras.
The monitoring comprises acquiring a pair of images of the area A using two video cameras of the first viewing head (TV1).
The two images are, preferably, synchronised with each other, that is, acquired at the same moment in time.
Subsequently, an analysis is carried out on the two images acquired for identifying the presence of an object in the area A.
The analysis comprises the generation of a map of the distances.
The step of generating a map of the distances comprises a step of generating a map, that is, a matrix, on the basis of a comparison between the two images acquired.
The map, or matrix, comprises the position (in the world) of each pixel (or, more generally, portion of the image) with respect to a reference system shared by the video cameras (T1, T2, T3).
This map is generated on the basis of an analysis of the differences.
Preferably, a map is generated containing the coordinates, in the world (that is, with respect to a reference system shared by the two video cameras), of each pixel.
Preferably, the map contains the coordinates of each pixel in three orthogonal directions: - a direction X substantially parallel to the optical axes of the video cameras; - a direction Z substantially orthogonal to the ground in the area; - a direction Y orthogonal to the previous directions.
Then there is an extraction and classification of the obstacles.
For this reason, the control unit 3 is configured to perform an extraction and classification of the obstacles.
More specifically, the control unit 3 is configured to subjected to supervision.
The analysis of the map of the distances comprises a step of searching, in the map of the distances, for adjacent portions 01 having substantially the same distance value along the direction X of moving away of the video cameras T1, T2, that is, along the direction which is substantially parallel to the optical axis of the video cameras (T1,T2; T2.T3; T1 ,T3).
Further, the supervision comprises analysing several images, acquired, respectively, with different pairs of video cameras of the first viewing head.
More precisely, since the viewing head consists of three video cameras (T1, T2, T3), it is possible to perform the analysis on three different pairs (T1, T2; T2, T3; T1, T3) of video cameras.
According to this aspect, there is a step for comparing the results of the analyses performed on each pair of video cameras to check for the presence of an object in the area A in pairs of different video cameras.
According to an embodiment of the system 1, the area A is subdivided into a plurality of regions (A1, A2, A3).
Each region (A1, A2, A3) corresponds to a portion of the area A subjected to 10 monitoring: three different regions (A1, A2, A3) are shown in the example in Figure 1.
According to this embodiment, the processing unit 3 is configured for analysing the images of the video cameras (T1, T2) to identify in which region (A1, A2, A3) 15 the object identified is positioned (the object is identified by the above-mentioned criteria).
Moreover, in the analysing step, the processing unit 3 is configured to associate the obstacle detected with the region (A1, A2, A3) identified.
It should be noted that, according to this embodiment, the signal SA for controlling (stopping) the machine M is provided by the processing unit 3 as a function of the relative position of the movable element 2 relative to the region (A1, A2, A3) identified.
In other words, according to this application example, each region can be: - enabled if a person and the operating head can be present simultaneously in the region without the stop signal being released; - “prohibited” if a person and the operating head cannot be present simultaneously in the region (this corresponds to a situation of potential danger) and the simultaneous presence of the head and the person in the region causes the release of the alarm signal.
It should be noted that the system 1, in particular the processing unit 3, can be configured in such a way that certain regions can be permanently prohibited or enabled or each region can be enabled or prohibited as a function of the position of the operating head: according to this latter possibility, the presence of the operating head in a region determines the prohibiting of the region whilst its absence determines the enabling of the region.
The following example, relative to Figure 1, will clarify the above.
Figure 1 shows the movable element 2 positioned in the region A1.
The entrance of a person in the region, in which the movable head or the operating head 2 is positioned, causes the machine to stop.
The region A1 is prohibited, as the operating head 2 is present there.
On the other hand, the entrance of a person in the region A3 does not cause the machine to stop as that condition is not the source of a potential danger, since the person in region A3 is quite far from the movable element 2.
It should be noted that the region A3, in the specific example, is enabled as the operating head 2 is not present there.
In this way the system 1 renders active a signal for stopping the machine as a function of the relative position of the movable element 2 and the object (or person) detected.
It should be noted that, generally, the processing unit 3 is configured to emit the stop signal if, on the basis of the analysis of the images, the movable element 2 is found to be near the person (object), as detected by analysing the images of the video camera, which corresponds to a situation of potential danger.
This condition of nearness, in the specific case, corresponds to the presence of the movable element 2 and the person (object) detected in the same region (A1, A2, A3), or in the immediately adjacent region (A1, A2, A3).
Advantageously, the presence of two or more viewing heads makes the system particularly immune to false alarms, that is, more reliable.
These viewing heads (TV1, TV2) are, preferably, positioned one relative to the other in such a way that the respective video cameras take images from different angles.
It should be noted that, as shown in Figure 1, the two viewing heads (TV1, TV2) are preferably positioned facing each other.
Advantageously, the use of a second viewing head TV2 avoids potential shadow areas in the zone immediately in front of the video cameras of the first viewing head TV1.
The invention described above is susceptible of industrial application and may be modified and adapted in several ways without thereby departing from the scope of the inventive concept. Moreover, all the details of the invention may be substituted with technically equivalent elements.

Claims (6)

1. A system for monitoring an area (A) close to a processing machine (M), with the machine (M) having at least one movable operating head (2) in the area (A) or close to it, characterised in that it comprises, in combination: - a first viewing head (TV1), provided with a first video camera (T1), a second video camera (T2), and a third video camera (T3), each positioned at a predetermined distance from each other in such a way as to acquire images of the area (A); - a processing unit (3), connected to the video cameras (T1, T2, T3) for receiving the images and configured for: a) analysing, in pairs, the images of the video cameras (T1, T2, T3) for detecting the presence of an object in the area (A); b) acquiring a position of the operating head (2) in the area (A); c) providing a signal (SA) for stopping the machine (M) as a function of the relative position of the operating head (2) with respect to the object detected in the area (A).
2. The system according to claim 1, wherein the video cameras (T1, T2, T3) of the first viewing head (TV1) have a parallel optical axis.
3. The system according to any one of the preceding claims, comprising at least one ultrasound sensor (U1), associated with the viewing head (TV1), and connected to the processing unit (3), the processing unit (3) being configured to process the signal of the at least one ultrasound sensor (U1), to detect the presence of an object in the area (A) on the basis of the signal and to compare the outcome of the processing of the signal of the ultrasound sensor (U1) with the outcome of the analysis, in pairs, of the images of the video cameras (T1, T2, T3) to derive a condition indicating the presence of an -15object in the area (A).
4. The system according to claim 3, wherein the control unit (3) is designed to derive a condition indicating the presence of an object in the area (A) if a 5. Presence of an object is detected both by analysing the images of the pairs of video cameras (T1, T2, T3) and by analysing the signal of the ultrasonic sensor (U1).
5. The system according to any one of the preceding claims, further comprising
6. 10 at least one lighting source, associated with the first viewing head (TV1) and configured to illuminate the area under surveillance.
IES20160070A 2015-03-05 2016-03-04 Monitoring system IES86722B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
ITBO2015U000018U ITBO20150018U1 (en) 2015-03-05 2015-03-05 MONITORING SYSTEM.

Publications (2)

Publication Number Publication Date
IES20160070A2 true IES20160070A2 (en) 2016-09-07
IES86722B2 IES86722B2 (en) 2016-11-02

Family

ID=55968046

Family Applications (1)

Application Number Title Priority Date Filing Date
IES20160070A IES86722B2 (en) 2015-03-05 2016-03-04 Monitoring system

Country Status (6)

Country Link
AT (1) AT15136U1 (en)
DE (1) DE202016101172U1 (en)
ES (1) ES1163783Y (en)
FR (1) FR3033419B3 (en)
IE (1) IES86722B2 (en)
IT (1) ITBO20150018U1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107263196B (en) * 2017-07-13 2020-03-24 台山市仁丰五金电器有限公司 Novel numerical control machine tool protection device and numerical control machine tool using same

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005028490C5 (en) * 2005-06-20 2016-07-14 Sick Ag camera system
DE102007014612A1 (en) * 2007-03-23 2008-09-25 TRüTZSCHLER GMBH & CO. KG Device for monitoring and securing hazardous areas on power-driven textile machines, in particular spinning preparation machines
BR112013002921A2 (en) * 2010-09-22 2016-05-31 Nippon Steel & Sumitomo Metal Corp restricted access intruder detection system, intruder detection device, restricted access intruder detection method and program thereof
US20120081537A1 (en) * 2010-10-04 2012-04-05 General Electric Company Camera protective system and method
ITRN20120036A1 (en) * 2012-07-09 2014-01-10 Wide Automation S R L SYSTEM AND SUPERVISORY PROCEDURE
DE102013104265A1 (en) * 2013-04-26 2014-10-30 Pilz Gmbh & Co. Kg Device and method for securing an automated machine
US10597053B2 (en) * 2013-05-08 2020-03-24 International Electronic Machines Corp. Operations monitoring in an area

Also Published As

Publication number Publication date
AT15136U1 (en) 2017-01-15
FR3033419B3 (en) 2017-06-09
ITBO20150018U1 (en) 2016-09-05
ES1163783Y (en) 2016-11-21
DE202016101172U1 (en) 2016-04-26
FR3033419A3 (en) 2016-09-09
IES86722B2 (en) 2016-11-02
ES1163783U (en) 2016-08-30

Similar Documents

Publication Publication Date Title
JP6866673B2 (en) Monitoring system, monitoring device, and monitoring method
US7995836B2 (en) Optoelectronic multiplane sensor and method for monitoring objects
CN105473927B (en) For the apparatus and method for the machine for ensureing automatically working
US8988527B2 (en) Method and apparatus for monitoring a three-dimensional spatial area
CN111226178B (en) Monitoring device, industrial system, method for monitoring and computer program
US20110050878A1 (en) Vision System for Monitoring Humans in Dynamic Environments
EP3279132B1 (en) System of monitoring handrail for a passenger conveyer device, a passenger conveyer device and monitoring method thereof
CN103257032B (en) For the system of the pixel performance in testing sensor array
US8107058B2 (en) Sensor device and system having a conveyor and a sensor device
US9933510B2 (en) Safety scanner and optical safety system
CN107662868A (en) Monitoring system, passenger transporter and its monitoring method of passenger transporter
WO2022142973A1 (en) Robot protection system and method
EP2685150B1 (en) Monitoring system and method
CN113739058A (en) Optoelectronic safety sensor and method for safeguarding a machine
US20240034605A1 (en) Safety device for self-propelled industrial vehicles
EP2466560A1 (en) Method and system for monitoring the accessibility of an emergency exit
IES20160070A2 (en) Monitoring system
KR100756008B1 (en) System for number recognition of container and a method thereof
US9083946B2 (en) System to detect failed pixels in a sensor array
CN110446944B (en) SPAD-based laser radar system
US11496215B2 (en) VLC in factories
KR102064712B1 (en) An equipment safety management system with a wearble device and a body detection module
CN113905854B (en) Method and system for monitoring a robotic device
US20230081003A1 (en) Sensor arrangement and method for safeguarding a monitored zone
CN116829986A (en) Hybrid depth imaging system