WO2005010840A1 - Procede de securisation automatique - Google Patents
Procede de securisation automatique Download PDFInfo
- Publication number
- WO2005010840A1 WO2005010840A1 PCT/CH2004/000465 CH2004000465W WO2005010840A1 WO 2005010840 A1 WO2005010840 A1 WO 2005010840A1 CH 2004000465 W CH2004000465 W CH 2004000465W WO 2005010840 A1 WO2005010840 A1 WO 2005010840A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- light
- sensor
- light source
- image area
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19641—Multiple cameras having overlapping views on a single scene
- G08B13/19643—Multiple cameras having overlapping views on a single scene wherein the cameras play different roles, e.g. different resolution, different camera type, master-slave camera
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/1961—Movement detection not involving frame subtraction, e.g. motion detection on the basis of luminance changes in the image
Definitions
- the invention relates to a method and a device for automatic protection of systems.
- one possibility for protection is based on the light barrier principle: a light source emits one or more light beams that strike one or more light catchers. If one or more light beams are interrupted, an alarm is triggered.
- Room surveillance e.g. for robots or on autonomous vehicles
- EP-A-1 300 691 A description of an alternative method for monitoring systems based on the known triangulation principle can be found in EP-A-1 300 691. According to this document, a pattern of light spots is projected onto a monitoring area. A receiving device determines whether light spots have moved away from a certain area, which indicates the presence of an object. Disadvantages are the great expenditure of optical elements for the generation of a pattern of light spots and the lack of possibility to make quantitative statements about the object.
- the method and the device should preferably also enable the quantitative determination of properties of the monitored or interfering objects.
- the method according to the invention uses the triangulation principle which is known per se, which, to put it simply, is based on the fact that the position of a light reflex on an object depends on its distance from the viewer, if the direction from which the light source shines depends on the Direction of view is different.
- the triangulation principle is used here in such a way that it can be implemented cost-effectively with the use of components such as, for example, a laser light source and area image sensor and image processing techniques that run on corresponding image processing computers.
- light is always understood to mean both visible light and electromagnetic radiation in the non-visible area.
- Image and image processing are likewise not limited to images taken in the visible, they can also be infrared images, etc.
- the invention is also based on the idea that when monitoring with the aid of image processing means, quantitative statements about the object and its
- 3D properties can be made. This is based on the fact that in the case of known - or determined by one-time, camera type-specific calibration - optical
- Characteristics of the camera an image position corresponds to a certain angle of incidence.
- Object can be determined from the angle of incidence. According to one embodiment, the displacement of the
- Embodiment the limitation of accuracy is overcome by a pixel resolution of the camera by performing a sub-pixel approximation.
- the method according to the invention avoids disadvantages of existing systems. It is optically very simple. All that is required are means for generating at least one light line - for example a “laser light meter” - and an image area sensor, for example a camera. There is no need for complex optical adjustment.
- the image processing that follows the image acquisition requires a certain amount Electronics and programming effort. However, this is not significant for a large number of pieces, since electronic components - in contrast to precise optical instruments - can be manufactured inexpensively in large quantities and, in particular, since no complex setup of each individual location is necessary.
- the approach according to the invention enables not only the detection of objects but also the detection of their properties.
- the deviation of the light pattern from a reference (for example, corresponding to the image of the light pattern, if there is no object in the detection area of the image area sensor) is recorded quantitatively and thus height information or distance information is obtained. Together with speed information - such information may be available if the monitored space is above a conveyor belt, for example - the volume of an object can thus be determined.
- the invention is therefore not limited to systems in which only disturbing objects are detected, the detection of which causes an alarm signal or a machine to be switched off. Rather, it also includes the possibility of performing complex monitoring or even sorting functions.
- Figure 1 of the drawing shows a schematic of the process.
- FIG. 2 shows very schematically the basic principle of triangulation.
- FIG. 3 shows an illustration of the principle of sub-pixel approximation.
- Figure 4 shows an illustration of the principle of the difference image calculation with 2 cameras.
- FIG. 1 schematically shows an area 4 to be monitored.
- an object 0 is located in the area to be monitored, that is to say on the area 4 to be monitored.
- a 3-dimensional image is to be generated from this object, which in the example described then triggers an alarm.
- the space to be monitored is illuminated with the aid of a “laser light meter” 1 as a straight line of light per se, the light direction having to be shifted relative to the camera axis that records the scene.
- the light meter essentially cuts through the entire area to be monitored and is, for example, . not interrupted or at most only interrupted by short sections, so that the total interruptions make up at most a small fraction (eg a tenth) of the total width. This ensures that narrow and marginal objects are not overlooked.
- the "light meter” is now reflected from the empty space as a line. If there is an object 0 in the room, this line is changed.
- a single light line it is also possible to use a plurality of light lines which are parallel or in any arrangement not parallel to one another and which complement one another or to form a pattern, for example by crossing, continuing with one another, etc.
- the light line (s) Of course, must not necessarily be straight.
- Different light patterns can be projected at different times and / or from independent light sources.
- Lasers for example diode lasers, but also other preferably monochrome light sources (ie Light sources with a limited bandwidth compared to the spectrum of visible light), for example light-emitting diodes (LEDs).
- the light pattern on the surface 4 serves as a reference, ie the pattern that would arise if there were a flat object on the surface 4. In this embodiment, it is also possible to detect objects that are located behind the surface.
- An image area sensor 3 records the scene.
- the light pattern 2 arising from the light line is also visible on this recording.
- the signal from the image area sensor 3 is converted to a digital signal in the analog / digital conversion unit 5 and is thus available to the subsequent evaluation units as a digital signal.
- the image evaluation 6 detects the course of the light line 2 in the image and creates a 3-D model from it in the manner set out above.
- the control computer 7 recognizes the situation. On the basis of this change, the control computer 7 can generate an alarm signal 8.
- the image recorded during the alarm situation - recorded by the image area sensor 3 or possibly by another camera - can possibly be saved for documentation purposes.
- the height or distance of the object can be determined quantitatively depending on the lateral position. The principle of triangulation is shown very schematically in FIG. Due to the presence of object 0, the point of impact of the light, which is perceived by an image area sensor 3, is perceived as shifted in comparison to the state without object 0 (dashed line). Using trigonometric formulas, the height or distance of the object is calculated from the angle of incidence ⁇ and the angle of incidence ⁇ as well as the known distance of the camera from the monitored plane.
- the specific formulas to be used are known from classic triangulation and are not listed again here.
- the present invention makes use of the knowledge that the angle ⁇ is expressed in a positional shift of the light pattern on the image due to the camera optics.
- the deviation in the image of the light pattern from the reference can be determined by means of image processing means, and from this the angle ⁇ or, as explained, the height of the object can be inferred.
- the positional shift of the light pattern on the image recorded by the camera can be determined, for example, by evaluating which image points (pixels) the light pattern is mapped to.
- the image of a thin line will also not only trigger signals in a single row of pixels, but also in neighboring pixels. This can be used to determine the position with a resolution that is finer than the "pitch", ie the size given by the pixel spacing.
- the position of all pixels in which a signal is measured is averaged. The averaging is weighted according to signal strength.
- the method can be used to determine precise position information with sub-pixel accuracy, even if the pattern is not exactly mapped to the center of a row of pixels. This is illustrated in FIG. 3, where the intensities 22 are plotted as a function of the x position of the pixels 21. The actual position 23 determined with the sub-pixel approximation is not in the center of the pixel with the highest intensity value.
- a change in the brightness of the light line in the image can also indicate a lack of functionality.
- a suitable warning signal and / or a check can be triggered.
- the presence of an object - as an alarm case - can be distinguished from a system failure - as a malfunction - according to the approach according to the invention.
- the image evaluation 6 can be carried out by simply setting a threshold value: if the signal is greater than this threshold value, it is the light line 2, otherwise it is not part of the light line.
- Lighting conditions can be used. It also has to work in direct, competitive solar radiation. So that is the case , the light output of the light source 1 would have to be increased so much that the eye of a person could be damaged by direct radiation.
- Various measures can be taken to reduce the light output so much that it becomes harmless to humans without endangering the way the system works. These measures are to be explained in more detail below:
- Laser light sources such as are usefully used as light source 1, are monochrome.
- an optical filter 9 for example a band-pass or low-pass filter in front of the camera, which is tuned to the frequency range of the laser, most of the sunlight or other lighting can be suppressed.
- the optical filter thus favors the recognizability of the monochrome light source in relation to the ambient light.
- Lasers can be used both in the visible and in the invisible area. The selection depends on the application; should the endangered area be visible (e.g. machine security) or should it be invisible (e.g. access security))
- the light source 1 can usually be modulated or pulsed; for a short period of time it can be 100 or 1000 times the continuous output. At the same time, modern cameras have an electronic shutter. This can be synchronized with the light pulse.
- the control computer 7 controls the light source via line 10 and the shutter of the camera via line 11.
- an image can be suppressed without the light source 1 is switched on, recorded and temporarily stored in an image memory 12.
- the picture is then taken with the light source switched on. This is compared by the comparator 13 with the stored image (for example by forming a difference).
- slowly changing strong lights in the recording area 4 can be suppressed comparatively, for example in comparison with the recording frequency of the camera.
- the picture can be taken with the light source switched off once at the beginning of a work cycle or continuously - alternating with lighting periods.
- FIG. 4 a method according to FIG. 4 is suitable.
- two sensors 3, 3 ' are available.
- One sensor 3 triggers the image acquisition at the same time as the pulsation of the light source, while the other sensor 3 'immediately takes an image, the image acquisition time being approximately identical for both sensors.
- Both sensor signals are converted into digital signals in the analog / digital converters 5, 5 '.
- the transformation unit 20 'transforms the signal of the second sensor 3' geometrically and in terms of light intensity to the geometric position of the first sensor 3.
- the two images are compared (e.g. by forming a difference) and then further evaluated analogously as described above.
- an image area sensor including A / D converter, possibly its own optics, and possibly the means for reference formation) or more additional sensor systems can help: While one sensor system is set to be very light-sensitive with a first image area sensor and overdrives for large input signals, a second sensor system with a second image area sensor can take over the area which is not very light-sensitive. The sensor system which provides better signal statistics is used for further processing. As an alternative or in addition to this, the images recorded by both sensor systems can also be combined, for example after correction or geometric transformation, for example by addition. The acquisition of the images exposed to different intensities can, but does not have to take place simultaneously.
- more than one sensor system can also be used in situations in which an object's shadow can lead to inaccessible areas, or in order to simultaneously detect the height and another geometric dimension of the object.
- equivalent at least two sensor systems are preferably used simultaneously.
- the device described here can be used to detect a change and thus for simple machine protection.
- the system can be calibrated; the height or distance of object 0 can thus be calculated directly from the displacement of light line 2 in the image. If the geometric arrangement is known, the height or distance of the recording system can also be calculated. If object 0 is moving at a known speed (eg on a conveyor belt for parts or people), the volume of object 0 can also be estimated with this method.
- the image processing device can also be provided with means for specifying an object speed in addition to the above.
- the object can be conveyed, for example, in a plane containing the surface 4 and in a direction approximately perpendicular to the direction of the light meter.
- the volume can be determined in the image processing means itself or in a control and / or evaluation computer.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CH12982003A CH697014A5 (de) | 2003-07-25 | 2003-07-25 | Absicherung und/oder Ueberwachung von Systemen. |
CH1298/03 | 2003-07-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005010840A1 true WO2005010840A1 (fr) | 2005-02-03 |
Family
ID=34085304
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CH2004/000465 WO2005010840A1 (fr) | 2003-07-25 | 2004-07-23 | Procede de securisation automatique |
Country Status (2)
Country | Link |
---|---|
CH (1) | CH697014A5 (fr) |
WO (1) | WO2005010840A1 (fr) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1722151A1 (fr) * | 2005-05-12 | 2006-11-15 | Sick Ag | Dispositif pour sécuriser une zone dangereuse |
ITPD20080320A1 (it) * | 2008-11-05 | 2010-05-06 | Marss S R L | Sistema antifurto |
US9057779B2 (en) | 2011-04-01 | 2015-06-16 | Cedes Ag | Sensor device, safety device, door and method for monitoring the movement |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4908704A (en) * | 1987-12-11 | 1990-03-13 | Kabushiki Kaisha Toshiba | Method and apparatus for obtaining an object image and distance data of a moving object |
DE4405849A1 (de) * | 1993-02-26 | 1994-09-01 | Murata Machinery Ltd | Verfahren zum Aufnehmen drei-dimensionaler Bilder von Ladungen |
DE19938639A1 (de) * | 1999-08-14 | 2001-02-22 | Pilz Gmbh & Co | Vorrichtung zur Absicherung eines Gefahrenbereichs, insbesondere des Gefahrenbereichs einer automatisiert arbeitenden Maschine |
DE10049366A1 (de) * | 2000-10-05 | 2002-04-25 | Ind Technik Ips Gmbh | Verfahren zum Überwachen eines Sicherheitsbereichs und entsprechendes System |
-
2003
- 2003-07-25 CH CH12982003A patent/CH697014A5/de not_active IP Right Cessation
-
2004
- 2004-07-23 WO PCT/CH2004/000465 patent/WO2005010840A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4908704A (en) * | 1987-12-11 | 1990-03-13 | Kabushiki Kaisha Toshiba | Method and apparatus for obtaining an object image and distance data of a moving object |
DE4405849A1 (de) * | 1993-02-26 | 1994-09-01 | Murata Machinery Ltd | Verfahren zum Aufnehmen drei-dimensionaler Bilder von Ladungen |
DE19938639A1 (de) * | 1999-08-14 | 2001-02-22 | Pilz Gmbh & Co | Vorrichtung zur Absicherung eines Gefahrenbereichs, insbesondere des Gefahrenbereichs einer automatisiert arbeitenden Maschine |
DE10049366A1 (de) * | 2000-10-05 | 2002-04-25 | Ind Technik Ips Gmbh | Verfahren zum Überwachen eines Sicherheitsbereichs und entsprechendes System |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1722151A1 (fr) * | 2005-05-12 | 2006-11-15 | Sick Ag | Dispositif pour sécuriser une zone dangereuse |
US7563039B2 (en) | 2005-05-12 | 2009-07-21 | Sick Ag | Apparatus for securing a dangerous zone |
ITPD20080320A1 (it) * | 2008-11-05 | 2010-05-06 | Marss S R L | Sistema antifurto |
US9057779B2 (en) | 2011-04-01 | 2015-06-16 | Cedes Ag | Sensor device, safety device, door and method for monitoring the movement |
Also Published As
Publication number | Publication date |
---|---|
CH697014A5 (de) | 2008-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2558886B1 (fr) | Dispositif de surveillance d'au moins une zone de sécurité tridimensionnelle | |
DE102010037744B3 (de) | Optoelektronischer Sensor | |
EP0902402B1 (fr) | Procédé et appareil pour la surveillance optique d'un espace | |
DE102009031732B3 (de) | Entfernungsmessender optoelektronischer Sensor | |
DE10360174B4 (de) | Vorrichtung zur Überwachung eines Erfassungsbereichs an einem Arbeitsmittel | |
EP1300691B1 (fr) | Procédé de surveillance et capteur opto-électronique | |
EP1813961B1 (fr) | Dispositif destiné à la surveillance optoélectronique d'objets | |
DE4222920A1 (de) | Bildverarbeitung verwendendes ueberwachungsmonitorsystem | |
EP2306426B1 (fr) | Dispositif de détection de véhicules sur une voie de circulation | |
DE60006411T2 (de) | Zählvorrichtung | |
WO2019114889A1 (fr) | Détection d'environnement 3d par projection d'une série de motifs pseudo-aléatoires et au moyen de modules de caméra stéréo | |
WO1999043150A1 (fr) | Systeme de poursuite de camera pour studio de television ou de video virtuel | |
DE10033608A1 (de) | Verfahren und Vorrichtung zum Absichern eines Gefahrenbereichs, insbesondere des Gefahrenbereichs einer automatisiert arbeitenden Maschine | |
EP1065521A2 (fr) | Système de surveillance optoélectronique | |
DE10055689B4 (de) | Verfahren zum Betrieb eines optischen Triangulationslichtgitters | |
DE102016110514B4 (de) | Vorrichtung und Verfahren zum Überwachen eines Raumbereichs, insbesondere zum Absichern eines Gefahrenbereichs einer automatisiert arbeitenden Anlage | |
DE102013007961B4 (de) | Optisches Messsystem für ein Fahrzeug | |
WO2007036553A1 (fr) | Procede et dispositif de prise de vue a distance | |
EP0448803B1 (fr) | Système de commande d'affichage vidéo | |
WO2005010840A1 (fr) | Procede de securisation automatique | |
DE29520980U1 (de) | Vorrichtung zum Testen eines lichtempfindlichen Matrix-Sensors, insbesondere einer elektronischen Kamera mit CCD-Matrix-Sensor oder CMOS-Matrix-Sensor | |
EP1865755A2 (fr) | Dispositif destiné à la commande d'un éclairage | |
DE10113413A1 (de) | Verfahren zur optoelektronischen Überwachung von Gefahrbereichen vor führerlosen, automatischen Transportsystemen | |
DE102007036632B4 (de) | Optischer Sensor und Verfahren zum Nachweis von Objekten in einem Überwachungsbereich | |
DE102019117849B4 (de) | Erfassung eines Objekts in einem Überwachungsbereich |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
122 | Ep: pct application non-entry in european phase |