EP1061487A1 - Verfahren und Vorrichtung zur automatischen Kontrolle einer Raumregion - Google Patents

Verfahren und Vorrichtung zur automatischen Kontrolle einer Raumregion Download PDF

Info

Publication number
EP1061487A1
EP1061487A1 EP99830376A EP99830376A EP1061487A1 EP 1061487 A1 EP1061487 A1 EP 1061487A1 EP 99830376 A EP99830376 A EP 99830376A EP 99830376 A EP99830376 A EP 99830376A EP 1061487 A1 EP1061487 A1 EP 1061487A1
Authority
EP
European Patent Office
Prior art keywords
region
volumetric map
volumetric
map
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP99830376A
Other languages
English (en)
French (fr)
Inventor
Marco Aste
Massimo Boninsegna
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Istituto Trentino di Cultura
Original Assignee
Istituto Trentino di Cultura
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Istituto Trentino di Cultura filed Critical Istituto Trentino di Cultura
Priority to EP99830376A priority Critical patent/EP1061487A1/de
Publication of EP1061487A1 publication Critical patent/EP1061487A1/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19641Multiple cameras having overlapping views on a single scene
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19652Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm

Definitions

  • the present invention relates in a general way to the automatic monitoring of a region of space, particularly as regards the detection and location of bodies present within this region.
  • the ability to detect and locate bodies present within a region of space is useful when it is necessary to monitor the presence of people or objects in particular regions or subregions.
  • the presence of people in defined areas may be judged to endanger the people themselves, or the objects present within that area.
  • a volume of space in the vicinity of a machine that may produce chips or liquids dangerous to man; or the volume represented by the radius of action of a mechanical arm; or regions of space close to equipment operating at high voltages; or more generally any region in which machines dangerous to man are operating may be considered dangerous.
  • the desirability of being able to automatically monitor and report the presence of people in a region of space regarded as dangerous for human activity may be considered dangerous.
  • Another example of an application is the protection of objects of artistic value in museum environments or in any situation in which the presence of people within a certain region can be a source of danger for the objects present in that area.
  • regions of space are monitored automatically by devices such as optical barriers of transmission and/or reflection type (typically based on infrared technology), physical barriers, pressure-sensitive mats, movement detectors based on microwaves, passive infrared or ultrasound, radar systems, and devices that use laser beams to detect the presence and position of objects.
  • devices such as optical barriers of transmission and/or reflection type (typically based on infrared technology), physical barriers, pressure-sensitive mats, movement detectors based on microwaves, passive infrared or ultrasound, radar systems, and devices that use laser beams to detect the presence and position of objects.
  • the object of this invention is therefore to provide a solution for the automatic detection of bodies within a defined region that is simple, reliable, easily set up and capable of discriminating between objects for shape and volume while also considering the movement and relative path of the objects within the monitored area.
  • the solution according to the invention is capable of automatically detecting, locating and reporting the presence of bodies within a monitored region of space.
  • the invention is also capable of discriminating with a high degree of reliability between objects of different shapes and sizes and is therefore able to select, for monitoring purposes, only one particular type of body, e.g. only people.
  • the solution according to the invention is capable of detecting with a high degree of reliability the simultaneous presence of bodies, not necessarily of the same shape, in the monitored region, and selecting for monitoring purposes only those bodies that present certain distinctive features, for example only people or objects above a certain height.
  • the quantity of information obtainable with the solution according to the invention is very much greater than could be obtained by means of conventional techniques of automatic detection, and makes possible more reliable and robust location of bodies within the monitored area, overcoming the limitations from which known systems usually suffer and so enabling it to be adapted more successfully to different working conditions and hence giving it greater generality of use.
  • the solution according to the invention is able to carry out a volumetric analysis and so extract the characteristics of form, volume and dimensions which distinguish an object or person which it is wished to pick out from other artefacts or objects that should be inside the monitored region.
  • a volumetric analysis is able to carry out a volumetric analysis and so extract the characteristics of form, volume and dimensions which distinguish an object or person which it is wished to pick out from other artefacts or objects that should be inside the monitored region.
  • volumetric map means any representation of occupied volumes due to the presence of a body, in other words a representation of a three-dimensional map in which the regions of volumetric occupation introduced by the presence of bodies are indicated.
  • Such a map is obtained after image analysis procedures have been carried out using automatic methods known per se or according to the embodiments of the invention described below.
  • the reference S indicates a region of space in which it is wished to detect the presence of people A or objects B.
  • the region S may be bounded by physical walls , as for example in the case of a room or cage, or may consist simply of a portion of space bounded by an imaginary closed surface that separates a generic space in two regions, or it may be bounded partly by physical barriers, for example the floor, and partly by an imaginary surface.
  • the monitored region has however the feature of a volume, may be of any shape and can be defined simply and flexibly according to need.
  • the volumes occupied by the bodies (such as bodies A, B visible in Figures 1 and 3) that are present in the monitored region S are found by using two or more video cameras (acting as image signal generating means) installed in such a way that the region S is in the visual field of at least two video cameras 2, as shown for example in Figures 1 and 3 (the latter figure referring to a solution in which four video cameras 2 are used). It is advisable for the video cameras 2 to be so positioned as to avoid occlusions due to the movement of objects on the same plane; for example, for bodies of different heights standing on the floor and moving about, it is preferable to have views from above.
  • the signals (of analogue type or already directly converted into digital form) output by the video cameras 2 are sent to a processing unit 1 which may be a specialized processor or, in the currently preferred embodiment of the invention, a computer such as a programmed personal computer (known per se) in order to extract from the images the shapes of the bodies A and B present within the region S to be monitored.
  • a processing unit 1 which may be a specialized processor or, in the currently preferred embodiment of the invention, a computer such as a programmed personal computer (known per se) in order to extract from the images the shapes of the bodies A and B present within the region S to be monitored.
  • the object here is to check for the possible presence of bodies not inherently belonging to the monitored region.
  • Figures al and bl included in Figure 2 and Figures al, bl, cl and dl included in Figure 4 show the images produced by the two video cameras depicted in Figure 1, on the one hand, and by the four video cameras depicted in Figure 3, on the other.
  • the unit 1 is able, in accordance with known principles (typically by using known image processing algorithms), to extract respective sets of data representing the abovementioned shapes of the bodies present within the region S.
  • Figures a2 and b2 of Figure 2 and Figures a2, b2, c2 and d2 of Figure 3 show the shapes, marked C of the body A present within the region S corresponding to images a1 and b1, or a1, b1, c1 and d1, produced, from their respective points of observation, by the video cameras 2.
  • the abovementioned shapes correspond only to bodies possessed of movement in themselves, thus eliminating - for the purposes of the subsequent processing - information corresponding to fixed parts of the image, such as the outlines of the region S represented by broken lines in parts a2 and b2, and a2, b2, c2 and d2 in Figures 2 and 4, or to the object B.
  • the region S is a room in a museum
  • the body A is the body of a visitor moving about in the room
  • the body B is a bench situated - in a fixed position of course - in the centre of the room.
  • objects having their own movement can be distinguished from fixed objects/items in such a way as to avoid, on the one hand, deception relating to the presence of an intruder body attempting to evade detection by moving very slowly and, on the other hand, the generation of false objects connected with, for example, vibratory movements, draughts, etc.
  • the processing unit 1 can be programmed to find matches on the images between projections of the same real-world points belonging to the detected bodies. In this way, information is extracted from each image frame about the positions of the projections of the same real-world points onto the different image planes corresponding to the different observation points identified by the video cameras 2 (or equivalent image signal generating means).
  • the availability of a larger number of observation points, and hence of a larger number of images gives a certain degree of redundancy which can be used to increase the accuracy and reliability of the detection action.
  • Figure 5 shows the currently preferred method of producing the volumetric map of bodies in the monitored region S.
  • the processing unit 1 is programmed, for the purposes of processing the signals generated by the video cameras 2, to divide up the entire volume of the region S into volumetric cells of fixed dimensions (some of these are shown diagrammatically at D1 and D2 in Figure 5), each cell corresponding to a portion of the real space inside the region S.
  • the region S is a room in a museum
  • this is not of course a limiting option as it has to do with the spatial resolution with which it is wished to detect and locate the objects: the smaller the cells, the greater the resolution and vice versa.
  • each three-dimensional cell is projected onto the image plane of each video camera.
  • each cell there is a certain area E on each image, as shown at the top of Figure 5, the bottom part of which meanwhile shows the location of the cells D1 and D2 within the three-dimensional Cartesian reference system used for locating the cells in question within the region S. Cells corresponding to regions of volume that are not covered by at least two video cameras are ignored.
  • the volumetric map F (see Figure 6) is obtained by checking each cell to see whether the areas corresponding to that cell's projection on the different images represent some portion of the objects present within the monitored region. If the outcome of the check is positive, that cell is judged to be occupied (for example, D1 is an example of this); otherwise it is judged to be unoccupied, as in the case of the cells marked D2. In this way the set of all occupied cells in each frame provides information about the volumetric occupation of the monitored region. From this information a volumetric map of occupation can be constructed almost immediately by checking which and how many of the cells into which the monitored space has been divided are occupied by objects. The aim of all this is to obtain, as a result, the representation seen in Figure 6.
  • the volumetric map approximates to the volume of actual occupation with a resolution based on the dimensions of the cells D1, D2 and on the spatial resolution of the means employed to generate the image signals (in particular, in the case of video cameras composed of a matrix of sensitive points, the resolution in a particular direction is given by the ratio of the dimension of the observed region to the number of sensitive elements in that direction).
  • the dimensions of the cells are defined having regard not only to the resolution but also to the processing capacity of the unit 1 and of the frequency with which the surveillance information is to be updated. Also borne in mind is the fact that the overall degree of approximation can be enhanced, as already stated, by using more video cameras in suitable positions.
  • the processing unit 1 can be programmed, again in a known manner, to carry out a volumetric analysis of bodies for the purpose of recognizing the distinctive characteristics of shape and volume of the objects analysed.
  • the programming can be done by conventional algorithmic approaches, thus coding ab initio the characteristics of shape and volume of the bodies to be detected into the processing system, or using statistical learning techniques such as for example neural networks.
  • a step 101 the unit 1 examines the data set corresponding to the images generated by the video cameras 2 (optionally already processed to refer only to moving objects) and in a second step 102 commences an action of scanning the region S in such a way as to scan the cells D into which the region S has been theoretically divided up.
  • each of these cells will be identified by three co-ordinates x i , y i , z i within the system x, y and z to which the bottom part of Figure 5 refers.
  • the unit 1 is also capable of detecting, for example, the entry into the region S of a body not previously present, the aim being to extend the scanning action to those cells (previously not included in the scanning action) which the body subsequently occupies.
  • the steps marked 1031, 1041; 1032, 1042; ...; 103n, 104n indicate successive processing stages, here shown as carried out in parallel, though in fact they can be performed serially, and therefore sequentially in time.
  • n is equal to 2, and to 4, in the illustrative embodiments shown in Figures 1 and 3, respectively
  • the unit 1 checks to see whether the cells corresponding to their respective images generated by the video cameras 2 can be regarded as occupied or unoccupied.
  • the results of the comparisons carried out in steps 1041, 1042, ..., 104n are processed in order to decide whether, on the basis of the image data, the scanned cell is to be regarded as occupied or unoccupied for the purposes of constructing the map of volumetric occupation.
  • attribution of the "occupied" value may be based on a different number of decision processes relating to the individual images than the number of images taken into consideration in attributing the "occupied" logic value to other cells.
  • the criterion used in attributing the logic value in question may be of unanimous type (the cell is judged to be occupied for the purposes of the construction of the map of volumetric occupation if and only if all the video cameras 2 whose images are taken into account produce data corresponding to occupation in the relevant image), majority type (the cell is judged to be occupied if the majority of video cameras 2 give data indicating occupation in the respective images), or correlation with the values attributed to adjacent cells (so that uncertainty in the attribution of the "occupied" value to a cell is resolved on the basis of confident values attributed to spatially adjacent cells) or different again, according to well-known criteria in the image processing field.
  • the step 106 in Figure 7 represents simply the selection step where it is decided whether or not the scan of the region S (or of the scanned subregion thereof) can be said to be complete.
  • step 106 If the result of the comparison in step 106 is positive, this indicates that the map of volumetric occupation is complete.
  • the map itself which can be represented as illustrated diagrammatically in Figure 6 (which must of course be understood to be a perspective representation of a data set which in reality is three-dimensional), is subjected to a processing step 107 for the extraction of a set of parameters P which represent in compact form the shape, volume, position and/or dimensions of the detected bodies.
  • a processing step 107 for the extraction of a set of parameters P which represent in compact form the shape, volume, position and/or dimensions of the detected bodies.
  • the map of volumetric occupation that is the set of occupied cells and their positions in space
  • the position of the centre of mass of each volumetric map can be used to represent the position of the body
  • the dimensions of width, length and height of the smallest parallelepiped which inscribes the occupied volume can be used as dimensional parameters.
  • Requicha A.G. "Representation of rigid solids: theory, methods and systems", Comp. Surveys, vol. 12, pp. 437-464, 1980
  • Requicha A.G. and Rossignac J.R. “Solid modeling and beyond”, IEEE Computer Graphics and Applications, vol. 12, pp. 31-44, 1992
  • Aggarwaal J.K. and Cai Q. "Human motion analysis: a review", Proceedings of IEEE Computer Society Workshop on Motion of Non-Rigid and Articulated Objects, pp. 90-102, 1997.
  • step 108 at least one of the parameters P obtained in this way is compared with a predetermined "model".
  • the purpose of this is to establish whether or not the map F, corresponding to the position, size and shape of a body such as the body C, is "compatible" with the criteria of monitoring or surveillance which the system according to the invention has to follow.
  • One possible model for comparison may correspond to a defined part of the region of space S in which the body C must come no closer than a limiting distance.
  • compatibility is checked by using, for example, the parameters P relating to the position and dimensions of the detected bodies.
  • the volume D corresponding to the region of space S which the body C must not enter may be an area that must be respected around a work of art exhibited in a museum (e.g. a picture hanging on a wall).
  • the volume D may be a zone that must be respected around a machine with moving parts or with exposed parts at a high temperature and/or voltage.
  • step 108 the unit checks (by applying known criteria) that, for example, none of the cells contained within the map of volumetric occupation F falls inside the volume D or is at a distance less than a minimum safety distance from the volume D.
  • the unit 1 prepares itself to repeat the monitoring action with reference to the next set of images taken by the video camera 2. The processing action thus returns upstream of step 101.
  • the processing action moves on from step 108 to a new step 109 corresponding to the emission of a warning signal.
  • a warning signal This may be represented by e.g. an acoustic or visual alarm signal (optionally at a distance, aimed at a manned remote control station) emitted by a corresponding device 3.
  • the device 3 must be understood to be of known type, depending on the alarm signal which it is wished to produce: it may for example be a siren, an acoustic indicator, a remote warning system, etc., connected to the unit 1.
  • the volumetric map of each body detected inside the monitored region and/or the manner in which it changes while the bodies are present in the monitored region can be compared with models of volumetric maps for other bodies, encoded in a similar manner and previously stored in the processing unit (take for example the model marked D in Figure 6) in order to recognize those bodies which must be detected from among all the bodies present inside the monitored region.
  • the bodies may for example be people only.
  • the solution described is highly robust and overcomes the functional limitations of currently used systems. Thus, it is capable of detecting the presence and at the same time determining the position of people or objects within a defined region of space, discriminate between objects and people, between objects or people close to each other, and between objects and people that move into the monitored region following different paths or more generally with behaviours which could easily deceive other types of sensor.
  • the method according to the invention can be carried out using, at least in part, a computer program capable of being run on a computer in such a way that the system comprising the program and the computer carries out the method according to the invention.
  • the invention therefore extends also to such a program capable of being loaded into a computer which has the means of or is capable of carrying out the method according to the invention, as well as to the corresponding information technology product comprising a means readable by a computer containing codes for a computer program which, when the program is loaded into the computer, cause the computer to carry out the method according to the invention.
EP99830376A 1999-06-17 1999-06-17 Verfahren und Vorrichtung zur automatischen Kontrolle einer Raumregion Withdrawn EP1061487A1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP99830376A EP1061487A1 (de) 1999-06-17 1999-06-17 Verfahren und Vorrichtung zur automatischen Kontrolle einer Raumregion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP99830376A EP1061487A1 (de) 1999-06-17 1999-06-17 Verfahren und Vorrichtung zur automatischen Kontrolle einer Raumregion

Publications (1)

Publication Number Publication Date
EP1061487A1 true EP1061487A1 (de) 2000-12-20

Family

ID=8243455

Family Applications (1)

Application Number Title Priority Date Filing Date
EP99830376A Withdrawn EP1061487A1 (de) 1999-06-17 1999-06-17 Verfahren und Vorrichtung zur automatischen Kontrolle einer Raumregion

Country Status (1)

Country Link
EP (1) EP1061487A1 (de)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003003310A1 (en) * 2001-06-29 2003-01-09 Honeywell International Inc. Moving object assessment system and method
WO2003003721A1 (en) * 2001-06-29 2003-01-09 Honeywell International, Inc. Surveillance system and methods regarding same
WO2003005315A1 (en) * 2001-07-02 2003-01-16 Koninklijke Philips Electronics N.V. Vision based method and apparatus for detecting an event requiring assistance or documentation
DE10138960A1 (de) * 2001-08-03 2003-02-27 Pilz Gmbh & Co Verfahren und Vorrichtung zum Beobachten, Vermessen oder Überwachen eines Raumbereichs
WO2003019490A1 (en) * 2001-08-22 2003-03-06 Koninklijke Philips Electronics N.V. Vision-based method and apparatus for detecting fraudulent events in a retail environment
WO2003036581A1 (en) * 2001-10-23 2003-05-01 Vistar Telecommunications Inc. Method of monitoring an enclosed space over a low data rate channel
WO2003042946A1 (en) * 2001-11-13 2003-05-22 Koninklijke Philips Electronics N.V. Vision-based method and apparatus for automatically activating a child safety feature
DE10245720A1 (de) * 2002-09-24 2004-04-01 Pilz Gmbh & Co. Verfahren un Vorrichtung zum Absichern eines Gefahrenbereichs
WO2004029502A1 (de) 2002-09-24 2004-04-08 Pilz Gmbh & Co. Kg Verfahren und vorrichtung zum absichern eines gefahrenbereichs
EP1422022A1 (de) * 2002-11-12 2004-05-26 Makita Corporation Kraftwerkzeuge
EP1482238A2 (de) 2003-05-29 2004-12-01 CASAGRANDE SpA Sicherheitsvorrichtung für Arbeitsmaschinen, insbesondere Bohrmaschinen oder dergleichen, und diese Vorrichtung verwendendes Verfahren zur Erfassung der Anwesenheit von Personen
BE1015588A3 (nl) * 2003-07-02 2005-06-07 Robosoft Nv Werkwijze voor het beveiligen van een machine of dergelijke en inrichting daarbij toegepast.
EP1586805A1 (de) * 2004-04-14 2005-10-19 Sick Ag Verfahren zur Überwachung eines Überwachungsbereichs
EP1635108A1 (de) * 2004-09-08 2006-03-15 Sick Ag Verfahren und Vorrichtung zum Erfassen eines Objekts
US7084779B2 (en) 2003-09-12 2006-08-01 Makita Corporation Power tool
WO2008028911A1 (de) * 2006-09-04 2008-03-13 Robert Bosch Gmbh Werkzeugmaschinenüberwachungsvorrichtung
WO2008028871A1 (de) * 2006-09-04 2008-03-13 Robert Bosch Gmbh Werkzeugmaschinenüberwachungsvorrichtung
WO2008014831A3 (de) * 2006-08-02 2008-04-03 Daimler Ag Verfahren zur beobachtung einer person in einem industriellen umfeld
WO2008098831A1 (de) * 2007-02-15 2008-08-21 Kuka Roboter Gmbh Verfahren und vorrichtung zum sichern eines arbeitsraums
WO2009062770A1 (de) * 2007-11-12 2009-05-22 Robert Bosch Gmbh Konfigurationsmodul für ein videoüberwachungssystem, überwachungssystem mit dem konfigurationsmodul, verfahren zur konfiguration eines videoüberwachungssystems sowie computerprogramm
WO2009083297A1 (de) * 2007-12-21 2009-07-09 Robert Bosch Gmbh Werkzeugmaschinenvorrichtung
US7684590B2 (en) 2004-04-19 2010-03-23 Ibeo Automobile Sensor Gmbh Method of recognizing and/or tracking objects
US7729511B2 (en) 2002-09-24 2010-06-01 Pilz Gmbh & Co. Kg Method and device for safeguarding a hazardous area
US7960676B2 (en) * 2007-09-28 2011-06-14 Casio Computer Co., Ltd. Image capture device to obtain an image with a bright subject, even when the background is dark, and recording medium
EP2650850A1 (de) * 2012-04-12 2013-10-16 Steinel GmbH Vorrichtung zur Steuerung eines Gebäudeaggregats
EP2685421A1 (de) 2012-07-13 2014-01-15 ABB Research Ltd. Ermittlung vorhandener Objekte in einem Prozesssteuerungssystem
DE102004044973B4 (de) * 2004-09-16 2014-12-04 Sick Ag Kontrolle eines Überwachungsbereiches
CN106163713A (zh) * 2014-02-14 2016-11-23 豪仕马裁板锯技术有限公司 用于对设备、尤其是板分割设备进行操作的方法
EP3112900A1 (de) * 2015-07-03 2017-01-04 Soilmec S.p.A. Sicherheiteinrichtung und verfahren zur überwachung einer region in der nähe eines apparates, z.b., einer bohrmaschine
IT201900021108A1 (it) * 2019-11-13 2021-05-13 Gamma System S R L Sistema di sicurezza per un macchinario industriale

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0356734A2 (de) * 1988-08-02 1990-03-07 Siemens Aktiengesellschaft Einbruchmeldeanlage für den Perimeterschutz mit Fernsehkameras

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0356734A2 (de) * 1988-08-02 1990-03-07 Siemens Aktiengesellschaft Einbruchmeldeanlage für den Perimeterschutz mit Fernsehkameras

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CARLSON J ET AL: "Real-time 3D visualization of volumetric video motion sensor data", SURVEILLANCE AND ASSESSMENT TECHNOLOGIES FOR LAW ENFORCEMENT, BOSTON, MA, USA, 19-20 NOV. 1996, vol. 2935, Proceedings of the SPIE - The International Society for Optical Engineering, 1997, SPIE-Int. Soc. Opt. Eng, USA, pages 69 - 79, XP000863382, ISSN: 0277-786X *
TARBOX G H ET AL: "VOLUMETRIC BASED INSPECTION*", PROCEEDINGS OF THE IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS,US,NEW YORK, IEEE, vol. -, pages 1239-1246, XP000334081, ISBN: 0-7803-0738-0 *

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003003721A1 (en) * 2001-06-29 2003-01-09 Honeywell International, Inc. Surveillance system and methods regarding same
WO2003003310A1 (en) * 2001-06-29 2003-01-09 Honeywell International Inc. Moving object assessment system and method
WO2003005315A1 (en) * 2001-07-02 2003-01-16 Koninklijke Philips Electronics N.V. Vision based method and apparatus for detecting an event requiring assistance or documentation
DE10138960A1 (de) * 2001-08-03 2003-02-27 Pilz Gmbh & Co Verfahren und Vorrichtung zum Beobachten, Vermessen oder Überwachen eines Raumbereichs
WO2003019490A1 (en) * 2001-08-22 2003-03-06 Koninklijke Philips Electronics N.V. Vision-based method and apparatus for detecting fraudulent events in a retail environment
WO2003036581A1 (en) * 2001-10-23 2003-05-01 Vistar Telecommunications Inc. Method of monitoring an enclosed space over a low data rate channel
US6720880B2 (en) 2001-11-13 2004-04-13 Koninklijke Philips Electronics N.V. Vision-based method and apparatus for automatically activating a child safety feature
WO2003042946A1 (en) * 2001-11-13 2003-05-22 Koninklijke Philips Electronics N.V. Vision-based method and apparatus for automatically activating a child safety feature
WO2004029502A1 (de) 2002-09-24 2004-04-08 Pilz Gmbh & Co. Kg Verfahren und vorrichtung zum absichern eines gefahrenbereichs
US7729511B2 (en) 2002-09-24 2010-06-01 Pilz Gmbh & Co. Kg Method and device for safeguarding a hazardous area
DE10245720A1 (de) * 2002-09-24 2004-04-01 Pilz Gmbh & Co. Verfahren un Vorrichtung zum Absichern eines Gefahrenbereichs
CN100389939C (zh) * 2002-11-12 2008-05-28 株式会社牧田 电动工具
EP1422022A1 (de) * 2002-11-12 2004-05-26 Makita Corporation Kraftwerkzeuge
US6959631B2 (en) 2002-11-12 2005-11-01 Makita Corporation Power tools
US7047854B2 (en) 2002-11-12 2006-05-23 Makita Corporation Power tools
EP1482238A2 (de) 2003-05-29 2004-12-01 CASAGRANDE SpA Sicherheitsvorrichtung für Arbeitsmaschinen, insbesondere Bohrmaschinen oder dergleichen, und diese Vorrichtung verwendendes Verfahren zur Erfassung der Anwesenheit von Personen
EP1482238A3 (de) * 2003-05-29 2005-08-17 CASAGRANDE SpA Sicherheitsvorrichtung für Arbeitsmaschinen, insbesondere Bohrmaschinen oder dergleichen, und diese Vorrichtung verwendendes Verfahren zur Erfassung der Anwesenheit von Personen
BE1015588A3 (nl) * 2003-07-02 2005-06-07 Robosoft Nv Werkwijze voor het beveiligen van een machine of dergelijke en inrichting daarbij toegepast.
US7084779B2 (en) 2003-09-12 2006-08-01 Makita Corporation Power tool
US7505620B2 (en) 2004-04-14 2009-03-17 Sick Ag Method for the monitoring of a monitored zone
EP1586805A1 (de) * 2004-04-14 2005-10-19 Sick Ag Verfahren zur Überwachung eines Überwachungsbereichs
US7684590B2 (en) 2004-04-19 2010-03-23 Ibeo Automobile Sensor Gmbh Method of recognizing and/or tracking objects
EP1635108A1 (de) * 2004-09-08 2006-03-15 Sick Ag Verfahren und Vorrichtung zum Erfassen eines Objekts
US7652238B2 (en) 2004-09-08 2010-01-26 Sick Ag Method and apparatus for detecting an object through the use of multiple sensors
DE102004044973B4 (de) * 2004-09-16 2014-12-04 Sick Ag Kontrolle eines Überwachungsbereiches
WO2008014831A3 (de) * 2006-08-02 2008-04-03 Daimler Ag Verfahren zur beobachtung einer person in einem industriellen umfeld
CN101511550B (zh) * 2006-08-02 2013-12-18 皮尔茨公司 用于观测工业环境中的人员的方法
US8154590B2 (en) 2006-08-02 2012-04-10 Pilz Gmbh & Co. Kg Method for observation of a person in an industrial environment
US8311661B2 (en) 2006-09-04 2012-11-13 Robert Bosch Gmbh Machine tool use situation monitoring device using reflected signal
US8615320B2 (en) 2006-09-04 2013-12-24 Robert Bosch Gmbh Machine tool monitoring device
WO2008028911A1 (de) * 2006-09-04 2008-03-13 Robert Bosch Gmbh Werkzeugmaschinenüberwachungsvorrichtung
WO2008028871A1 (de) * 2006-09-04 2008-03-13 Robert Bosch Gmbh Werkzeugmaschinenüberwachungsvorrichtung
RU2459138C2 (ru) * 2006-09-04 2012-08-20 Роберт Бош Гмбх Устройство для контроля технологической машины
WO2008098831A1 (de) * 2007-02-15 2008-08-21 Kuka Roboter Gmbh Verfahren und vorrichtung zum sichern eines arbeitsraums
US8227738B2 (en) 2007-09-28 2012-07-24 Casio Computer Co., Ltd. Image capture device for creating image data from a plurality of image capture data, and recording medium therefor
US7960676B2 (en) * 2007-09-28 2011-06-14 Casio Computer Co., Ltd. Image capture device to obtain an image with a bright subject, even when the background is dark, and recording medium
US9549155B2 (en) 2007-11-12 2017-01-17 Robert Bosch Gmbh Configuration module for a video surveillance system, surveillance system comprising the configuration module, method for configuring a video surveillance system, and computer program
WO2009062770A1 (de) * 2007-11-12 2009-05-22 Robert Bosch Gmbh Konfigurationsmodul für ein videoüberwachungssystem, überwachungssystem mit dem konfigurationsmodul, verfahren zur konfiguration eines videoüberwachungssystems sowie computerprogramm
CN101855906B (zh) * 2007-11-12 2017-11-24 罗伯特·博世有限公司 视频监视系统的配置模块、具有配置模块的监视系统和用于配置视频监视系统的方法
WO2009083297A1 (de) * 2007-12-21 2009-07-09 Robert Bosch Gmbh Werkzeugmaschinenvorrichtung
US8948903B2 (en) 2007-12-21 2015-02-03 Robert Bosch Gmbh Machine tool device having a computing unit adapted to distinguish at least two motions
EP2650850A1 (de) * 2012-04-12 2013-10-16 Steinel GmbH Vorrichtung zur Steuerung eines Gebäudeaggregats
WO2014009087A1 (en) 2012-07-13 2014-01-16 Abb Research Ltd Presenting process data of a process control object on a mobile terminal
EP2685421A1 (de) 2012-07-13 2014-01-15 ABB Research Ltd. Ermittlung vorhandener Objekte in einem Prozesssteuerungssystem
CN106163713A (zh) * 2014-02-14 2016-11-23 豪仕马裁板锯技术有限公司 用于对设备、尤其是板分割设备进行操作的方法
CN106163713B (zh) * 2014-02-14 2018-04-13 豪迈裁板锯技术有限公司 用于对设备、尤其是板分割设备进行操作的方法
EP3112900A1 (de) * 2015-07-03 2017-01-04 Soilmec S.p.A. Sicherheiteinrichtung und verfahren zur überwachung einer region in der nähe eines apparates, z.b., einer bohrmaschine
IT201900021108A1 (it) * 2019-11-13 2021-05-13 Gamma System S R L Sistema di sicurezza per un macchinario industriale

Similar Documents

Publication Publication Date Title
EP1061487A1 (de) Verfahren und Vorrichtung zur automatischen Kontrolle einer Raumregion
Rozsa et al. Obstacle prediction for automated guided vehicles based on point clouds measured by a tilted LIDAR sensor
RU2251739C2 (ru) Система распознавания объектов и слежения за ними
CN108241870B (zh) 用于分配测量数据内特定的感兴趣类别的方法
JP3785456B2 (ja) 駅ホームにおける安全監視装置
Zivkovic et al. Hierarchical map building using visual landmarks and geometric constraints
CN104935879B (zh) 用于活动顺序验证的基于视觉的监视系统
Anousaki et al. Simultaneous localization and map building for mobile robot navigation
Cameron et al. A Bayesian approach to optimal sensor placement
US5845048A (en) Applicable recognition system for estimating object conditions
US20050012817A1 (en) Selective surveillance system with active sensor management policies
KR101877294B1 (ko) 객체, 영역 및 객체가 유발하는 이벤트의 유기적 관계를 기반으로 한 복수 개 기본행동패턴 정의를 통한 복합 상황 설정 및 자동 상황 인지가 가능한 지능형 방범 cctv 시스템
Arbuckle et al. Temporal occupancy grids: a method for classifying the spatio-temporal properties of the environment
EP3680813A1 (de) Verfahren und system zum erfassen von in einem gebäude installierten objekten
Chakravarty et al. Panoramic vision and laser range finder fusion for multiple person tracking
Kim et al. Robotic sensing and object recognition from thermal-mapped point clouds
McGreavy et al. Next best view planning for object recognition in mobile robotics
Arsic et al. Applying multi layer homography for multi camera person tracking
Apolloni et al. Machine learning and robot perception
Foresti et al. Event classification for automatic visual-based surveillance of parking lots
Nickerson et al. An autonomous mobile robot for known industrial environments
JP3630703B2 (ja) 物体監視装置
KR102597692B1 (ko) 영상을 이용한 물건 부피의 측정 장치, 방법, 및 컴퓨터 프로그램
Özbilge Experiments in online expectation-based novelty-detection using 3D shape and colour perceptions for mobile robot inspection
Beibei et al. Crowd analysis: a survey

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

AX Request for extension of the european patent

Free format text: AL;LT;LV;MK;RO;SI

AKX Designation fees paid
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20010621

REG Reference to a national code

Ref country code: DE

Ref legal event code: 8566