EP0559357A1 - Système de surveillance - Google Patents

Système de surveillance Download PDF

Info

Publication number
EP0559357A1
EP0559357A1 EP93301284A EP93301284A EP0559357A1 EP 0559357 A1 EP0559357 A1 EP 0559357A1 EP 93301284 A EP93301284 A EP 93301284A EP 93301284 A EP93301284 A EP 93301284A EP 0559357 A1 EP0559357 A1 EP 0559357A1
Authority
EP
European Patent Office
Prior art keywords
beams
interruptions
operable
graph
zone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP93301284A
Other languages
German (de)
English (en)
Inventor
Johan Willie Viljoen
Ralph Jurgen Matzner
Cornelius Johannes Englebrecht
Francois Petrus Jacobus Le Roux
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Council for Scientific and Industrial Research CSIR
Original Assignee
Council for Scientific and Industrial Research CSIR
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Council for Scientific and Industrial Research CSIR filed Critical Council for Scientific and Industrial Research CSIR
Publication of EP0559357A1 publication Critical patent/EP0559357A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/181Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems
    • G08B13/183Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems by interruption of a radiation beam or barrier
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit

Definitions

  • THIS INVENTION relates to a monitoring system. It relates in particular to a monitoring system intended for use with a security system, eg for monitoring or controlling admission into an area.
  • a method of monitoring a zone includes transmitting a plurality of beams between at least two locations defining extremities of the zone, detecting interruptions of the beams caused by an object moving between the two locations, generating a graph of the interruptions relative to time, and comparing the graph generated with a set of preselected graphs representative of known objects.
  • a monitoring apparatus for monitoring a zone and which includes storage means for storing a set of preselected graphic representations of known objects, sensing means for sensing interruptions of a plurality of beams; graph generation means responsive to the sensing means and operable to generate a graphical representation with respect to time of the interruptions sensed by the sensing means; and comparing means operable to compare the graphical representation generated by the graph generation means with the graphical representations stored in the storage means.
  • the graph generated may be compared with the preselected graphs using image recognition techniques.
  • Interruptions of the beams may be monitored continuously and each interruption, restoration or partial occlusion of any beam may be stored together with the time of occurrence.
  • the graph of interruptions generated in the form of a temporal profile may be translated into a physical outline profile.
  • the resolution of the profile would be dependent upon the number of beams and the spacing between the beams, as well as the speed of the object interrupting the beams.
  • the graph generated may be subjected to pre-processing and data forming the graph may then be submitted in a suitable format to a pattern recognition algorithm to permit classification of the object.
  • Pre-processing may include extraction of image parameters, eg. moments, Zernike moments or Fourier coefficients, which can then be fed to a neural network.
  • the cause of the interruption may be classified into various classes and an indicator representative of the probable cause in terms of the classification may be displayed by display means.
  • the preselected graphs may be stored in the neural network.
  • the neural network may comprise a plurality of interconnected neural nodes, and signals received from the various nodes may be biased with different interconnection weights thereby to vary the effect of signals received from the various nodes. This can minimise errors. Training of the neural network and setting up of the interconnection weights is a computationally intensive process normally done on a powerful computer or workstation.
  • the final network weights reached after training of the neural network may then be made available to an operational or embedded computer, which is then used to classify the object.
  • This is a relatively simple process, with one multiplication and one addition per interconnection between each neural node of the neural network.
  • the operational computer need not be a very powerful computer, provided the size of the network is not obsolete.
  • the output of the network provides a real-time classification of the status of interruption of the beams.
  • the applicant has found that the most successful general pattern recognition algorithm (excluding the human brain) can be achieved by the neural network.
  • the graphic representations of known objects may be used to train the neural network to achieve optimal segmentation of N dimensional space defined by N image parameters.
  • a relatively small neural network may then be used for classification.
  • the sensing means may include an array of at least two optical beams transmitted in spaced relationship from one or more locations arranged at opposite extremities of the zone to be monitored, eg from a pair of poles or the like at extremities of the zone.
  • Detector elements responsive to the optical beams and operable to detect movement of objects through the zone by detecting interruptions of the beams may be housed in one or both of the pair of poles.
  • the optical beams may be transmitted in vertically spaced relationship but may also be transmitted in horizontally spaced relationship or at an angle between the vertical and the horizontal. Suitable arrangement of the spacing of the beams can facilitate detection of the direction of movement of the object.
  • the beams may be infra red beams, laser beams, or any other optical medium.
  • the apparatus may include alarm generating means operable to generate an alarm when a beam is interrupted.
  • inputs from the different zones may be fed via a multiplexer to a single processor.
  • the interconnection weights used in the neural network can be stored in a replaceable permanent memory, such as ROM, PROM, EPROM, EEPROM, and so on, for easy and quick upgrading of the algorithm if, based on experience in use of the apparatus in a particular location or generally, circumstances require this or if more accurate data become available.
  • a replaceable permanent memory such as ROM, PROM, EPROM, EEPROM, and so on
  • reference numeral 10 generally indicates a typical form of sensing means used to monitor activities occurring in a zone to be monitored.
  • the sensing means 10 includes a pair of poles 12.1 and 12.2 anchored in the ground at extremities of the zone.
  • the poles 12.1 and 12.2 can be mounted relatively close together, eg about 2 metres apart, or up to 1000 metres or more apart dependent upon the zone to be monitored.
  • four vertically spaced optical beams 14, which are preferably infra red beams, are transmitted between the poles 12.1 and 12.2 by suitable light emitters (not shown) contained within one or both of the poles 12.1 and 12.2.
  • Each beam 14 is received by a suitable light detector (not shown) so that whenever one of the beams is interrupted, this can be detected.
  • a monitoring apparatus 20 is shown connected to three of the sensing means 10.1, 10.2 and 10.3 of Figure 1. Obviously, more or less than three sensing means may be used.
  • the light detectors of the sensing means are scanned continuously so as to detect the presence or absence of light received from the light emitters at any instant of time.
  • Raw binary data received from the light detectors in each sensing means 10.1, 10.2 and 10.3, indicative of whether or not a particular beam is broken or not, are generated and fed via a multiplexer 22 to graphical generation means 24 which generates a graphical representation of interruptions of the light beams 14 with respect to time.
  • the raw data is also simultaneously stored in raw data storage means 25 so that the raw data can be made available at any later stage.
  • the graphical representation is then fed to a classifier 26.
  • the classifier 26 includes a pre-processor 28, a neural network 30 and comparing means 32.
  • the graphical representation is compared with preselected graphical representations stored in weighted storage means 34.
  • the output of the comparing means 32 is then fed to a controller 36 which may be in the form of a computer.
  • the controller 36 reacts to the output of the classifier 26 and in appropriate circumstances energises alarm means 38.
  • the controller also is connected to a printer and/or a display device 39 to enable the classified graphical representation to be printed and/or displayed.
  • a training workstation or computer 35 is used.
  • the neural network 30 is presented with seventeen inputs or features relating to any particular preselected object.
  • the seventeen inputs include a signal representative of the duration of any particular event and signals representative of the first sixteen two-dimensional moments of the object. Two-dimensional moments are defined as: In this case both i and j are in the range [0...3].
  • preselected network weights are transferred to the weight storage means 34 which feeds the neural network 30. Typical outputs from the neural network are shown in Figure 4 and can be presented to an operator via the display device and/or printer 39, combined with an audio warning signal from the alarm means 38.
  • Each change in beam status is logged with the time of occurrence, and the graphical representation is built up by normalizing the duration of the activity.
  • the duration can be displayed (in milliseconds) directly above the displayed event.
  • Figure 4 shows a typical output display from the neural network 30 after classification when presented with various inputs from various classes of objects.
  • Raw data of different persons walking through the sensing means 10 is shown in the graphical representations 40.1, 40.2 and 40.3; of a motor vehicle passing the sensing means 10 by the graphical representations 42.1 and 42.2; of a person crawling through the sensing means 10 by the graphical representations 44.1 and 44.2; and of a dog walking through the sensing means 10 by the graphical representations 46.1 and 46.2.
  • These graphical representations are compared with preselected graphs (not shown) stored in the storage means 34 of Figure 2.
  • an icon 48 denoting the class decided upon by the monitoring apparatus is shown so as to enable an operator easily to identify the object sensed.
  • a simplified form of neural network 30 is shown in Figure 3.
  • the network 30 has three input nodes or neurons 52.1, 52.2 and 52.3, an intermediate series of neurons 54.1, 54.2, 54.3 and 54.4, and two output neurons 56.1 and 56.2.
  • the neurons are interconnected by weighted lines 58 and 60 whereby different weights can be applied from a preceding neuron to influence the output of a subsequent neuron when data are presented to the input neurons 52.1, 52.2 and 52.3.
  • a neuron can thus be defined as a node with a number of weighted inputs which are summed together and passed through some form of non-linear process.
  • the most common non-linearity used is the sigmoid function: which maps the input to an output between 0 and 1.
  • the values of inputs to the nodes 52.1 to 52.3 are multiplied by appropriate selected weights on lines 58 and summed in the neurons 54.1 to 54.4.
  • the outputs of these neurons 54.1 to 54.4 are then passed in the same way to the next layer (in this instance the output neurons 56.1 and 56.2, although there can be more than one intermediate layer of neurons).
  • the classification is indicated by the output neuron 56.1 or 56.2 with the highest output value.
  • the performance of the network hinges on the values of the interconnection weights on lines 58 and 60. Adjustment of these weights is an optimization process which can be done in several ways (mostly variants of gradient-search routines) of which the conjugate-gradient algorithm is one of the most efficient.
  • Table 1 below illustrates a matrix associated with a neural network trained by a conjugate gradient training algorithm to distinguish between walking humans (class 1), motor vehicles (class 2), crawling humans (class 3) and dogs (class 4).
  • the rows correspond to true classes and the columns to the classes assigned to sample objects by the neural network 30.
  • a network with seventeen input neurons, twelve intermediate neurons and four output neurons was used. The seventeen input neurons were chosen to correspond to the first sixteen two-dimensional moments of the raw input graphical representations plus the total duration of the event causing interruption of the beams 14.
  • Four output neurons were used to denote the four classes the network was trained to distinguish.
  • Table 2 below shows the network weights which can be used for a two-input, two-output neural network with three intermediate neurons, partly trained on an exclusive-or problem, ie similar to that shown in Figure 3 except that there are only three intermediate neurons 54.1 to 54.3 instead of the four neurons 54.1 to 54.4 shown in Figure 3.
  • This problem has two classes (1 and 2) and two inputs. Inputs (0,0) and (1,1) fall in class 1, while the other two, (0,1) and (1,0), constitute class 2. It will be seen that Table 2 shows the network as having three inputs. This is because one input neuron is added, with constant activation of 1, to enable a variable threshold to be realized during training.
  • the invention illustrated provides a monitoring apparatus which is simple because the number of primary inputs provided by the sensing means 10 is small.
  • Two to eight beams 14 can be used in the sensing means of Figure 1.
  • the inputs are binary, ie the beam 14 is either broken or is not, or is quantified to at most a few possible values to provide for partial obscuration of the beams 14.
  • the number of outputs is limited to an alarm signal plus one of a few possible classifications. It will be appreciated that the response provided by the apparatus need not be instantaneous as a reaction time of a few hundred milliseconds to several seconds is acceptable in almost all cases.
  • the invention illustrated facilitates the optimal utilisation of data gathered by an optical beam fence in deciding on the issuing and classification of alarm messages. It furthermore allows the user to eliminate false alarms produced by spurious causes like power-line glitches, and enables a system based on the architecture described to degrade gracefully when the beam array is impaired by failure or obstruction.
  • Central processing of data by the controller 36 also eliminates duplication of logic services at each sensing means 10.1, 10.2 and 10.3 allowing a high level of system integrity to be maintained, since beam status information can be passed to the controller 36 together with data relating to interruption of the beams 14.
  • Classification is not done by a rule-based expert system and does not use human value judgements, since real-world data are used as the only inputs during training of the neural network 30. Although training is normally done on a personal computer or workstation, the classification can be implemented on a single-chip microprocessor.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)
EP93301284A 1992-03-04 1993-02-22 Système de surveillance Withdrawn EP0559357A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ZA921621 1992-03-04
ZA921621 1992-03-04

Publications (1)

Publication Number Publication Date
EP0559357A1 true EP0559357A1 (fr) 1993-09-08

Family

ID=25581470

Family Applications (1)

Application Number Title Priority Date Filing Date
EP93301284A Withdrawn EP0559357A1 (fr) 1992-03-04 1993-02-22 Système de surveillance

Country Status (2)

Country Link
EP (1) EP0559357A1 (fr)
ZA (1) ZA929406B (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994008258A1 (fr) * 1992-10-07 1994-04-14 Octrooibureau Kisch N.V. Appareil permettant d'identifier des objets passant dans un lieu predetermine
EP0875873A1 (fr) * 1997-04-30 1998-11-04 Sick Ag Capteur opto-électronique
EP0892280A2 (fr) * 1997-07-15 1999-01-20 Sick AG Procédé pour faire fonctionner un dispositif capteur opto-électronique
FR2826443A1 (fr) * 2001-06-21 2002-12-27 Gilles Cavallucci Procede et dispositif de detection optique de la position d'un objet
FR2867864A1 (fr) * 2004-03-17 2005-09-23 Automatic Systems Procede et installation de detection de passage associe a une porte d'acces
DE102021005129A1 (de) 2021-10-13 2023-04-13 vanory GmbH Verfahren und Vorrichtung zum Steuern elektronischer Geräte, insbesondere Leuchten
US11747513B2 (en) 2018-12-20 2023-09-05 Sick Ag Sensor apparatus

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3825916A (en) * 1972-10-20 1974-07-23 California Crime Technological Laser fence
US3898639A (en) * 1972-08-24 1975-08-05 Hrand M Muncheryan Security surveillance laser system
DE2940414A1 (de) * 1979-10-05 1981-04-09 Licentia Patent-Verwaltungs-Gmbh, 6000 Frankfurt Verfahren zur identifizierung von objekten
EP0118182A2 (fr) * 1983-02-08 1984-09-12 Pattern Processing Technologies Inc. Procédé pour le traitement de modèles
WO1988000745A1 (fr) * 1986-07-24 1988-01-28 Keith Jeffrey Gate Systeme de detection
FR2670404A1 (fr) * 1990-12-12 1992-06-19 Dassault Electronique Dispositif et procede de classification automatique a la volee de vehicules autonomes.

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3898639A (en) * 1972-08-24 1975-08-05 Hrand M Muncheryan Security surveillance laser system
US3825916A (en) * 1972-10-20 1974-07-23 California Crime Technological Laser fence
DE2940414A1 (de) * 1979-10-05 1981-04-09 Licentia Patent-Verwaltungs-Gmbh, 6000 Frankfurt Verfahren zur identifizierung von objekten
EP0118182A2 (fr) * 1983-02-08 1984-09-12 Pattern Processing Technologies Inc. Procédé pour le traitement de modèles
WO1988000745A1 (fr) * 1986-07-24 1988-01-28 Keith Jeffrey Gate Systeme de detection
FR2670404A1 (fr) * 1990-12-12 1992-06-19 Dassault Electronique Dispositif et procede de classification automatique a la volee de vehicules autonomes.

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994008258A1 (fr) * 1992-10-07 1994-04-14 Octrooibureau Kisch N.V. Appareil permettant d'identifier des objets passant dans un lieu predetermine
US5519784A (en) * 1992-10-07 1996-05-21 Vermeulen; Pieter J. E. Apparatus for classifying movement of objects along a passage by type and direction employing time domain patterns
EP0875873A1 (fr) * 1997-04-30 1998-11-04 Sick Ag Capteur opto-électronique
US6023335A (en) * 1997-04-30 2000-02-08 Sick Ag Optoelectronic sensor
EP0892280A2 (fr) * 1997-07-15 1999-01-20 Sick AG Procédé pour faire fonctionner un dispositif capteur opto-électronique
EP0892280A3 (fr) * 1997-07-15 1999-11-03 Sick AG Procédé pour faire fonctionner un dispositif capteur opto-électronique
FR2826443A1 (fr) * 2001-06-21 2002-12-27 Gilles Cavallucci Procede et dispositif de detection optique de la position d'un objet
WO2003003580A1 (fr) * 2001-06-21 2003-01-09 H2I Technologies Procede et dispositif de detection optique de la position d'un objet
US7221462B2 (en) 2001-06-21 2007-05-22 H2I Technologies, Societe Anonyme a Directoire et Conseil de Surveillance Method and device for optical detection of the position of an object
FR2867864A1 (fr) * 2004-03-17 2005-09-23 Automatic Systems Procede et installation de detection de passage associe a une porte d'acces
WO2005101062A1 (fr) * 2004-03-17 2005-10-27 Automatic Systems Procede et installation de detection de passage associe a une porte d’acces
US11747513B2 (en) 2018-12-20 2023-09-05 Sick Ag Sensor apparatus
DE102021005129A1 (de) 2021-10-13 2023-04-13 vanory GmbH Verfahren und Vorrichtung zum Steuern elektronischer Geräte, insbesondere Leuchten

Also Published As

Publication number Publication date
ZA929406B (en) 1993-09-27

Similar Documents

Publication Publication Date Title
EP0664012B1 (fr) Methode et appareil pour classifier le mouvement des objets traversant un passage
US7170418B2 (en) Probabilistic neural network for multi-criteria event detector
US5101194A (en) Pattern-recognizing passive infrared radiation detection system
Yue et al. A bio-inspired visual collision detection mechanism for cars: Optimisation of a model of a locust neuron to a novel environment
EP0702800B1 (fr) Systemes detecteurs
CA2275893C (fr) Systeme de securite video a faible taux de fausses alertes utilisant la classification d'objets
CN107665326A (zh) 乘客运输装置的监测系统、乘客运输装置及其监测方法
CN106846729A (zh) 一种基于卷积神经网络的跌倒检测方法和系统
GB2251310A (en) Method for detecting and classifying features in sonar images
US20070035622A1 (en) Method and apparatus for video surveillance
GB2251309A (en) Method and apparatus for detecting targets in sonar images
JP2019079445A (ja) 火災監視システム
JP2018073024A (ja) 監視システム
KR20190046351A (ko) 침입 탐지방법 및 그 장치
CN113484858A (zh) 一种入侵检测方法和系统
KR102360568B1 (ko) 터널 내 돌발상황 감지 시스템 및 방법
CN109870250A (zh) 区域异常体温监测方法、装置及计算机可读存储介质
EP0559357A1 (fr) Système de surveillance
CN108319892A (zh) 一种基于遗传算法的车辆安全预警方法及系统
FR2418505A1 (fr) Systeme de surveillance d'emplacements
US20210312190A1 (en) Monitoring device, and method for monitoring a man overboard situation
Mantri et al. Analysis of feedforward-backpropagation neural networks used in vehicle detection
KR102556447B1 (ko) 패턴 분석을 통한 상황 판단 시스템
KR102286229B1 (ko) 특징벡터 기반 싸움 이벤트 인식 방법
CA2932851A1 (fr) Systeme et procede de reconnaissance de motifs

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LI LU MC NL PT SE

17P Request for examination filed

Effective date: 19940301

17Q First examination report despatched

Effective date: 19961118

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Withdrawal date: 19970217