EP0559357A1 - Monitoring system - Google Patents

Monitoring system Download PDF

Info

Publication number
EP0559357A1
EP0559357A1 EP93301284A EP93301284A EP0559357A1 EP 0559357 A1 EP0559357 A1 EP 0559357A1 EP 93301284 A EP93301284 A EP 93301284A EP 93301284 A EP93301284 A EP 93301284A EP 0559357 A1 EP0559357 A1 EP 0559357A1
Authority
EP
European Patent Office
Prior art keywords
beams
interruptions
operable
graph
zone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP93301284A
Other languages
German (de)
French (fr)
Inventor
Johan Willie Viljoen
Ralph Jurgen Matzner
Cornelius Johannes Englebrecht
Francois Petrus Jacobus Le Roux
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Council for Scientific and Industrial Research CSIR
Original Assignee
Council for Scientific and Industrial Research CSIR
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Council for Scientific and Industrial Research CSIR filed Critical Council for Scientific and Industrial Research CSIR
Publication of EP0559357A1 publication Critical patent/EP0559357A1/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/181Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems
    • G08B13/183Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using active radiation detection systems by interruption of a radiation beam or barrier
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit

Definitions

  • THIS INVENTION relates to a monitoring system. It relates in particular to a monitoring system intended for use with a security system, eg for monitoring or controlling admission into an area.
  • a method of monitoring a zone includes transmitting a plurality of beams between at least two locations defining extremities of the zone, detecting interruptions of the beams caused by an object moving between the two locations, generating a graph of the interruptions relative to time, and comparing the graph generated with a set of preselected graphs representative of known objects.
  • a monitoring apparatus for monitoring a zone and which includes storage means for storing a set of preselected graphic representations of known objects, sensing means for sensing interruptions of a plurality of beams; graph generation means responsive to the sensing means and operable to generate a graphical representation with respect to time of the interruptions sensed by the sensing means; and comparing means operable to compare the graphical representation generated by the graph generation means with the graphical representations stored in the storage means.
  • the graph generated may be compared with the preselected graphs using image recognition techniques.
  • Interruptions of the beams may be monitored continuously and each interruption, restoration or partial occlusion of any beam may be stored together with the time of occurrence.
  • the graph of interruptions generated in the form of a temporal profile may be translated into a physical outline profile.
  • the resolution of the profile would be dependent upon the number of beams and the spacing between the beams, as well as the speed of the object interrupting the beams.
  • the graph generated may be subjected to pre-processing and data forming the graph may then be submitted in a suitable format to a pattern recognition algorithm to permit classification of the object.
  • Pre-processing may include extraction of image parameters, eg. moments, Zernike moments or Fourier coefficients, which can then be fed to a neural network.
  • the cause of the interruption may be classified into various classes and an indicator representative of the probable cause in terms of the classification may be displayed by display means.
  • the preselected graphs may be stored in the neural network.
  • the neural network may comprise a plurality of interconnected neural nodes, and signals received from the various nodes may be biased with different interconnection weights thereby to vary the effect of signals received from the various nodes. This can minimise errors. Training of the neural network and setting up of the interconnection weights is a computationally intensive process normally done on a powerful computer or workstation.
  • the final network weights reached after training of the neural network may then be made available to an operational or embedded computer, which is then used to classify the object.
  • This is a relatively simple process, with one multiplication and one addition per interconnection between each neural node of the neural network.
  • the operational computer need not be a very powerful computer, provided the size of the network is not obsolete.
  • the output of the network provides a real-time classification of the status of interruption of the beams.
  • the applicant has found that the most successful general pattern recognition algorithm (excluding the human brain) can be achieved by the neural network.
  • the graphic representations of known objects may be used to train the neural network to achieve optimal segmentation of N dimensional space defined by N image parameters.
  • a relatively small neural network may then be used for classification.
  • the sensing means may include an array of at least two optical beams transmitted in spaced relationship from one or more locations arranged at opposite extremities of the zone to be monitored, eg from a pair of poles or the like at extremities of the zone.
  • Detector elements responsive to the optical beams and operable to detect movement of objects through the zone by detecting interruptions of the beams may be housed in one or both of the pair of poles.
  • the optical beams may be transmitted in vertically spaced relationship but may also be transmitted in horizontally spaced relationship or at an angle between the vertical and the horizontal. Suitable arrangement of the spacing of the beams can facilitate detection of the direction of movement of the object.
  • the beams may be infra red beams, laser beams, or any other optical medium.
  • the apparatus may include alarm generating means operable to generate an alarm when a beam is interrupted.
  • inputs from the different zones may be fed via a multiplexer to a single processor.
  • the interconnection weights used in the neural network can be stored in a replaceable permanent memory, such as ROM, PROM, EPROM, EEPROM, and so on, for easy and quick upgrading of the algorithm if, based on experience in use of the apparatus in a particular location or generally, circumstances require this or if more accurate data become available.
  • a replaceable permanent memory such as ROM, PROM, EPROM, EEPROM, and so on
  • reference numeral 10 generally indicates a typical form of sensing means used to monitor activities occurring in a zone to be monitored.
  • the sensing means 10 includes a pair of poles 12.1 and 12.2 anchored in the ground at extremities of the zone.
  • the poles 12.1 and 12.2 can be mounted relatively close together, eg about 2 metres apart, or up to 1000 metres or more apart dependent upon the zone to be monitored.
  • four vertically spaced optical beams 14, which are preferably infra red beams, are transmitted between the poles 12.1 and 12.2 by suitable light emitters (not shown) contained within one or both of the poles 12.1 and 12.2.
  • Each beam 14 is received by a suitable light detector (not shown) so that whenever one of the beams is interrupted, this can be detected.
  • a monitoring apparatus 20 is shown connected to three of the sensing means 10.1, 10.2 and 10.3 of Figure 1. Obviously, more or less than three sensing means may be used.
  • the light detectors of the sensing means are scanned continuously so as to detect the presence or absence of light received from the light emitters at any instant of time.
  • Raw binary data received from the light detectors in each sensing means 10.1, 10.2 and 10.3, indicative of whether or not a particular beam is broken or not, are generated and fed via a multiplexer 22 to graphical generation means 24 which generates a graphical representation of interruptions of the light beams 14 with respect to time.
  • the raw data is also simultaneously stored in raw data storage means 25 so that the raw data can be made available at any later stage.
  • the graphical representation is then fed to a classifier 26.
  • the classifier 26 includes a pre-processor 28, a neural network 30 and comparing means 32.
  • the graphical representation is compared with preselected graphical representations stored in weighted storage means 34.
  • the output of the comparing means 32 is then fed to a controller 36 which may be in the form of a computer.
  • the controller 36 reacts to the output of the classifier 26 and in appropriate circumstances energises alarm means 38.
  • the controller also is connected to a printer and/or a display device 39 to enable the classified graphical representation to be printed and/or displayed.
  • a training workstation or computer 35 is used.
  • the neural network 30 is presented with seventeen inputs or features relating to any particular preselected object.
  • the seventeen inputs include a signal representative of the duration of any particular event and signals representative of the first sixteen two-dimensional moments of the object. Two-dimensional moments are defined as: In this case both i and j are in the range [0...3].
  • preselected network weights are transferred to the weight storage means 34 which feeds the neural network 30. Typical outputs from the neural network are shown in Figure 4 and can be presented to an operator via the display device and/or printer 39, combined with an audio warning signal from the alarm means 38.
  • Each change in beam status is logged with the time of occurrence, and the graphical representation is built up by normalizing the duration of the activity.
  • the duration can be displayed (in milliseconds) directly above the displayed event.
  • Figure 4 shows a typical output display from the neural network 30 after classification when presented with various inputs from various classes of objects.
  • Raw data of different persons walking through the sensing means 10 is shown in the graphical representations 40.1, 40.2 and 40.3; of a motor vehicle passing the sensing means 10 by the graphical representations 42.1 and 42.2; of a person crawling through the sensing means 10 by the graphical representations 44.1 and 44.2; and of a dog walking through the sensing means 10 by the graphical representations 46.1 and 46.2.
  • These graphical representations are compared with preselected graphs (not shown) stored in the storage means 34 of Figure 2.
  • an icon 48 denoting the class decided upon by the monitoring apparatus is shown so as to enable an operator easily to identify the object sensed.
  • a simplified form of neural network 30 is shown in Figure 3.
  • the network 30 has three input nodes or neurons 52.1, 52.2 and 52.3, an intermediate series of neurons 54.1, 54.2, 54.3 and 54.4, and two output neurons 56.1 and 56.2.
  • the neurons are interconnected by weighted lines 58 and 60 whereby different weights can be applied from a preceding neuron to influence the output of a subsequent neuron when data are presented to the input neurons 52.1, 52.2 and 52.3.
  • a neuron can thus be defined as a node with a number of weighted inputs which are summed together and passed through some form of non-linear process.
  • the most common non-linearity used is the sigmoid function: which maps the input to an output between 0 and 1.
  • the values of inputs to the nodes 52.1 to 52.3 are multiplied by appropriate selected weights on lines 58 and summed in the neurons 54.1 to 54.4.
  • the outputs of these neurons 54.1 to 54.4 are then passed in the same way to the next layer (in this instance the output neurons 56.1 and 56.2, although there can be more than one intermediate layer of neurons).
  • the classification is indicated by the output neuron 56.1 or 56.2 with the highest output value.
  • the performance of the network hinges on the values of the interconnection weights on lines 58 and 60. Adjustment of these weights is an optimization process which can be done in several ways (mostly variants of gradient-search routines) of which the conjugate-gradient algorithm is one of the most efficient.
  • Table 1 below illustrates a matrix associated with a neural network trained by a conjugate gradient training algorithm to distinguish between walking humans (class 1), motor vehicles (class 2), crawling humans (class 3) and dogs (class 4).
  • the rows correspond to true classes and the columns to the classes assigned to sample objects by the neural network 30.
  • a network with seventeen input neurons, twelve intermediate neurons and four output neurons was used. The seventeen input neurons were chosen to correspond to the first sixteen two-dimensional moments of the raw input graphical representations plus the total duration of the event causing interruption of the beams 14.
  • Four output neurons were used to denote the four classes the network was trained to distinguish.
  • Table 2 below shows the network weights which can be used for a two-input, two-output neural network with three intermediate neurons, partly trained on an exclusive-or problem, ie similar to that shown in Figure 3 except that there are only three intermediate neurons 54.1 to 54.3 instead of the four neurons 54.1 to 54.4 shown in Figure 3.
  • This problem has two classes (1 and 2) and two inputs. Inputs (0,0) and (1,1) fall in class 1, while the other two, (0,1) and (1,0), constitute class 2. It will be seen that Table 2 shows the network as having three inputs. This is because one input neuron is added, with constant activation of 1, to enable a variable threshold to be realized during training.
  • the invention illustrated provides a monitoring apparatus which is simple because the number of primary inputs provided by the sensing means 10 is small.
  • Two to eight beams 14 can be used in the sensing means of Figure 1.
  • the inputs are binary, ie the beam 14 is either broken or is not, or is quantified to at most a few possible values to provide for partial obscuration of the beams 14.
  • the number of outputs is limited to an alarm signal plus one of a few possible classifications. It will be appreciated that the response provided by the apparatus need not be instantaneous as a reaction time of a few hundred milliseconds to several seconds is acceptable in almost all cases.
  • the invention illustrated facilitates the optimal utilisation of data gathered by an optical beam fence in deciding on the issuing and classification of alarm messages. It furthermore allows the user to eliminate false alarms produced by spurious causes like power-line glitches, and enables a system based on the architecture described to degrade gracefully when the beam array is impaired by failure or obstruction.
  • Central processing of data by the controller 36 also eliminates duplication of logic services at each sensing means 10.1, 10.2 and 10.3 allowing a high level of system integrity to be maintained, since beam status information can be passed to the controller 36 together with data relating to interruption of the beams 14.
  • Classification is not done by a rule-based expert system and does not use human value judgements, since real-world data are used as the only inputs during training of the neural network 30. Although training is normally done on a personal computer or workstation, the classification can be implemented on a single-chip microprocessor.

Abstract

A monitoring apparatus and a method of monitoring a zone are disclosed. A plurality of beams (14) are transmitted between at least two locations (12.2,12.2) defining extremities of the zone. Interruptions of the beams (14) caused by an object moving between the two locations are detected by sensing means and a graph of the interruptions relative to time is generated. The graph generated is compared with a set of preselected graphs representative of known objects stored in storage means (34).

Description

  • THIS INVENTION relates to a monitoring system. It relates in particular to a monitoring system intended for use with a security system, eg for monitoring or controlling admission into an area.
  • In conventional security systems in which one or more beams, such as optical beams, are transmitted between two or more locations, eg spanning the perimeter of a security area, problems can arise as a result of false alarms caused by natural objects, such as birds, dogs, waving branches of trees, and the like, interrupting the beams. If the security area is large, monitoring of such false alarms can be time consuming in that it is normally necessary for a person monitoring the area physically to inspect the region in which the beam was interrupted unless expensive cameras are installed at various locations. It is an object of the invention to offer a solution to this problem.
  • In accordance with the invention a method of monitoring a zone includes transmitting a plurality of beams between at least two locations defining extremities of the zone, detecting interruptions of the beams caused by an object moving between the two locations, generating a graph of the interruptions relative to time, and comparing the graph generated with a set of preselected graphs representative of known objects.
  • Further according to the invention there is provided a monitoring apparatus for monitoring a zone and which includes
       storage means for storing a set of preselected graphic representations of known objects,
       sensing means for sensing interruptions of a plurality of beams;
       graph generation means responsive to the sensing means and operable to generate a graphical representation with respect to time of the interruptions sensed by the sensing means; and
       comparing means operable to compare the graphical representation generated by the graph generation means with the graphical representations stored in the storage means.
  • The graph generated may be compared with the preselected graphs using image recognition techniques.
  • Interruptions of the beams may be monitored continuously and each interruption, restoration or partial occlusion of any beam may be stored together with the time of occurrence.
  • The graph of interruptions generated in the form of a temporal profile may be translated into a physical outline profile. The resolution of the profile would be dependent upon the number of beams and the spacing between the beams, as well as the speed of the object interrupting the beams.
  • The graph generated may be subjected to pre-processing and data forming the graph may then be submitted in a suitable format to a pattern recognition algorithm to permit classification of the object. Pre-processing may include extraction of image parameters, eg. moments, Zernike moments or Fourier coefficients, which can then be fed to a neural network. The cause of the interruption may be classified into various classes and an indicator representative of the probable cause in terms of the classification may be displayed by display means.
  • The preselected graphs may be stored in the neural network. The neural network may comprise a plurality of interconnected neural nodes, and signals received from the various nodes may be biased with different interconnection weights thereby to vary the effect of signals received from the various nodes. This can minimise errors. Training of the neural network and setting up of the interconnection weights is a computationally intensive process normally done on a powerful computer or workstation.
  • The final network weights reached after training of the neural network may then be made available to an operational or embedded computer, which is then used to classify the object. This is a relatively simple process, with one multiplication and one addition per interconnection between each neural node of the neural network. The operational computer need not be a very powerful computer, provided the size of the network is not extravagant. The output of the network provides a real-time classification of the status of interruption of the beams.
  • Although statistical methods can be used to ascertain the defining features of a specific object thereby to classify the object, the applicant has found that the most successful general pattern recognition algorithm (excluding the human brain) can be achieved by the neural network. In the neural network, the graphic representations of known objects may be used to train the neural network to achieve optimal segmentation of N dimensional space defined by N image parameters.
  • A relatively small neural network may then be used for classification.
  • The sensing means may include an array of at least two optical beams transmitted in spaced relationship from one or more locations arranged at opposite extremities of the zone to be monitored, eg from a pair of poles or the like at extremities of the zone. Detector elements responsive to the optical beams and operable to detect movement of objects through the zone by detecting interruptions of the beams may be housed in one or both of the pair of poles. The optical beams may be transmitted in vertically spaced relationship but may also be transmitted in horizontally spaced relationship or at an angle between the vertical and the horizontal. Suitable arrangement of the spacing of the beams can facilitate detection of the direction of movement of the object. The beams may be infra red beams, laser beams, or any other optical medium.
  • The apparatus may include alarm generating means operable to generate an alarm when a beam is interrupted.
  • If a plurality of different zones are to be monitored, inputs from the different zones may be fed via a multiplexer to a single processor.
  • In order to improve operation of the classifying means in its operational environment, as much as possible pre-classified data relating to beam interruptions of known objects are gathered, care being exercised that it represents as far as possible all manifestations of all types of activity expected to cause interruption of the beams. This data is then used as criteria to train the neural network. Selection of the criteria and capturing of known beam interruption causes may be done as an off-line process and need be done only once for a particular installation on a separate computer.
  • The interconnection weights used in the neural network can be stored in a replaceable permanent memory, such as ROM, PROM, EPROM, EEPROM, and so on, for easy and quick upgrading of the algorithm if, based on experience in use of the apparatus in a particular location or generally, circumstances require this or if more accurate data become available.
  • Since a correctly trained and a correctly sized network is capable of generalising, fundamental rules distinguishing different types of activity from each other are, in effect, discovered by the network itself during training. It is therefore important that the training data is representative.
  • An embodiment of the invention is now described, by way of example, with reference to the accompanying drawings, in which:
    • Figure 1 is a side view of one form of sensing means used with a monitoring apparatus in accordance with the invention;
    • Figure 2 is a schematic block diagram of the monitoring apparatus;
    • Figure 3 is a schematic diagram of a simple neural network which can be used in the monitoring apparatus; and
    • Figure 4 shows schematic diagrams of various outputs of the neural network resulting from interruptions of the sensing means of Figure 1.
  • Referring to Figure 1, reference numeral 10 generally indicates a typical form of sensing means used to monitor activities occurring in a zone to be monitored. The sensing means 10 includes a pair of poles 12.1 and 12.2 anchored in the ground at extremities of the zone. The poles 12.1 and 12.2 can be mounted relatively close together, eg about 2 metres apart, or up to 1000 metres or more apart dependent upon the zone to be monitored. In the embodiment illustrated four vertically spaced optical beams 14, which are preferably infra red beams, are transmitted between the poles 12.1 and 12.2 by suitable light emitters (not shown) contained within one or both of the poles 12.1 and 12.2. Each beam 14 is received by a suitable light detector (not shown) so that whenever one of the beams is interrupted, this can be detected.
  • Referring now to Figure 2, a monitoring apparatus 20 is shown connected to three of the sensing means 10.1, 10.2 and 10.3 of Figure 1. Obviously, more or less than three sensing means may be used. The light detectors of the sensing means are scanned continuously so as to detect the presence or absence of light received from the light emitters at any instant of time. Raw binary data received from the light detectors in each sensing means 10.1, 10.2 and 10.3, indicative of whether or not a particular beam is broken or not, are generated and fed via a multiplexer 22 to graphical generation means 24 which generates a graphical representation of interruptions of the light beams 14 with respect to time. The raw data is also simultaneously stored in raw data storage means 25 so that the raw data can be made available at any later stage.
  • The graphical representation is then fed to a classifier 26. The classifier 26 includes a pre-processor 28, a neural network 30 and comparing means 32. The graphical representation is compared with preselected graphical representations stored in weighted storage means 34. The output of the comparing means 32 is then fed to a controller 36 which may be in the form of a computer. The controller 36 reacts to the output of the classifier 26 and in appropriate circumstances energises alarm means 38. The controller also is connected to a printer and/or a display device 39 to enable the classified graphical representation to be printed and/or displayed.
  • In order to train the classifier 26 to detect and classify expected known objects, a training workstation or computer 35 is used.
  • During training, data relating to known objects are classified and pre-processed and stored in the weighted storage means 34 thereby to train the classifier 26 to distinguish between the preselected classes of expected objects. In a preferred embodiment, the neural network 30 is presented with seventeen inputs or features relating to any particular preselected object. The seventeen inputs include a signal representative of the duration of any particular event and signals representative of the first sixteen two-dimensional moments of the object. Two-dimensional moments are defined as:
    Figure imgb0001

       In this case both i and j are in the range [0...3]. After training, preselected network weights are transferred to the weight storage means 34 which feeds the neural network 30. Typical outputs from the neural network are shown in Figure 4 and can be presented to an operator via the display device and/or printer 39, combined with an audio warning signal from the alarm means 38.
  • Each change in beam status is logged with the time of occurrence, and the graphical representation is built up by normalizing the duration of the activity. The duration can be displayed (in milliseconds) directly above the displayed event.
  • Figure 4 shows a typical output display from the neural network 30 after classification when presented with various inputs from various classes of objects. Raw data of different persons walking through the sensing means 10 is shown in the graphical representations 40.1, 40.2 and 40.3; of a motor vehicle passing the sensing means 10 by the graphical representations 42.1 and 42.2; of a person crawling through the sensing means 10 by the graphical representations 44.1 and 44.2; and of a dog walking through the sensing means 10 by the graphical representations 46.1 and 46.2. These graphical representations are compared with preselected graphs (not shown) stored in the storage means 34 of Figure 2. In each case, directly to the right of the graphical representations of the raw data, an icon 48 denoting the class decided upon by the monitoring apparatus is shown so as to enable an operator easily to identify the object sensed.
  • A simplified form of neural network 30 is shown in Figure 3. The network 30 has three input nodes or neurons 52.1, 52.2 and 52.3, an intermediate series of neurons 54.1, 54.2, 54.3 and 54.4, and two output neurons 56.1 and 56.2. The neurons are interconnected by weighted lines 58 and 60 whereby different weights can be applied from a preceding neuron to influence the output of a subsequent neuron when data are presented to the input neurons 52.1, 52.2 and 52.3.
  • Thus data received by say the input neuron 52.1 will be weighted by the weighted lines 58 and influence the output of the intermediate neurons 54.1 to 54.4 to present a signal which is itself weighted by the weighted lines 60 and which is then presented to the output neurons 56.1 and 56.2. A neuron can thus be defined as a node with a number of weighted inputs which are summed together and passed through some form of non-linear process. The most common non-linearity used is the sigmoid function:
    Figure imgb0002

    which maps the input to an output between 0 and 1. The values of inputs to the nodes 52.1 to 52.3 are multiplied by appropriate selected weights on lines 58 and summed in the neurons 54.1 to 54.4. The outputs of these neurons 54.1 to 54.4 are then passed in the same way to the next layer (in this instance the output neurons 56.1 and 56.2, although there can be more than one intermediate layer of neurons). The classification is indicated by the output neuron 56.1 or 56.2 with the highest output value. Obviously the performance of the network hinges on the values of the interconnection weights on lines 58 and 60. Adjustment of these weights is an optimization process which can be done in several ways (mostly variants of gradient-search routines) of which the conjugate-gradient algorithm is one of the most efficient.
  • Table 1 below illustrates a matrix associated with a neural network trained by a conjugate gradient training algorithm to distinguish between walking humans (class 1), motor vehicles (class 2), crawling humans (class 3) and dogs (class 4). In Table 1, the rows correspond to true classes and the columns to the classes assigned to sample objects by the neural network 30. A network with seventeen input neurons, twelve intermediate neurons and four output neurons was used. The seventeen input neurons were chosen to correspond to the first sixteen two-dimensional moments of the raw input graphical representations plus the total duration of the event causing interruption of the beams 14. Four output neurons were used to denote the four classes the network was trained to distinguish.
    Figure imgb0003
  • It can be seen in Table 1 that the network has no difficulty in distinguishing the first two classes, ie the walking humans and the motor vehicles, from the others and from each other. As can be expected it was somewhat more difficult to distinguish a crawling human (class 3) from a dog (class 4), with 2 out of 52 humans being classified as dogs and 4 out of 109 dogs being classified as humans. This gives however a better than 96% accuracy on these two classes, with an overall accuracy of 97.68%.
  • Table 2 below shows the network weights which can be used for a two-input, two-output neural network with three intermediate neurons, partly trained on an exclusive-or problem, ie similar to that shown in Figure 3 except that there are only three intermediate neurons 54.1 to 54.3 instead of the four neurons 54.1 to 54.4 shown in Figure 3. This problem has two classes (1 and 2) and two inputs. Inputs (0,0) and (1,1) fall in class 1, while the other two, (0,1) and (1,0), constitute class 2. It will be seen that Table 2 shows the network as having three inputs. This is because one input neuron is added, with constant activation of 1, to enable a variable threshold to be realized during training.
    Figure imgb0004
  • The invention illustrated provides a monitoring apparatus which is simple because the number of primary inputs provided by the sensing means 10 is small. Two to eight beams 14 can be used in the sensing means of Figure 1. Also, the inputs are binary, ie the beam 14 is either broken or is not, or is quantified to at most a few possible values to provide for partial obscuration of the beams 14. The number of outputs is limited to an alarm signal plus one of a few possible classifications. It will be appreciated that the response provided by the apparatus need not be instantaneous as a reaction time of a few hundred milliseconds to several seconds is acceptable in almost all cases.
  • The applicant believes that because an array of active optical beam sensors with a pattern recognition algorithm are combined, the false alarm rate compared with that of a conventional rule-based beam security system can be drastically reduced.
  • The invention illustrated facilitates the optimal utilisation of data gathered by an optical beam fence in deciding on the issuing and classification of alarm messages. It furthermore allows the user to eliminate false alarms produced by spurious causes like power-line glitches, and enables a system based on the architecture described to degrade gracefully when the beam array is impaired by failure or obstruction. Central processing of data by the controller 36 also eliminates duplication of logic services at each sensing means 10.1, 10.2 and 10.3 allowing a high level of system integrity to be maintained, since beam status information can be passed to the controller 36 together with data relating to interruption of the beams 14. Classification is not done by a rule-based expert system and does not use human value judgements, since real-world data are used as the only inputs during training of the neural network 30. Although training is normally done on a personal computer or workstation, the classification can be implemented on a single-chip microprocessor.

Claims (10)

  1. A method of monitoring a zone characterised in that it includes transmitting a plurality of beams (14) between at least two locations (12.1,12.2) defining extremities of the zone, detecting interruptions of the beams (14) caused by an object moving between the two locations (12.1,12.2), generating a graph (40.1,40.2; 42.1,42.2; 44.1.44.2; 46.1,46.2) of the interruptions relative to time, and comparing the graph generated with a set of preselected graphs representative of known objects.
  2. A method as claimed in claim 1, characterised in that the graph generated is compared with the preselected graphs using image recognition techniques.
  3. A method as claimed in claim 1 or claim 2, characterised in that interruptions of the beams (14) are monitored continuously and in that each interruption, restoration or partial occlusion of any beam is stored together with the time of occurrence and in that the graph of interruptions generated in the form of a temporal profile is translated into a physical outline profile.
  4. A method as claimed in any one of the preceding claims, characterised in that it includes pre-processing the graph generated and then submitting data forming the graph in a suitable format to a pattern recognition algorithm to permit classification of the cause of the interruption into various classes and displaying an indicator representative of the probable cause in terms of the classification.
  5. A method as claimed in any one of the preceding claims, characterised in that the preselected graphs are stored in a neural network (30) comprising a plurality of neural nodes (52.1,52.2,52.3; 54.1,54.2,54.3,54.4; 56.1,56.2) and in that signals received from different nodes are selectively biased by interconnection weights (58,60) thereby to vary the effect of signals received from different nodes.
  6. A monitoring apparatus for monitoring a zone, characterised in that it includes
       storage means (34) for storing a set of preselected graphic representations of known objects,
       sensing means for sensing interruptions of a plurality of beams (14);
       graph generation means (24) responsive to the sensing means and operable to generate a graphical representation (40.1,40.2; 42.1,42.2; 44.1.44.2; 46.1,46.2) with respect to time of the interruptions sensed by the sensing means; and
       comparing means (32) operable to compare the graphical representation generated by the graph generation means (24) with the graphical representations stored in the storage means (34).
  7. An apparatus as claimed in claim 6, characterised in that the sensing means includes an array of at least two optical beams (14) transmitted in spaced relationship from at least one location (12.1) at one extremity of the zone to be monitored, and in that another location (12.2) at an opposite extremity of the zone has detector elements responsive to the optical beams (14) and operable to detect movement of objects through the zone by detecting interruptions of the beams.
  8. An apparatus as claimed in claim 6 or claim 7, characterised in that it includes alarm generating means (38) operable to generate an alarm when a beam is interrupted and in that it includes a multiplexer (22) operable to pass input signals received from a plurality of different zones to be monitored to a single processor (26,36).
  9. An apparatus as claimed in any one of the preceding claims 6 to 8, characterised in that it includes a neural network (30) comprising a plurality of interconnected neural nodes (52.1,52.2,52.3; 54.1,54.2,54.3,54.4; 56.1,56.2), and in that it includes biassing means (58,60) operable to bias signals received from the nodes with different interconnection weights thereby to vary the effect of signals received from the various nodes.
  10. An apparatus as claimed in any one of the preceding claims 8 to 12, characterised in that it includes classification means (26) operable to classify interruptions of the beams into a plurality of classes representative of likely causes of the interruptions, and display means (39) operable to display an indication of the class derived by the classification means.
EP93301284A 1992-03-04 1993-02-22 Monitoring system Withdrawn EP0559357A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ZA921621 1992-03-04
ZA921621 1992-03-04

Publications (1)

Publication Number Publication Date
EP0559357A1 true EP0559357A1 (en) 1993-09-08

Family

ID=25581470

Family Applications (1)

Application Number Title Priority Date Filing Date
EP93301284A Withdrawn EP0559357A1 (en) 1992-03-04 1993-02-22 Monitoring system

Country Status (2)

Country Link
EP (1) EP0559357A1 (en)
ZA (1) ZA929406B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994008258A1 (en) * 1992-10-07 1994-04-14 Octrooibureau Kisch N.V. Apparatus and a method for classifying movement of objects along a passage
EP0875873A1 (en) * 1997-04-30 1998-11-04 Sick Ag Opto-electronic sensor
EP0892280A2 (en) * 1997-07-15 1999-01-20 Sick AG Method for operating an opto-electronic sensor device
FR2826443A1 (en) * 2001-06-21 2002-12-27 Gilles Cavallucci METHOD AND DEVICE FOR OPTICALLY DETECTING THE POSITION OF AN OBJECT
FR2867864A1 (en) * 2004-03-17 2005-09-23 Automatic Systems METHOD AND INSTALLATION FOR PASSING DETECTION ASSOCIATED WITH A ACCESS DOOR
DE102021005129A1 (en) 2021-10-13 2023-04-13 vanory GmbH Method and device for controlling electronic devices, in particular lights
US11747513B2 (en) 2018-12-20 2023-09-05 Sick Ag Sensor apparatus

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3825916A (en) * 1972-10-20 1974-07-23 California Crime Technological Laser fence
US3898639A (en) * 1972-08-24 1975-08-05 Hrand M Muncheryan Security surveillance laser system
DE2940414A1 (en) * 1979-10-05 1981-04-09 Licentia Patent-Verwaltungs-Gmbh, 6000 Frankfurt IR beams across waterway identify passing ships - by comparing hull and superstructure signatures with stored data from which irrelevant detail is suppressed
EP0118182A2 (en) * 1983-02-08 1984-09-12 Pattern Processing Technologies Inc. Pattern processing method
WO1988000745A1 (en) * 1986-07-24 1988-01-28 Keith Jeffrey Gate Detection system
FR2670404A1 (en) * 1990-12-12 1992-06-19 Dassault Electronique Device and method for automatic classification of independent vehicles on the move

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3898639A (en) * 1972-08-24 1975-08-05 Hrand M Muncheryan Security surveillance laser system
US3825916A (en) * 1972-10-20 1974-07-23 California Crime Technological Laser fence
DE2940414A1 (en) * 1979-10-05 1981-04-09 Licentia Patent-Verwaltungs-Gmbh, 6000 Frankfurt IR beams across waterway identify passing ships - by comparing hull and superstructure signatures with stored data from which irrelevant detail is suppressed
EP0118182A2 (en) * 1983-02-08 1984-09-12 Pattern Processing Technologies Inc. Pattern processing method
WO1988000745A1 (en) * 1986-07-24 1988-01-28 Keith Jeffrey Gate Detection system
FR2670404A1 (en) * 1990-12-12 1992-06-19 Dassault Electronique Device and method for automatic classification of independent vehicles on the move

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994008258A1 (en) * 1992-10-07 1994-04-14 Octrooibureau Kisch N.V. Apparatus and a method for classifying movement of objects along a passage
US5519784A (en) * 1992-10-07 1996-05-21 Vermeulen; Pieter J. E. Apparatus for classifying movement of objects along a passage by type and direction employing time domain patterns
EP0875873A1 (en) * 1997-04-30 1998-11-04 Sick Ag Opto-electronic sensor
US6023335A (en) * 1997-04-30 2000-02-08 Sick Ag Optoelectronic sensor
EP0892280A2 (en) * 1997-07-15 1999-01-20 Sick AG Method for operating an opto-electronic sensor device
EP0892280A3 (en) * 1997-07-15 1999-11-03 Sick AG Method for operating an opto-electronic sensor device
FR2826443A1 (en) * 2001-06-21 2002-12-27 Gilles Cavallucci METHOD AND DEVICE FOR OPTICALLY DETECTING THE POSITION OF AN OBJECT
WO2003003580A1 (en) * 2001-06-21 2003-01-09 H2I Technologies Method and device for optical detection of the position of an object
US7221462B2 (en) 2001-06-21 2007-05-22 H2I Technologies, Societe Anonyme a Directoire et Conseil de Surveillance Method and device for optical detection of the position of an object
FR2867864A1 (en) * 2004-03-17 2005-09-23 Automatic Systems METHOD AND INSTALLATION FOR PASSING DETECTION ASSOCIATED WITH A ACCESS DOOR
WO2005101062A1 (en) * 2004-03-17 2005-10-27 Automatic Systems Method and device for detecting a passage associated with an access door
US11747513B2 (en) 2018-12-20 2023-09-05 Sick Ag Sensor apparatus
DE102021005129A1 (en) 2021-10-13 2023-04-13 vanory GmbH Method and device for controlling electronic devices, in particular lights

Also Published As

Publication number Publication date
ZA929406B (en) 1993-09-27

Similar Documents

Publication Publication Date Title
EP0664012B1 (en) Method and apparatus for classifying movement of objects along a passage
US5214744A (en) Method and apparatus for automatically identifying targets in sonar images
US7170418B2 (en) Probabilistic neural network for multi-criteria event detector
US5101194A (en) Pattern-recognizing passive infrared radiation detection system
Yue et al. A bio-inspired visual collision detection mechanism for cars: Optimisation of a model of a locust neuron to a novel environment
EP0702800B1 (en) Detector systems
CA2275893C (en) Low false alarm rate video security system using object classification
CN107665326A (en) Monitoring system, passenger transporter and its monitoring method of passenger transporter
CN106846729A (en) A kind of fall detection method and system based on convolutional neural networks
GB2251310A (en) Method for detecting and classifying features in sonar images
JP2019079445A (en) Fire monitoring system
KR20190046351A (en) Method and Apparatus for Detecting Intruder
CN111686392A (en) Artificial intelligence fire extinguishing system is surveyed to full scene of vision condition
CN113484858A (en) Intrusion detection method and system
KR102360568B1 (en) Method and system for detecting incident in tunnel environment
CN109870250A (en) Region exception body temperature monitoring method, device and computer readable storage medium
EP0559357A1 (en) Monitoring system
CN108319892A (en) A kind of vehicle safety method for early warning and system based on genetic algorithm
FR2418505A1 (en) LOCATION MONITORING SYSTEM
US20210312190A1 (en) Monitoring device, and method for monitoring a man overboard situation
Mantri et al. Analysis of feedforward-backpropagation neural networks used in vehicle detection
KR102140195B1 (en) Method for detecting invasion of wild animal using radar and system thereof
KR20220113631A (en) Dangerous situation detection device and dangerous situation detection method
KR102556447B1 (en) A situation judgment system using pattern analysis
RU2665264C2 (en) Intelligent system of intruder detection

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LI LU MC NL PT SE

17P Request for examination filed

Effective date: 19940301

17Q First examination report despatched

Effective date: 19961118

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Withdrawal date: 19970217