GB2337146A - Detecting motion across a surveillance area - Google Patents

Detecting motion across a surveillance area Download PDF

Info

Publication number
GB2337146A
GB2337146A GB9809807A GB9809807A GB2337146A GB 2337146 A GB2337146 A GB 2337146A GB 9809807 A GB9809807 A GB 9809807A GB 9809807 A GB9809807 A GB 9809807A GB 2337146 A GB2337146 A GB 2337146A
Authority
GB
United Kingdom
Prior art keywords
zone
frame
moving
region
moving region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9809807A
Other versions
GB9809807D0 (en
GB2337146B (en
Inventor
Geoffrey Lawrence Thiel
Jonathan Hedley Fernyhough
Andrew Evripedes Kyri Georgiou
Peter Richard Seed
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PRIMARY IMAGE Ltd
Original Assignee
PRIMARY IMAGE Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PRIMARY IMAGE Ltd filed Critical PRIMARY IMAGE Ltd
Priority to GB9809807A priority Critical patent/GB2337146B/en
Publication of GB9809807D0 publication Critical patent/GB9809807D0/en
Priority to PCT/GB1999/001457 priority patent/WO1999059116A1/en
Publication of GB2337146A publication Critical patent/GB2337146A/en
Application granted granted Critical
Publication of GB2337146B publication Critical patent/GB2337146B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19604Image analysis to detect motion of the intruder, e.g. by frame subtraction involving reference image or background adaptation with time to compensate for changing conditions, e.g. reference image update on detection of light level change
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/1961Movement detection not involving frame subtraction, e.g. motion detection on the basis of luminance changes in the image
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19652Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A camera captures, in successive frames, images of an object moving across a surveillance area. Successive frames are correlated to identify moving regions within each frame and to evaluate motion vectors for each region. Each moving region is thus tracked across the surveillance area. If no moving region is found in a frame to correspond to a moving region in a previous frame, a check is made for a corresponding region in a background mask; the background mask is updated frame-by-frame. If no corresponding region is found in the background mask, the moving region is considered lost and a preliminary alarm state is set. A full alarm state is then set if either a moving region is tracked traversing the surveillance area in an illegal direction or a moving region is tracked entering the surveillance area, is lost, and then another moving region (which may, or is expected to, correspond to the same object) is tracked leaving or about to leave the surveillance area in an illegal direction while a preliminary alarm state is set.

Description

2337146 Method and Apparatus for Detecting MotiQn Across a Surveillance
Area The invention relates to a method and apparatus for detecting motion of an object across a surveillance area, and in particular for raising an alarm in the event of detecting motion of an object across a surveillance area in a predetermined direction in the presence of objects traversing the area in the opposite direction.
The invention finds particular application, for example, in using a video camera to monitor people passing through a departure gate in an airport, processing images captured by the video camera to detect individuals passing through the gate in the wrong direction, and raising an alarm if such illegal motion occurs.
A number of image processing methods for detecting the motion of objects are known, but give rise to disadvantages in applications such as described above. Examples are as follows:
The simplest approach may be frame-to-frame differencing, whereby the differences between pixel intensities in adjacent frames can be used to identify objects moving through a scene. Typically, this technique highlights the leading and trailing edge of each moving object rather than the actual objects. The contrast polarity of the leading and trailing edges can then be used to obtain the direction of motion, assuming the background has a constant contrast and that the leading and trailing edges can be accurately grouped. However, it is not practical to assume that the background has a constant contrast and it is not a trivial matter to group leading and trailing edges correctly. overlapping or closely-spaced objects cause additional complications when identifying matching
C:\SPECS39699GB.SG - 8 May 1998 1 1 j. -.
edges. This technique is also sensitive to light conditions, contrast and camera motion/shaking. Should the scene be densely populated with objects moving close together and in various directions-it is likely that only a single moving region would be identified.
A similar approach uses background subtraction, followed by thresholding and segmentation to identify objects in the scene. This technique assumes that a good background image is available or can be generated. From frame to frame, each identified region can then be tracked and labelled through proximity measurements, and the direction of motion obtained by considering the location of the labelled regions in each frame. As above, this technique is sensitive to light conditions, contrast and camera motion/shake. Additionally, as thresholding is used prior to segmentation, overlapping or closely-spaced objects may be grouped together and identified as a single amalgamated region, including objects moving in different directions. With objects being lost in this way, it is not possible to accurately track objects moving in the wrong direction. Additionally, as with frame differencing. tracking objects moving in different directions is not possible.
Model-based tracking techniques, using a spatio-temporal filter (for example, a Kalman filter), could be used to accurately track individual objects through the scene.
Typically these techniques require a large amount of training footage to accurately model the typical kinds of objects which move through the scene. If unusually-shaped objects then travel through the scene, the tracker is likely to become confused and not track those objects successfully. once trained, the application is highly specific to the direction of motion and the camera angle; if the camera is moved to a different location the application is likely to require further training data.
C\SPECS'39699GB.SG - 9 May 1998 1 0 0 1 ' Typically, this kind of tracking application is highly processor- intensive and may require specialised hardware to perform a full frame rate.
Techniques based on a number of surveillance "gates" can detect motion based on the order, and time, in which the gates are activated. Motion within a gate can be detected using any of a number of simple techniques; for example, background subtraction and histogram analysis. These techniques are prone to aliasing problems, whereby a gate may be triggered by an object moving in the correct direction followed shortly afterwards by a second object, also moving in the correct direction, triggering an adjacent gate. By adding more gates, along with a combination of logical and timing parameters, more complex scenarios can be catered for, but this tends to lead to overly-complex configuration requirements.
The invention provides in its various aspects a motion detection method and apparatus and a method and apparatus for raising an alarm as defined in the appendant independent claims. Preferred or advantageous features of the invention are defined in dependent subclaims.
Circumstances where objects are not all moving in one direction, and may reverse their direction of motion, present difficulties for motion detection systems. These difficulties are increased where large numbers of nonidentical objects may be moving close together. The invention seeks to solve these problems while requiring as little processing power as possible to carry out the necessary image processing.
The system of the invention advantageously does not attempt to reliably identify and track individual objects through a field of view. Rather, the invention is
C\SPECS\39699GMG - 8 May 1998 i 1 1_------/ intended to identify the motion of individual or grouped objects and to track the motion in order to recognise illegal movement through the field of view or surveillance area.
The invention advantageously allows an alarm to be raised if an object traverses a surveillance area in an illegal direction, when many objects may simultaneously be traversing the surveillance area in the opposite, legal direction. The invention is therefore particularly applicable to monitoring gates through which people may only pass in one direction, such as gates at airports, ports or railway stations, or entrance or exit gates at sporting events or other public events.
A significant advantage of the invention is its capacity to recognise objects moving in different directions. It may therefore find application for, for example, counting objects traversing a surveillance area when those objects may not always move in one direction. An example would be monitoring motor traffic.
The invention provides a robust motion detector which can accommodate motion of objects in different directions, and discontinuous motion of objects, using an image processing technique which is preferably sufficiently compact to be executed on a suitably-programmed personal computer (PC), even at full video frame rate. This gives very significant advantages of cost and ease of manufacture and implementation.
The robust motion detection advantageously derives from the combination of several features as set out in the independent claims. Thus, the invention does not attempt to track individual objects, but looks for regions of similar motion and then tracks those regions through the C\SPEW39699GB.SG - 8 May 1998 scene. If a tracked region is lost, a fallback advantageously determines if the region is still valid from a background mask. If the region still cannot be identified, a preliminary. state may be set to identify the object as it proceeds across the surveillance area.
is To summarise, the invention may advantageously provide the following benefits; robust motion detection which performs well with crowd scenes; a system having only a simple configuration and which may require no learning or training and so be able to cope with unexpected changes in data: for example, the system may track a person in a wheel chair even through this type of object may not have been seen before; a system which may advantageously not suffer from (aliasing) false alarms caused by several objects moving in a legal direction; a system having good rejection characteristics to lighting changes and camera movements; and a system which need not require specialised hardware but may function at full frame rate performance on currently available hardware (off-theshelf PC).
Advantageously, the system of the invention may alternatively be implemented on a dedicated microprocessor or similar circuitry for mass production to reduce costs.
Advantageously, apart from external configuration of the surveillance area within the field of view, the system of the invention may be selfconfiguring and may only require further external configuration for a delay period used in determining the Alarm State.
Detailed De,5cription of S)ecific Embodimgnt Specific embodiments of the invention will now be described by way of example, with reference to the drawings, in which; C\SPECS\39699GB.SG - 8 May 1998 Figure 1 shows a field of view of a video camera for capturing images for motion detection; 0 C.
Figure 2 shows a correlator grid overlaid on a portion of a frame captured by the camera; Figure 3 shows a search area and a correlator in both its correlator position and a correlator match position; Figure 4 shows a flow chart of the overall operation of an embodiment of the invention; Figure 5 shows a flow chart of the procedure for labelling regions in successive frames according to the embodiment; and Figure 6 shows a flow chart of the alarm-state-determining routine of the embodiment.
A preferred embodiment of the invention provides a system for identifying objects that illegally enter and leave a surveillance area in which legal motion is limited to a single direction. The system uses live video images from a static camera, received at full frame rate (25fps with PAL and 30fps with NTSC). Motion through the area can be extremely dense such that automatically identifying individual objects may not be possible by simple segmentation techniques based on frame differences. Rather than attempting to track individual objects, the system therefore tracks regions of motion through the field of view using localised optical flow.
The only alarm situation is when an object illegally enters the surveillance area from the normal exit and leaves the surveillance area through the normal entrance. Objects entering the surveillance area correctly, C:I,SPECS'J9699GB.SG - 8 May 1998 c' c reversing direction and then leaving through the entrance should not raise the alarm. Similarly, objects which illegally enter through the exit but reverse direction and leave through the exit should not raise the alarm.
The embodiment of the invention is implemented at, for example, a departure gate at an airport, where people are permitted to pass in one direction through the gate and may do so in large numbers, closely-spaced. It is important, however, that no-one passes through the gate in the wrong direction and the system of the embodiment is for raising an alarm if anyone does so.
In the embodiment, an overhead camera provides live data from the surveillance area (the departure gate) with an alarm state being activated when illegal motion is detected. At that time, relays may be activated to sound an alarm and/or to flash a warning lamp and/or a video printer may be triggered to print an image of the object performing the illegal motion. The video printer does not have to share the same view as the surveillance camera. Instead, a separate camera, which has a clearer view of the object illegally leaving the surveillance area, can be used.
The field of view of the camera is split into four zones or areas as shown in figure 1. These are an entrance zone 2 an exit zone 4, a travel zone 6, and the remaining area 8.
The entrance zone, exit zone and travel zone combine to form a surveillance area 10. The surveillance area must span the full width of the departure gate and is transversely divided by the entrance and exit zones, separated by the travel zone. The sizes of these different zones are completely configureable and may be C:\SPECS39699GB.SG - 8 May 1998 00 0 D 0 0.-, 0 c C 0 0 C 0 0 0 c c 00 00 0 000 Do.
1 1 -I predetermined for different applications. For maximum alarm protection the entrance zone and exit zone would meet, giving a travel zone of zero width. This means that any object just crossing the line between the zones will raise the alarm, which may increase the number of false alarms triggered. The entrance and exit zones do not have to be large; a reasonable width for each zone is about the distance travelled by a typical object in about half a second. The width of the travel zone is less significant and can be predetermined depending on the application of the system.
For each frame captured by the camera, the system performs the following basic steps, as described in detail below and as illustrated in figure 4; update background mask (step 52), calculate motion vectors (step 54), identify moving regions (step 56), label regions (step 58), and determine alarm state (step 60).
Update Background Mask
When each new frame is captured by the camera (which captures monochrome frames, comprising pixels each having only a grey-scale intensity), a background image b is updated dynamically (step 52) using weighted averaging W of the current frame f with the existing background image. A rounding coefficient (either the floor or ceiling) is applied to reduce rounding errors in the calculation for the weighted average. So, at each point (x, y) within the surveillance area, the background is updated using:
Y) ={floor( (f(x,y) -b(x,y))1W+b(x,y)) for (f (x, y) -b (x, y ceiling( (f(x,y) -b(x,y)) IW+b(x,y)) otherwise C\SPECS\39699GB.SG - 8 May 1998 Without the floor and ceiling coefficients, high intensity pixels in the background image can build up over time and not be representative of the "true" background. The dynamic updating of the background image has the effect that any object which stops moving and remains stationary is slowly merged into the background over a number of frames.
A "differenced" image is then obtained by applying image subtraction between the updated background image and the current frame. Image subtraction is used to subtract the image intensities at each pixel in the background image from the current frame image. The resulting differenced image is then thresholded by flagging all pixels having image intensity greater than a predetermined threshold to obtain a background mask in the form of a binary image where the flagged pixels represent significant changes in intensity such as objects moving in the scene. A background mask corresponding to each successive frame is generated. Thus, the background mask provides a template highlighting all areas in the field of view that are not part of the perceived background - in particular, highlighting all objects that have "recently" entered the surveillance area or that are moving within the surveillance area.
This technique is known for use as a first step in many vision and tracking applications. The background mask is used to segment and identify moving objects in the scene by clustering connected groups of flagged pixels. However, if several moving objects overlap in the image (or are travelling too close together to be separately identified in the background mask) a single amalgamated region may be obtained rather than individual regions for each object. Additionally, the technique used by itself
C\SPECS\39699GB.SG - 8 May 1998 COO - C, 0 may be disadvantageously sensitive to sudden changes in light conditions, contrast and camera movement.
The specific embodiment of the invention does not use this technique to initially identify objects. Rather, this technique is used later in the image processing procedure of the embodiment to locate objects from the previous frame which have not yet been identified in the current frame using the motion vector image processing method detailed below.
It should therefore be noted that the updating of the background mask need not be the first step of the image processing procedure. It only needs to be carried out before the background mask is used in the step of labelling moving regions, as described below. Updating the background mask has been described first herein only for convenience.
Calculate Motion Vectors Motion vectors are calculated (step 54) in the embodiment by correlating portions of successive frames. The calculation is based on the frames as captured by the camera (full grey scale).
As illustrated in figure 2, a fixed two-dimensional grid overlay is defined covering the surveillance area. Each intersection 12 in the grid identifies the central point of an area to be correlated between successive frames, obtaining a localised optical flow or motion vector. Each such area is known as a correlator 14.
Correlators can be any size and may overlap other correlators. In the present embodiment, as shown in figure 2, the correlators match the grid spacing so that the correlators do not overlap. The grid spacing and C:SPECS'39699GB.SG - 8 May 1998 i correlator sizes are preferably small enough (for example, 16x16 pixels) such that a moving object will cover more than one correlator.
A single correlator h located at grid position (u,v) relative to an origin (1,j) in frame f with dimensions w and h (measured in pixels) is defined as:
w12, h12 h (u, v) c f (1+ U, i +v) !=-w12,j=-h12 As illustrated in figure 3, each frame is compared with the previous frame in order to assign a motion vector to each correlator, where possible. Thus, after each new frame is captured, a search area 16 is defined around each carrelator position 12 and a search made within that area for a pattern of pixel intensities similar to the pattern of pixel intensities in the corresponding correlator 14 in the previous frame. Thus, in each new frame, the best match position 18 for each correlator in the previous frame is identified within a search area around the position of that correlator. The offset between the best match position and the correlator position is the motion vector 20 of that correlator.
Pixels outside the search area are not used in any correlation attempt. This limits the size of the maximum motion vector (dx max, dy max) to an offset within the search area as follows:
C:\SPECS39699GB.SG - 8 May 1998 dx,,,,,, = (S,, C,) 12 dy,,. = (SY CY) 12 1 \,--"i 1 12 - Where S, and SY are the search area width and height and C., and CY are the width and height of each correlator. A correlation area 22 can therefore be defined as the area within the search area which contains pixels on which a correlator area can be centered and not extend outside the search area.
is The size of the search area is predetermined depending on the maximum expected travel speed within the surveillance area. For example, if the width of the surveillance area is 4 metres, the image width is 400 pixels, the correlator mask size is 20x20 pixels and the search area is 40x40 pixels, the maximum detectable object travel speed in the image is 10 pixels/frame which corresponds to a real object speed of about 2.5 metres per second (at 25 frames per second). Limiting the size of the search area advantageously limits the amount of calculation required to perform a correlation operation and, in practice, does not significantly restrict the motion detection capabilities of the system. This is because an object is unlikely to exceed the maximum expected travel speed and because if an object is travelling very fast across the surveillance area, it is unlikely that a good correlator match will be identified. Real-time image processing is thus made easier without significant performance reduction.
With a correlator area centred at each pixel position (x,y) in the correlation area (i.e. each possible offset for the motion vector) a correlation value C is calculated from the sum of the absolute grey scale differences between each pixel in the correlator h (from the previous frame) and the corresponding search pixels (in the present frame). As such, an exact match would be a correlation value of zero. However, an exact match is unlikely to C\SPECS09699GB.SG - 8 May 1998 13 - 1 occur even if there is no movement from one frame to the next. High frequency noise from the camera and slight digitiser errors introduce differences such that a perfect match is unlikely to occur". Hen-ce, the lowest correlation value for each pixel position within the correlation area indicates the best match.
i.e. if f is the current frame image, and h (u,v) is the correlator mask from the previous frame at position (u, v) then:
u+dx, v+dy C (x, If (x+i, y+j) h (u+i, v+!) x=u-dx,y=v-dy j=-h12 i=-w12 TO improve performance, a threshold value is applied to the match allowing an early exit from correlation search as soon as a correlation value below the threshold is found. The threshold value for each correlator may advantageously be based on the match value C for that correlator in the previous frame. This dynamic update improves performance from one frame to the next when the image changes, for example, due to light changes or automatic gain.
As motion tends to continue in the same direction from one frame to the next the initial correlator search position can be taken from the previous motion vector. If there was no motion then the initial search position will be the centre of the search area (previous motion vector is of zero magnitude). When the threshold value technique is also applied, there is then a reasonable possibility that an immediate match will be found, thus further reducing the time required for the correlation and motion vector calculations.
C\SPECS1,39699GB.SG - 8 May 1998 1 i. 1.
7 1 1 i 1 A further performance improvement that may be applied to reduce the correlation search time is to check early in the search the same offset as the motion vectors calculated in adjacent correlato-rs.
The motion vector (dx, dy) for correlator position (u, v) is then calculated from the point where, during the search procedure for that correlator, the correlation value C(x, y) first falls below the threshold value.
To obtain the most accurate possible motion vector, the search should be continued to find the point that has the minimum correlation value Cj,(x, y), but the search would then require a correlation value to be calculated for all correlator positions within the search area. The image processing method of the embodiment avoids this requirement and thus advantageously enables an effective search procedure to be performed at full, real-time frame rate with a reduced demand for data processing power and therefore at reduced cost.
Identify Moving Regions The step of calculating motion vectors (step 54) generates a 2- dimensional (2-d) grid of motion vectors, one for each correlator. By grouping together adjacent motion vectors that have similar properties (i. e. approximately the same direction and magnitude, as shown in figure 2), moving regions within the surveillance area can be identified.
For matching directions between two motion vectors the range between the extremes should be no more than 30-45 degrees. It is difficult to provide a value for the magnitude comparison and depends on the surveillance area and the speeds to be distinguished. For example, if it is desired to identify separate regions for objects C:\SPECS'39699GB.SG - 8 May 1998 travelling faster or slower than 10 metres per second across a surveillance area of 40 metres, assuming a frame rate of 25 frames/second an object moving at 10 m/s can travel 40 cm each frame. "If each pixel represents 10 cm, the magnitude of the motion vectors at 10 m/s is 4 pixels. Any larger magnitude is travelling faster than 10 m/s and anything smaller is slower.
Label Reaions This procedure is illustrated in figure 5 as well as in part of figure 4.
The steps of identifying moving regions and then labelling those regions are used to "track" regions from one frame to the next. It is assumed that each region in the previous frame represents one or more objects moving through the surveillance area and that those objects, typically, follow a fixed course through that area. As mentioned above, a single object may be represented by a number of regions. This is partially dealt with when labelling regions in this step.
An attempt is made (step 62) to match each region (already labelled)in the previous frame (steps 64, 66) to a region identified in the current frame on the basis of close proximity of the regions. At full frame rate, it is assumed that the current spatial extent of a region normally overlaps the spatial extent of the region in the previous frame. However, if the size of a region is only a single correlator width or height, motion of a region may be perceived as a set of discrete jumps to adjacent correlators providing no overlap between one frame and the next. To compensate for this perceived motion, for the purposes of matching regions in successive frames the spatial extent of a region in the previous frame is C\SPECS\39699GB.SC - 8 May 1998 extended by the width of a correlator in the perceived direction of motion.
If one region in the new frame matches one in the previous frame, the region in the new frame is given the same label as the region in the previous frame (steps 68, 70).
It is possible that more than one region in the current frame will match a region in the previous frame (for example, objects moving close together, or large objects "breaking" up. When this happens, those matching regions are combined into a single region and labelled with same label accordingly (step 72).
When a region in the previous frame is lost (i.e. cannot be located in the current frame) (step 73) and has demonstrated sufficient sustained motion over a number of previous frames to be considered more that just "noise" (step 74), a second location process is attempted to determine if the object has stopped moving. If it is determined that a moving region is a noise artifact, its loss is ignored (step 75) and the region is no longer tracked. (Although there are different forms of "noise-, for example, digitizer errors and line interference, these will typically only appear as motion artifacts for a frame or two. Additionally, movement of a person's limbs can shown brief motion in directions other than that of the main body. Expe.rimentation by the inventor has demonstrated that a sustained motion of at least five frames appears sufficient to remove most forms of xnoise").
When an object has stopped moving, the motion vectors which previously identified that object no longer exist. However the background mask (generated as described above), which shows the locations of all objects which
C'ISPECS\39699GBW - 9 May 1998 0 0 0 0 0 0 0 0 00 17 - have recently entered the field of view, will still contain a mask of the now stationary object. By examining the location of the missing region in the background mask, it is usually possible to-find and identify a stationary region corresponding to the lost moving region (step 76). The region is then labelled with the same label as the region in the previous frame (step 78). If the region remains stationary for several frames it can be "tracked" in the background mask.
In case of a region corresponding to an object entering the travel area via the exit area, if the lost region can still not be identified after inspection of the background mask and the last known position was in the travel zone, a preliminary alarm state will be activated (step 80). An expected arrival time into the entrance area is calculated from the known motion and distance travelled (step 82). This information is used when determining the Alarm State in the next step.
Finally, any region in the current frame that has not been labelled is assigned a new label (step 84).
Determine Alarm State The Alarm State is set for either of two reasons, as shown in figure 6, which expands step 60 of figure 4, starting at the step of entering the "determine-alarmstate routine" (step 100).
The first reason for an alarm state occurs (step 106) when a region is clearly tracked from the exit zone (step 102), through the travel zone and into the entrance zone (step 104). An alarm state may also be set after a region has been tracked into the
travel zone from the exit zone. As C:\SPECS',39699GB.SG - 8 May 1998 discussed above, a preliminary alarm state is set when an object cannot be located from one frame to the next. When the preliminary alarm state has been set (step 108) a full alarm state will be set if a reg-ion is subsequently tracked from the travel zone into the entrance zone (step 104) at about the expected arrival time (ETA) of the first region in the exit zone or within a pre-defined delay period. When the delay period has elapsed, or after a delay based on the ETA, the preliminary alarm state is reset.
The pre-defined delay period is operator-configurable and determines the time for which an object may be lost before cancelling the preliminary alarm status. While the preliminary alarm status is in effect, false alarms may be raised due to an aliasing effect (for example a person entering the surveillance area through the exit, reversing and then leaving). As such, the benefit of this delay period will be lost should the magnitude be of the order of minutes or hours rather than seconds.
When an alarm state is set (step 106),' an alarm procedure is initiated (step 110) and then the system returns to capture and process frames from the video camera (step 112). When the alarm procedure is initiated, any appropriate action may be triggered automatically, such as sounding an alarm, triggering a recording camera (to record an image of a person passing the wrong way through an airport departure gate, for example), alerting security personnel or automatically closing a door to block the exit of a person who has passed the wrong way through the surveillance area (step 114).
The system of the embodiment thus does not attempt to track individual objects, but looks for regions of similar motion and then tracks those regions through the scene.
CASPEM39699GB.SG - 8 May 1998 p 11 1 1 - - 1 If a tracked region is lost, a fallback will determine if the region is still valid from the background mask. If the region still cannot be identified, a preliminary alarm state is set to identify the obj-ect if it enters the next zone.
C\SPECS09699G9SC - 8 May 1998 1 -

Claims (18)

1. A method for detecting motion of an object across a surveillance area, which is tran-sversely divided into a first zone and a second zone separated by a travel zone, from the first zone to the second zone, comprising the steps of; capturing in successive video frames a series of images of the surveillance area; correlating successive frames to identify moving regions within each frame and to evaluate motion vectors describing the motion of each moving region; tracking moving regions in successive frames on the basis of their position in each frame and their motion to identify corresponding moving regions in successive is frames; if no moving region is found in a frame corresponding to a moving region in the previous frame, looking for a corresponding region in a background mask, the background mask being updated frame-by-frame; if no corresponding region is found in the background mask, setting a preliminary alarm state; and setting an alarm state if; either a moving region is tracked entering the travel zone from the first zone and subsequently leaving the travel zone by entering the second zone; or a moving region is tracked entering the travel zone from the first zone, is lost while it is within the travel zone, and another moving region subsequently leaves the travel zone by entering the second zone while a preliminary alarm state is set.
2. A method according to claim 1, in which when an alarm state is set, an alarm is raised and/or an image of the surveillance area, or at least of the second zone, is C\SPECS\39699GMG - 8 May 1998 1,1 n C.
recorded and/or a door is closed to block an exit from the second zone.
3. A method according to claim 1 or 2, in which successive frames are correlated by defining a plurality of correlators within one frame, calculating a correlation value for a plurality of possible positions of each correlator within a respective search area in the next frame, and if a correlator match is found in the search area assigning a motion vector to the correlator based on the offset between the correlator position and the correlator match position.
4. A method according to claim 3, in which the size of each search area is predetermined depending on the expected maximum speed of objects moving across the surveillance area and/or on the reliability of correlator matching for fast-moving objects.
5. A method according to claim 3 or 4, in which the calculation of correlation values for a particular correlator at different positions within the respective search area is ended when a correlation value below a predetermined correlation value threshold is found.
6. A method according to claim 5, in which the correlation value threshold varies depending on the magnitude of a correlation value calculated for the same correlator in the previous frame.
7. A method according to any of claims 3 to 6, in which the calculation of correlation values for a correlator is started at or near a correlator position within the respective search area corresponding to the motion vector for the same correlator in the previous frame.
C\SPECS'39699GMG -
8 May 1998 0 8. A method according to any preceding claim, in which, when a moving region is lost during the tracking of that moving region, no attempt is made to look for a corresponding region in the background mask if the moving region has not demonstrated sustained motion over a plurality of frames.
9. A method according to any preceding claim, in which a preliminary alarm state is cancelled after a predetermined time.
10. A method according to any of claims 1 to 8, in which the preliminary alarm state is cancelled after a time evaluated depending on the position and motion of the moving region which was lost and thus triggered the preliminary alarm state, by calculating the expected time of arrival of that moving region in the second zone.
11. A method according to any preceding claim, which can be implemented by a suitably-programmed personal computer (PC) to process in real time frames captured by a conventional monochrome video camera.
12. A method of raising an alarm when an object traverses a surveillance area in a predetermined direction, the surveillance area being transversely divided into first and second zones separated by a travel zone, the predetermined direction being from the first zone to the second zone, and the method comprising the steps of; capturing in successive video frames a series of images of the surveillance area; correlating successive frames to identify moving regions within each frame and to evaluate motion vectors describing the motion of each moving region; tracking moving regions in successive frames on the basis of their position in each frame and their motion to C\SPECS'139699GBW - 8 May 1998 0 0 <1 a a 011 11 , c : 0 0 C identify corresponding moving regions in successive frames; if no moving region is found in a frame corresponding to a moving region in theprevicus frame, looking for a corresponding region in a background mask, the background mask being updated frame-by-frame; if no corresponding region is found in the background mask, setting a preliminary alarm state; and setting an alarm state if; either a moving region is tracked entering the travel zone from the first zone and subsequently leaving the travel zone by entering the second zone; or a moving region is tracked entering the travel zone from the first zone, is lost while it is within the travel zone, and another moving region subsequently leaves the travel zone by entering the second zone while a preliminary alarm state is set.
13. An apparatus for detecting motion of an object across a surveillance area, which is transversely divided into a first zone and a second zone separated by a travel zone, from the first zone to the second zone, comprising; a video camera for capturing in successive video frames a series of images of the surveillance area; a correlator coupled to an output of the camera for correlating successive frames to identify moving regions within each frame and to evaluate motion vectors describing the motion of each moving region; a tracking means coupled to an output of the correlator for tracking moving regions in successive frames on the basis of their position in each frame and their motion to identify corresponding moving regions in successive frames; a background mask updating means coupled to an output of the camera for generating an updated background mask corresponding to each successive frame; C'SPECS\39699GBW - 8 May 1998 24 - 1.1 0 cl . 0 C, '.0 a background mask inspection means for, if no moving region is found in a frame corresponding to a moving region in the previous frame, looking for a corresponding region in the background mask; a preliminary alarm state setting means for setting a preliminary alarm state if no corresponding region is found in the background mask by the background mask inspection means; and, an alarm state setting means for setting an alarm if; either a moving region is tracked by the tracking means entering the travel zone from the first zone and subsequently leaving the travel zone by entering the second zone; or a moving region is tracked by the tracking means entering the travel zone from the first zone, is lost while it is within the travel zone, and another moving region subsequently leaves the travel zone by entering the second zone while a preliminary alarm state is set.
14. An apparatus according to claim 13, comprising an alarm means for. when an alarm state is set, raising an alarm, and/or recording an image of the surveillance area, or at least of the second zone, and/or closing a door to block an exit from the second zone.
15. An apparatus according to claim 13 or 14, in which the video camera is a conventional monochrome video camera and the correlator, tracking means, background mask updating means and inspection means, preliminary alarm state setting means and alarm state setting means are implemented on a suitably-programmed personal computer (PC) operating in real time.
i 1 i
16. An apparatus for raising an alarm when an object traverses a surveillance area in a predetermined direction, the surveillance area being transversely C\SPEM39699GB.SG - 8 May 1998 c a c 11 divided into first and second zones separated by a travel zone, the predetermined direction being from the first zone to the second zone, and the apparatus comprising; a video camera for capturing in successive video frames a series of images of the surveillance area; a correlator coupled to an output of the camera for correlating successive frames to identify moving regions within each frame and to evaluate motion vectors describing the motion of each moving region; a tracking means coupled to an output of the correlator for tracking moving regions in successive frames on the basis of their position in each frame and their motion to identify corresponding moving regions in successive frames; a background mask updating means coupled to an output of the camera for generating an updated background mask corresponding to each successive frame; a background mask inspection means for, if no moving region is found in a frame corresponding to a moving region in the previous frame, looking for a corresponding region in the background mask; a preliminary alarm state setting means for setting a preliminary alarm state if no corresponding region is found in the background mask by the background mask inspection means; and, an alarm state setting means for setting an alarm if either a moving region is tracked by the tracking means entering the travel zone from the first zone and subsequently leaving the travel zone by entering the second zone; or a moving region is tracked by the tracking means entering the travel zone from the first zone, is lost while it is within the travel zone, and another moving region subsequently leaves the travel zone by entering the second zone while a preliminary alarm state is set.
C:WKS'J9699GB.SG - 8 May 1998 - 0 1
17. A method for detecting motion of an object and raising an alarm substantially as described herein, with reference to the drawings.
18. An apparatus for detecting motion of an object and raising an alarm substantially as described herein with reference to the drawings.
C:SPECS\39699GB.SG - 8 May 1998
GB9809807A 1998-05-08 1998-05-08 Method and apparatus for detecting motion across a surveillance area Expired - Fee Related GB2337146B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB9809807A GB2337146B (en) 1998-05-08 1998-05-08 Method and apparatus for detecting motion across a surveillance area
PCT/GB1999/001457 WO1999059116A1 (en) 1998-05-08 1999-05-10 Method and apparatus for detecting motion across a surveillance area

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9809807A GB2337146B (en) 1998-05-08 1998-05-08 Method and apparatus for detecting motion across a surveillance area

Publications (3)

Publication Number Publication Date
GB9809807D0 GB9809807D0 (en) 1998-07-08
GB2337146A true GB2337146A (en) 1999-11-10
GB2337146B GB2337146B (en) 2000-07-19

Family

ID=10831643

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9809807A Expired - Fee Related GB2337146B (en) 1998-05-08 1998-05-08 Method and apparatus for detecting motion across a surveillance area

Country Status (2)

Country Link
GB (1) GB2337146B (en)
WO (1) WO1999059116A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001050339A2 (en) 1999-12-30 2001-07-12 Koninklijke Philips Electronics N.V. Method and apparatus for detecting fast motion scenes
EP1189187A2 (en) * 2000-08-31 2002-03-20 Industrie Technik IPS GmbH Method and system for monitoring a designated area
WO2003044752A1 (en) * 2001-11-21 2003-05-30 Iomniscient Pty Ltd Non-motion detection
GB2392033A (en) * 2002-08-15 2004-02-18 Roke Manor Research Video motion anomaly detector
WO2004019272A2 (en) * 2002-08-23 2004-03-04 Scyron Limited Interactive apparatus and method for monitoring performance of a physical task
GB2401977A (en) * 2003-05-22 2004-11-24 Hewlett Packard Development Co Surveillance of an area
EP1542154A2 (en) * 2003-12-11 2005-06-15 Sony United Kingdom Limited Object detection
WO2005058668A2 (en) * 2003-12-19 2005-06-30 Core E & S. Co., Ltd Image processing alarm system and method for automatically sensing unexpected accident related to train
WO2006013347A1 (en) * 2004-08-06 2006-02-09 Quinetiq Limited Method of detecting a target
WO2006109256A2 (en) * 2005-04-12 2006-10-19 Koninklijke Philips Electronics, N.V. Pattern based occupancy sensing system and method
WO2010004514A1 (en) * 2008-07-08 2010-01-14 Nortech International (Pty) Limited Apparatus and method of classifying movement of objects in a monitoring zone
WO2010081577A1 (en) * 2009-01-13 2010-07-22 Robert Bosch Gmbh Device, method, and computer for image-based counting of objects passing through a counting section in a prescribed direction
US8457401B2 (en) 2001-03-23 2013-06-04 Objectvideo, Inc. Video segmentation using statistical pixel modeling
US8564661B2 (en) 2000-10-24 2013-10-22 Objectvideo, Inc. Video analytic rule detection system and method
US8711217B2 (en) 2000-10-24 2014-04-29 Objectvideo, Inc. Video surveillance system employing video primitives
US9020261B2 (en) 2001-03-23 2015-04-28 Avigilon Fortress Corporation Video segmentation using statistical pixel modeling
US9892606B2 (en) 2001-11-15 2018-02-13 Avigilon Fortress Corporation Video surveillance system employing video primitives

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007072370A2 (en) * 2005-12-20 2007-06-28 Koninklijke Philips Electronics, N.V. Method and apparatus for estimating object speed
US8212669B2 (en) 2007-06-08 2012-07-03 Bas Strategic Solutions, Inc. Remote area monitoring system
DE102007062996A1 (en) 2007-12-21 2009-06-25 Robert Bosch Gmbh Machine tool device
US10816658B2 (en) 2016-09-07 2020-10-27 OmniPreSense Corporation Radar enabled weapon detection system
US10789472B1 (en) * 2017-06-14 2020-09-29 Amazon Technologies, Inc. Multiple image processing and sensor targeting for object detection
CN109859149B (en) * 2019-01-25 2023-08-08 成都泰盟软件有限公司 Small animal motion tracking method for setting target searching area
WO2021015672A1 (en) * 2019-07-24 2021-01-28 Hitachi, Ltd. Surveillance system, object tracking system and method of operating the same
CN113379985B (en) * 2020-02-25 2022-09-27 北京君正集成电路股份有限公司 Nursing electronic fence alarm device
CN113379984B (en) * 2020-02-25 2022-09-23 北京君正集成电路股份有限公司 Electronic nursing fence system
CN113379987B (en) * 2020-02-25 2022-10-21 北京君正集成电路股份有限公司 Design method of nursing electronic fence module
CN112435440B (en) * 2020-10-30 2022-08-09 成都蓉众和智能科技有限公司 Non-contact type indoor personnel falling identification method based on Internet of things platform

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2239369A (en) * 1989-11-09 1991-06-26 Marconi Gec Ltd Image tracking
GB2277845A (en) * 1993-05-03 1994-11-09 Philips Electronics Nv Monitoring system
EP0700017A2 (en) * 1994-08-31 1996-03-06 Nippon Telegraph And Telephone Corporation Method and apparatus for directional counting of moving objects
US5666157A (en) * 1995-01-03 1997-09-09 Arc Incorporated Abnormality detection and surveillance system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA1116286A (en) * 1979-02-20 1982-01-12 Control Data Canada, Ltd. Perimeter surveillance system
DE59509343D1 (en) * 1994-08-24 2001-07-19 Seisma Ag Zug IMAGE EVALUATION SYSTEM AND METHOD

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2239369A (en) * 1989-11-09 1991-06-26 Marconi Gec Ltd Image tracking
GB2277845A (en) * 1993-05-03 1994-11-09 Philips Electronics Nv Monitoring system
EP0700017A2 (en) * 1994-08-31 1996-03-06 Nippon Telegraph And Telephone Corporation Method and apparatus for directional counting of moving objects
US5666157A (en) * 1995-01-03 1997-09-09 Arc Incorporated Abnormality detection and surveillance system

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1252586A2 (en) * 1999-12-30 2002-10-30 Koninklijke Philips Electronics N.V. Method and apparatus for detecting fast motion scenes
WO2001050339A2 (en) 1999-12-30 2001-07-12 Koninklijke Philips Electronics N.V. Method and apparatus for detecting fast motion scenes
EP1189187A2 (en) * 2000-08-31 2002-03-20 Industrie Technik IPS GmbH Method and system for monitoring a designated area
EP1189187A3 (en) * 2000-08-31 2009-05-27 Industrie Technik IPS GmbH Method and system for monitoring a designated area
US10026285B2 (en) 2000-10-24 2018-07-17 Avigilon Fortress Corporation Video surveillance system employing video primitives
US8564661B2 (en) 2000-10-24 2013-10-22 Objectvideo, Inc. Video analytic rule detection system and method
US8711217B2 (en) 2000-10-24 2014-04-29 Objectvideo, Inc. Video surveillance system employing video primitives
US10645350B2 (en) 2000-10-24 2020-05-05 Avigilon Fortress Corporation Video analytic rule detection system and method
US10347101B2 (en) 2000-10-24 2019-07-09 Avigilon Fortress Corporation Video surveillance system employing video primitives
US8457401B2 (en) 2001-03-23 2013-06-04 Objectvideo, Inc. Video segmentation using statistical pixel modeling
US9020261B2 (en) 2001-03-23 2015-04-28 Avigilon Fortress Corporation Video segmentation using statistical pixel modeling
US9892606B2 (en) 2001-11-15 2018-02-13 Avigilon Fortress Corporation Video surveillance system employing video primitives
US7688997B2 (en) 2001-11-21 2010-03-30 Iomniscient Pty Ltd Non-motion detection
WO2003044752A1 (en) * 2001-11-21 2003-05-30 Iomniscient Pty Ltd Non-motion detection
GB2392033B (en) * 2002-08-15 2004-10-20 Roke Manor Research Video motion anomaly detector
GB2392033A (en) * 2002-08-15 2004-02-18 Roke Manor Research Video motion anomaly detector
WO2004019272A3 (en) * 2002-08-23 2004-04-29 Scyron Ltd Interactive apparatus and method for monitoring performance of a physical task
WO2004019272A2 (en) * 2002-08-23 2004-03-04 Scyron Limited Interactive apparatus and method for monitoring performance of a physical task
GB2401977A (en) * 2003-05-22 2004-11-24 Hewlett Packard Development Co Surveillance of an area
GB2401977B (en) * 2003-05-22 2006-11-15 Hewlett Packard Development Co Systems,apparatus,and methods for surveillance of an area
EP1542154A3 (en) * 2003-12-11 2006-05-17 Sony United Kingdom Limited Object detection
EP1542154A2 (en) * 2003-12-11 2005-06-15 Sony United Kingdom Limited Object detection
WO2005058668A2 (en) * 2003-12-19 2005-06-30 Core E & S. Co., Ltd Image processing alarm system and method for automatically sensing unexpected accident related to train
WO2005058668A3 (en) * 2003-12-19 2006-03-16 Core E & S Co Ltd Image processing alarm system and method for automatically sensing unexpected accident related to train
US7646329B2 (en) 2004-08-06 2010-01-12 Qinetiq Limited Method for detecting a target
WO2006013347A1 (en) * 2004-08-06 2006-02-09 Quinetiq Limited Method of detecting a target
WO2006109256A2 (en) * 2005-04-12 2006-10-19 Koninklijke Philips Electronics, N.V. Pattern based occupancy sensing system and method
WO2006109256A3 (en) * 2005-04-12 2007-01-04 Koninkl Philips Electronics Nv Pattern based occupancy sensing system and method
WO2010004514A1 (en) * 2008-07-08 2010-01-14 Nortech International (Pty) Limited Apparatus and method of classifying movement of objects in a monitoring zone
US8957966B2 (en) 2008-07-08 2015-02-17 Nortech International (Proprietary) Limited Apparatus and method of classifying movement of objects in a monitoring zone
RU2509355C2 (en) * 2008-07-08 2014-03-10 Нортек Интернэшнл (Пти) Лимитед Apparatus and method of classifying movement of objects in monitoring zone
CN102089770B (en) * 2008-07-08 2013-05-01 诺泰克国际股份有限公司 Apparatus and method of classifying movement of objects in a monitoring zone
CN102089770A (en) * 2008-07-08 2011-06-08 诺泰克国际股份有限公司 Apparatus and method of classifying movement of objects in a monitoring zone
US9418300B2 (en) 2009-01-13 2016-08-16 Robert Bosch Gmbh Device, method, and computer for image-based counting of objects passing through a counting section in a specified direction
WO2010081577A1 (en) * 2009-01-13 2010-07-22 Robert Bosch Gmbh Device, method, and computer for image-based counting of objects passing through a counting section in a prescribed direction

Also Published As

Publication number Publication date
GB9809807D0 (en) 1998-07-08
WO1999059116A1 (en) 1999-11-18
GB2337146B (en) 2000-07-19

Similar Documents

Publication Publication Date Title
GB2337146A (en) Detecting motion across a surveillance area
Cucchiara et al. The Sakbot system for moving object detection and tracking
CN111144247B (en) Escalator passenger reverse detection method based on deep learning
Mukojima et al. Moving camera background-subtraction for obstacle detection on railway tracks
US7646401B2 (en) Video-based passback event detection
US7003135B2 (en) System and method for rapidly tracking multiple faces
US7787656B2 (en) Method for counting people passing through a gate
JP2002536646A (en) Object recognition and tracking system
US20060245618A1 (en) Motion detection in a video stream
Velastin et al. A motion-based image processing system for detecting potentially dangerous situations in underground railway stations
EP2093698A1 (en) Crowd congestion analysis
Mukherjee et al. Anovel framework for automatic passenger counting
Ekinci et al. Silhouette based human motion detection and analysis for real-time automated video surveillance
JPH10285581A (en) Automatic monitoring device
Dick et al. A stochastic approach to tracking objects across multiple cameras
Liu et al. Moving object detection and tracking based on background subtraction
Xu et al. Segmentation and tracking of multiple moving objects for intelligent video analysis
Beghdadi et al. Towards the design of smart video-surveillance system
Black et al. A real time surveillance system for metropolitan railways
Park et al. A track-based human movement analysis and privacy protection system adaptive to environmental contexts
Diamantopoulos et al. Event detection for intelligent car park video surveillance
Hu et al. An effective detection algorithm for moving object with complex background
Swears et al. Functional scene element recognition for video scene analysis
Corvee et al. Occlusion tolerent tracking using hybrid prediction schemes
Leo et al. Real‐time smart surveillance using motion analysis

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20050508