WO2012141574A1 - Intrusion detection system for determining object position - Google Patents

Intrusion detection system for determining object position Download PDF

Info

Publication number
WO2012141574A1
WO2012141574A1 PCT/MY2012/000081 MY2012000081W WO2012141574A1 WO 2012141574 A1 WO2012141574 A1 WO 2012141574A1 MY 2012000081 W MY2012000081 W MY 2012000081W WO 2012141574 A1 WO2012141574 A1 WO 2012141574A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
determining
monitoring area
detection system
intrusion detection
Prior art date
Application number
PCT/MY2012/000081
Other languages
French (fr)
Inventor
Kim Meng Liang
Sze Ling Tang
Zulaikha Kadim
Original Assignee
Mimos Berhad
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mimos Berhad filed Critical Mimos Berhad
Publication of WO2012141574A1 publication Critical patent/WO2012141574A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Definitions

  • the present invention relates generally to an intrusion detection system and the method thereof, more particularly to a system which able to determine the exact location of the objects at the monitoring area accurately and efficiently.
  • Intrusion detection system is a system that detects intrusion of any human body into monitoring area by using various of sensors.
  • the intrusion sensors are widely used in security systems to monitor the boundaries of a well monitored area in order to detect the presence, location and motion of people, objects and vehicles.
  • Intrusion detection is an important feature of video surveillance and video analysis.
  • the monitoring area may consist at least of region of interest or line of interest, where it may be generated from linear or non-linear line.
  • the monitoring area could be banks, offices, buildings, homes or any other areas that are not the open areas.
  • the object that moves across the monitoring area must be detected accurately and at the shortest time.
  • the best configuration of the monitored capturing devices such as camera is position at the top-down view. With the captured image view from the top-down angle, the exact location of the objects at the monitoring area is accurately located. This allows the intrusion event to be detected with high accuracy and at the high speed.
  • Figure 1 illustrates the information as perceived from the 2D image captured using capturing device. In this, the image is captured from a camera at anonymous location and angle. The object is moving at various directions towards monitoring area at anonymous angle.
  • figure lb highlighted the difficulty in determining the exact location of the object at the monitoring area, where the motion area of the object is highly overlapped- , with the monitoring area.
  • the noise that may appear in the motion blob of the object influences the determination of the exact location of the object at the monitoring area.
  • An object of the present invention is to provide an intrusion detection system uniquely designed and adapted to determine the exact location of the object at the monitoring area
  • a system and method to determine the exact location of the object at the monitoring area regardless of camera location, camera angle and angle of the monitoring area with relation to the camera is described.
  • the present invention analyses the motion blobs of the objects at the monitoring area and utilizing the path direction of the objects in order to determine the exact location of the objects.
  • the determination of the objects at the monitoring area allows the system to detect objects that crosses the monitoring area accurately and efficiently.
  • Figure la shows a series of output images which illustrating the information as perceived from the 2D image captured using capturing device with previous system and method
  • Figure lb shows a series of output images which highlighting the difficulty in determining the exact location of the object at the monitoring area with previous system and method
  • Figure 2 is a flowchart illustrating an intrusion detection system in accordance to the present invention
  • Figure 3 illustrates a flowchart showing the method of determining the object position at the monitoring area
  • Figure 4 is a flowchart showing the method of performing area analysis of the object
  • Figure 5 depicts a flowchart showing the method of determining region label of the objects using blob analysis
  • Figure 6 shows an illustration of estimation of slope degree of a non-linear line
  • Figure 7 is one of the examples of the method to determine the area threshold from the slope degree and path direction of the objects .
  • FIG. 2 illustrates an intrusion detection system (10) of the present invention.
  • the intrusion detection system (10) for determining object position at a monitoring area includes a detecting means (11) which involves an initialization module (11a), process module (lib) and a display module (11c), wherein the process module (lib) includes a comparing means (12) for detecting an interest object, a tracking means (13) for tracking the detected interest object, a position determining means (14) for determining the exact position of the object and an analyzing means (15) for analyzing the determined position for event triggering.
  • the process module (lib) includes a comparing means (12) for detecting an interest object, a tracking means (13) for tracking the detected interest object, a position determining means (14) for determining the exact position of the object and an analyzing means (15) for analyzing the determined position for event triggering.
  • the position determining means (14) includes an extracting means (51) for extracting motion blobs of the interest objects, and a region label forecasting means (52) for determining object position of the objects using blob analysis by performing area analysis based on path direction and slope degree .
  • the initialization module (11a) of the detecting means (11) has at least one capturing devices (16) for capturing an input image.
  • the capturing device (16) ranges from visible light camera to non-visible light camera is used in this embodiment of the invention for intrusion detection at a monitoring area (17).
  • the initialization module (11a) only needed to be carried out at the beginning of the system (10) .
  • the monitoring area (17) is initialized by the user in order for an event to be detected if object crossing the monitoring area (17).
  • the monitoring area (17) includes at least of single or multiple linear and non-linear line.
  • the process module (lib) begins with the step of detecting object (12a) by the comparing means (12) which using the background estimation technique, where the intensity of the current image is compared with the intensity of the previous image.
  • the object is tracked (13a) by the tracking means (13) in the scene by giving a consistent label to the moving object in the scene.
  • the position of the tracked object is then determined (14a) by the position determining means (14).
  • the analysis is carried out (15a) by the analyzing means (15) for event triggering.
  • an alert text and bounding box indicating the object that crossing the monitoring area (17) are overlaid (19) to an output image (18), which will be displayed at the display devices.
  • the overlay output only will be carried out if object crossing the predefined line is detected.
  • Figure 3 depicts a flowchart showing the method of determining the object position (14a) at the monitoring area (17) in the present invention.
  • the method of determining the object position (14a) begins with a step of extracting motion blobs (21) of the interest objects. In this step, the area of the motion blobs in the scene is determined (22) . If the motion blobs overlap with the line or regions of interest, ROI (23) of the monitoring area (17), the step of performing area analysis (24) of the object to determine the object's position is executed. However, if the motion blobs do not overlap (25) with the monitoring area (17), the location with reference to the motion blobs are automatically extracted (26) .
  • the extracted object's locations for both steps (24, 26) are stored (27) into their respective object repository for further analysis.
  • the step of performing area analysis (24) of the object to determine the object's position where the motion blobs overlap with the monitoring area (17) the detailed steps are shown in Figure 4 which involves the region label forecasting means (52) .
  • the method begins with the step of determining the path direction (31) with reference to the capturing device where the path direction includes the direction of the object moving either towards the capturing device location or moving away from the capturing device location. This path direction is determined based on the history location information of the • particular object that is stored from the previous frames. Then the method follows with the step of determining the area of the motion blobs (32) that are overlapped with the monitoring area (17).
  • this determined area of motion blob is then used for further analysis in the following step.
  • the overlapped motion blobs are divided (33) to two blobs where the divider is the predefined lines.
  • MB far the step of extracting the area of the divided blob, MB far (34) that is furthest away according to the early detected path direction.
  • the final step is determining region label of the objects (35) using blob analysis and the detailed steps are shown in Figure 5.
  • the step is determining region label of the objects (35) using blob analysis begins with the step of determining the slope degree (41) of the monitoring area, where the slope degree is estimated based on the angle that is formed from the horizontal line and line that is created with the two furthest points of predefined lines that overlapped with the motion blobs. Therefore, the similar technique is used to estimate the slope degree of a non-linear line and the illustration of estimation of slope degree of a non-linear line is shown in Figure 6.
  • the area threshold is extracted automatically with a preset value (45) .
  • the extracted area of motion blob, MB far with extracted area threshold for both conditions are then compared
  • the area threshold is a threshold to filter the area of the motion blob, MB far that overlapped the monitoring area. If the extracted area motion blob, MB far is larger than the area threshold (47), the location with reference to motion blob, MB far is extracted (48). Otherwise (4.9), the location with reference to the other divided motion blob of MB far is extracted (50) .
  • Figure 7 illustrates one of the examples of the method to determine the area threshold adaptively from the slope degree and path direction of the objects (motion blobs) .
  • the predefined threshold, Tl is defined as 60 degree.
  • a predefined threshold value is extracted regardless of the path direction and the area threshold value is extracted as 70%.
  • the area threshold value is determined adaptively based on the path direction.
  • two categories of the slope degree are introduced wherein a first category has a range of slope degree from 0 degree to 30 degree and a second category has range of slope degree from 31 degree to 60 degree.
  • area threshold is extracted based on path direction either moving away or towards to the capturing device.
  • area threshold is defined as 5% for path direction that moving towards to the capturing device and 95% for path direction that moving away from the capturing device.
  • area threshold is defined as 10% for path direction that moving towards to the capturing device and 85% for path direction that moving away from the capturing device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Burglar Alarm Systems (AREA)

Abstract

An intrusion detection system (10) with a detection means (11) for determining object position in a monitoring area (17), comprises an initialization module (11a) having at least one capturing device (16) for capturing images, a process module (lib) which comprises a comparing means (12) for detecting an interest object, a tracking means (13) for tracking said detected interest object, a position determining means (14) for determining the exact position of said object by performing area analysis based on the overlapping area of said object with reference to the path direction and slope degree, and an analyzing means (15) for analyzing the determined position for event triggering, and a display module (11c) for displaying output images at the display devices.

Description

Intrusion Detection System for Determining Object Posi'bion
Field of Invention
The present invention relates generally to an intrusion detection system and the method thereof, more particularly to a system which able to determine the exact location of the objects at the monitoring area accurately and efficiently.
Background of the Invention
Intrusion detection system is a system that detects intrusion of any human body into monitoring area by using various of sensors. The intrusion sensors are widely used in security systems to monitor the boundaries of a well monitored area in order to detect the presence, location and motion of people, objects and vehicles. Intrusion detection is an important feature of video surveillance and video analysis. The monitoring area may consist at least of region of interest or line of interest, where it may be generated from linear or non-linear line. The monitoring area could be banks, offices, buildings, homes or any other areas that are not the open areas.
In certain fields, it may be desirable to know the location of the objects and people as they traverse a monitored area. In practical implementation of automated video surveillance and video analysis system, the object that moves across the monitoring area must be detected accurately and at the shortest time. In order to fulfill the demanded accuracy and speed, the best configuration of the monitored capturing devices such as camera is position at the top-down view. With the captured image view from the top-down angle, the exact location of the objects at the monitoring area is accurately located. This allows the intrusion event to be detected with high accuracy and at the high speed.
However, the configuration of the camera location, camera angle and angle of the predefined line with reference to the camera location are unknown in the practical implementation. With this configuration, the location of the object at the monitoring area is subjective, whether it is located before the monitoring area or after the monitoring area is unknown. This is due to the high overlapping motion area of the object at the monitoring ' area and huge loss of the spatial information of transforming 3D information into 2D information. The subjectivity of the location of the object at the predefined line is illustrated in Figure 1. Figure la illustrates the information as perceived from the 2D image captured using capturing device. In this, the image is captured from a camera at anonymous location and angle. The object is moving at various directions towards monitoring area at anonymous angle. Whereas figure lb highlighted the difficulty in determining the exact location of the object at the monitoring area, where the motion area of the object is highly overlapped-, with the monitoring area. In addition, the noise that may appear in the motion blob of the object influences the determination of the exact location of the object at the monitoring area.
An object of the present invention is to provide an intrusion detection system uniquely designed and adapted to determine the exact location of the object at the monitoring area
Other objects of this invention will become apparent on the reading of this entire disclosure.
Summary of the Invention
In the present invention, a system and method to determine the exact location of the object at the monitoring area regardless of camera location, camera angle and angle of the monitoring area with relation to the camera is described. The present invention analyses the motion blobs of the objects at the monitoring area and utilizing the path direction of the objects in order to determine the exact location of the objects. The determination of the objects at the monitoring area allows the system to detect objects that crosses the monitoring area accurately and efficiently.
Brief Description of the Drawings
Other objects, features, and advantages of the invention will be apparent from the following description when read with reference to the accompanying drawings. In the drawings, wherein like reference numerals denote corresponding parts throughout the several views:
Figure la shows a series of output images which illustrating the information as perceived from the 2D image captured using capturing device with previous system and method;
Figure lb shows a series of output images which highlighting the difficulty in determining the exact location of the object at the monitoring area with previous system and method; Figure 2 is a flowchart illustrating an intrusion detection system in accordance to the present invention;
Figure 3 illustrates a flowchart showing the method of determining the object position at the monitoring area; Figure 4 is a flowchart showing the method of performing area analysis of the object;
Figure 5 depicts a flowchart showing the method of determining region label of the objects using blob analysis;
Figure 6 shows an illustration of estimation of slope degree of a non-linear line; and
Figure 7 is one of the examples of the method to determine the area threshold from the slope degree and path direction of the objects .
Detailed Description of the Preferred Embodiments
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well-known methods, procedures and/or components have not been described in detail so as not to obscure the invention. Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. Figure 2 illustrates an intrusion detection system (10) of the present invention. The intrusion detection system (10) for determining object position at a monitoring area includes a detecting means (11) which involves an initialization module (11a), process module (lib) and a display module (11c), wherein the process module (lib) includes a comparing means (12) for detecting an interest object, a tracking means (13) for tracking the detected interest object, a position determining means (14) for determining the exact position of the object and an analyzing means (15) for analyzing the determined position for event triggering. By determining the exact location of the objects at the monitoring area, the object crossing the monitoring area is detected accurately and efficiently.
The position determining means (14) includes an extracting means (51) for extracting motion blobs of the interest objects, and a region label forecasting means (52) for determining object position of the objects using blob analysis by performing area analysis based on path direction and slope degree .
The initialization module (11a) of the detecting means (11) has at least one capturing devices (16) for capturing an input image. The capturing device (16) ranges from visible light camera to non-visible light camera is used in this embodiment of the invention for intrusion detection at a monitoring area (17). The initialization module (11a) only needed to be carried out at the beginning of the system (10) . The monitoring area (17) is initialized by the user in order for an event to be detected if object crossing the monitoring area (17). The monitoring area (17) includes at least of single or multiple linear and non-linear line.
The process module (lib) begins with the step of detecting object (12a) by the comparing means (12) which using the background estimation technique, where the intensity of the current image is compared with the intensity of the previous image. For each interest object detected at the background estimation process, the object is tracked (13a) by the tracking means (13) in the scene by giving a consistent label to the moving object in the scene. The position of the tracked object is then determined (14a) by the position determining means (14). And lastly, with the determined positions, the analysis is carried out (15a) by the analyzing means (15) for event triggering.
In the display module (11c) of the detecting means (11) of the present invention, an alert text and bounding box indicating the object that crossing the monitoring area (17) are overlaid (19) to an output image (18), which will be displayed at the display devices. The overlay output only will be carried out if object crossing the predefined line is detected.
Figure 3 depicts a flowchart showing the method of determining the object position (14a) at the monitoring area (17) in the present invention. The method of determining the object position (14a) begins with a step of extracting motion blobs (21) of the interest objects. In this step, the area of the motion blobs in the scene is determined (22) . If the motion blobs overlap with the line or regions of interest, ROI (23) of the monitoring area (17), the step of performing area analysis (24) of the object to determine the object's position is executed. However, if the motion blobs do not overlap (25) with the monitoring area (17), the location with reference to the motion blobs are automatically extracted (26) . Then, the extracted object's locations for both steps (24, 26) are stored (27) into their respective object repository for further analysis. In the step of performing area analysis (24) of the object to determine the object's position where the motion blobs overlap with the monitoring area (17), the detailed steps are shown in Figure 4 which involves the region label forecasting means (52) . The method begins with the step of determining the path direction (31) with reference to the capturing device where the path direction includes the direction of the object moving either towards the capturing device location or moving away from the capturing device location. This path direction is determined based on the history location information of the particular object that is stored from the previous frames. Then the method follows with the step of determining the area of the motion blobs (32) that are overlapped with the monitoring area (17). And this determined area of motion blob is then used for further analysis in the following step. Then, the overlapped motion blobs are divided (33) to two blobs where the divider is the predefined lines. After that, it follows with the step of extracting the area of the divided blob, MBfar (34) that is furthest away according to the early detected path direction. And with the extracted furthest divided blob, MBfar, the final step is determining region label of the objects (35) using blob analysis and the detailed steps are shown in Figure 5.
The step is determining region label of the objects (35) using blob analysis begins with the step of determining the slope degree (41) of the monitoring area, where the slope degree is estimated based on the angle that is formed from the horizontal line and line that is created with the two furthest points of predefined lines that overlapped with the motion blobs. Therefore, the similar technique is used to estimate the slope degree of a non-linear line and the illustration of estimation of slope degree of a non-linear line is shown in Figure 6.
If the slope degree is less than predefined threshold, Tl
(42), the area thresholds for different path directions with reference to slope degree are then extracted (43). Otherwise
(44), the area threshold is extracted automatically with a preset value (45) . The extracted area of motion blob, MBfar with extracted area threshold for both conditions are then compared
(46) . The area threshold is a threshold to filter the area of the motion blob, MBfar that overlapped the monitoring area. If the extracted area motion blob, MBfar is larger than the area threshold (47), the location with reference to motion blob, MBfar is extracted (48). Otherwise (4.9), the location with reference to the other divided motion blob of MBfar is extracted (50) .
Figure 7 illustrates one of the examples of the method to determine the area threshold adaptively from the slope degree and path direction of the objects (motion blobs) . As shown in the figure 5, the predefined threshold, Tl is defined as 60 degree. For slope degree which is above the Tl value, a predefined threshold value is extracted regardless of the path direction and the area threshold value is extracted as 70%. For slope degree which is equal or below Tl value, the area threshold value is determined adaptively based on the path direction. For determination of adaptive area threshold value, two categories of the slope degree are introduced wherein a first category has a range of slope degree from 0 degree to 30 degree and a second category has range of slope degree from 31 degree to 60 degree.
For each category, different area threshold is extracted based on path direction either moving away or towards to the capturing device. In the preferred embodiment, for the first category, area threshold is defined as 5% for path direction that moving towards to the capturing device and 95% for path direction that moving away from the capturing device. As for the second category, area threshold is defined as 10% for path direction that moving towards to the capturing device and 85% for path direction that moving away from the capturing device.
As will be readily apparent to those skilled in the art, the present invention may easily be produced in other specific forms without departing from its essential characteristics. The present embodiments is, therefore, to be considered as merely illustrative and not restrictive, the scope of the invention being indicated by the claims rather than the foregoing description, and all changes which come within therefore intended to be embraced therein.

Claims

Claims
1. An intrusion detection system (10) with a detection means (11) for determining object position in a monitoring area (17), comprising:
an initialization module (11a) having at least one capturing device (16) for capturing images for an event to be detected if object crossing said monitoring area (17);
a process module (lib) which comprising:
a comparing means (12) for detecting an interest object;
a tracking means (13) for tracking said detected interest obj ect ;
a position determining means (14) for determining the exact position of said object by performing area analysis based on the overlapping area of said object with reference to the path direction and slope degree; and
an analyzing means (15) for analyzing the determined position for event triggering; and
a display module (11c) for displaying output images at the display devices.
2 . The intrusion detection system (10) with a detection means (11) for determining object position in a monitoring area (17) as claimed in claim 1, wherein said position determining means (14) includes an extracting means (51) for extracting motion blobs of the interest objects to determine if the motion blobs overlap (22) with the monitoring area (17) to perform the area analysis (24 ) .
3. The intrusion detection system (10) with a detection means (11) for determining object position in a monitoring area (17) as claimed in claim 2, wherein said position determining means (14) further comprising a region label forecasting means (52) to determine the path direction with reference to the capturing device (16) and area of motion blobs which is then divided with a predefined line to define the area that is furthest from the determined path direction for determining region label of the objects (35) .
4. The intrusion detection system (10) with a detection means (11) for determining object position in a monitoring area (17) as claimed in claim 3, wherein said region label forecasting means (52) includes means for determining region label of the objects using blob analysis to determine the slope degree of said monitoring area (17) with reference to predefined threshold, Tl to extract area threshold for comparing with the extracted area motion blob to obtain the location of the obj ects .
5. The intrusion detection system (10) with a detection means (11) for determining object position in a monitoring area (17) as claimed in claim 3, wherein said monitoring area (17) includes at least of single or multiple linear and non-linear line .
6. The intrusion detection system (10) with a detection means (11) for determining object position in a monitoring area (17) as claimed in claim 3, wherein said monitoring area (17) includes at least a region of interest that build from single or multiple linear and non-linear line.
7. A method for determining object position in a monitoring area (17) with an intrusion detection system (10) includes a detection means (11) having a process module (lib), said method comprising the steps of:
detecting an interest object (12a) with a comparing means (12);
tracking said detected interest object (13a) with a tracking means ( 13 ) ;
determining the exact position of said interest object (14a) by performing area analysis based on the overlapping area of said object with reference to the path direction and slope degree of said monitoring area (17); and
analyzing the determined position (15a) for an event triggering with an analyzing means (15) .
8. The method for determining object position in a monitoring area (17) with an intrusion detection system (10) includes a detection means (11) having a process module (lib) as claimed in claim 7, wherein said step of determining object position includes the step of extracting motion blobs (21) of said interest object to determine if said motion blobs overlap (22) with said monitoring area (17) to perform the area analysis (24).
9. The method for determining object position in a monitoring area (17) with an intrusion detection system (10) includes a detection means (11) having a process module (lib) as claimed in claim 8, wherein said step of performing area analysis includes the steps of:
determining the path direction (31) with reference to said capturing device (16);
determining the area of motion blobs (32) that are overlapped with said monitoring area (17);
dividing said overlapped motion blobs (33) with reference to a predefined line; and
extracting the area (34) of said divided blob, MBfar that is furthest away according to said determined path direction; and determining region label (35) of said object using blob analysis .
10. The method for determining object position in a monitoring area (17) with an intrusion detection system (10) includes a detection means (11) having a process module (lib) as claimed in claim 9, wherein said step of determining region label (35) of said object includes the steps of:
determining the slope degree (41) of the predefined line to compare with a predefined threshold value for extracting area threshold (43, 45);
comparing said extracted area (46) of motion blob, MBFar with said related area threshold; and
extracting location with reference to motion blob, MBfar (48) if said extracted area of motion blob, Bfar is larger (47) than said related area threshold.
11. The method for determining object position in a monitoring area (17) with an intrusion detection system (10) includes a detection means (11) having a process module (lib) as claimed in claim 9 , wherein said slope degree is estimated based on the angle that is formed from the horizontal line and line that is created with the two furthest points of predefined lines that overlapped with the motion blobs
12. The method for determining object position in a monitoring area (17) with an intrusion detection system (10) includes a detection means (11) having a process module (lib) as claimed in claim 10, wherein said determining the slope degree (41) of the predefined line is followed by the step of extracting area thresholds for different path directions (43) with reference to slope degree if said slope degree is lesser (42) than a predefined threshold value.
13. The method for determining object position in a monitoring area (17) with an intrusion detection system (10) includes a detection means (11) having a process module (lib) as claimed in claim 10, wherein said determining the slope degree (41) of the predefined line is followed by the step of extracting predefined area threshold (45) if said slope degree is larger (44) than a predefined threshold value.
14. The method for determining object position in a monitoring area (17) with an intrusion detection system (10) includes a detection means (11) having a process module (lib) as claimed in claim 10, wherein said location with reference to the other divided motion blob of MBfar (50) is extracted if said extracted area of motion blob, MBfar is lesser (49) than said related area threshold.
PCT/MY2012/000081 2011-04-14 2012-04-12 Intrusion detection system for determining object position WO2012141574A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI2011001676 2011-04-14
MYPI2011001676A MY167508A (en) 2011-04-14 2011-04-14 Intrusion detection system for determining object position

Publications (1)

Publication Number Publication Date
WO2012141574A1 true WO2012141574A1 (en) 2012-10-18

Family

ID=47009558

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2012/000081 WO2012141574A1 (en) 2011-04-14 2012-04-12 Intrusion detection system for determining object position

Country Status (2)

Country Link
MY (1) MY167508A (en)
WO (1) WO2012141574A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017051499A1 (en) * 2015-09-25 2017-03-30 Sony Corporation Representative image generation device, representative image generation method, and program
US10740623B2 (en) 2015-09-25 2020-08-11 Sony Corporation Representative image generation device and representative image generation method
WO2022159371A1 (en) * 2021-01-20 2022-07-28 Siemens Aktiengesellschaft Image processing system, method and device, and computer-readable medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060170769A1 (en) * 2005-01-31 2006-08-03 Jianpeng Zhou Human and object recognition in digital video
US7688999B2 (en) * 2004-12-08 2010-03-30 Electronics And Telecommunications Research Institute Target detecting system and method
US7801330B2 (en) * 2005-06-24 2010-09-21 Objectvideo, Inc. Target detection and tracking from video streams

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7688999B2 (en) * 2004-12-08 2010-03-30 Electronics And Telecommunications Research Institute Target detecting system and method
US20060170769A1 (en) * 2005-01-31 2006-08-03 Jianpeng Zhou Human and object recognition in digital video
US7801330B2 (en) * 2005-06-24 2010-09-21 Objectvideo, Inc. Target detection and tracking from video streams

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017051499A1 (en) * 2015-09-25 2017-03-30 Sony Corporation Representative image generation device, representative image generation method, and program
US10740623B2 (en) 2015-09-25 2020-08-11 Sony Corporation Representative image generation device and representative image generation method
WO2022159371A1 (en) * 2021-01-20 2022-07-28 Siemens Aktiengesellschaft Image processing system, method and device, and computer-readable medium

Also Published As

Publication number Publication date
MY167508A (en) 2018-09-04

Similar Documents

Publication Publication Date Title
KR101593187B1 (en) Device and method surveiling innormal behavior using 3d image information
US10032283B2 (en) Modification of at least one parameter used by a video processing algorithm for monitoring of a scene
Regazzoni et al. Video analytics for surveillance: Theory and practice [from the guest editors]
CN102521578B (en) Method for detecting and identifying intrusion
US20130265423A1 (en) Video-based detector and notifier for short-term parking violation enforcement
Verstockt et al. A multi-modal video analysis approach for car park fire detection
US20150049906A1 (en) Human image tracking system, and human image detection and human image tracking methods thereof
KR101921610B1 (en) Method and Apparatus for Monitoring Objects from Video
EP2587462A1 (en) Image monitoring system and method with false alarm rate reduction
KR101840042B1 (en) Multi-Imaginary Fence Line Setting Method and Trespassing Sensing System
WO2012141574A1 (en) Intrusion detection system for determining object position
Chauhan et al. Study of moving object detection and tracking for video surveillance
KR101736431B1 (en) System and Method for Smart Traffic Monitoring Using Multiple Image
US9589191B2 (en) Method for evaluating a plurality of time-offset pictures, device for evaluating pictures, and monitoring system
KR101960741B1 (en) Method for Setting Event Rule According to Interested Boundary and Apparatus for Monitoring Event Using The Method
CN105793868B (en) Method for video surveillance
WO2012148258A1 (en) Abrupt object movement detection system and method thereof
CN111476142A (en) Video image detection method and device
Foresti et al. Vehicle detection and tracking for traffic monitoring
CN111063145A (en) Intelligent processor for electronic fence
KR101951900B1 (en) Method and Apparatus for Detecting Object in an Image
KR101684098B1 (en) Monitoring system with 3-dimensional sensor and image analysis integrated
CN114879177B (en) Target analysis method and device based on radar information
Kadim et al. Video analytics algorithm for detecting objects crossing lines in specific direction using blob-based analysis
WO2011005074A1 (en) Surveillance system and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12772044

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12772044

Country of ref document: EP

Kind code of ref document: A1