CA2814294A1 - Object detection - Google Patents

Object detection Download PDF

Info

Publication number
CA2814294A1
CA2814294A1 CA2814294A CA2814294A CA2814294A1 CA 2814294 A1 CA2814294 A1 CA 2814294A1 CA 2814294 A CA2814294 A CA 2814294A CA 2814294 A CA2814294 A CA 2814294A CA 2814294 A1 CA2814294 A1 CA 2814294A1
Authority
CA
Canada
Prior art keywords
blob
camera
digital video
video processor
alert
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CA2814294A
Other languages
French (fr)
Other versions
CA2814294C (en
Inventor
Wael Badawy
Choudhury A Rahman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intelliview Technologies Inc
Original Assignee
Intelliview Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intelliview Technologies Inc filed Critical Intelliview Technologies Inc
Priority to CA2814294A priority Critical patent/CA2814294C/en
Priority to CA3104723A priority patent/CA3104723C/en
Priority to PCT/CA2014/050407 priority patent/WO2014176693A1/en
Priority to EP14791064.0A priority patent/EP2992365B1/en
Publication of CA2814294A1 publication Critical patent/CA2814294A1/en
Application granted granted Critical
Publication of CA2814294C publication Critical patent/CA2814294C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M3/00Investigating fluid-tightness of structures
    • G01M3/002Investigating fluid-tightness of structures by using thermal means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M3/00Investigating fluid-tightness of structures
    • G01M3/38Investigating fluid-tightness of structures by using light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Abstract

An object detection apparatus comprising a camera having video output comprising frames; and a digital video processor configured to receive the video output from the camera, detect and track a blob in the frames to determine a trajectory for the blob and trigger an alert message if the trajectory of the blob is characteristic of the object. The digital video processor may detect and classify the object as a leak, and provide an alert or alarm. The digital video processor may detect and classify the object as a bird, and provide a bird report. A weather station may be combined with the digital video processor to receive input from the weather station and take the input from the weather station into account in determining whether to trigger an alert.

Description

OBJECT DETECTION
TECHNICAL FIELD
[0001] Object detection.
BACKGROUND
[0002] A variety of leak detection methods are known, including those described in United States patent nos. 4,772,789; 4,963,742; 5,001,346; 5,210,526;
6,812,846 and 7,460,980.
SUMMARY
[0003] The inventors have disclosed a new apparatus for object detection, including leak and bird detection.
[0004] There is thus provided an object detection apparatus, comprising a camera having video output comprising frames; and a digital video processor configured to receive the video output from the camera, detect and track a blob in the frames to determine a trajectory for the blob and trigger an alert message if the trajectory of the blob is characteristic of the object. The digital video processor may detect and classify the object as a leak, and provide an alert or alarm. The digital video processor may detect and classify the object as a bird, and provide a bird report. There may also be provided a weather station and the digital video processor being configured to receive input from the weather station and take the input from the weather station into account in determining whether to trigger an alert. The digital video processor carries out the disclosed detection methods.
[0005] These and other aspects of the device and method are set out in the claims, which are incorporated here by reference.
BRIEF DESCRIPTION OF THE FIGURES
[0006] Embodiments will now be described with reference to the figures, in which like reference characters denote like elements, by way of example, and in which:
[0007] Fig. 1 is a schematic showing a leak detection system;
[0008] Fig. 2 is a high level flow diagram of a leak detection algorithm;
[0009] Fig. 3 is a flow diagram of frame capture and background modeling;
[0010] Fig. 4 is a flow diagram of object detection;
[0011] Fig. 5 is a flow diagram of object characterization and decision making;
[0012] Fig. 6 shows an exemplary display panel for a leak detection digital video recorder;
[0013] Fig. 7 shows an exemplary event notification panel for a leak detection digital video recorder;
[0014] Fig. 8 shows a vessel being monitored along with object areas drawn;
and
[0015] Fig. 9 shows an algorithm for configuring a digital video processor to detect a bird.
DETAILED DESCRIPTION
[0016] Immaterial modifications may be made to the embodiments described here without departing from what is covered by the claims. Referring to Fig. 1, there is shown a leak detection apparatus 10 comprising a digital video processor (DVP) 12, one or more video inputs 14 connected to output frames of video to the DVP 12, one or more accessory inputs 16 connected to output sensor signals to the DVP 12, and event notification outputs 18 that may be connected directly or indirectly, for example through a network 20 such as the internet, to receive alert messages from the DVP 12. The video inputs 14 may comprise one or more of a color camera 22 such as a Day/Night Color Camera IVT-LIT9OESHQ, a thermal camera 24 such as Thermal Camera IVT-XWTA-19, pan-tilt-zoom camera 26 such as a conventional security camera, and hazardous location camera 28 such as Day/Night Hazardous Location Camera IVT-C1D190ESHQ.
[0017] The accessory inputs 16 may comprise one or more of a radar detector 32, weather station 34, gas detector 36 and flame detector 38. The inputs 14 and 16 may be conventional commercially available products. The DVP 12 may comprise a digital video recorder, with built in or added encoder, having digital video processing circuitry or a digital video recorder combined with separate encoder and processor. The DVP 12 may comprise conventional digital video recording and processing hardware, configured with software for carrying out the functions disclosed here to receive frames of video output from one or more of the video inputs 14, detect and track a blob in the frames to determine a trajectory for the blob and trigger an alert message if the trajectory of the blob is characteristic of a leak. For example, the DVP 12 may comprise a SmrtDVR IVT-DVR405-05 4Ch DVR.
[0018] The camera 22, 24, 26 and/or 28 may be connected via 75 ohm BNC (RG
59U/RG 6) cable or other suitable communication link to the DVP 12. An exemplary DVP
12 may be equipped with a H.264 hardware encoder, capable of encoding analog video feed up to a maximum of 4CIF (704x480) resolution at 30fps. Baseline profile is used with a quality settings of 30 and GOV settings of 5. Leak analytic preferably uses two cameras; one thermal 24 and one color 22. The thermal camera 24 is preferably the one in which the algorithm runs for leak detection. The color camera 22 or other cameras 26, 28 are preferably used for visual reference and verification. The analysis may for example be done on raw frame buffer (YUV) of CIF (352x240) resolution at 10fps. The video may for example be recorded in H.264 compressed format of which resolution and frame rate are user selectable.
In the exemplary embodiment, supported resolutions and frame rates for recorder video are CIF/2CIF/4CIF and 5/10/15/30 fps, respectively, but this will change as standards and available equipment change.
[0019] As illustrated in Fig. 2, following frame capture 40, the leak detection algorithm goes through a series of stages comprising background modeling 42, object detection 44, object characterization 45, decision making 46 and event notification 48.
Background modeling 41 is carried out as a precursor to the object identification process and includes a learning process when first initiated. The frame capture and background modeling stage is shown in Fig. 3 in the following steps: 50 a frame is captured at a time rate dependent on the hardware, 52 frame processing, again hardware dependent, but in this example to ensure the frame is downscaled to a Common Intermediate Format (CIF) if not already CIF, 54 check camera illumination, if this requires changing, reset learning process if change is a above a user selectable threshold, 56 generate or update an environmental filter map, if selected, 57 adjust or set auto contract and 58 generate or update the background model from time to time, such as every few seconds. The environmental filter step 58 may be obtained from data from an accessory input such as a weather station 34 to filter out effects of weather, such as glare, rain or snow.
[0020] An exemplary object detection and characterization process is shown in Fig.
4, and includes the following steps: 60 a frame is captured at a time rate dependent on the hardware, 61 frame processing, again hardware dependent, but in this example to ensure the frame is downscaled to a Common Intermediate Format (CIF) if not already CIF, 62 a frame from the thermal camera 24, such as a YUV encoded frame (though other formats may be used), is compared with the background to generate a difference map, 64 blobs within the difference frame are detected by any one of a variety of conventional digital image processing methods, 65 merge blobs if the distance between blobs (x-y pixel distance for example), satisfies a user set distance threshold, 66 filter the blobs for environmental and/or size effects (as for the background to remove for example weather effects) and 68 track the blobs to determine their trajectory and update a list of blobs with the tracking information, including an identification code for each blob and the trajectory information associated with the blob. Blob detection may use HASM as disclosed in "A low power VLSI
architecture for mesh-based video motion tracking" Badawy, W.; Bayoumi, M.A. IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, Volume 49, Issue 7, pp 488-504, July 2002; and also in "On Minimizing Hierarchical Mesh Coding Overhead:
(HASM) Hierarchical Adaptive Structured Mesh Approach", Badawy, W., and Bayoumi, M., Proc.
IEEE Int. Conf. Acoustics, Speech and Signal Processing, Istanbul, Turkey, June 2000, p.
1923-1926; and "Algorithm Based Low Power VLSI Architecture for 2-D mesh Video-Object Motion Tracking", IEEE Transactions on Circuits and Systems for Video Technology, vol. 12, no. 4, April 2002 and United States patent no. 7,986,810.
[0021] In the detection and tracking process of Fig. 4, blobs are detected as each new frame is compared and analyzed with the background model to find the difference. The blobs may be detected as blob rectangles (set of x-y pixel coordinates forming a rectangle) or other polygons. These blob rectangles are further analyzed in order to combine together to form an object. This is done in order to interpret one physical object just by one object and not by multiple smaller objects.
[0022] The object characterization depends on the analytic. For leak analytic the object characterization factors (leak criteria) are: Object has to appear first inside the area of interest. Object cannot appear outside the area of interest and move inside.
Object can disappear in the next frame but can reappear in the following frame. Object must not move beyond a defined boundary from where it first appeared in the frame. Object has to be within the area of interest in the successive frames before the leak alarm is set off. Objects may be characterized as leaks, drips, sprays and pools.
[0023] The decision making algorithm depends on two user defined parameters:
Time to Alert and Time to Alarm. "Time to Alert" is the time that the algorithm waits for until it sets off an "Alert" to warn the user. The alert basically indicates that there is a potential leak. "Time to Alarm" is the time till when the algorithm waits before it turns the "Alert" into "Alarm" state. The algorithm is shown in Fig. 5. The counter is linear when incrementing and non-linear when decrementing. Its decay rate increases while decrementing. Thus, as shown in Fig. 5, the decision making algorithm follows these steps:
70 check if the object satisfies leak criteria, 72 if no, capture next frame and perform object detection and characterization, 74, if yes, increase the background update interval, 76 record the initial position of the blob (object) and time when it first appeared, initialize a counter to zero, 78 capture a next frame (does not have to be the next frame, depending on frame rate and desired sensitivity) and perform object detection and characterization as in Fig. 4, 80 check of an object is found and it satisfies the leak criteria, 82 if no object found that satisfies leak criteria, decrement the counter and check if counter less than or equal to zero, and if no again, return to step 78, and if counter is less than or equal to zero, then step 84 reset learning interval and return to step 72, if an object is found and it satisfies leak criteria in step 80, then 86 calculate the elapsed time since the object first appeared, 88 if the elapsed time is greater than or equal to an alert time (first threshold) issue an alert message 90 otherwise return to step 78, and 92 if the elapsed time is greater than the alarm time (second threshold) issue an alarm message otherwise return to step 78. A leak is therefore characterized, and an alert or alarm triggered, if it meets the criteria and persists after its initial appearance as a leak.
[0024] For event notification, in the event of an alert or alarm the system can be programmed to send the notification to the user in several ways. System can send the notification to a central monitoring station 96, to email 98 or to a handheld smart phone 99 with mobile app. The events can also be viewed through web browser 97 (for example Internet Explorer/Chrome) by logging into the DVP 12. The notification contains alert/alarm event message and may include snapshots from the thermal camera 24 and color camera 22 (if equipped) or other camera 26, 28. System can also be programmed to set off a buzzer or siren through its digital 10 interface or even send notification to SCADA
system through RTU/MODBUS interface.
[0025] When an alarm occurs, or at other times, the system 10 may use the thermal camera 24 or one of the other video inputs 14 to capture and store an image of the field of view of the thermal camera 24. The system 10 is preferably self-contained and able to operate on localized power when available, or equipped with its own power source such as battery, diesel generator, solar, wind or fuel cell. In hazardous location applications, the system 10 should be installed in approved enclosures, such as Class 1 Division 1 and Division 2 enclosures, and built to withstand extreme temperatures. Video from any of the inputs 14 may be stored in the DVP 12 for one touch playback and instant video tagging.
DVP 12 may be connected to monitor hundreds of cameras and devices as well as providing leak detection, color matching, security analysis and more in a few rack mountable devices.
Camera coverage varies by type and application, but may for example be placed from 5 m to 150 m from the target area.
[0026] Current thermal camera technology requires a temperature difference of 5 degrees C for the camera to distinguish between objects. When external temperatures match the temperature of the product in a pipeline (5 degrees plus/minus), the detection system will not be able to detect leaks. To resolve this, the DVP 12 can accept a communications input from the operator of the equipment being monitored, such as a Modbus or other Scada input, to indicate product temperature, and the Weather Station 34 can be used to indicate ambient temperature. When the DVP 12 sees an ambient thermal match, it can be programmed to send notifications and periodic images using the color camera 22 for visual verification to a monitoring station 96.
[0027] In a blizzard, the thermal camera 24 cannot "see" the target area.
Using the weather station 34 as an input to the DVP 12, the DVP 12 may send notifications and periodic images using the color camera 22 for visual verification to a monitoring station 96.
High winds can cause movement in masts and poles and cause false alarms. Guy wires are recommended whenever possible to secure the supports for the inputs 14. DVP 12 when operated with a thermal camera 24 as the primary source of leak detection information may cause an alarm if personnel are within the field of view of the thermal camera 24. Alert and Alarm delays programmed into the solution are used to ignore personnel and vehicles passing through the area. However, at times, personnel will be required to stay within the area of interest and due to this, will initiate an alarm. To resolve this, a switch may be installed that can disable the operation of the DVP 12 while personnel are onsite. Once finished, personnel can switch the DVP 12 back on, or the DVP 12 can be programmed to restart automatically after a certain time period.
[0028] In normal operation, the thermal sensor 24 is able to pick up the thermal signature of a fluid spray or flow emanating from an enclosed space that would typically have a different temperature than the outdoor conditions. During day time, the color camera 22 may optionally be used as a visual reference to detect the oil or liquid accumulation over a surface. The DVP 12 can also use the color camera 22 to take snapshots of the scene to periodically send to the user for visual verification of operation. The snapshots can be sent to a remote monitoring station 96 or as email attachments 98 via the user's own network, 30 Cellular, or via a satellite communication solution.
[0029] Depending on the type of pipeline or vessel is being monitored and what is being transported through the pipe or stored in the vessel, characteristics of a spill could vary significantly. A combination of thermal 24 and video cameras 22 may be key to successfully identifying leaks. This way, temperature differences can be used to accurately identify leaks or spills through analysis, and color and size characteristics can be used for manual verification.
[0030] When using thermal camera 24, heat is the major deciding factor for detecting a pipeline leak. For the system 10 to be the most effective in monitoring a pipeline, it should be setup along the critical points of the pipeline. Ideally, the setup is mounted 30 feet (10 meters) above the ground so that the cameras scan the area below allowing the video analytics software to detect any scene anomalies that may result from liquid spills, sprays, leaks, or pools. Both the color camera 22 and the thermal camera 24 are used in conjunction to improve detection during day time and night time. The video image must have a clear picture with no noise interference for the camera or other input 14 to accurately monitor the area and detect pipeline leaks effectively. Preferably, the pipeline or other vessel should cover a minimum of 20% of the thermal camera's field of view. This means that the camera should be setup so that 20% of the image is taken up by the pipeline or other vessel. The camera can be up to 150 metres away from the furthest part of the pipeline and will effectively cover at least 30 metres of pipeline. To make an accurate detection from an image with a field of view of 240 x 320 pixels, 256 pixels must show a temperature difference. This means that a spray taking up 0.34% of the image will be able to be detected.
[0031] Detection area depends on equipment used, but the following chart illustrates the detection area at different distances with a 19mm and 60mm camera for example.
Distance from Lens Type Camera (m) 19 mm 60 mm Coverage Minimum Coverage Minimum Area (m2) Detection Area (m2) Detection Area (m2) Area (m2) 3.61 0.0030 25 90.41 0.0753 50 361.66 0.3014 33.29 0.0277 75 813.73 0.6781 74.91 0.0624 100 1446.64 1.21 133.18 0.1110 125 2260.38 1.8836 208.09 .1734 150 3254.95 2.7125 299.65 .2497
[0032] For example, if you are using a 19 mm thermal camera, you need to have the camera at a maximum distance of 80 metres to be able to detect a 1 m2 liquid spill. A larger spill can be detected at further distance. In order to detect a spill, it should cover at least 6 pixels across its critical dimension, which is equivalent to 8 pixels per meter at 80 metre distance. Follow camera hardware mounting instructions, and minimize cabling (power and video) in order to prevent signal and power degradation.
[0033] Wiring and cabling requirements follow the standards defined by each technology. The following table addresses common applications for each component.
Part Number Usage Cable Type Max Length SmrtDVR IVT-DVR405-05 4 ch DVR Lan/Wan CATS e 100m SmrtDVR IVT-DVR405-05 4 ch DVR Digital I/0 UL Shielded 1000m SmrtDVR IVT-DVR405-05 4 ch DVR Serial I/0 UL Shielded 3m SmrtDVR IVT-DVR405-05 4 ch DVR RS485 UL Shielded 1000m SmrtDVR IVT-DVR405-05 4 ch DVR DC Out 12v - 5v lA Maximum SmrtDVR IVT-DVR405-05 4 ch DVR Audio In UL Shielded 3m Day/Night Camera IVT-C1D190ESHQ Video RG6U 500m Day/Night Camera IVT-LIT9OESHQ Video RG6U 500m Thermal Camera IVT-XWTA-19 Video RG6U 500m All Cameras Power Varies Varies
[0034] Note: Actual cable lengths vary depending on guage, type and connectors.
During step up, ensure all required components are connected. This should include the thermal camera connected to a CH, the color camera connected to a CH, the power connected to IGN, the monitor connected to VGA, a USB mouse connected to any USB port, and a keyboard connected to any USB port. In order for the device to connect externally and send alarms, a LAN or Wireless device will also need to be connected and configured according to its requirements.
[0035] The DVP 12 may be set up to show operators real-time live feed using connected cameras, allow operators can review recorded video, analyze video images to determine if preset user-defined rules have been broken and decide if real-time alerts need to be generated, send a real-time event notification when a rule has been breached, transmit live video data to monitoring stations, permit access to alarm notification, video data, and analytic setup via Internet Explorer or web access software, and read and write analog and digital inputs and outputs (for example to be used for interacting with local devices such as gate locks and PTZ cameras).
[0036] Referring to Fig. 6, the DVP 12 may include a video panel 120 for displaying one or more of live camera feed (from connected inputs 14), playback of recorded video from the DVP 12 or stored archives and instant display for the alarms video 22. The DVP 12 may be provided with a control panel 122 to access administrative options which may be accessed using administration button 136, provide a Layout button 126 for various layout options, tag video using a bookmark button 130, control the PTZ camera functions, select the cameras, disable or enable analytics and review recorded video using a replay button 134.
The DVP 12 may be provided with an alarm control panel 124 including a list 140 of alarms and system events, an alarm quick preview screen 142 and information screens 144 such as the current date and time. Access to the administration interface allows access to the DVP
12 settings, alarm rule setup, as well as exiting the DVP 12.
[0037] Layout options are used to configure the way cameras are arranged on the screen. A PTZ setup button 128 may be used for configuration of PTZ camera 26 presets, auto-panning and camera patterns. A bookmark button 130 may be used for manually triggering an alarm. A draw objects button 132 and corresponding function may be used to track moving objects by showing a trail of path (trajectory) as well as the outline of the analytic. A display 138 show free disk space. PTZ Controls 156 may be used to operate a Pan-Tilt-Zoom (PTZ) camera 26 in case one is connected to the system. Presets for PTZ
controls can also be set here. Camera controls buttons 146 on the control panel may be used to indicate which cameras are connected and operating. A replay control panel button 134 may be used to switch to the control replay panel for recorded video and activates instant backwards playback of the current video.
[0038] Referring to Fig. 7, an event notification panel may be used including an event notification list box 140 that contains all of the alarms and system events that have occurred since the software was started. These can be alarms that have been triggered by the analytics engine, or notifications intended to inform the operator about something specific such as the state of the software. For example an alarm notification triggered by a fluid leak or spill, or an event notification that the software started successfully. A
preview screen 142 may be used to see a snapshot of the event that triggered the alarm, and may be activated by a click on the alarm entry in the grid. Each alarm notification can be sorted by time 150, camera number 152 or bookmark 154 as well as a rapid sort through footage and event isolation.
[0039] A Camera Controls panel 148, shown in Fig. 6, comprising the numbered buttons or icons 146 may be used. Each icon 146 represents a single camera, and the number inside is the number of the camera channel. If the icon has green background, it means that the camera is connected to the video channel and that the analytics for this camera are running. If the icon is just green with no red circle this indicates that the camera is on, but there are no analytics running. If the icon has blue background, the video channel is available, but either no camera is connected to the channel, or the camera has lost power. If the icon has a gray background, it means that the video channel is not available for this system. It is also possible that is not available because it has not been enabled by the licensing system.
[0040] If a camera is connected to the system and operating correctly, live feeds from the camera will be displayed in the appropriate camera display, alongside with the camera name, date, and other information. When an event alert has been triggered, a red border may be provided to flash on the related camera view window. Layouts and specification of which cameras to operate and view may be controlled by clicks on appropriate portions of the control panels.
[0041] The Draw Objects button 132 may be used to control whether rule areas and objects detected by analytics are displayed on screen alongside with the camera feed. If analytics are enabled for a specific camera, then objects associated with those analytics may be tagged and drawn on the screen. If the Objects option is turned off, even though analytics are running, this may be omitted from the screen. What is then seen is just the raw feed from the camera. If on the other hand, the Draw Objects option has been enabled, the rules area may be outlined 156 in red and distinct objects highlighted 158 on the screen in green. The objects are not necessarily randomly chosen, but instead are the objects identified by analytics. If there are no analytics running for this camera then no objects will be highlighted. No objects will also be highlighted if the analytics are running, but nothing is triggering alarms. This means the analytics are not identifying anything that fits their specified description. For the example in Fig. 8, the analytics were set to detect liquid sprays. The area of interest 156 was set up to look for liquids within that area. In this example, the analytic is running and has identified an object which is indicated in green (rectangular box 158). If no analytics are running, then no analytic boundary will be indicated and no objects will be detected.
[0042] Software on the DVP 12 may run as an application on a conventional operating system, for example in the Microsoft Windows XP Embedded environment and can be exited if needed. Under normal working conditions this is not necessary and is not recommended. The application may be designed to run all the time and should not be closed even if the user session is over. However, exiting the software can be helpful when debugging the system.
[0043] Depending on the size of the site being monitored, an application of the system 10 may use one or more pairs of thermal camera 24 and color camera 22, preferably mounted along side each other, for outdoor leak detection, one or more color cameras 22 for indoor leak detection and one or more security cameras 26 for example for monitoring one or more access points such as a gate. For example, the outside edge of an evaluation site area of interest may include a gate that vehicles and personnel are required to stop at. As the leak detection solution is based on thermal signatures, personnel (on foot or in vehicles, which radiate heat) that remain in the area for longer than 30 seconds may be required to use a manual switch to temporarily disable the functionality of the DVP 12 in order to prevent a false alarm. This will require training and awareness by personnel on the test site due to the test environment defined. However, it is important to note that this additional requirement will not be applicable to the majority of actual site applications due to remote locations, limited areas of interest, and fewer site visits. Colour detection without thermal video may be used in indoor applications such as pump houses, where specific conditions concerning lighting, product colour, and collection areas can be controlled.
[0044] Fixed station 96 may be used for network management and configuring the security profiles on multiple DVPs 12, viewing live video from cameras (fixed and PTZ), manage system users, generate audio/visual alarm and event notifications for leaks or security, video data management, archiving and summarizing and performing automated system diagnostics to ensure continual system operation.
[0045] The DVP 12 may also be used to identify a bird using the radar gun 32 as shown in Fig. 9. The radar gun 32 is connected via a conventional communication link to provide input signals to the DVP 12 that correspond to an object's speed. The DVP 12 is configured to use the input signals to classify the blob for example as disclosed in Fig. 9.
After the process is initiated at 100, an image is acquired at step 102 with the thermal camera 24 and a blob is located at 104 by a conventional detection method such as HASM and identified as an object and tracked. At 106, if the object does not cross a line or region within the field of view of the camera 24, then another image is acquired at step 102 and the process repeated. At 106, if the object crosses a region or line within the field of view of the camera, then the object is flagged as a potential bird. At the same time, at step 108, the DVP
12 acquires input signals from the radar gun 32 that are characteristic of the object's speed.
At 110, if the speed of the object does not exceed a predefined speed or bird detection threshold, another speed is obtained with the radar gun 32 that is characteristic of the speed of another or the same object at a later time. If the speed of an object is above the bird detection threshold, and the object has been flagged as a potential bird based on location in the field of view of the thermal camera 32, then a bird is reported by the DVP
12.
[0046] In the claims, the word "comprising" is used in its inclusive sense and does not exclude other elements being present. The indefinite articles "a" and "an"
before a claim feature do not exclude more than one of the feature being present. Each one of the individual features described here may be used in one or more embodiments and is not, by virtue only of being described here, to be construed as essential to all embodiments as defined by the claims.

Claims (7)

THE EMBODIMENTS OF THE INVENTION IN WHICH AN EXCLUSIVE PROPERTY
OR PRIVILEGE IS CLAIMED ARE DEFINED AS FOLLOWS:
1. An object detection apparatus, comprising:
a camera having video output comprising frames; and a digital video processor configured to receive the video output from the camera, detect and track a blob in the frames to determine a trajectory for the blob and trigger an alert message if the trajectory of the blob is characteristic of the object.
2. The object detection apparatus of claim 1 in which the digital video processor is configured to detect a blob and identify the blob as a leak if the blob matches leak criteria.
3. The object detection apparatus of claim 2 in which the digital video processor is configured to trigger an alert or alarm if the blob persists beyond a threshold.
4. The object detection apparatus of claim 2 in which the digital video processor is configured to trigger an alert if the blob persists beyond a first threshold and an alarm if the blob persists beyond a second threshold.
5. The leak detection apparatus of claim 1, 2, 3 or 4 further comprising a weather station and the digital video processor being configured to receive input from the weather station and take the input from the weather station into account in determining whether to trigger an alert.
6. The object detection apparatus of claim 1 further comprising a radar gun connected to provide input signals to the digital video processor that correspond to an object's speed and in which the digital video processor is configured to use the input signals to classify the blob.
7. The object detection apparatus of claim 6 in which the digital video processor is configured to identify the blob as a bird if the blob crosses a region or line within the field of view of the camera and the speed of the blob exceeds a bird detection threshold.
CA2814294A 2013-04-29 2013-04-29 Object detection Active CA2814294C (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CA2814294A CA2814294C (en) 2013-04-29 2013-04-29 Object detection
CA3104723A CA3104723C (en) 2013-04-29 2013-04-29 Object detection
PCT/CA2014/050407 WO2014176693A1 (en) 2013-04-29 2014-04-29 Object detection
EP14791064.0A EP2992365B1 (en) 2013-04-29 2014-04-29 Object detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CA2814294A CA2814294C (en) 2013-04-29 2013-04-29 Object detection

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CA3104723A Division CA3104723C (en) 2013-04-29 2013-04-29 Object detection

Publications (2)

Publication Number Publication Date
CA2814294A1 true CA2814294A1 (en) 2014-10-29
CA2814294C CA2814294C (en) 2021-02-16

Family

ID=51831029

Family Applications (2)

Application Number Title Priority Date Filing Date
CA2814294A Active CA2814294C (en) 2013-04-29 2013-04-29 Object detection
CA3104723A Active CA3104723C (en) 2013-04-29 2013-04-29 Object detection

Family Applications After (1)

Application Number Title Priority Date Filing Date
CA3104723A Active CA3104723C (en) 2013-04-29 2013-04-29 Object detection

Country Status (1)

Country Link
CA (2) CA2814294C (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9468850B2 (en) 2014-10-10 2016-10-18 Livebarn Inc. System and method for optical player tracking in sports venues
US20180174413A1 (en) * 2016-10-26 2018-06-21 Ring Inc. Customizable intrusion zones associated with security systems
CN110631771A (en) * 2018-06-22 2019-12-31 通用汽车环球科技运作有限责任公司 Method and apparatus for leak detection
CN112954179A (en) * 2021-03-19 2021-06-11 杭州伍沃电子商务有限公司 Device capable of collecting large bird information in special weather
CN113192277A (en) * 2021-04-29 2021-07-30 重庆天智慧启科技有限公司 Automatic alarm system and method for community security
US11545013B2 (en) 2016-10-26 2023-01-03 A9.Com, Inc. Customizable intrusion zones for audio/video recording and communication devices

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108153763B (en) * 2016-12-05 2023-11-24 佳能株式会社 Indexing device and method, object image retrieval device and monitoring system
CN111402284B (en) * 2020-03-17 2023-07-25 中国人民解放军国防科学技术大学 Image threshold value determination method and device based on three-dimensional connectivity

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9468850B2 (en) 2014-10-10 2016-10-18 Livebarn Inc. System and method for optical player tracking in sports venues
US9744457B2 (en) 2014-10-10 2017-08-29 Livebarn Inc. System and method for optical player tracking in sports venues
US20180174413A1 (en) * 2016-10-26 2018-06-21 Ring Inc. Customizable intrusion zones associated with security systems
US10891839B2 (en) * 2016-10-26 2021-01-12 Amazon Technologies, Inc. Customizable intrusion zones associated with security systems
US11545013B2 (en) 2016-10-26 2023-01-03 A9.Com, Inc. Customizable intrusion zones for audio/video recording and communication devices
CN110631771A (en) * 2018-06-22 2019-12-31 通用汽车环球科技运作有限责任公司 Method and apparatus for leak detection
CN112954179A (en) * 2021-03-19 2021-06-11 杭州伍沃电子商务有限公司 Device capable of collecting large bird information in special weather
CN113192277A (en) * 2021-04-29 2021-07-30 重庆天智慧启科技有限公司 Automatic alarm system and method for community security
CN113192277B (en) * 2021-04-29 2022-09-30 重庆天智慧启科技有限公司 Automatic alarm system and method for community security

Also Published As

Publication number Publication date
CA3104723A1 (en) 2014-10-29
CA3104723C (en) 2023-03-07
CA2814294C (en) 2021-02-16

Similar Documents

Publication Publication Date Title
US10373470B2 (en) Object detection
CA2814294C (en) Object detection
EP2992365B1 (en) Object detection
CA3000005C (en) Drone detection systems
CN103745579B (en) A kind of safety defense monitoring system
CN106600872A (en) Radar video linkage based intelligent boundary security system
CN106657921A (en) Portable radar perimeter security and protection system
US5966074A (en) Intruder alarm with trajectory display
US20130208123A1 (en) Method and System for Collecting Evidence in a Security System
US20030025599A1 (en) Method and apparatus for collecting, sending, archiving and retrieving motion video and still images and notification of detected events
KR101743386B1 (en) Video monitoring method, device and system
CN107483889A (en) The tunnel monitoring system of wisdom building site control platform
CN105574468B (en) Video flame detection method, device and system
CN112288984A (en) Three-dimensional visual unattended substation intelligent linkage system based on video fusion
CN103325209A (en) Intelligent security alarm system based on wireless
CN204129891U (en) A kind of high ferro anti-intrusion system along the line
US10706699B1 (en) Projector assisted monitoring system
Alshammari et al. Intelligent multi-camera video surveillance system for smart city applications
KR20110040699A (en) Forest fire monitiring system and control method thereof
CN112382033A (en) Regional intrusion alarm system
CN111696390B (en) Intelligent airport runway FOD device and working process thereof
CN112634579A (en) Transformer substation security early warning linkage system
WO2019099321A1 (en) Collaborative media collection analysis
CN114446004A (en) Security protection system
CN206132990U (en) Safety monitoring system based on phased array radar

Legal Events

Date Code Title Description
EEER Examination request

Effective date: 20180430