WO2010116364A1 - Système de gestion de sécurité et procédé - Google Patents

Système de gestion de sécurité et procédé Download PDF

Info

Publication number
WO2010116364A1
WO2010116364A1 PCT/IL2010/000276 IL2010000276W WO2010116364A1 WO 2010116364 A1 WO2010116364 A1 WO 2010116364A1 IL 2010000276 W IL2010000276 W IL 2010000276W WO 2010116364 A1 WO2010116364 A1 WO 2010116364A1
Authority
WO
WIPO (PCT)
Prior art keywords
station
data
local
central station
video
Prior art date
Application number
PCT/IL2010/000276
Other languages
English (en)
Inventor
Gil Shavit
Eitan Sterner
Original Assignee
Tadsec Advanced Homeland Security Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tadsec Advanced Homeland Security Technologies Ltd filed Critical Tadsec Advanced Homeland Security Technologies Ltd
Publication of WO2010116364A1 publication Critical patent/WO2010116364A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position

Definitions

  • the present invention relates to security systems and in particular, to security systems which combine video data with other related data.
  • the present invention overcomes these deficiencies of the background art by providing a system and method for managing data related to events received from a plurality of resources.
  • the system is preferably composed of distributed stations arranged in star topology wherein all the stations are connected to a central station.
  • a star topology is a topology wherein all nodes are connected to a central node.
  • Each station is composed of one or more sensor devices, which can be carried by a person, installed in a vehicle or located in a certain area.
  • a satiation that can be mobile is termed herein a moving station, while a station, which covers a specific area, such as a street, is termed herein a. fixed station.
  • the sensor device may include at least one camera and/or any other sensor and optionally a GPS and a communication device such as a telephone. Information from the sensor devices is sent to the local station in real time, for example by radio or by cellular telephone communication.
  • the local station preferably is able to filter and correlate events received from the sensors and transmit only suspicious events to the central station.
  • a central station is a station, which monitor and control all the local stations and enables the operator to detect suspicious events and to decide about emergency situations and react to these situations.
  • the data received in the central station is preferably rendered to support a three dimensional picture (which includes static elements overlaid with dynamic data); this data can be combined with other information.
  • Such information can be received, for example, by an SMS message, a telephone call or by GPS.
  • the central station is able to combine live video with previously generated 3D (three-dimensional) map.
  • the system can also present a moving station having at least a camera and a GPS as an animated moving object.
  • the information presented on the video can be tilted, zoomed in or zoomed out.
  • the system is able to prevent false alarms by correlating and filtering the events at the local station.
  • the system provides full event management with a combination of 3D (three- dimensional) animation/modeling and inserted video stream for video warping to form a composite image.
  • the system enables to focus on real alerts only by analyzing the data for detecting suspicious events in the local stations and transferring only data related to a suspicious event to the central station.
  • the camera/s at the local station is preferably active constantly and the output is preferably kept in a database. The duration of the clip that is kept is depended on the capacity of the storage device and on configuration parameters.
  • the local station preferably sends a video clip containing the clips that contain information before, during and after the event, enabling the reporting of the actions aligned to the time that it happened for later investigation.
  • Focusing on real event data is also achieved by correlating the events received from the plurality of sensors. Correlation as well as filtering are done at each local station and preferably prevent false alarms and redundant alarms. Correlating is done , for example for cross-interface in alarms received from two or more sensors; for detecting an alarm that is combined from alerts received from more than one sensor for example motion detecting or noise detecting optionally do not cause an alarm but the combination of movement and noise optionally indicates an alarm.
  • the filtering mechanism can be used for filtering redundant alarms; for example, light reflecting off of a broken bottle can trigger a visual trigger sensor while the microwave wouldn't see anything, so would be noted as a false alarm and does would be filtered.
  • the map presented at the central local station focuses on the location of the event.
  • the fixed station can be either manned or unmanned.
  • An example for an unmanned local station is a street cabinet.
  • An event can also be detected by a human being, in addition to the automatic detection. Generating an event that was detected by a human being can preferably be done by operating a menu or by pressing one of the buttons or by any other user interface means. Generating an event in an unmanned local station can be done, for example by sending a message over the network. Such a message preferably causes the local station to send data to the central local station as if the event was automatically detected by the local station.
  • each local station features an automatic messaging mechanism such as for example SMS for enabling the operator to send information to the central station.
  • an automatic messaging mechanism such as for example SMS for enabling the operator to send information to the central station.
  • No video is shown from any camera at the central station, if there is no event or reason to show it to the operator.
  • the central station operator can initiate any video request at any time, on call. For example, if a civilian calls in an emergency, the operator can request the region of interest from the map and see all related video.
  • Real time video of clipper system can be sent wirelessly from any local station to the mobile system of the security patrol in order to more easily acquire the target.
  • PCT application WO 2008/026210 discloses a system and method for organization of security related data in order to at least assist in the determination of the relative priority of such data.
  • This application does not disclose any method or system for transferring relevant data from a plurality of local stations arranged in a star topology to a central station, nor does this application disclose any system and method for receiving and filtering data from local stations.
  • This application does not disclose a method and system for combining the transmitting of a data received by one or more sensor with GPS data, text data or data that is received by a voice call.
  • This application does not teach how to render the data received from a camera to support a three-dimensional picture. The application does not teach false alarm management.
  • One or more surveillance cameras capture video data.
  • One or more video analytics devices process the video data from one or more of the surveillance cameras and detect primitive video events in the video data.
  • a network management module monitors network status of the surveillance cameras, and the video analytics devices, and generates network events reflective of the network status of all subsystems.
  • a correlation engine correlates two or more primitive video events from the video analytics devices weighted by the attribute data of the surveillance cameras used to capture the video data, and network events from the network management module weighted by attribute data of device corresponding to the network event.
  • This application does not teach or suggest a network having a star topology wherein each local station filters the video events and transmits only relevant events to the central station and wherein this central station is capable of presenting a 3 dimensional display by rendering the received data and to combine the video data with other information, such as GPS, received from the distributed local station.
  • USA application 2008/0048851 filed on September 27, 2007 teaches a system comprising a secure, redundant, verifiable, computer-enabled, direct or networked, facility emergency notification, rapid alert management and alarm systems installed in public, private, and government buildings, and outdoor areas for which there is a need for rapid alerts to occupants or attendees of the occurrence of impending or in-progress dangerous or threatening events.
  • the invention relates to highly secure, access-controllable, flexible, hierarchical, local, regional, national or international fast alert systems comprising computer-enabled and direct or network linked apparatus, software, and methods enabling rapid dissemination from a central station, or decentralized or mobile location, of alerts of the occurrence of threatening or dangerous events in a series of hierarchical, increasing levels of directed action to be taken by the occupants.
  • This application however teaches only how to transmit alarms but does not teach how to filter relevant data such as video and transmit such data to a central manager wherein said central manager combines all data received from different resources.
  • US Patent Number 6069655 issued on May 30, 2000, teaches a video security system, which monitors a premise .to detect unwanted intrusions onto the premises.
  • Pluralities of cameras located about the premise supply video images of scenes to a processor, which processes the images to detection motion in a scene, and classify the source of the motion. Only if the source is determined be one of a predetermined class of causes, is an indication provided to an alarm unit.
  • the alarm unit which is also connected to a plurality of conventional sensors, is responsive to the indication to cause the processor to transmit authenticated video images of the scene in which the motion is detected to a central station (CS).
  • CS central station
  • a video server in conjunction with an alarm computer, enables the images to be displayed at a selected workstation for viewing by an operator.
  • This patent does not teach or suggest sensors, which can be carried by a person, installed in a vehicle; neither does it suggest criteria for transferring the video to a central station after analysis by a local station, which are not related to the moving of an object.
  • This patent does not suggest also the display of the video in three-dimensional rendering with both static and dynamic components.
  • Implementation of the method and system of the present invention involves performing or completing certain selected tasks or stages manually, automatically, or a combination thereof.
  • several selected stages could be implemented by hardware or by software on any operating system of any firmware or a combination thereof.
  • selected stages of the invention could be implemented as a chip or a circuit.
  • selected stages of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
  • selected stages of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
  • any device featuring a data processor and/or the ability to execute one or more instructions may be described as a computer, including but not limited to a PC (personal computer), a server, a minicomputer, a cellular telephone, a smart phone, a PDA (personal data assistant), a pager, TV decoder, game console, digital music player, ATM (machine for dispensing cash), POS credit card terminal (point of sale), electronic cash register. Any two or more of such devices in communication with each other, and/or any computer in communication with any other computer may optionally comprise a "computer network”.
  • FIG.l is a schematic drawing of the system.
  • FIG.2 is a schematic exemplary drawing of a local station.
  • FIG. 3 is a schematic exemplary drawing of a central station.
  • FIG. 4 is a high-level flow diagram of a local station behavior when an event is detected.
  • FIG. 5 is a high-level flow diagram of a central station behavior when an event is received.
  • the present invention is of a system and method for system and method for managing data related to events received from a plurality of resources.
  • FIG.l is a schematic drawing of the system.
  • System 100 is composed of distributed stations arranged in star topology wherein all the stations are connected to a central station.
  • a star topology is a topology wherein all nodes are connected to a central node.
  • the local stations are divided into three types: manned moving local station 102, unmanned fix local station 104 and manned fix local station 105.
  • Moving local station 102 is a station that is portable and can be carried by a person, installed in a vehicle and the like. Moving local station 102 is manned, while fixed local station can be manned 105 or unmanned 104.
  • Both manned and unmanned station have the same functionality, however detecting an event by a human being or any other human being operation is operated in the manned station by the station's operator, while detecting an event by a human being or any other human being operation is operated from the central station 101 by sending a message over the network.
  • Each local station is composed of one or more sensor devices (not shown). The local station is able to filter events received from a camera and transmit only suspicious events to the central station 101.
  • the data received in the central station 101 is preferably rendered to support a three dimensional picture (which includes static elements overlaid with dynamic data); this data can be combined with other information received from one or more local stations. Such information can be received, for example, by an SMS message, a telephone call or by GPS.
  • the central station 101 is preferably connected to the moving stations represented as 102 by a wireless network such as GPS, WiFi and a combination thereof. It should be noted that there is no mesh network and there is only one central antenna at the central station 101 to which all stations are connected.
  • the central station 101 is preferably connected to the fix stations represented as 104 and 105 by a fix network such as the internet or by a wireless network and a combination thereof.
  • the central station 101 is described in greater details in figure 3.
  • the local station represented by 102,104 and 105 is described in grater details in figure 2.
  • FIG.2 is a schematic exemplary drawing of a local station.
  • Local station 200 is responsible for collecting data from the area it is located in, and transferring the collected data to the central station (not shown) when a suspicious event is detected.
  • Locale station 200 can be mobile (a moving station), for example, a station that is carried by a person or a station that is located in a vehicle.
  • Local station 200 can also be a fixed station.
  • Local station 200 is preferably comprised of a local station cabinet 250 comprising a computer 251, a console 252, keyboard 253, antenna 255, cellular modem 256 and WiFi modem 257.
  • Local station 200 is also connected to one or more cameras shown, for the purpose of illustration, as camera 210. Camera
  • Camera 210 or a plurality of cameras preferably covers an area of up to 360 degree.
  • Camera 210 collects video data and periodically sends to data to local station cabinet 250.
  • Camera 210 is preferably located up to one thousand meters from the local station cabinet 250 and, thus, can transfer data at high definition bit rate over a wireless system.
  • the camera optionally is a PTZ (pan-tilt-zoom) camera (not shown), which is a standard type of video camera that can zoom in or out, and which is connected to a servo motor or any type of suitable motor and mount to permit panning and tilting.
  • PTZ pan-tilt-zoom
  • Another exemplary, non-limiting type of camera includes a VMD (video motion detector) camera, although strictly such a camera actually features an analysis system with a camera.
  • VMD video motion detector
  • the analysis system activates collection of video data upon the detection of movement.
  • video data may be collected through camera 210 regardless of whether movement is sensed.
  • a thermal camera may also optionally be provided, which detects heat energy as opposed light, again as a non-limiting illustrative example.
  • One or more other types of cameras may optionally be provided, additionally or alternatively, without any limitation.
  • These different cameras may optionally and preferably be connected to a DVR (digital video recorder) 260 for recording video data.
  • the cameras may be connected through DVR 260 to local station cabinet 250 as shown, or alternatively may optionally be connected directly to local station cabinet 250, which could then optionally and preferably feed video data to DVR 260 and/or to any other type of recording device.
  • DVR preferably stores the video data.
  • Data is preferably stored as video clips having a configurable duration (for example ten seconds).
  • Local station 250 is optionally connected to a GPS 220. GPS 220 collects location data and transfers data to local station 250. Transferring can be done periodically or upon demand.
  • Local station 250 is optionally connected to one or more motion sensors shown, for the purpose of illustration, as motion sensor 230. Such sensors can be for example a video motion detector with camera. Motion sensor 230 preferably alerts the local station 200 when a movement is detected.
  • Local station cabinet 250 is also optionally connected to one or more noise detection sensors, shown, for the purpose of illustration, as noise detection sensor 240. Such sensors can be for example an acoustic detector of some type. Noise detection sensor 230 preferably alerts the local station 250 when a noise is detected.
  • Local station cabinet 250 may optionally be connected to other sensors shown as other sensors 260, for the purpose of illustration only. GPS 220 and sensors are preferably located within one hundred meters from the local station cabinet 250.
  • Software residing in local station's computer 251 is responsible for receiving events and data from plurality of sensors and for managing the received events. Managing the events is preferably done by correlating and filtering.
  • Software in computer 251 is also responsible for alerting the central station (not shown) and transferring data to central station (not shown) when a suspicious event occurs. If the local station 200 is manned, then the operator is able to view the data on the console 252. Data is available when a suspicious event is detected, or upon operator demand. Data is automatically transferred to the central station (not shown) when an alert is detected. Data can also be transferred upon demand from the central station (not shown) which is sent as a message over the network or upon activating a command by the operator of the local station 200. Operator can request to view data or to transfer data to the central station (not shown) by for example pressing a special button 254 or by using a menu or any other user interface options.
  • FIG. 3 is a schematic exemplary drawing of a central station.
  • Central station 300 is composed of a computer 310, one or more display screens shown as display screen 330 and 350 and an antenna 340.
  • Central station 300 receives alerts from all the local stations regarding suspicious events. The alerts are accompanied with relevant data such as video data, text data and GPS data. The station enables the operator to monitor the whole area that is covered by the local stations and to act upon an emergency situation.
  • Central station 300 features a lower touch screen 350 that pops out from the table (not shown) when an operator touches it and is otherwise flat with the table.
  • the station 300 also features one or more upper display screen 330.
  • the information displayed is identical in all screens.
  • Central station 300 features a map, preferably a three-dimensional map (not shown), which covers the whole area that is controlled by the local stations (not shown).
  • Computer 310 is able to combine the live video and data that arrives from a local station (not shown) upon an emergency with previously generated 3D map.
  • the display can be tilted and zoomed in/out to enable better viewing of the scene.
  • the computer Upon receiving an alert, the computer causes the map to move to that location and perspective.
  • the live video is being adjusted to appear in the correct location, perspective and size parameters, according to the 3D (three-dimensional) modeling.
  • the station 300 can display animated objects that move on the displayed video according to the information received from the GPS of the local station. For example if a local station is located on a car, it is presented as a moving car on the display.
  • Computer 310 is responsible for receiving events and data from the local stations (not shown) for performing event management.
  • Event management includes but not limited to activating alarms, saving the event in a history file and the like.
  • Alarms are activated by, for example, displaying the alarms on the screen, activating a siren and the like.
  • Computer 310 is also responsible for managing the map video data and all other data that is displayed on the screen.
  • Central station operator can optionally request additional information from one or more local station. Operator can also request to view live video of a certain area.
  • Communication with the local stations (not shown) is done via wireless network through Wi-Fi modem 313 and/or cellular modem 312 and via the internet through the internet connection on the computer 314. Wireless connection is done via one central antenna 340.
  • FIG. 4 is a high-level flow diagram of a local station behavior when an event is detected.
  • an event is detected.
  • An event can be detected from a camera, GPS or any other sensor.
  • the event is saved.
  • the event is correlated with previously detected events.
  • the event is preferably correlated with events that have been detected in the near past. Such correlation is done for avoiding false alarms and for detecting real alarms. For example, a noise detection correlated with motion detection can cause an alarm.
  • the related data is retrieved. Such data can be, for example video data from the time before during and after the event, last updated GPS data and the like.
  • the data retrieved in stage 4 is displayed on the screen, and optionally, an alert is performed.
  • An alert can be, for example, a message displayed on a screen and/or a siren.
  • the alert and the related data is sent to the central station.
  • FIG. 5 is a high-level flow diagram of a central station behavior when an event is received.
  • stage I 5 an event is received from one of the local stations.
  • the computer causes the map to move to that location and perspective.
  • the live video is being adjusted to appear in the correct location, perspective and size parameters, according to the 3D (three- dimensional) modeling.
  • stage 3 the computer combines the live video and data that arrives from a local station with the three dimensional map.
  • animated objects that move on the displayed video are optionally displayed, according to the information received from the GPS of the local station. For example if a local station reported is located on a car, it is presented as a moving car on the display.
  • alarms are activated by, for example, a popup message on the screen, activating a siren and the like. While the invention has been described with respect to a limited number of embodiments, it will be appreciated that many variations, modifications and other applications of the invention may be made.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Alarm Systems (AREA)

Abstract

L'invention porte sur des systèmes de gestion de sécurité, et en particulier sur des systèmes de gestion de sécurité combinant des données vidéo avec d'autres données associées.
PCT/IL2010/000276 2009-04-05 2010-04-06 Système de gestion de sécurité et procédé WO2010116364A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL19800209 2009-04-05
IL198002 2009-04-05

Publications (1)

Publication Number Publication Date
WO2010116364A1 true WO2010116364A1 (fr) 2010-10-14

Family

ID=42333429

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2010/000276 WO2010116364A1 (fr) 2009-04-05 2010-04-06 Système de gestion de sécurité et procédé

Country Status (1)

Country Link
WO (1) WO2010116364A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2894613A1 (fr) * 2014-01-14 2015-07-15 Samsung Electronics Co., Ltd Système de sécurité et procédé de fourniture de service de sécurité à l'aide de celui-ci

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1304672A1 (fr) * 2000-06-30 2003-04-23 Japan Network Service Co., Ltd Procede de surveillance a distance et serveur de commande de dispositif de surveillance
US20030085992A1 (en) * 2000-03-07 2003-05-08 Sarnoff Corporation Method and apparatus for providing immersive surveillance
GB2389978A (en) * 2002-06-17 2003-12-24 Raymond Joseph Lambert Event-triggered security monitoring apparatus
EP1480178A2 (fr) * 2003-05-20 2004-11-24 Marconi Intellectual Property (Ringfence) Inc. Système de securité

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030085992A1 (en) * 2000-03-07 2003-05-08 Sarnoff Corporation Method and apparatus for providing immersive surveillance
EP1304672A1 (fr) * 2000-06-30 2003-04-23 Japan Network Service Co., Ltd Procede de surveillance a distance et serveur de commande de dispositif de surveillance
GB2389978A (en) * 2002-06-17 2003-12-24 Raymond Joseph Lambert Event-triggered security monitoring apparatus
EP1480178A2 (fr) * 2003-05-20 2004-11-24 Marconi Intellectual Property (Ringfence) Inc. Système de securité

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2894613A1 (fr) * 2014-01-14 2015-07-15 Samsung Electronics Co., Ltd Système de sécurité et procédé de fourniture de service de sécurité à l'aide de celui-ci

Similar Documents

Publication Publication Date Title
KR101321444B1 (ko) Cctv 모니터링 시스템
AU2009243916B2 (en) A system and method for electronic surveillance
KR101671783B1 (ko) 통합경비 원격 모니터링 시스템 및 그 방법
US7231654B2 (en) Remote monitoring method and monitor control server
US20150145991A1 (en) System and method for shared surveillance
EP3033742B1 (fr) Système et procédé pour le déploiement d'audio/vidéo et d'événements au moyen d'un système de positionnement
CA2864890C (fr) Systemes et procedes pour la fourniture de ressources d'urgence
CN109361898B (zh) 异常事件监控方法及装置
US20110109747A1 (en) System and method for annotating video with geospatially referenced data
CA2806786C (fr) Systeme et procede pour echange de videos sur demande entre des operateurs de sites et des operateurs mobiles
CN109274926B (zh) 一种图像处理方法、设备及系统
US20130262640A1 (en) Method and Apparatus for Interconnectivity between Legacy Security Systems and Networked Multimedia Security Surveillance System
JP2017538978A (ja) 警報方法および装置
JP2003216229A (ja) 遠隔制御管理システム
TW202009860A (zh) 任務處理方法、裝置及系統
CN101341753A (zh) 用于广域安全监控、传感器管理及情况认知的方法和系统
KR101297237B1 (ko) 재난 감시 시스템 및 방법
KR101466004B1 (ko) 지능형 방범, 방재 및 사후처리 통합 트리플렉스 시스템 및 그 제어 방법
WO2018116485A1 (fr) Système de collecte de vidéo, serveur de collecte de vidéo, procédé de collecte de vidéo et programme
KR101005568B1 (ko) 지능형 방범 시스템
CN101272483A (zh) 用于管理运动监视相机的系统和方法
JP2016028466A (ja) センサベース検出システムのための再生装置
KR101964230B1 (ko) 데이터 처리 시스템
KR101250956B1 (ko) 자동 관제 시스템
Keat et al. Smart indoor home surveillance monitoring system using Raspberry Pi

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10727143

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC

122 Ep: pct application non-entry in european phase

Ref document number: 10727143

Country of ref document: EP

Kind code of ref document: A1