WO2018091110A1 - Dispositif d'affichage pour système de surveillance d'une zone de surveillance, système de surveillance équipé du dispositif d'affichage, procédé de surveillance d'une zone de surveillance avec un système de surveillance et programme informatique pour mettre en œuvre le procédé - Google Patents

Dispositif d'affichage pour système de surveillance d'une zone de surveillance, système de surveillance équipé du dispositif d'affichage, procédé de surveillance d'une zone de surveillance avec un système de surveillance et programme informatique pour mettre en œuvre le procédé Download PDF

Info

Publication number
WO2018091110A1
WO2018091110A1 PCT/EP2016/078231 EP2016078231W WO2018091110A1 WO 2018091110 A1 WO2018091110 A1 WO 2018091110A1 EP 2016078231 W EP2016078231 W EP 2016078231W WO 2018091110 A1 WO2018091110 A1 WO 2018091110A1
Authority
WO
WIPO (PCT)
Prior art keywords
display device
data
entities
sensor data
monitoring
Prior art date
Application number
PCT/EP2016/078231
Other languages
German (de)
English (en)
Inventor
Marcus NADENAU
Andre KAMP
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Priority to PCT/EP2016/078231 priority Critical patent/WO2018091110A1/fr
Priority to DE112016007457.3T priority patent/DE112016007457A5/de
Publication of WO2018091110A1 publication Critical patent/WO2018091110A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/14Central alarm receiver or annunciator arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Definitions

  • a display device for a monitoring system of a surveillance area wherein the surveillance system has a plurality of sensors, wherein the sensors are designed to monitor at least a portion of the surveillance area as a scene with objects and to provide them as scene data, with at least one input interface for taking over the sensor data , an entity device, wherein the entity device is configured to assign entities to the objects of the scene based on the sensor data, with an attribution device, wherein the attribution device has the entities and the sensor data and attributes, wherein the attribution device is designed to assign attributes to at least one of the entities and / or attributes for the entity, with a filter device, wherein the filter device is configured, the sensor data based on at least one of the entities and / or filter the assigned attributes into output data, with a display device for displaying the output data.
  • Monitoring of surveillance areas is often by sensors and / or by video cameras located in the surveillance area.
  • the Monitoring material in particular the sensor data and the image material are thereby, for example, sighted and / or evaluated by a security staff in a control room.
  • a display device for a monitoring system of a surveillance area with the features of claim 1 is proposed. Furthermore, a monitoring system with the display device with the features of claim 13, a method for monitoring a monitoring area with a monitoring system with the features of claim 14 and a computer program for carrying out the method with the features of claim 15 is proposed. Preferred or advantageous embodiments of the invention will become apparent from the dependent claims, the following description and the accompanying drawings.
  • a display device for a monitoring system of a surveillance area is proposed.
  • the display device is, for example, a computer workstation.
  • the display device is designed for a security guard and / or for security personnel, in particular, the display device is designed to display data for the security guard and security personnel and / or for interaction with the security guard and / or security personnel.
  • the display device is arranged centrally and / or stationary, such as in a central security monitoring center and / or a control room.
  • the display device is mobile, and designed, for example, as a tablet, as a phablet and / or as a smartphone.
  • the display device is designed so that a security guard and / or the security personnel can monitor the surveillance area by means of the display device and / or control actuators in the surveillance area by means of the display device.
  • the monitoring system has a plurality of sensors, wherein the sensors are arranged in the monitoring area.
  • the surveillance area is in particular a public and / or a non-public sector. Further, the surveillance area is an open space and / or a closed area such as a building. Preferably, the surveillance area is an airport facility, a train station, a prison, a public, such as an authority, a school and / or a university and / or a border station.
  • the sensors in the surveillance area are preferably video cameras, microphones, fire detectors and / or motion detectors.
  • the video cameras are in particular digital video cameras, for example CMOS or CCD cameras.
  • the video cameras are in particular fixed cameras, for example with a pivotable receiving area, for example a PTZ camera. Alternatively and / or additionally, the video cameras are mobile video cameras that are movable in the surveillance area.
  • Further examples of monitoring system sensors include intrusion sensors, access control sensors, personal tracing devices, flight information systems, gate
  • the monitoring system comprises more than 100 sensors, in particular more than 500 sensors and in particular more than 1000 sensors.
  • the sensors are designed to monitor at least a portion of the surveillance area as a scene with objects and to provide them as scene data.
  • the sections which are monitored by means of the sensors are overlapping, so that in particular complete monitoring of the monitored area is possible.
  • at least one section of a monitoring area with a plurality of sensors, in particular sensors of different types, is monitored by sensors.
  • a section is monitored both videotechnisch, acoustically and with fire detectors.
  • the scene comprises objects, the objects being, for example, persons in the surveillance area, objects, doors and / or events in the surveillance area.
  • the scene data includes information about the objects, such as the object type, the object position in the surveillance area, and / or the time of recording the scene data.
  • the display device comprises an input interface for accepting the sensor data.
  • the input interface is designed as a radio interface, wherein the input interface gets the sensor data provided by means of a radio link.
  • the input interface gets the sensor data provided by means of a radio link.
  • Input interface a cable interface, wherein the sensor data is provided via a cable connection.
  • the display device comprises a memory unit for storing the sensor data.
  • the display device includes an entity device, wherein the entity device is configured to associate entities with the scene based on the sensor data.
  • entity device a representation of the objects in the surveillance area and / or a representation of the sensor in the surveillance area itself is preferably understood, the representation being, for example, a data-technical representation within the display device.
  • entities are objects, persons and / or objects in the surveillance area. Further, entities, locations and / or coordinates may be in the surveillance area as well as routes and / or paths within the surveillance area. Examples of entities are
  • entities are static objects, such as fixed property configurations and / or master data.
  • entities are dynamic objects, which can change their type, for example, such that a camera detects a moving object, detects additional information such as size and speed by means of additional sensors and, based thereon, an entity as human and / or animal sets.
  • attributes, details and / or additional information to one of the entities.
  • the display device comprises a filter device, wherein the filter device is designed to filter the sensor data based on at least one of the entities and / or the associated attributes of the at least one entity into output data.
  • the filter device is designed so that the sensor data Output data for filtering that the output data information, in particular only those information include, based on the entities and the attributes include security events in the surveillance area and / or show.
  • the filter device is designed to provide a comprehensive and / or complete image of sensor data
  • an entity is an event in the surveillance area, where the
  • the filter device is in particular designed to filter the sensor data into output data in such a way that the output data comprises all and / or only data relating to the event in the monitoring area.
  • the display device comprises a display device for displaying the display data.
  • the display device is designed to display the display data graphically in the form of images, videos, graphics and / or alphanumeric.
  • the display device comprises an output device for outputting the acoustic component, that is to say that acoustic signals and / or sensor data can also be reproduced by means of the display device.
  • the display device is in particular a screen, a monitor and / or a touch screen.
  • the display device is for
  • the display device is designed to interact and / or to input data.
  • the monitoring personnel and / or a guard control the monitoring system by means of the display device.
  • the display device comprises, for example, input means, such as a keyboard, a computer mouse or a touchscreen unit.
  • the attribution device comprises at least one relationship between two and / or more entities.
  • the filter device is designed in particular, the
  • Sensor data and / or the input data based on at least one relationship in the output data to filters.
  • Relations between two and / or more entities serve, in particular, for characterizing the entities and obtaining information and / or interpreting the scene and / or the scene data.
  • a relationship between two and / or more entities describes how two and / or more entities are related to each other and / or how they are linked.
  • relationships between two and / or more entities may be permanent and / or permanent, such as two entities being fixedly located in a particular space.
  • relationships between two and / or more entities are dynamic and / or variable, such as a person owning and / or handing over something.
  • the filtering of the sensor data into the output data based on the output data represents, in particular, a filtering based on the context, relationships and / or the interpretation.
  • the output data form a subset of the sensor data.
  • Context is understood in particular the filtered sensor data and / or the output data. It is a consideration of the invention to provide a display device that uses context and / or additional information between two entities to represent only the subset of the sensor data that characterizes the current scene and / or accordingly represents safety-relevant data. This makes it easier for the monitoring personnel to more easily view and / or understand a scene in the surveillance area, displaying only the relevant data.
  • this filtering corresponds to an interpretation of the sensor data that is close to the human interpretation of sensor data and / or operations in the surveillance area.
  • the relationships between two entities and / or between multiple entities are spatial relationships.
  • spatial relationships are the relationships "something is in one place", such as an entity is in a particular location, for example, this information can be used to find a location where another particular sensor is located the relationship "something is close to something else", such as an entity is close to another entity, which information can be used, for example, to check the environment of the two entities.
  • Other examples of spatial and / or spatial relationships are "something looks at something", such as a camera focusing on a particular point, or "something is connected to something", such as two places connected by a lift and / or a door. It is a consideration of the invention to use spatial relationship between entities to filter the sensor data into output data, with the focus limited to spatial ranges.
  • the relationships between two and / or more entities include temporal relationships and / or logical relationships between two and / or more entities.
  • the temporal and / or logical entities are used in particular for the characterization of entities and / or their relationships to each other.
  • the temporal, logical and / or local relationships can be combined, such as in temporally local, temporally logical and / or temporally locally logical relationships.
  • temporal relationships include, for example, "something happens at the same time,””something happens in a common time window,””something happens before something else” and / or "something happens after something
  • spatio-temporal and / or local temporal relationships are "something can be achieved within a time” and / or something is achieved at a time of something.
  • Logical relationships and / or relationships of people are, for example, "something is where, something has the property," “something has something else,” “something uses an object,” “something is involved in a process.” Relationships and / or properties of sensors may be, in particular, "a sensor is not detectable,” a sensor detects and / or reports something “and / or” a sensor detects that something is being blocked by something ".
  • the display device comprises an actuator control device.
  • the monitoring system comprises a plurality of actuators, wherein the actuators are arranged in the monitoring area.
  • the actuators are, for example, door openers, sprinkler systems and / or announcement loudspeakers.
  • the actuators are controllable by means of the display device.
  • the actuator control device is designed to provide control options for at least one of the actuators to a user of the display device based on at least one of the entities and / or the associated attributes of the entities.
  • the actuator control device is designed to provide certain actuators to a user of the display device only for certain events in the monitoring area, which are determined in particular by entities and their attributes.
  • the actuator control device is configured to provide a user of the display device with the sprinkler system for activation, for a detected fire in the surveillance area and / or the scene.
  • the display device comprises an interpretation module.
  • the interpretation module is configured to interpret the scene on the basis of the sensor data, the attributes and / or the relationships and to provide interpretation data.
  • the interpretation module is designed to evaluate and / or interpret the sensor data, wherein the interpretation data can be displayed, for example, by the display device.
  • the interpretation can be close to the sensor data, which in particular represents a low-level interpretation and / or the interpretation may be close to human interpretation, which corresponds to a high-level interpretation.
  • a low-level interpretation is the simple representation and / or detection of a sensor signal of one of the sensors in the surveillance area.
  • the detection of a Mac address which was detected by a WI FI sensor in the surveillance area is an interpretation, in particular a low-level interpretation.
  • a slightly higher-level low-level interpretation for example, is already the assignment that the detected Mac address belongs to a smartphone.
  • a high-level interpretation uses the context, the attributes and / or additional information about entities and / or the correlation between two entities. For example, it is possible for the interpretation module to map two sensor signals received from different sensors to a common event and interpret that to be the same event.
  • the filter device is designed to filter the sensor data on the basis of the interpretation data.
  • the interpretation unit it is possible for example for the interpretation unit to detect and / or interpret an event based on the sensor data, the entities and / or the attributes in the monitoring area, wherein the filter device is designed to filter the sensor data such that the output data only and / or largely include only information about the event in the surveillance area.
  • the interpretation unit is designed to detect a fire in a contiguous area of the monitoring area, wherein the filter device is designed to filter the sensor data in such a way that the output data include all important information about a fire in this monitoring area and / or section.
  • the degree of interpretation by the interpretation module by a user is adjustable and / or adjustable.
  • the level of interpretation is adjustable from a low level interpretation to a high level interpretation and / or vice versa.
  • the display device comprises a safety device, wherein the safety device comprises and / or secures the input data and / or the sensor data.
  • the security device is designed to secure the interpretation data and / or the output data.
  • the security device is designed as an internal and / or external storage unit, such as a hard disk. This embodiment is based on the consideration that the interpretation of the sensor data by the interpretation unit may be wrong, so that by securing the sensor data, the interpretation data and / or the other data, an evaluation in retrospect is possible, so that the
  • Input data, sensor data, output data and / or interpretation data are not lost.
  • the display device comprises a fallback device.
  • the falling-back device is designed to display an unfiltered one
  • interpretation data may be displayed with a lower and / or higher level of interpretation. For example, the interpretation "A person A is in a room X" may be wrong if the admission card uses a person B, by increasing the degree of interpretation, for example by identifying the face in the room
  • Video data "Person B is in Room X.” This embodiment is based on the consideration that a possible misinterpretation by the display device and / or a possible false filtering based on the entities can be excluded and / or reduced.
  • the display device comprises an event attachment module.
  • the event attachment module is configured to evaluate the scene data, sensor data and / or input data for the presence of an event in the scene.
  • the event in the surveillance area and / or scene is a security-related event, such as a Burglary, a detected fire and / or a robbery.
  • the event attachment module is further configured to provide event data in the event of an event, a scene.
  • the event data includes the type of event, the location of the event, and / or the time of the event.
  • the filter device is designed to filter the sensor data based on the event data in output data.
  • the output data includes, in particular, the sensor data that includes information about the event and / or was recorded at the time and / or location of the event.
  • the display device comprises an aggregation module.
  • the aggregation module is preferably designed as a backend system.
  • the aggregation module is designed to check two and / or several events for a connection and / or to combine two and / or several events with a connection to a combined event.
  • the summarized event comprises summarized event data that characterize and / or define the event more precisely.
  • the filter device is preferably designed to filter the sensor data into output data on the basis of the summarized event data.
  • the aggregation module is designed to combine spatial and / or temporally related events, which were detected by different sensors, into a common event.
  • such a summarized event is the detection of fire and / or smoke in the surveillance area, where a plurality of sensors in different portions of the surveillance area detect fire and / or smoke, the aggregation module being formed, the sections and / or areas of the detected fire surveillance area or to combine smoke into a summarized event.
  • the monitoring system in particular comprises a plurality of sensors, wherein the sensors are arranged in the monitoring area. It is further possible that the monitoring system includes, for example, a plurality of actuators, such as door openers and / or speakers.
  • Another object of the invention is a method for monitoring a surveillance area with a surveillance system.
  • the method provides for generating and / or providing sensor data by means of a plurality of sensors, which are arranged in the monitoring area.
  • the sensors in the monitoring area provide the sensor data which monitor a section of the surveillance area as a scene with objects.
  • the method provides that entities are assigned to the objects of the scene based on the sensor data.
  • the method provides that attributes are assigned to at least one of the entities and / or attributes for the entity are deposited.
  • the sensor data is then filtered into output data based on one of the entities and / or the associated attributes, with the output data displayed on a display.
  • This embodiment of the invention is based on the consideration of providing a method which only displays relevant information and / or sensor data on a display to the monitoring personnel and / or the user of a monitoring system.
  • a further subject of the invention is a computer program with program code means for carrying out all the steps of the method when the program is executed on a computer and / or the display device.
  • FIG. 1 is a schematic representation of a monitoring system
  • Figure 2 is a schematic representation of the degree of interpretation
  • Figure 3 is an example of an interpretation of an event
  • 1 shows a schematic representation of a monitoring system 1 with a display device 2.
  • the monitoring system 1 is designed to monitor a monitoring area 3 sensor technology, video technology and / or safety technology.
  • a plurality of sensors 4 are arranged.
  • the sensors 4 are in particular video cameras, fire detectors, card readers and / or lighting ventilation, lift controls and / or disarm switch of burglar alarms.
  • the monitoring area 3 further comprises a plurality of sections 5, wherein the sections 5 are, for example, different rooms and / or areas in the monitoring area 3.
  • a plurality of sensors 4 are arranged, for example, a room may include both fire detectors and video cameras.
  • the actuators 6 are, for example sprinkler systems, door openers or locking systems.
  • the actuators 6 are preferably controllable and / or operable by means of the display device 2.
  • the sensors 4 are designed to monitor the monitoring area 3 and / or the sections 5 sensor-wise and thereby to provide sensor data 7.
  • the display device 2 comprises an input interface 8, wherein the input interface 8 is designed as a wireless interface, in particular an intersection interface. At least one mating interface 9 is arranged in the monitoring area 3, wherein the mating interface 9 can be coupled with the input interface 8 in terms of data, so that the sensor data 7 can be transmitted from the sensors 4 in the monitoring area 3 to the input interface 8 by means of the mating interface 9.
  • the sensor data 7 are thus provided to the display device 2.
  • the sensor data 7 are provided to an entity device 10.
  • the entity device 10 is in particular part of the display device 2.
  • the entity device 10 is a backend system.
  • the entity device 10 is designed to assign entities to a scene and / or the objects in the surveillance area 3 based on the sensor data 7.
  • the entities are in particular a representation of real objects in the surveillance area within the display device 2. Entities are For example, persons in the surveillance area 3, objects in the surveillance area 3 as well as the sensors 4 and / or actuators 6 in the surveillance area 3. In particular, entities are to be understood as a data-technical representation of real objects of the surveillance area 3 in the display device 2.
  • the display device 2 comprises an attribution device 11.
  • the at least one input device 11 is designed to assign information and / or additional information to the entities based on the sensor data.
  • the additional information associated with the entities is attributes.
  • the attribution device 11 is designed to store attributes for one and / or multiple entities.
  • the attribution device 11 is in particular designed to associate the entities with relationships between two and / or more entities.
  • the relationships between two entities are in particular logical relationships.
  • the attribution device 11 is designed to assign temporal relationships 12 and / or spatial relationships 13 to the entities, the attribution device 11 for this purpose being provided with a time unit and / or provided with a spatial scale.
  • the display device 2 comprises a filter device 14, the filter device 14 being provided with the sensor data 7, the entities, objects and attributes.
  • the filter device 14 is designed to filter the sensor data into output data 15 based on the entities, the sensor data and the attributes.
  • the filter device 14 is designed to filter the sensor data in such a way that the output data 15 comprises all important information about an event and / or an entity in the monitoring area.
  • the filter device 14 is designed to filter the sensor data 7 in such a way that the output data 15 includes all information about a burglary, a fire or another security-relevant event in the surveillance area.
  • the display device 2 comprises a display unit 16.
  • the display unit 16 is designed, for example, as a monitor, a smartphone and / or as a tablet.
  • the display unit 16 is configured to display the output data 15. Further, it is possible that the display unit 16 is formed, a To allow interaction of a user of the display device 2 with the display device, so that, for example, a user can control the actuators in the surveillance area by entering and / or interacting with and / or on the display unit 16.
  • FIG. 2 shows a schematic representation of the degrees of interpretation of the sensor data 7 and / or the interpretation data.
  • the interpretation is based on temporal relationships between entities.
  • the basis here is a time unit which assigns each entity and / or event an event and / or timestamp.
  • the interpretation based on spatial model 17 of the monitoring area 3.
  • the spatial model 17 includes hierarchical structures of the surveillance area 3, in particular places, coordinates, room numbers, room areas, etc.
  • the spatial model 17 comprises structural navigation elements, such as doors, elevators, Escalators, staircases, etc. This makes it possible to track, for example, a person and / or an object in the movement through the monitoring area 3, as well as events in the adjacent areas and / or sections of the monitoring area 3, if necessary, summarize.
  • the spatial model 17 includes hierarchical structures of the surveillance area 3, in particular places, coordinates, room numbers, room areas, etc.
  • the spatial model 17 comprises structural navigation elements, such as doors, elevators, Escalators, staircases, etc. This makes
  • Classifications of areas and / or sections of the surveillance area 3 such as high security areas, public areas and / or restricted areas. Further, as a basis for interpreting the additional data information may be used, such as information on persons, security guards, cardholders and / or other properties of objects and / or persons in the surveillance area 3.
  • a largely uninterpreted view of the sensor data 7 is the simple recording of sensor data in a low-level interpretation 19.
  • Interpretation 19 detects only the sensor signals, the video images and / or audio data. These are preferably stored and / or secured by a safety device. With increasing interpretation and / or increasing degree of interpretation one reaches the detection interpretation 20 and the identification interpretation 21.
  • the detection interpretation uses the Sensor data 7 to detect based on the sensor data 7 objects in the monitoring area 3, such as based on the video data to detect a person in the surveillance area 3 and / or to detect that a door has been opened.
  • the interpretation is based more strongly on the additional information and / or attributes, so that based on the detection and the attributes, for example a detected person, the name of the person can be assigned. Furthermore, it is possible, for example, when opening a door by means of an access card, to identify which person has opened the door.
  • a high-level interpretation 22 of the sensor data for example, all characterizing information and / or sensor data are used to interpret a situation in the monitoring area. For example, individual events are aggregated in the surveillance area. For example, several fire detectors in the surveillance area on the same floor detect a fire.
  • the display device 2 is designed to combine the individual detected fires in a high-level interpretation to a major fire.
  • the display device 2 is further configured to display a user of the display device 2 all the information necessary for the assessment and / or evaluation of the fire.
  • the display device 2 is designed to offer the user options to control actuators 6 in the monitoring area 3, which may need to be controlled for the event of the fire. Information which is not necessary for this event, for example the fire, is filtered from the output data 15.
  • the user can be shown possible escape routes, with the user being able to control and / or open actuators 6, for example door openers within the rod.
  • Events which are based on the respective main event are preferably suppressed, if they are not of superordinate relevance. For example, for a detected fire events, such as broken windows and / or temperature increases are hidden.
  • the user can also contradict the interpretation of the sensor data 7, so that the event is optionally resolved into individual events and / or the raw sensor data are displayed.
  • Another example of a high-level interpretation 22 is that if a card is used on a door opener and access is denied, that information does not get into the output data 15. However, if the card is not used successfully on different door openers several times within a time window, this event is considered an attempted break-in.
  • FIG. 3 shows schematically an example of the interpretation of an event that Ms. Smith entered room 4711.
  • a card reader as the sensor 4 and / or as the first entity detects that an admission card with the card ID 1234 has been used on the card reader XYZ. Based on the attributes that the
  • Card reader XYZ assigning the position information, floor, house and room is assigned to this event and / or the entity of room 4711.
  • the display device 2 is adapted to provide for interpretation and / or interpretation data that Ms. Smith used her card on the card reader of the door to room 4711. Further, based on the information that the door to room 4711 was open for more than 3 seconds after using the card of Ms. Smith, the display device 2 is adapted to interpret that Ms. Smith has entered the room 4711.
  • the interpretation data in this high-level interpretation 22 includes the information that Ms. Smith entered room 4711.
  • the suspect is assigned context, attributes, and / or additional information, for example, an attribute that the suspect used a firearm is based on the fact that a sensor 4 has detected a shot with a firearm in a corresponding area
  • the information that the shot occurs in a corresponding area is based on the spatial model 17.
  • the video camera is designed to capture the suspicious person videotechnisch.
  • the video data, in particular the images, are thereby interpreted and evaluated, so that the suspicious person can be assigned as an attribute that it is a person 23 and possibly even its identity 24 and further details of the person are assignable.
  • the video camera has detected a car 26 in the corresponding area, wherein the video camera has also detected the license plate of the car 26.
  • the video camera has also detected the license plate of the car 26.
  • a WI FI sensor is located in the monitoring area as the sensor 4
  • a mobile telephone 27 used in the area of the location of the suspicious person can be located by means of the WIFI sensor, for example, this mobile telephone 27 being assignable to personal data such as telephone number
  • access cards 28 were used in the area of the suspicious person, the access card 28 used can be assigned, for example, as a context and / or as an attribute.
  • the actual owner 29 can also be assigned to the admission card 28 used, so that a possible connection between the suspect and the actual cardholder can optionally be established. Further, it is possible for the spatial model 17 to be used to send a nearest watchman 30 and / or policeman to the location of the suspect.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Alarm Systems (AREA)

Abstract

Les zones de sécurité dans les édifices et/ou les espaces ouverts sont fréquemment surveillées par des capteurs. Le matériel de surveillance est souvent évalué dans des centres de contrôle. L'invention concerne un dispositif d'affichage (2), destiné à un système de surveillance (1) d'une zone de surveillance (3), le système de surveillance (1) comportant une pluralité de capteurs (4), les capteurs (4) étant conçus pour surveiller au moins une partie de la zone de surveillance (3) sous la forme d'une scène comportant des objets et la présenter sous forme de données de scène, comprenant au moins une interface d'entrée (8) destinée recevoir les données de capteur (7), un moyen d'entité (10), le moyen d'entité (10) étant conçu pour associer des entités aux objets de la scène sur la base des données de capteur (7), un moyen d'attribution (11), le moyen d'attribution (11) comportant les entités et les données de capteur (7) ainsi que des attributs, le moyen d'attribution (11) étant conçu pour associer des attributs à au moins une des entités et/ou pour mémoriser des attributs destinés à l'entité, un moyen de filtrage (14), le moyen de filtrage (14) étant conçu pour filtrer les données de capteur (7), sur la base d'au moins une des entités et des attributs associés, en données de sortie (15), un moyen d'affichage (2) pour afficher les données de sortie. En outre, l'invention concerne une commande d'actionneur au moyen du dispositif d'affichage.
PCT/EP2016/078231 2016-11-21 2016-11-21 Dispositif d'affichage pour système de surveillance d'une zone de surveillance, système de surveillance équipé du dispositif d'affichage, procédé de surveillance d'une zone de surveillance avec un système de surveillance et programme informatique pour mettre en œuvre le procédé WO2018091110A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/EP2016/078231 WO2018091110A1 (fr) 2016-11-21 2016-11-21 Dispositif d'affichage pour système de surveillance d'une zone de surveillance, système de surveillance équipé du dispositif d'affichage, procédé de surveillance d'une zone de surveillance avec un système de surveillance et programme informatique pour mettre en œuvre le procédé
DE112016007457.3T DE112016007457A5 (de) 2016-11-21 2016-11-21 Anzeigevorrichtung für eine überwachungsanlage eines überwachungsbereiches, überwachungsanlage mit der anzeigevorrichtung, verfahren zur überwachung eines überwachungsbereiches mit einer überwachungsanlage und computerprogramm zur durchführung des verfahrens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/078231 WO2018091110A1 (fr) 2016-11-21 2016-11-21 Dispositif d'affichage pour système de surveillance d'une zone de surveillance, système de surveillance équipé du dispositif d'affichage, procédé de surveillance d'une zone de surveillance avec un système de surveillance et programme informatique pour mettre en œuvre le procédé

Publications (1)

Publication Number Publication Date
WO2018091110A1 true WO2018091110A1 (fr) 2018-05-24

Family

ID=57471809

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/078231 WO2018091110A1 (fr) 2016-11-21 2016-11-21 Dispositif d'affichage pour système de surveillance d'une zone de surveillance, système de surveillance équipé du dispositif d'affichage, procédé de surveillance d'une zone de surveillance avec un système de surveillance et programme informatique pour mettre en œuvre le procédé

Country Status (2)

Country Link
DE (1) DE112016007457A5 (fr)
WO (1) WO2018091110A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210327232A1 (en) * 2018-09-05 2021-10-21 Nec Corporation Apparatus and a method for adaptively managing event-related data in a control room
DE102022200832A1 (de) 2022-01-26 2023-07-27 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Überwachung mit mindestens einer Überwachungseinheit, Computerprogramm und Überwachungseinheit

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060227997A1 (en) * 2005-03-31 2006-10-12 Honeywell International Inc. Methods for defining, detecting, analyzing, indexing and retrieving events using video image processing
US20070132767A1 (en) * 2005-11-30 2007-06-14 William Wright System and method for generating stories in time and space and for analysis of story patterns in an integrated visual representation on a user interface
DE102007058959A1 (de) * 2007-12-07 2009-06-10 Robert Bosch Gmbh Konfigurationsmodul für ein Überwachungssystem, Überwachungssystem, Verfahren zur Konfiguration des Überwachungssystems sowie Computerprogramm
US20160259975A1 (en) * 2008-03-03 2016-09-08 Avigilon Patent Holding 2 Corporation Method of searching data to identify images of an object captured by a camera system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060227997A1 (en) * 2005-03-31 2006-10-12 Honeywell International Inc. Methods for defining, detecting, analyzing, indexing and retrieving events using video image processing
US20070132767A1 (en) * 2005-11-30 2007-06-14 William Wright System and method for generating stories in time and space and for analysis of story patterns in an integrated visual representation on a user interface
DE102007058959A1 (de) * 2007-12-07 2009-06-10 Robert Bosch Gmbh Konfigurationsmodul für ein Überwachungssystem, Überwachungssystem, Verfahren zur Konfiguration des Überwachungssystems sowie Computerprogramm
US20160259975A1 (en) * 2008-03-03 2016-09-08 Avigilon Patent Holding 2 Corporation Method of searching data to identify images of an object captured by a camera system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210327232A1 (en) * 2018-09-05 2021-10-21 Nec Corporation Apparatus and a method for adaptively managing event-related data in a control room
DE102022200832A1 (de) 2022-01-26 2023-07-27 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Überwachung mit mindestens einer Überwachungseinheit, Computerprogramm und Überwachungseinheit

Also Published As

Publication number Publication date
DE112016007457A5 (de) 2019-08-14

Similar Documents

Publication Publication Date Title
Norris From personal to digital: CCTV, the panopticon, and the technological mediation of suspicion and social control
US7595815B2 (en) Apparatus, methods, and systems for intelligent security and safety
AT513101B1 (de) Überwachungssystem, Freiflächenüberwachung sowie Verfahren zur Überwachung eines Überwachungsbereichs
CN106937086B (zh) 具有可选择的操作场景的视频监控系统和对于改进的势态感知的系统培训
EP2235948B1 (fr) Module de surveillance pour un système de vidéosurveillance, procédé de surveillance d'état d'une zone de surveillance ainsi que programme informatique
JP6837650B1 (ja) 危険度判別プログラム及びシステム
EP2290628A1 (fr) Procédé de vidéosurveillance de pièces
WO2018091110A1 (fr) Dispositif d'affichage pour système de surveillance d'une zone de surveillance, système de surveillance équipé du dispositif d'affichage, procédé de surveillance d'une zone de surveillance avec un système de surveillance et programme informatique pour mettre en œuvre le procédé
CN111083449A (zh) 一种校园安防系统
EP1680769A1 (fr) Procede et dispositif de controle de passage et / ou de separation de personnes
EP1376502A1 (fr) Système de surveillance
DE102016222134A1 (de) Videoanalyseeinrichtung für eine Überwachungsvorrichtung und Verfahren zur Erzeugung eines Einzelbildes
US20160378268A1 (en) System and method of smart incident analysis in control system using floor maps
JP6739119B1 (ja) 危険度判別プログラム及びシステム
US9990821B2 (en) Method of restoring camera position for playing video scenario
EP3542351A1 (fr) Dispositif de surveillance pour surveiller une zone de surveillance et système de surveillance équipé du dispositif de surveillance
WO2018091111A1 (fr) Dispositif d'affichage pour une installation de surveillance d'une zone de surveillance et installation de surveillance équipée du dispositif d'affichage
EP3542528B1 (fr) Dispositif d'affichage pour une installation de surveillance d'une zone de surveillance
Okorodudu Moving towards motion activated security camera system with live feed and call routing
DEPARTMENT OF HOMELAND SECURITY WASHINGTON DC Crowd Count and Analysis (CCA)
Bostrom Artificial Intelligence, AI Camera: Market & About
DE102019214707A1 (de) Verfahren, Vorrichtung und Schienenfahrzeug
Slotnick Integrated Physical Security Systems
WO2024002637A1 (fr) Système de contrôle d'accès avec accès sans barrière
CN113345192A (zh) 一种校园安防视频的分布式处理方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16805330

Country of ref document: EP

Kind code of ref document: A1

REG Reference to national code

Ref country code: DE

Ref legal event code: R225

Ref document number: 112016007457

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16805330

Country of ref document: EP

Kind code of ref document: A1