WO2018091108A1 - Dispositif de surveillance pour surveiller une zone de surveillance et système de surveillance équipé du dispositif de surveillance - Google Patents

Dispositif de surveillance pour surveiller une zone de surveillance et système de surveillance équipé du dispositif de surveillance Download PDF

Info

Publication number
WO2018091108A1
WO2018091108A1 PCT/EP2016/078227 EP2016078227W WO2018091108A1 WO 2018091108 A1 WO2018091108 A1 WO 2018091108A1 EP 2016078227 W EP2016078227 W EP 2016078227W WO 2018091108 A1 WO2018091108 A1 WO 2018091108A1
Authority
WO
WIPO (PCT)
Prior art keywords
rule
monitoring
attributes
monitoring device
data
Prior art date
Application number
PCT/EP2016/078227
Other languages
German (de)
English (en)
Inventor
Marcus NADENAU
Tobias STANGL
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Priority to PCT/EP2016/078227 priority Critical patent/WO2018091108A1/fr
Priority to EP16805023.5A priority patent/EP3542351A1/fr
Publication of WO2018091108A1 publication Critical patent/WO2018091108A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms

Definitions

  • Monitoring area with at least one input interface for receiving input data and at least one output interface for output data output, with a user interface to select and / or to provide a rule, the rule comprises control components.
  • a translation device for the translation of the rule in one on one
  • Computer unit executable algorithm with an attribution device, wherein the attribution means objects, attributes, input data and / or output data, wherein the attribution means is adapted to assign attributes to an object and / or attributes for the object.
  • Known methods and / or devices for detecting and analyzing events in a surveillance area are based on the analysis of image and audio files.
  • Document DE 10 2005 006 989 A1 which is probably the closest prior art, discloses a method for monitoring a surveillance area with sensors of a surveillance system, wherein elements of the surveillance area and attributes associated with the elements are extracted from signals from the sensors. From the ones extracted from the sensors Attributes are additionally derived text attributes and assigned to the elements. The elements and their attributes are stored in a memory device.
  • a monitoring device for monitoring a surveillance area with the features of claim 1 is proposed. Furthermore, a monitoring system with a
  • a monitoring device for monitoring a surveillance area is proposed.
  • the monitoring device is designed as a monitoring device, such as for a security control center for monitoring a surveillance area.
  • the surveillance area is preferably a public and / or a non-public area.
  • the surveillance area is an outdoor area and / or an interior area, in particular an interior area of a house, a hall and / or a factory.
  • the surveillance area is an airport, a train station, and / or an area with public and / or passenger traffic.
  • the monitoring device comprises at least one input interface for accepting input data and at least one output interface for outputting output data.
  • the input interface and / or the output interface may be cable interfaces, for example
  • the input interface is a radio interface, such as for the acquisition and output of data by means of a radio link.
  • the input data and the output data are in particular sensor data, image data, video data and / or audio data.
  • the input data and / or the output data may include digital data and / or analog data.
  • the input interface is in particular couplable with input devices, wherein the output interface is preferably coupled to output devices.
  • the input devices and / or the output devices are arranged in the monitoring area.
  • at least one input interface and at least one output interface can form a common interface, wherein, for example, an output device also forms an input device.
  • the input device may be a fire alarm, a burglary alarm, a wiring system, a video camera, an access control system, a personal tracking system, a perimeter surveillance system, a flight information system, a gate management system, a passenger information system, a City of the vehicle.
  • the output device is, for example, a door lock, a door opener, a sprinkler system, a siren, a loudspeaker, a public address system or a lighting unit.
  • the monitoring device includes a user interface for selecting and / or providing a rule.
  • the user interface is designed in particular hardware technology, and includes a display, such as a screen and / or a touchscreen, so that a user can interact with the monitoring device by means of the user interface.
  • the user interface comprises a microphone, an optical, a haptic and / or an acoustic input and / or output device, for example a computer keyboard and / or a computer mouse.
  • the rule is a rule that is duplicatable and / or reusable applicable to similar and / or identical problems.
  • the rule comprises rule components, the rule in particular having a if-then structure.
  • the rule is constructed in such a way that Condition a then-action follows.
  • the rule can be optically, linguistically and / or acoustically selected and / or provided.
  • the monitoring device comprises a translation device for translating the rule into an algorithm executable on a computer unit.
  • the computer unit is a CPU, a processor, a personal computer and / or a microchip.
  • the algorithm, or the translated rule is preferably a program code in a programming language.
  • the translation device can be designed by software technology and / or hardware technology.
  • the monitoring device comprises an attribution device, which can be embodied in hardware technology and / or software technology.
  • the attribute allocation device is a microcontroller, a microprocessor or a computer unit with storage medium.
  • the attribution device has objects, attributes, input data and output data.
  • the attribution device is designed as a database.
  • the objects can be physical, temporally constant, dynamic and / or generic objects.
  • the attribution device is designed to assign attributes to an object and / or to deposit attributes for the object.
  • objects form a representation of the input devices.
  • objects are, for example, persons, places, breeding in the surveillance area.
  • Other examples of objects include personalized maps, luggage, cars, smartphones, rooms, and / or areas.
  • objects include configurations of data, and / or dynamically varying objects.
  • Attributes are in particular understood properties and / or details of objects.
  • the attributes are stored in the monitoring device and / or are dynamically changeable by the input data and / or by the output data.
  • the size, position and velocity form attributes of an object in a surveillance area.
  • attributes may include location and / or location details, such as whether it is a public or secured area in the surveillance area.
  • Other attributes include the Person density and / or the number of people in the surveillance area.
  • attributes can be designed statically; alternatively and / or additionally, attributes are dynamic in time. Examples of dynamic attributes are, for example, the density of people and / or the presence of a security-relevant event. Static attributes are for example the
  • the monitoring device comprises an allocation device, which is designed in terms of hardware technology or software.
  • the allocation device is in particular designed to allocate required attributes, input data, output data and / or at least one object to the control components.
  • the required attributes, input data, output data and at least one object are the parts of the control components that are required to execute the rule.
  • the allocation device is designed to duplicate the rule, wherein the duplicated rule is applied to all useful possible assignments of attributes, input data, output data and / or object to the control components.
  • the rule for a particular type of input device is applied by the assignment rule as a duplicate rule to all the same input devices in the surveillance area.
  • the attribution device is designed to provide a plurality of attributes as a context for at least one object, the context being further objects and relationships with the other objects.
  • the allocation device is designed to allocate the required attributes, input data, output data and / or objects on the basis of the context to the control components.
  • the context is information that further characterizes a situation and / or an object.
  • the context is a summary of attributes that further characterize and / or define an object, area, or situation.
  • the context and / or the context is based on relations between at least two objects. For example, a relation describes how at least two objects are related to each other, physically, spatially and / or temporally.
  • temporally constant relations such as the distance between two sensors in the surveillance area and / or a room.
  • time-variable relations such as, for example, that a person carries with him another object, for example a suitcase, which may be parked at a later time.
  • the allocation device is in particular designed such that it uses the context to parameterize the control components with the correct and / or required attributes, input data, output data and / or objects.
  • a rule may be that when a sensor gives a signal, then drive the nearest actuator, with the
  • Context is the spatial distance between the sensor and the actuator, wherein the allocation device now duplicates this rule for each sensor and uses therein the respective sensor and the next actuator.
  • the rule is in a natural language and / or a semantics of a natural language.
  • the rule can be provided and / or selected by means of a speech dialogue system.
  • the rule comprises a whole sentence, the sentence comprising, for example, a subject, a predicate and an object.
  • the rule is, for example, in German, English, French, Chinese and / or Spanish.
  • the translation device is adapted to extract from the rule, which is in natural language, a plurality of information, such as subject, predicate, object, temporal and / or spatial relationships.
  • the user interface and / or the translation device is designed as a device with computational linguistics, the computational linguistics comprising the method steps speech recognition, tokenization, morphological analysis, syntactic analysis, semantic analysis and / or dialogue and discourse analysis.
  • the rule comprises sentence components, wherein the association device is designed to associate with the sentence components corresponding and / or required attributes, input data, output data and / or the at least one object.
  • the attributes, input data and / or output data are translated by means of and / or another translation device and / or the translated attributes, the translated input data and / or the translated output data as the sentence components in the rule used and / or assigned become.
  • the rule it is possible, for example, for the rule to be translated into the algorithm by means of the translation device with the sentence components, the sentence components being parameterized correspondingly with the translated attributes, translated input data and / or translated output data.
  • the algorithm comprises algorithm components, wherein the association device is designed to assign the algorithm components required attributes, input data, output data and / or the at least one object.
  • the algorithm components correspond to the translated sentence components translated by the translation device.
  • the attribution device comprises a model, in particular a virtual model.
  • the model is a model of the surveillance area.
  • the model is a three-dimensional or two-dimensional model, the model preferably including distances and geometries of the surveillance area.
  • the attribution device is designed to assign at least one object in the monitoring area a position, a position and / or a time stamp as an attribute.
  • the position and / or the position of an object can be displayed in at least three spatial coordinates.
  • the position and / or the position of the object includes the time at which the situation was determined.
  • the model also includes hierarchical structures, such as location, building, floor, room, etc.
  • an area can be defined in more detail in the model, such as in a floor rooms can be selected more closely.
  • the model may include navigation elements and / or movement elements such as elevators, staircases and / or doors.
  • the model includes the location and / or the positions of the input and / or output devices.
  • the model may also classify regions more closely, such as assigning whether an area of a surveillance area is a public and / or a non-public area.
  • the attribution device comprises as at least one attribute a local relation between at least two objects.
  • at least one attribute is a local relation between two input devices and / or between input device and an output device.
  • a local relation is "an object is at a point," for example, a sensor is mounted in a particular space
  • another possible local relation is "an object is close to a point,” such as an input device and / or sensor closest to this or that actor.
  • the local relation may include the relation "one object looks at another object", such as a camera focusing on this or that position, and it is possible that the local relation is the relation "an object connected to, for example, that one area is connected to another area by means of a door.
  • This embodiment is based on the consideration that such a rule will be parameterized based on spatial relations.
  • the spatial relationships are used as context, such as that at
  • Detection of a fire by means of a fire alarm the nearest camera filming the situation and / or the nearest sprinkler of a sprinkler system is triggered.
  • This rule is applicable by the monitoring device for all fire alarms in the Monitoring area, whereby the respective nearest actuators (sprinklers) are controlled by the rule.
  • the attribution device comprises as at least one attribute a temporal relation between two objects.
  • the temporal relation comprises relations such as "something is happening at the same time," such as two sensors detect coincidentally in a certain time range, and another temporal relation is "something happens before or after something.”
  • This embodiment is based on the consideration that the attributes and / or the context include temporal information, so that, for example, a rule can be parameterized by the allocation device on the basis of a temporal context.
  • the attribution device comprises as at least one attribute a physical relation between two objects.
  • a physical relation is, for example, that a sensor detects and / or reports something. Other possible physical relations are "something is blocked", for example an area is blocked or "a fire detector detects fire".
  • a physical relation can be a failure relation, that, for example, in a region no current is available and so, for example, no optical monitoring can take place.
  • the attribution device comprises as at least one attribute a logical relation between two objects.
  • logical relations are "an object has something,””an object uses something,””an object is related to something,"”an object does something in a particular place” and / or "an object is involved in something.”
  • An example of "an object has something” is that a person owns a suitcase, a gun, or a car.
  • Example of the relation "an object is something useful” means that a first object and / or a first person transports a second object, such as a suitcase.
  • a particularly important example of such a relation is "A person has an access card.”
  • a relationship is a logical relation For example, a group of people belongs together, for example, as a family.
  • the monitoring device is designed such that the allocation device has a
  • the allocation device is designed to apply a rule to identical, and / or similar objects.
  • the two objects are the same and / or similar type of sensors and / or input devices, such as two fire detectors.
  • the two objects are characterized by a similar number of attributes.
  • the allocation device is designed, for example, to apply a rule to all fire detectors accordingly, so that a rule only has to be set up once and then passively applied to context and attributes accordingly.
  • a rule is, for example, "if object 1, action 1 and / or signal 1 supplies, then drive the nearest object 2 with parameters 2.”
  • the assignment device sets object 1, signal 1 and action 1 and independently selects the closest objects 2 and put them in the rule.
  • the user interface comprises a selection unit for selecting stored rule or rule components.
  • the user interface comprises a selection menu in which the rules and / or control components can be selected.
  • the rules and / or the rule components in textual form are for example the
  • Sentence components in optical form of symbols or in acoustic form.
  • the user interface can be designed such that only further logical control components can be selected for selected control components, so that the configuration of rules can take place without errors.
  • a monitoring arrangement comprising a monitoring device according to one of the preceding claims and / or as described above.
  • the monitoring arrangement comprises at least one input device and at least an output device.
  • the input device is preferably a sensor and the output device is preferably an actuator, the input device is data-technologically connected to the input interface. Further, the input device provides the monitor with at least a portion of the input data.
  • the output device is connected to the output interface, whereby the output device takes over the output data via the output interface.
  • Figure 1 is a block diagram of a monitoring device
  • Figure 2 is a schematic view of a monitoring device and a model of the surveillance area
  • Figure 3 shows schematically the structure of a rule.
  • the surveillance area 2 is in particular a building, for example an airport, in which persons 3 move and interact with each other and / or with objects.
  • input devices 4 are arranged, wherein the input devices 4 are formed for example as a video camera, as a flight information system, as a card reading and / or identification device.
  • the input devices 4 are connected by data technology to an input interface 5 of the monitoring device 1.
  • the data connection can be wired and / or wireless, for example via a radio link.
  • the input interface 5 is designed, for example, as a cable plug connection and / or as a radio interface.
  • the data provided by the input devices 4 to the input interface 5 form the input data, wherein the input data may be digital and / or analog data, such as sensor signals and / or digitized image information.
  • the data and / or information acquired by the input devices 4 are routed via the input interface 5 to an attribution device 6, the attribution device 6 having and / or storing the input data, objects, attributes and / or output data.
  • the attribution device 6 is designed, the input data and / or the
  • the attribution device 6 assigns attributes to objects, input data attributes, output data attributes, shapes attributes to objects and / or combines several attributes as a context.
  • the attribution device 6 has a model 7, wherein the model 7 is stored, for example, as a model of the monitoring region 2 in the attribution device 6.
  • the attribution device 6 comprises a time measuring unit which assigns attributes, objects, input data and / or output data a time stamp, for example the time of day and the date.
  • the attribution device 6 is also designed to allow an object, input data,
  • Input devices, output data and / or attributes to arrange a position and / or a position within the monitoring area 2 and / or within the model 7 based on the model 7.
  • a central aspect of model 7 is that it allows spatial relationships to be broken between input devices 4, input data, and objects in particular.
  • objects, input devices, input data and / or attributes spatial features can be assigned, such as that the devices are arranged in a specific space and / or area that areas by means of doors and / or
  • Elevators are connected and / or which is closest to which object. Further, by means of the model 7, an area and / or an object may include information, such as whether it is a public or a secured area.
  • the monitoring device comprises a time module 8, which is an association of a recording and / or
  • the monitoring device 1 comprises a user interface 9.
  • the user interface 9 is arranged, for example, in the monitoring center, for example also command center and / or security center.
  • the user interface For example, FIG. 9 includes a computer, a display unit such as a screen, an input unit such as a keyboard, and a touch screen unit, for example.
  • the user interface comprises a configuration interface and an operation interface.
  • the configuration interface is designed to display value ranges and data, wherein the monitoring device can be set by means of the configuration interface and the rule can be selected and / or configured.
  • the operation interface is designed, for example, to display and / or reproduce the input data, the sensor data, the video data and / or audio data of the monitoring area 2.
  • Monitoring device 1 by means of the user interface 9, the monitoring device 1 control, operate and / or interact.
  • the user interface 9 comprises a rule 10, the rule 10 comprising control components 10a, 10b, 10c, 10d and 10e.
  • the rule 10 is designed in particular in a natural language, and is represented by the user interface 9 in this.
  • the rule in particular in the natural language, can be selected, entered and / or provided by a user via the user interface 9.
  • the user interface 9 can include a selection menu in which the user can select the rule components 10a-10e, wherein, for example, when selecting a rule component 10a, 10b, 10c, 10d and 10e, only limited, in particular meaningful, selection options of rule components 10a 10e is possible in the following.
  • the rule 10 is formed as a rule 10 with condition and sequence. Especially as an if-then rule.
  • rule 10 is constructed "If object 1 is action / condition 1, then object 2 is action 2."
  • rule 10 is designed such that rule 10 is analogously applicable to all objects with corresponding conditions and / or attributes in monitoring area 2
  • Particularly preferred for a rule in natural language is that the proposition of the condition is a subject
  • Predicate object structure and the sentence of the action also has a subject predicate object structure.
  • the monitoring device 1 has a translation device 11, wherein the translation device 11 preferably on a software Computing unit is executable.
  • the translation device 11 is designed to convert the rule 10 into an algorithm, in particular in a program code, which can be executed on a computer unit and / or on the monitoring device 1.
  • the rule 10 translates, for example, the control components 10a, 10b, 10c, 10d and 10e into algorithm components, wherein the algorithm is designed to correspondingly execute the condition and the action of the rule 10 when executed on a computer unit.
  • the translation device 11 provides the algorithm and / or the algorithm components of an allocation device 12. Furthermore, the attribution device 6 makes the attributes, the objects, the input data, the model 7 and / or the time stamp of the assignment device 12 available.
  • the allocation device 12 is designed to allocate the correspondingly required data to the algorithm components and / or the control components 10a-10e on the basis of the attributes, the objects, the input data, the model 7, the time stamp and / or in particular the context.
  • the associating means 12 is adapted to execute, for a rule 10 applicable to a plurality of objects, input devices, input data and / or attributes in the surveillance area, the rule 10 for all respective input devices, input data, attributes and / or objects and / or Rule for all input devices, input data, attributes to fill and / or assign.
  • the allocation device 12 is designed to assign output data to the control components and / or to the algorithm components, the output data in particular corresponding to the action following the condition.
  • the output data is provided by data technology to an output interface 13, wherein the output interface 13 may be a hardware interface and / or a radio interface.
  • the output interface 13 is in particular coupled and / or can be coupled to output devices 14, wherein the output devices 14 may be actuators, sensors and / or in particular also the input devices 4.
  • the output devices 14 may be arranged, for example, in the user interface 9, such as display units of video data, alternatively and / or additionally, the output devices 14 are arranged in the monitoring area 2, such as surveillance camera, sprinkler systems and / or door openers.
  • the monitoring device 1 is in particular designed such that the rule 10 provided via the user interface 9 uses the context and / or attributes for the assignment of input data, objects and attributes.
  • rule 10 is constructed so that rule 10 corresponds to a natural language and can be expressed in a natural language.
  • rule 10 and / or the construction of rules 10 is based on the syntax and / or grammar of a natural language.
  • An example of a rule 10 that can be executed on the monitoring device 1 is the rule 10 "If a fire detector outputs the signal detected fire, then show on the user interface 9, in particular on the Operatininterface, the video images of the nearest camera of the area".
  • the allocation device 12 is designed to apply this rule to all fire detectors and video cameras
  • Allocator 12 model 7 and / or attributes, as knowing which camera is closest to which fire detector. Further possibilities for a rule of this kind are "If a fire detector in a public area detects fire, then open all emergency exit doors in this area.”
  • the allocator 12 here assigns, based on the knowledge of the model 7, which is deposited as attributes in the attribution device, This and the upper rule can thus be applied to all fire detectors in one area without a user having to parameterize and attribute them specifically for each fire detector, in particular the user does not need to know which camera to use and / or which output device 14 is closest to which input device, for example fire detectors.
  • Actuators can be dramatically reduced in a monitoring device 1, since the monitoring device 1 by means of the allocation device 12, the rule itself parameterized and / or corresponding components, this assignment based on context and / or attributes that the user does not enter and / or know independently got to.
  • FIG. 2 shows a schematic view of a monitoring device 1 and a model 7 of the surveillance area 2.
  • the model of the surveillance area 2 is depicted here as a floor plan drawing of the surveillance area 2.
  • the monitoring area 2 here comprises six spaces R1, R2, R3, R4, R5 and R6.
  • an input device 4 and an output device 14 are arranged in each case.
  • the input devices 4 are each designed as fire detectors and the output devices 14 are designed as video cameras.
  • the spaces R1-R6 are connected via doors 15, the doors 15 having card readers which unlock a door 15 and / or identify the end-to-end.
  • the input devices 4 and the output devices 14 have a radio interface, wherein the input devices 4 are connected to the input interfaces 5 of the monitoring device 1 in terms of radio technology and data technology.
  • the output devices 14 also have a radio interface, wherein the output devices 14 are connected by data technology via the radio link with the output interfaces of the monitoring device 1.
  • the input devices 4 provide the input data of the monitoring device 1 via the radio link, the monitoring device 1 providing the output data to the output devices 14 via the radio link.
  • the monitoring device 1 has been provided with rules 10 which are permanently checked by the monitoring device 1 as a time loop.
  • One rule is that if an input device 4, fire detector, detects fire, then display on the user interface 9 the video images of the nearest camera in this room.
  • the associating device 12 of the monitoring device 1 is now designed to apply this rule to each fire detector and / or to each input device 4 in monitoring area 2, thereby passing on the attributes comprising the model 7 and / or as a context, the right one Assign output device 14.
  • This is the fire alarm in room 1 the video camera assigned in room 1, the fire detector in room 2, the video camera in room 2 and corresponding to the other rooms R1-R6.
  • the rule 10 is deposited by means of the user interface 9 of the monitoring device 1, when a fire detector detects a fire in one area, then open all the doors 15 which lead from this room to the emergency exit 16.
  • the allocation device 12 is designed to independently select, on the basis of the context, the attributes and / or the model 7, the doors 15 to be used as a route to the emergency exit 16 from the respective room.
  • the allocation device 12 is formed, this
  • Another possible rule 10 is that if a person requires access to a security area on a door 15, which is possible via the identification device on the door 15, then show on the user interface 9 the image of the camera, which receives this area.
  • the allocation device 12 is formed, for each door 15 and / or for each door opener, identification device to select the appropriate camera independently in the rule 10, this selection on the basis of the context, the attributes and / or the input data happens.
  • a rule would be "if a door 15 is violently opened, then show the image of the camera taking up the corresponding area and close all doors further in that area.” This is an example of a rule followed by a condition two actions, namely the action 18 show the video image as the first action
  • a rule comprises a plurality of conditions 17 and / or comprises a plurality of actions. For example, an action is only executed here if one, two and / or several conditions 17 are fulfilled.
  • FIG. 3 schematically shows the structure of a rule 10.
  • the rule 10 comprises conditions 17 and actions 18 and furthermore control components 10a, 10b, 10c, 10d.
  • the conditions 17 include, for example, the control components 10a "Check the relationship between two objects and / or between two input devices 4", 10b "Check whether an object and / or whether an input device 4 has a particular attribute "or 10c" use logical operators to test relations between objects and / or input signals ". If these conditions are met and / or are not met, the execution of an action 18, wherein the action 18 as control components 10e, 10f, 10k and 101, for example, the driving of an actuator and / or an output device 14 included checking other attributes and or input data or comprise the logical operation of attributes.
  • parameters 19 are required, wherein the parameters 19 include in particular the input data, the attributes and / or the model 7.
  • the rule 10, in particular the action 18 which follows from the rule 10 can be regarded as an output parameter 20, wherein the output parameters 20 preferably correspond to the output data.
  • the rule 10 "When a fire alarm
  • the monitoring device 1 is for monitoring a surveillance area 2.
  • the surveillance area 2 is, for example, a
  • the monitoring device is, for example, the airport security monitoring control center and a computer workstation. In the monitoring area 2 fire alarms are mounted, which form the input devices 4.
  • the monitoring device 2 has at least one input interface 5 for taking over input data and at least one output interface for outputting output data.
  • the input data includes whether a fire detector signals "fire.”
  • the output data includes signals and / or control data for the video camera, in particular the
  • the monitoring device 1 comprises a user interface 9 for selecting and / or providing rule 10 "When a fire detector reports” fire "then show live video from a camera monitoring that area".
  • Rule 10 includes Control components, in this example "fire detector”, “fire”, “report”, “show live video", "a camera that monitors this area”.
  • the monitoring device 1 comprises a translation device 11 for translating the rule 10 into an algorithm executable on a computer unit, for example into a computer program that can be executed on the monitoring device.
  • An attribution device 6 of the monitoring device 1 has objects, attributes, input data and / or output data.
  • the attribution device 6 stores sections of the monitoring area 2 as attributes which fire detectors and video cameras are located.
  • the attribution device 6 has as spatial information between input devices 4 and / or output devices 14 as attributes.
  • the attribution device 6 the input data and / or output data to a time stamp and a source and / or target position in the surveillance area 2.
  • the attribution device 6 is designed to assign attributes to an object and / or attributes for the object, such as an input device 4 a detection state, in this embodiment, the fire detector "fire” or "no fire".
  • the monitoring device 1 comprises an allocation device 12, wherein the allocation device 12 is designed to assign the control components 10a-10a required attributes, input data, output data and / or at least one object.
  • the assignment device 12 is designed to assign the control components the correct parameters for the application of the rule.
  • the allocator 12 typically associates with each fire detector the video camera that maps the area in which the fire alarm is located.
  • the allocator 12 applies the rule to all fire detectors in the surveillance area.
  • the rule associated with attributes, input data and / or output data by means of the allocation means 12 is then applied, for example, to a fire alarm "If fire detector 1.2.3.4 reports fire then show live video of camera 2.5.3.”

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Alarm Systems (AREA)

Abstract

Les édifices et les espaces ouverts sont souvent surveillés par des capteurs. Il est proposé un dispositif de surveillance 1, destiné à surveiller une zone de surveillance 2, comprenant au moins une interface d'entrée 5 destinée à prendre en charge des données d'entrée et au moins une interface de sortie 13 destinée à délivrer des données de sortie, une interface utilisateur 9 destinée à sélectionner et/ou fournir une règle 10, la règle 10 comprenant des composants de règle 10a-10, un dispositif de traduction 11 destiné à traduire la règle 10 en un algorithme exécutable sur une unité informatique, un moyen d'attribution 6, le moyen d'attribution 6 comportant des objets, des attributs, des données d'entrée et/ou des données de sortie, le moyen d'attribution 6 étant conçu pour associer des attributs à un objet et/ou pour mémoriser des attributs destinés à l'objet.
PCT/EP2016/078227 2016-11-21 2016-11-21 Dispositif de surveillance pour surveiller une zone de surveillance et système de surveillance équipé du dispositif de surveillance WO2018091108A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/EP2016/078227 WO2018091108A1 (fr) 2016-11-21 2016-11-21 Dispositif de surveillance pour surveiller une zone de surveillance et système de surveillance équipé du dispositif de surveillance
EP16805023.5A EP3542351A1 (fr) 2016-11-21 2016-11-21 Dispositif de surveillance pour surveiller une zone de surveillance et système de surveillance équipé du dispositif de surveillance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/078227 WO2018091108A1 (fr) 2016-11-21 2016-11-21 Dispositif de surveillance pour surveiller une zone de surveillance et système de surveillance équipé du dispositif de surveillance

Publications (1)

Publication Number Publication Date
WO2018091108A1 true WO2018091108A1 (fr) 2018-05-24

Family

ID=57460472

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/078227 WO2018091108A1 (fr) 2016-11-21 2016-11-21 Dispositif de surveillance pour surveiller une zone de surveillance et système de surveillance équipé du dispositif de surveillance

Country Status (2)

Country Link
EP (1) EP3542351A1 (fr)
WO (1) WO2018091108A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3748597A1 (fr) * 2019-06-04 2020-12-09 Honeywell International Inc. Génération de règle de système d'incendie
US10928509B2 (en) * 2007-07-27 2021-02-23 Lucomm Technologies, Inc. Systems and methods for semantic sensing

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050258943A1 (en) * 2004-05-21 2005-11-24 Mian Zahid F System and method for monitoring an area
DE102005006989A1 (de) 2005-02-15 2006-08-24 Robert Bosch Gmbh Verfahren für die Überwachung eines Überwachungsbereichs
US20080201277A1 (en) * 2007-02-16 2008-08-21 Matsushita Electric Industrial Co., Ltd. System architecture and process for automating intelligent surveillance center operation
EP2046040A2 (fr) * 2007-10-04 2009-04-08 KD Secure, LLC Système d'alerte et procédé de sécurité, et de productivité commerciale
US20130038737A1 (en) * 2011-08-10 2013-02-14 Raanan Yonatan Yehezkel System and method for semantic video content analysis
US20160165187A1 (en) * 2014-12-05 2016-06-09 Avigilon Fortress Corporation Systems and methods for automated visual surveillance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050258943A1 (en) * 2004-05-21 2005-11-24 Mian Zahid F System and method for monitoring an area
DE102005006989A1 (de) 2005-02-15 2006-08-24 Robert Bosch Gmbh Verfahren für die Überwachung eines Überwachungsbereichs
US20080201277A1 (en) * 2007-02-16 2008-08-21 Matsushita Electric Industrial Co., Ltd. System architecture and process for automating intelligent surveillance center operation
EP2046040A2 (fr) * 2007-10-04 2009-04-08 KD Secure, LLC Système d'alerte et procédé de sécurité, et de productivité commerciale
US20130038737A1 (en) * 2011-08-10 2013-02-14 Raanan Yonatan Yehezkel System and method for semantic video content analysis
US20160165187A1 (en) * 2014-12-05 2016-06-09 Avigilon Fortress Corporation Systems and methods for automated visual surveillance

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10928509B2 (en) * 2007-07-27 2021-02-23 Lucomm Technologies, Inc. Systems and methods for semantic sensing
EP3748597A1 (fr) * 2019-06-04 2020-12-09 Honeywell International Inc. Génération de règle de système d'incendie
US11338161B2 (en) 2019-06-04 2022-05-24 Honeywell International Inc. Fire system rule generation

Also Published As

Publication number Publication date
EP3542351A1 (fr) 2019-09-25

Similar Documents

Publication Publication Date Title
EP3279700B1 (fr) Système de gestion centralisée d'inspection de sécurité
DE102018118423A1 (de) Systeme und verfahren zur verfolgung bewegter objekte in der videoüberwachung
DE102017212533A1 (de) Vorrichtung und Verfahren zum Bereitstellen einer Zustandsinformation eines automatischen Valet Parking Systems
DE102021211867A1 (de) Systeme und verfahren zum erkennen von krankheitssymptomen von nutzern
DE102015007493A1 (de) Verfahren zum Trainieren eines in einem Kraftfahrzeug eingesetzten Entscheidungsalgorithmus und Kraftfahrzeug
EP3254172B1 (fr) Détermination d'une position d'un objet étranger à un véhicule dans un véhicule
EP3977430A1 (fr) Procédé et dispositif de détection de fumée
WO2018091108A1 (fr) Dispositif de surveillance pour surveiller une zone de surveillance et système de surveillance équipé du dispositif de surveillance
DE102021114612A1 (de) Verunreinigungserkennungs- und - benachrichtigungssysteme
EP2691330B1 (fr) Dispositif de contrôle d'accès muni d'au moins une unité vidéo
DE102017222898A1 (de) Automatisiertes Detektieren von Gefahrensituationen
EP3485621B1 (fr) Procédé de fourniture d'un moyen d'accès à une source de données personnelles
DE102012211298A1 (de) Anzeigevorrichtung für ein Videoüberwachungssystem sowie Videoüberwachungssystem mit der Anzeigevorrichtung
DE102022201786A1 (de) System und verfahren für multimodales neurosymbolisches szenenverständnis
CN108629274A (zh) 用于利用对视频储存库的法医视频分析来创建故事板的系统和方法
EP3710870B1 (fr) Système de détection
DE102016222134A1 (de) Videoanalyseeinrichtung für eine Überwachungsvorrichtung und Verfahren zur Erzeugung eines Einzelbildes
DE19600958A1 (de) Interaktives Überwachungssystem
DE102009027253A1 (de) Anordnung und Verfahren zur Bedienung eines Media-Gerätes
WO2018091110A1 (fr) Dispositif d'affichage pour système de surveillance d'une zone de surveillance, système de surveillance équipé du dispositif d'affichage, procédé de surveillance d'une zone de surveillance avec un système de surveillance et programme informatique pour mettre en œuvre le procédé
DE102018121110A1 (de) Verfahren und Vorrichtung zur Dokumentation eines Status eines autonomen Roboters
DE102012200504A1 (de) Analysevorrichtung zur Auswertung einer Überwachungsszene, Verfahren zur Analyse der Überwachungsszenen sowie Computerprogramm
DE102021201978A1 (de) Sammeln von sensordaten von fahrzeugen
DE19641000A1 (de) Verfahren und Anordnung zur automatischen Erkennung der Anzahl von Personen in einer Personenschleuse
DE102019204849A1 (de) Erkennung eines durch Personen ausgehenden Gefahrenpotentials

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16805023

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016805023

Country of ref document: EP