EP3329432A1 - System zur beobachtung und beeinflussung von objekten von interesse sowie davon ausgeführten prozessen und entsprechendes verfahren - Google Patents
System zur beobachtung und beeinflussung von objekten von interesse sowie davon ausgeführten prozessen und entsprechendes verfahrenInfo
- Publication number
- EP3329432A1 EP3329432A1 EP15766739.5A EP15766739A EP3329432A1 EP 3329432 A1 EP3329432 A1 EP 3329432A1 EP 15766739 A EP15766739 A EP 15766739A EP 3329432 A1 EP3329432 A1 EP 3329432A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- influencing
- information
- objects
- interest
- obtaining information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 230000008569 process Effects 0.000 title claims abstract description 35
- 238000012544 monitoring process Methods 0.000 title claims abstract description 9
- 238000004891 communication Methods 0.000 claims abstract description 26
- 230000004044 response Effects 0.000 claims abstract description 18
- 230000009471 action Effects 0.000 claims description 44
- 238000006243 chemical reaction Methods 0.000 claims description 25
- 238000010191 image analysis Methods 0.000 claims description 9
- 238000012797 qualification Methods 0.000 claims description 7
- 230000001960 triggered effect Effects 0.000 claims description 7
- 238000011156 evaluation Methods 0.000 claims description 5
- 230000000694 effects Effects 0.000 claims description 2
- 230000006870 function Effects 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 5
- 241001465754 Metazoa Species 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 208000027418 Wounds and injury Diseases 0.000 description 2
- 230000002457 bidirectional effect Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 208000014674 injury Diseases 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 206010038743 Restlessness Diseases 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005352 clarification Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B9/00—Safety arrangements
- G05B9/02—Safety arrangements electric
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/021—Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/26—Government or public services
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/18—Status alarms
Definitions
- the present invention relates to a system and method for monitoring and influencing objects of interest and to processes executed therewith, and more particularly to a system and method for planning, deploying and / or supporting security forces.
- a device for monitoring and influencing objects, areas, processes, etc. which consists of several modules and distributed by means of a telecommunications network spatially and informally distributed and distributed human work to a large number of employees, this thereby reducing the quality of services, reducing costs and making them attractive to a larger group of people.
- a wide variety of information is collected and processed. Task forces are selected and guided by means of the information obtained and processed.
- the object of the present invention is to further develop a device known from DE 100 29 784 A1 and to provide a corresponding method.
- a system for monitoring and influencing at least one object of interest and / or processes executed therefrom comprising: at least one object for obtaining information, which is designed to obtain information about the object of interest,
- At least one influencing object designed to exert influence on the object of interest
- a central control unit which is coupled to the object for information acquisition and the influencing object, an information exchange with or between the object for obtaining information and the influencing object controls and requests an information from the object for information retrieval upon alarm triggering and taking into account information from the Object to
- Extracting information causes the influencing object to take measures to influence the object of interest and / or processes carried out therefrom, and
- An object for obtaining information for example a video camera, is directed to an object of interest, for example a road intersection, around one
- Video stream of it is connected via a communications network, such as a landline network, to a central control unit which is capable, via an image analysis device, of recognizing structures in the video stream acquired by the video camera which require external intervention.
- a communications network such as a landline network
- the central control unit via a communication network, such as a
- the central control unit determines, based on the image analysis of the video stream from the video camera, that an exceptional action such as an accident involving personal injury is taking place or has just taken place on the intersection, an external intervention is required, then an alarm is triggered the emergency vehicle is directed as quickly as possible to the intersection, ie to the action location.
- Forces which are brought by the emergency vehicle to the place of action, can there necessary Take measures such as securing the place of action and care of injured persons. This effectively initiates a response to an emergency situation.
- Objects of interest here are i) real estate, in particular buildings, roads, intersections, squares or areas, ii) movable objects, in particular vehicles, aircraft, ships or portable devices, or iii) living beings, in particular humans, animals or plants, or parts to look at it.
- the object for information acquisition is preferably a sensor, a camera or a microphone, which are positioned stationary or mobile.
- a plurality of information or data of different kinds on the object of interest can be obtained, which can be better analyzed operations on the object of interest.
- the influencing object is preferably an emergency vehicle, deployment device or a robot, if an active intervention is required at the action location, wherein in
- the deployment device or the robot can also be spent by the emergency vehicle at the scene.
- Such influencing objects are a monitor or a
- the interaction of the object for obtaining information and / or the influencing object proves to be effective if the position of the respective objects is known.
- a georeferencing of the respective objects by assignment of coordinates preferably by means of a list, by GPS tracking, Wi-Fi or WiFi-based location or image analysis.
- the operation of the system is initiated by an alarm triggering, preferably by external message, by signals from the object to Information collection or their evaluation takes place.
- the alarm is triggered by a passer-by recognizing a specific hazard at the point of action and then activating an alarm buzzer, whereby, as described above, the central control unit analyzes the incident (the term “incident” will sometimes be referred to hereinafter as “incident”). Operation "and would cause a determination of a response to it.
- Gaining information and an influencing object is that
- Communication network is preferably formed as a mobile network, landline, Internet, local area network (LAN), WAN (Wide Area Network) or as a group of at least two networks thereof. This takes into account the stationary or mobile arrangement of the respective objects as well as their distance to a distribution or node point (hotspot).
- LAN local area network
- WAN Wide Area Network
- the object for obtaining information and / or the influencing object in each case has a local control system which stores the associated object and / or the object
- Tax system thereby increasing the performance of the system.
- the central control unit is preferably designed to determine, in communication with the object for obtaining information and / or the influencing object, the cause of the alarm triggering, the object of interest affected and its position, and to determine a reaction which is to take place in response to the alarm triggering , This is achieved by storing a plurality of processes that have been so or similar in the past and reactions successfully applied thereto. The reactions are assigned to the incidents so that the corresponding reaction is selected and thus defined for the analysis or the recognition of an incident.
- a plurality of objects for obtaining information and / or a plurality of influencing objects are present, whereby it is possible to obtain from the place of action as much information as for a meaningful analysis of a Eventually be required, and to direct to the place of action several influencing objects or from several influencing objects the most suitable.
- the central control unit is designed to be in
- the influencing objects Dependence of the established reaction, the availability of influencing objects and their qualification for establishing the determined reaction and its estimated time of arrival at the point of action at which the specified reaction is to take place, selecting the influencing object (s) which will cause the specified reaction and applying it directs the place of action.
- the influencing objects preferably reach the place of action, which have the highest qualification with regard to the defined reaction and are positioned relatively close to the action location at the time the alarm is triggered. This will increase the effectiveness and efficiency of the
- the central control unit is designed such that it creates the selected influencing objects schedules to the specified
- the schedules have been stored based on previous processes and reactions that have occurred, are retrieved when needed, and can be modified based on circumstances using logic, especially considering the selected influencing objects and their properties or capabilities.
- the selected influencing objects then work out the schedules created for them, effectively and effectively achieving the specified and thus the desired response.
- the central control unit is preferably designed, a selected one
- Action logs are created, which in turn, in turn, computer supports a final report is created. Overall, this will further increase the effectiveness and efficiency of the system.
- the above-mentioned features can be present individually or combined with each other.
- Information from the object for obtaining information causes the influencing object to carry out measures for influencing the object of interest and / or processes executed therefrom, and
- Objects of interest include i) real estate, in particular buildings, streets, intersections, squares or areas, ii) movable property, in particular vehicles, aircraft, ships or portable devices, or iii) living beings, in particular humans, animals or plants, or Parts of it.
- the object for obtaining information preferably a sensor, a camera or a microphone is used, which are positioned stationary or mobile.
- a camera or a microphone is used, which are positioned stationary or mobile.
- the influencing object it is preferable
- Used emergency vehicle deployment device, robot, monitor or speakers.
- the georeferencing of the object for obtaining information and / or the influencing object takes place by assigning coordinates, preferably by means of a list, by GPS tracking, WLAN or WiFi-based positioning or by
- the alarm is triggered preferably by external message, by signals from the object for obtaining information or their evaluation.
- the communication network is preferably a mobile radio network
- the object for information acquisition and / or the influencing object are preferably each locally controlled, whereby the associated object and / or the
- Communication is controlled internally or externally.
- the central control is preferably carried out in such a way that in
- the reason for the alarm triggering, the affected object of interest and its position is determined and a response to it is determined, which is to take place in response to the alarm triggering.
- a plurality of objects for information acquisition and / or a plurality of influencing objects is provided.
- the central control is preferably carried out in such a way that, depending on the specified reaction, the availability of influencing objects and their qualification for achieving the specified reaction and its estimated
- the one or more influencing objects are selected, which are to bring about the specified reaction, and are directed to the place of action.
- the central control is preferably performed such that schedules are created for the selected influencing object (s) to effect the specified response.
- the central control is preferably carried out in such a way that a selected influencing object at the place of action for preparing witness protocols,
- the above method (as well as the individual method steps) can be carried out by a computer system in which a data carrier is stored, on which the method is stored in computer readable form.
- Fig. 1 an operation center
- Fig. 2 shows the data known to the system with the triggering of an alarm
- FIG. 3 shows the screen display of a map with the scene of an incident as well as directly at the scene location located (video) cameras;
- Fig. 4 is the screen representation of the map of Fig. 3 and in a window thereof the image of a selected Best cam;
- Fig. 5 shows the screen display of the map of Fig. 3 and emergency forces located in the vicinity of the scene of action, and the calculation and display of routes from the current position of the emergency forces to the scene of action;
- FIG. 6 shows the screen illustration of the card of FIG. 3, showing the employees of a task force team
- Fig. 7 shows the screen representation of the card of Fig. 3 and the recording of the
- FIG. 8 shows the screen representation of the map of FIG. 3 and, in a window thereof, the presence of data records created at the point of action as well as their transmission state;
- a system for observing and influencing objects and processes executed by objects contains a plurality of objects of different categories
- a central control unit which controls an exchange of information with the objects and between the objects and on an alarm triggering measures for
- Influencing one or more objects causes, and a
- Road intersections, squares or areas ii) moving objects, in particular vehicles, airplanes, ships or portable devices, or iii) living beings, in particular humans, animals or plants, or parts thereof.
- object of interest is a place where, depending on the date and time of day, a certain number of people reside to discover a scenario that fits the space is considered atypical and is subject to increased risk potential.
- At suitable positions of the space are objects for information acquisition such as (video) cameras, microphones and / or other sensors such as temperature, pressure or humidity sensor stationary or mobile positioned to a plurality of objects for information acquisition
- objects for information acquisition such as (video) cameras, microphones and / or other sensors such as temperature, pressure or humidity sensor stationary or mobile positioned to a plurality of sensors
- the objects for obtaining information are for data transmission to a
- Communication network connected, which allows a uni- or bidirectional data flow.
- the information gathering objects may be local
- Examples include devices for
- Both the local control unit of the objects for obtaining information and the central control unit are designed such that the incoming data flow is subjected to an analysis, for example an image analysis, which is an assessment of the nature of a process or incident (such as the presence of an accident or assault) in the Environment of the object of interest.
- the assessment compares the data obtained from the incoming data flow through the analysis
- Structures with stored structures to which a particular fact or situation is assigned The degree of agreement of the structures obtained by the analysis with the stored structures provides a measure of the Accuracy of assessment. For example, if the degree of coincidence is 75% or above, the estimate is considered sufficiently accurate, and the same thing or situation as the stored structure is assigned to the process or occurrence occurring just around the object of interest. If the degree of coincidence is below, for example, 75%, then the object for information acquisition requests further data from another point of view, such as an enlarged view or a view from a different angle (such as another camera surrounding the object of interest) , Should the degree of
- the value at which the estimate is considered sufficiently accurate is set at 75% in this embodiment. In a further embodiment, the value can be determined arbitrarily.
- the data supplied to the respective local control units of the objects for obtaining information and the central control unit as well as the data processed and analyzed thereby are preferably available to all objects.
- the local control units of the objects for obtaining information and also the central control unit are as hardware by concrete electrical / electronic
- Circuits or formed by software wherein a computer program is processed in a computer and thereby carries out the method described above or at least method steps thereof.
- the incoming data may include information about a desired viewing direction of the camera, wherein the local control unit of the camera from the incoming data flow determines the desired viewing direction and causes the orientation of the camera via suitable means (joints, motors).
- suitable means joints, motors.
- Criteria are in particular the resolution that is achieved for a detected object, how far an object is completely captured in the image, or which wavelength range best suits the current environmental conditions (for example, infrared at night).
- Data originating from the object for obtaining information can be analyzed by its local control unit and / or suitably formatted for feeding into the communication network.
- the degree of analysis can be determined arbitrarily.
- the objects for obtaining information may be stationarily positioned in a location or be mobile so that they can be moved between different locations back and forth.
- the system also has influencing objects that are designed to influence objects of interest. Influencing objects are
- the influencing objects for data transmission are connected to the communication network, which enables a uni- or bidirectional data flow.
- the objects can be used for
- Information acquisition have a local control unit, via which the incoming or outgoing data flow takes place, wherein the data flow is subject to a certain processing depending on the function of the influencing object.
- the incoming data may include a deployment plan, or at least data to create a deployment plan, with the local
- Control unit of the deployment device or the robot analyzes the incoming data, requests further information from the objects for obtaining information, from other influencing objects or from the central control unit and formatted outgoing data for feeding into the communication network. Due to the submitted or created mission plan, the deployment device or robot performs operations that are appropriate to one or more objects of
- the system has a central control unit, which controls an exchange of information with the objects and between the objects and, in response to an alarm being triggered, initiates measures for influencing one or more objects.
- the communication between the objects can be done directly or with the interposition of the central control circuit. In any case, however, the central control unit has access to the data flow between the objects.
- the communication network via which the information exchange between the objects and / or the objects and the central control unit takes place is known as
- Mobile network landline, Internet, local area network (LAN), wide area network (WAN) or as a group of at least two networks thereof. This ensures secure communication between the objects themselves or between the objects and the central control unit regardless of spatial conditions.
- LAN local area network
- WAN wide area network
- the objects and thus also the events recorded by the objects and the information transmitted by them are georeferenced. To the appropriate
- ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇
- the FoV field of view
- the FoV spans a three-dimensional space into a pyramid, with the tip of the camera.
- the alarm is triggered by an external message, by signals from one or more of the objects for obtaining information or their evaluation.
- the central control unit in communication with one or more of the objects, determines the cause of the alarm, the objects of interest, and their location, and determines the response, in response to a comparison with stored previous events and the corresponding responses thereto the
- the central control unit selects the influencing objects which are to bring about the defined reaction and forwards them to the place of action to the location of the object of interest.
- Map material the optimal path is calculated.
- additional obstacles as well as route planning for aircraft (e.g.
- the success of the influence or the determined reaction depends to a large extent on the context, such as the Accessibility of the place of action (depending on the available means of transport), environmental conditions and a cultural environment (language) at the place of action.
- the skills of the staff and the accompanying resources are provided via databases. Thus, in the case of the staff, language skills, discarded courses and other qualifications can be stored and thus queried.
- drones can be launched and automatically navigate to the scene to transmit images or video streams from there.
- the deployment command can be issued manually or automatically. This is accompanied by the bis existing record uploaded to mobile computing devices at this time. This may also include the proposed route, instructions for implementation, further information on crime scene and suspects.
- the on-site employment takes place according to the situation and the specification.
- the security forces are supported by the system in that they can use a special app to document the incident in court-enforceable form.
- a sketch of the crime scene can be created. Testimonies are recorded via the voice recording, photos and videos of the crime scene and the proceedings can be created. The data is combined to form a common incident or process and in this form is transmitted to the control center (central control unit) or the central storage of proof.
- the creation of the sketch of the crime scene and of images or videos is carried out automatically, in particular under the control of the local control units of the objects for obtaining information and / or the central control unit.
- the data is transmitted using conventional methods. From and to mobile devices, the data transfer via WLAN or 3G / 4G.
- the channel is encrypted and secured by a certificate.
- An encrypted transmission is also used in the local network or WWW.
- the components call each other via web or REST services. The upload or download is thus identical to the
- Fig. 1 shows the operations center of a security company (for example, the police), with three employees staying in the foreground at their work and in the background a monitor or screen with a situation image of a locality is shown.
- a security company for example, the police
- the emergency call was made here by telephone, whereby on the basis of the telephone number directly the data "Who" (left column): Carrier ID, name (caller), carrier (telephone company), address (address of the caller) as well as "when” (right column, above): Date (here today), time (here 23:13:25) and time zone (here CET, Central European Time) of the call are raised.
- the geodesics or geo-coordinates (latitude - latitude - longitude) of the place where the incident occurred or occurred for example, the geodesics can indirectly via the phone number the GSM cell or else by naming the location by specifying the location, the street name and the house number by the caller and automatic retrieval of the assigned geodesics from a register.Note is that the above data are collected automatically according to FIG ,
- FIG. 3 shows the next step in the method sequence (best-cam function), in which the system determines and highlights the cameras on the basis of the previously obtained geodesics from the majority of (video) cameras, which are grayed out on the map. highlighted in blue), which can best represent the place of action / crime scene.
- the place of action itself is shown here by a blue underlined arrow.
- the type of incident car accident, car accident
- the associated address Park Street 127) are indicated.
- the window in the upper left corner of the picture contains the data of the process in the upper area
- the emergency response forces closest to the action location (green-shaded vehicles), the respective routes to the action location and the estimated arrival time at the action location are determined and displayed on the lower left field on the screen;
- This function is called Nearest Unit function.
- the blue lines symbolize that a route calculation and an estimate of the expected arrival time is being carried out with regard to the respective emergency services / emergency vehicles.
- the corresponding field is indicated by the number of the respective emergency vehicle (car number), the availability of the corresponding emergency services (green: team of teams available, red: emergency services not available) and estimated time of arrival of emergency responders
- the usually spoken there language are powerful.
- the system would recognize that it would be beneficial for the task force to speak the language normally spoken at the scene, and it would independently select such a force team (if available) for deployment and guide them to the scene.
- Fig. 7 illustrates the situation where a task force team was sent from the system to the scene.
- a member of the response team is in the process of capturing images of the scene, and the incident is also captured and displayed by the Bestcam feature via Camera 1701-2.
- Telephone / 3G were sent to the operations center, the fourth record of the process of uploading and Studentsmitteins is not yet completed, and the last two records are only available and the process of uploading and transmission has not been initiated.
- the existing data records are thus automatically uploaded sequentially (upload process) and also automatically transmitted to the operations center and assigned to the process.
- the data sets are available to field operations staff for viewing or selection and can be analyzed as desired.
- the selection or analysis of the process can be semi-automatic, ie under the control of the staff of the operations center, or fully automatically, ie by the system alone without intervention by the staff of the headquarters done.
- the object of interest corresponds to the road intersection marked on the map with an arrow, to the object for obtaining information one of the plurality of blue or gray shaded (video) cameras, the influencing object of one of the plural occurring green or red underlay emergency vehicles and contained therein means for recording and / or logging the accident, and the central control unit of existing in the operations center computer.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Educational Administration (AREA)
- Development Economics (AREA)
- Marketing (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Alarm Systems (AREA)
- Traffic Control Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
Claims
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/DE2015/000389 WO2017020879A1 (de) | 2015-07-31 | 2015-07-31 | System zur beobachtung und beeinflussung von objekten von interesse sowie davon ausgeführten prozessen und entsprechendes verfahren |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3329432A1 true EP3329432A1 (de) | 2018-06-06 |
Family
ID=54150199
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15766739.5A Ceased EP3329432A1 (de) | 2015-07-31 | 2015-07-31 | System zur beobachtung und beeinflussung von objekten von interesse sowie davon ausgeführten prozessen und entsprechendes verfahren |
Country Status (8)
Country | Link |
---|---|
US (1) | US11188034B2 (de) |
EP (1) | EP3329432A1 (de) |
JP (1) | JP6828010B2 (de) |
CN (1) | CN107533679A (de) |
AU (3) | AU2015404349A1 (de) |
RU (1) | RU2693926C1 (de) |
SG (1) | SG11201709794QA (de) |
WO (1) | WO2017020879A1 (de) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102018114867A1 (de) * | 2018-06-20 | 2019-12-24 | B. Strautmann & Söhne GmbH u. Co. KG | Verfahren zum Verbinden von Bauteilen |
RU2018133712A (ru) * | 2018-09-25 | 2020-03-25 | Алексей Викторович Шторм | Способы подтверждения транзакций в распределенной сети наружной рекламы |
US11481421B2 (en) * | 2019-12-18 | 2022-10-25 | Motorola Solutions, Inc. | Methods and apparatus for automated review of public safety incident reports |
US20230053823A1 (en) * | 2020-02-13 | 2023-02-23 | Motorola Solutions, Inc. | Device, system and method for controlling environmental devices at a smart building to assist a first responder |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140055621A1 (en) * | 2012-04-02 | 2014-02-27 | Mcmaster University | Optimal camera selection in array of monitoring cameras |
Family Cites Families (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3714650A (en) * | 1970-07-30 | 1973-01-30 | Raytheon Co | Vehicle command and control system |
US7271704B2 (en) * | 1996-01-23 | 2007-09-18 | Mija Industries, Inc. | Transmission of data to emergency response personnel |
US6359647B1 (en) | 1998-08-07 | 2002-03-19 | Philips Electronics North America Corporation | Automated camera handoff system for figure tracking in a multiple camera system |
JP3045713B1 (ja) * | 1998-12-09 | 2000-05-29 | 富士通株式会社 | 車載型車両誘導装置及び通信サーバシステム並びに代替車両誘導システム |
US6690374B2 (en) | 1999-05-12 | 2004-02-10 | Imove, Inc. | Security camera system for tracking moving objects in both forward and reverse directions |
JP2001076031A (ja) * | 1999-09-02 | 2001-03-23 | Fujitsu Ltd | 車両選択方法及び車両選択装置並びにシステム |
JP3645460B2 (ja) * | 1999-12-28 | 2005-05-11 | 株式会社東芝 | 事故対応ロボットシステム |
JP2001202577A (ja) * | 2000-01-20 | 2001-07-27 | Mitsubishi Electric Corp | 事故車両監視カメラシステム |
DE10029784A1 (de) * | 2000-05-04 | 2001-11-15 | Alexander John | Vorrichtung zur Beobachtung und Beeinflussung von Objekten und Prozessen |
JP2002034026A (ja) * | 2000-07-13 | 2002-01-31 | Nec Corp | 遠隔監視システム |
US7389204B2 (en) | 2001-03-01 | 2008-06-17 | Fisher-Rosemount Systems, Inc. | Data presentation system for abnormal situation prevention in a process plant |
US6798344B2 (en) | 2002-07-08 | 2004-09-28 | James Otis Faulkner | Security alarm system and method with realtime streaming video |
JP2005064784A (ja) * | 2003-08-11 | 2005-03-10 | Nec Commun Syst Ltd | 緊急通報受信センタと該センタを備えた緊急通報発生場所映像取得システム |
US7395151B2 (en) * | 2004-02-24 | 2008-07-01 | O'neill Dennis M | System and method for knowledge-based emergency response |
US7983835B2 (en) * | 2004-11-03 | 2011-07-19 | Lagassey Paul J | Modular intelligent transportation system |
KR20060014765A (ko) * | 2004-08-12 | 2006-02-16 | 주식회사 현대오토넷 | 텔레매틱스 시스템을 이용한 긴급 구난 서비스 시스템 및방법 |
ATE500580T1 (de) | 2005-03-25 | 2011-03-15 | Sensormatic Electronics Llc | Intelligente kameraauswahl und objektverfolgung |
EP1909243A1 (de) * | 2006-10-05 | 2008-04-09 | ESU Sicherheits- & Dienstleistungsmanagement GmbH | Einsatz-Leitsystem für mobile Sicherheitsdienste |
US7515065B1 (en) * | 2008-04-17 | 2009-04-07 | International Business Machines Corporation | Early warning system for approaching emergency vehicles |
CA2997878A1 (en) | 2008-10-27 | 2010-05-06 | Mueller International, Llc | Infrastructure monitoring system and method |
US10108912B1 (en) * | 2011-04-25 | 2018-10-23 | Joseph E. Conroy | Incident resource management |
JP6094132B2 (ja) * | 2012-10-09 | 2017-03-15 | 日本電気株式会社 | 災害情報管理装置、災害情報システム、災害情報の管理方法、および災害情報を管理するプログラム、ならびに、携帯端末、携帯端末の制御方法、および携帯端末の動作を制御する制御プログラム |
JP6077909B2 (ja) * | 2013-03-29 | 2017-02-08 | 綜合警備保障株式会社 | 侵入検知システム及び侵入検知方法 |
US20140368643A1 (en) * | 2013-06-12 | 2014-12-18 | Prevvio IP Holding LLC | Systems and methods for monitoring and tracking emergency events within a defined area |
US9847016B2 (en) * | 2014-07-07 | 2017-12-19 | Honeywell International Inc. | System and method of communicating data from an alarm system to emergency services personnel |
US10634507B2 (en) * | 2016-03-28 | 2020-04-28 | Avaya Inc. | Interfacing emergency events with map/routing software to re-route non-emergency traffic to create paths for emergency vehicles |
AU2017313074B2 (en) * | 2016-08-17 | 2020-07-16 | Scott Technologies, Inc. | Smart commissioning for first responders in incident command system |
US10600326B2 (en) * | 2016-09-15 | 2020-03-24 | International Business Machines Corporation | Method for guiding an emergency vehicle using an unmanned aerial vehicle |
US10511951B2 (en) * | 2017-01-17 | 2019-12-17 | 3AM Innovations LLC | Tracking and accountability device and system |
EP3631503A4 (de) * | 2017-05-23 | 2021-03-17 | D.R Roads A.I. Ltd. | Systeme und verfahren zur verkehrsüberwachung und -verwaltung |
US20180367968A1 (en) * | 2017-06-19 | 2018-12-20 | Honeywell International Inc. | Fire chief mobile command control center |
SG10201705480UA (en) * | 2017-07-03 | 2019-02-27 | Nec Asia Pacific Pte Ltd | System and method for determining event |
AU2019340424A1 (en) * | 2018-09-14 | 2021-04-15 | Avive Solutions, Inc. | Responder network |
US11327503B2 (en) * | 2019-08-18 | 2022-05-10 | Cobalt Robotics Inc. | Surveillance prevention by mobile robot |
-
2015
- 2015-07-31 US US15/578,450 patent/US11188034B2/en active Active
- 2015-07-31 EP EP15766739.5A patent/EP3329432A1/de not_active Ceased
- 2015-07-31 SG SG11201709794QA patent/SG11201709794QA/en unknown
- 2015-07-31 JP JP2018504853A patent/JP6828010B2/ja active Active
- 2015-07-31 RU RU2018106880A patent/RU2693926C1/ru active
- 2015-07-31 CN CN201580079347.2A patent/CN107533679A/zh active Pending
- 2015-07-31 AU AU2015404349A patent/AU2015404349A1/en not_active Abandoned
- 2015-07-31 WO PCT/DE2015/000389 patent/WO2017020879A1/de active Application Filing
-
2019
- 2019-11-25 AU AU2019271883A patent/AU2019271883A1/en not_active Abandoned
-
2021
- 2021-12-06 AU AU2021282389A patent/AU2021282389A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140055621A1 (en) * | 2012-04-02 | 2014-02-27 | Mcmaster University | Optimal camera selection in array of monitoring cameras |
Also Published As
Publication number | Publication date |
---|---|
JP2018524743A (ja) | 2018-08-30 |
SG11201709794QA (en) | 2017-12-28 |
AU2019271883A1 (en) | 2019-12-12 |
AU2021282389A1 (en) | 2021-12-23 |
AU2015404349A1 (en) | 2017-12-14 |
WO2017020879A1 (de) | 2017-02-09 |
CN107533679A (zh) | 2018-01-02 |
RU2693926C1 (ru) | 2019-07-05 |
US20180150034A1 (en) | 2018-05-31 |
US11188034B2 (en) | 2021-11-30 |
JP6828010B2 (ja) | 2021-02-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE112013004591T5 (de) | Erhebung und Nutzung von erfassten Fahrzeugdaten | |
DE102018114609A1 (de) | Fahrzeugsauberkeitserkennungssyteme und -verfahren | |
DE60213526T2 (de) | Verfahren und System zur Verbesserung des Situationsbewusstseins von Kommando-Kontrolleinheiten | |
DE102014213553A1 (de) | Tracking-Unterstützungsvorrichtung, Tracking-Unterstützungssystem und Tracking-Unterstützungsverfahren | |
DE112016006300T5 (de) | Vorrichtung zur Überwachung unbeaufsichtigter Objekte, mit derselben ausgestattetes System zur Überwachung unbeaufsichtigter Objekte und Verfahren zur Überwachung unbeaufsichtigter Objekte | |
WO2017020879A1 (de) | System zur beobachtung und beeinflussung von objekten von interesse sowie davon ausgeführten prozessen und entsprechendes verfahren | |
DE112013005195T5 (de) | Verfahren und Vorrichtung zur Auswahl eines Videoanalyse-Algorithmus, basierend auf historischen Ereignisdaten | |
DE102017113752A1 (de) | Fahrzeug mit der ereignisaufzeichnung | |
DE102020108972A1 (de) | System und verfahren zur verfolgung der sich bewegenden objekte | |
DE102013217223A1 (de) | Überwachungsanlage sowie Verfahren zur Darstellung eines Überwachungsbereichs | |
DE102012222661A1 (de) | Überwachungsanlage für einen Überwachungsbereich, Verfahren sowie Computerprogramm | |
DE102016123906B4 (de) | System und Verfahren zur Abwicklung eines Katastrophenfalls | |
DE102021201774A1 (de) | Augmented-Reality-Erkennung zum Lokalisieren von autonomen Fahrzeugen | |
DE10049366A1 (de) | Verfahren zum Überwachen eines Sicherheitsbereichs und entsprechendes System | |
DE102017219292A1 (de) | Verfahren und vorrichtung zum erfassen von ereignisbezogenen daten bezüglich eines fahrzeugs | |
EP3611711B1 (de) | Verfahren zum klassifizieren von daten betreffend eine parklücke für ein kraftfahrzeug | |
DE102016214860A1 (de) | Verfahren zur Überwachung zumindest eines Fahrzeugs mit mindestens einer Überwachungskamera, Überwachungskamera sowie Fahrzeug | |
DE102015007145A1 (de) | Verfahren zur automatischen Fahrroutenbewertung | |
DE102019129612A1 (de) | Fahrzeugverwaltungsystem, fahrzeug, verwaltungsvorrichtung, steuerverfahren und computerprogramm | |
EP3376152A1 (de) | Informationsverarbeitungssystem und informationsverarbeitungsverfahren | |
EP3627370B1 (de) | Verfahren und system zum verarbeiten von personenbezogenen optischen und/oder akustischen signalen | |
WO2002030053A1 (de) | Verfahren und system zum übertragen von informationen zwischen einem server und einem mobilen client | |
DE102019219452A1 (de) | Verfahren und Vorrichtung zur Bestimmung einer Position eines mobilen Objekts in einem Aufenthaltsbereich | |
LU102252B1 (de) | Computerimplementiertes Verfahren beim Bestimmen eines Beschattungszustands eines Objekts | |
DE102007000532A1 (de) | Verfahren und Vorrichtung zur Erkennung und Identifikation einer Überwachungseinrichtung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20171121 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20190805 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20210318 |