CN114040003B - Emergency disposal system and method for emergency events in personnel dense area - Google Patents

Emergency disposal system and method for emergency events in personnel dense area Download PDF

Info

Publication number
CN114040003B
CN114040003B CN202210021169.XA CN202210021169A CN114040003B CN 114040003 B CN114040003 B CN 114040003B CN 202210021169 A CN202210021169 A CN 202210021169A CN 114040003 B CN114040003 B CN 114040003B
Authority
CN
China
Prior art keywords
emergency
target
data
plan
early warning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210021169.XA
Other languages
Chinese (zh)
Other versions
CN114040003A (en
Inventor
杨博
王琼
李健
单耀
窦圆圆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
North China Institute of Science and Technology
Original Assignee
North China Institute of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by North China Institute of Science and Technology filed Critical North China Institute of Science and Technology
Priority to CN202210021169.XA priority Critical patent/CN114040003B/en
Publication of CN114040003A publication Critical patent/CN114040003A/en
Application granted granted Critical
Publication of CN114040003B publication Critical patent/CN114040003B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/10Information sensed or collected by the things relating to the environment, e.g. temperature; relating to location
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/50Safety; Security of things, users, data or systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/60Positioning; Navigation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Toxicology (AREA)
  • Alarm Systems (AREA)

Abstract

The application provides an emergency disposal system and method for emergency events in a dense personnel area, which comprises the following steps: the system comprises an Internet of things perception acquisition module, an emergency intelligent detection module, an emergency plan matching module and a central data platform. The internet of things perception acquisition module is used for acquiring various sensing data and original videos of the person-dense region. The emergency intelligent detection module is used for processing the original video, obtaining target video data, performing fusion analysis on the target video data and various sensing data, determining a target emergency and generating early warning information containing the target emergency. The emergency plan matching module is used for establishing an emergency plan library of the personnel dense area and screening the emergency plan matched with the target emergency in the emergency plan library as a target plan according to the early warning information. The central data platform comprises a visual remote command module used for providing early warning information and a target plan for the commander so as to realize the linkage rescue of the commander and the on-site rescue staff.

Description

Emergency disposal system and method for emergency events in personnel dense area
Technical Field
The application relates to the technical field of emergency disposal of dense personnel areas, in particular to an emergency disposal system and method for emergency events of the dense personnel areas.
Background
At present, the modernization of treatment systems and treatment capacity is the development trend of ultra-large cities, and emergency management is an important component of the treatment systems and the treatment capacity of the cities and should be explored and practiced without any loss. Large-scale activity places such as large-scale business circles, high-density communities, transportation hubs, schools, hospitals and the like are used as important components of life and work of residents in modern cities, and large-scale and high-density crowd gathering brings huge potential safety hazards, so that important monitoring is required. Some experts and scholars are dedicated to research on the aspect of early detection and early warning in the emergency management process of the intensive personnel areas, and the effective way of avoiding accidents is considered to be to master real-time passenger flow volume and passenger flow density and carry out early warning and evacuation guidance on crowd risks in time. To this end, the main ways in which expert scholars have studied relevant technical methods with a view to monitoring the abnormal population may include: the system is based on mobile phone positioning, traffic travel OD, monitoring video data, laser monitoring equipment and the like.
However, in the existing emergency management of the personnel-intensive area, only the abnormal gathering of people is focused on, and the emergent public events are not effectively monitored. In other words, the existing emergency monitoring in the personnel-intensive area has the problems of single means, lack of data fusion analysis and the like. Certainly, in the prior art, a way for rapidly disposing emergency events is not provided for field personnel according to monitoring results, the interactivity between the field personnel and commanders is poor, and the timeliness of emergency disposal needs to be improved.
Disclosure of Invention
The present application provides an emergency treatment system and method for emergency events in dense personnel areas, which is intended to solve or partially solve at least one of the above problems related to the background art and other disadvantages of the related art.
The application provides such emergency treatment system to intensive personnel area emergency, can include thing networking perception collection module, emergency intelligent detection module, emergent scheme matching module and central data platform. The internet of things perception acquisition module is used for acquiring various sensing data and original videos of the person-dense region. The emergency intelligent detection module is used for processing the original video, obtaining target video data, performing fusion analysis on the target video data and various sensing data, determining a target emergency and generating early warning information containing the target emergency. The emergency plan matching module is used for establishing an emergency plan library of the personnel dense area, and screening the emergency plan matched with the target emergency in the emergency plan library as a target plan according to the early warning information. The central data platform comprises a visual remote command module used for providing early warning information and a target plan for the commander so as to realize the linkage rescue of the commander and the on-site rescue staff.
In some embodiments, further comprising: and the communication positioning module is used for acquiring the position information of the target emergency and sending various sensing data, target video data and early warning information to the central data platform.
In some embodiments, the emergency intelligent detection module performs the steps including: performing grid division on a current frame image in an original video, and performing category identification and position determination on a target object in each grid to obtain category data and initial position data of each target object; the built-in tracker is used for predicting the position of each target object in the next frame image of the current frame image to obtain a predicted value, and matching and correcting the predicted value and the observed value of the next frame image to obtain dynamic position data of the target object, wherein the target object comprises a dynamic target and a static target, and the dynamic position data are a plurality of continuous position data of the target object; integrating the category data, the initial position data and the dynamic position data of each target object to obtain structured target video data; screening abnormal video data in the target video data, and screening abnormal sensing data exceeding an early warning threshold value from all the sensing data; and determining the target emergency according to the abnormal video data and the abnormal sensing data, and generating early warning information containing the target emergency, wherein the early warning information comprises personnel risks, vehicle risks, weather risks, disaster risks, field conditions and traffic conditions.
In some embodiments, the executing step of the emergency protocol matching module comprises: acquiring the existing emergency and the corresponding emergency plan of the personnel-intensive area in the Internet by utilizing a data processing and character extraction technology; extracting and integrating video data and sensing data of each existing emergency to generate an emergency plan library; screening the existing emergency matched with the target emergency in an emergency plan library by taking the early warning information of the target emergency as a keyword, calling the emergency plan corresponding to the existing emergency, and taking the emergency plan as an alternative plan; and adjusting the alternative plan in a targeted manner based on the abnormal video data and the abnormal sensing data of the target emergency, and finally obtaining the target plan suitable for the target emergency.
In some embodiments, further comprising: the visual interaction module is provided with audio equipment and is used for transmitting the audio information of the area where the target emergency collected by the audio equipment is located and the original video collected by the IOT perception collection module to the central data platform.
The application also provides an emergency handling method for emergency events in a dense personnel area, which comprises the following steps: collecting various sensing data and original videos of a person-dense area; processing an original video to obtain target video data, performing fusion analysis on the target video data and various sensing data to determine a target emergency, and generating early warning information containing the target emergency; establishing an emergency plan library of a personnel dense area, and screening an emergency plan matched with a target emergency in the emergency plan library as a target plan according to early warning information; and transmitting the early warning information and the target plan to a central data platform so as to realize the linkage rescue of commanders and field rescuers positioned on the central data platform.
In some embodiments, further comprising: and collecting the position information of the target emergency.
Processing the original video to obtain target video data, performing fusion analysis on the target video data and various sensing data to determine a target emergency and generate early warning information containing the target emergency, wherein the method comprises the following steps: performing grid division on a current frame image in an original video, and performing category identification and position determination on a target object in each grid to obtain category data and initial position data of each target object; the built-in tracker is used for predicting the position of each target object in the next frame image of the current frame image to obtain a predicted value, and matching and correcting the predicted value and the observed value of the next frame image to obtain dynamic position data of the target object, wherein the target object comprises a dynamic target and a static target, and the dynamic position data are a plurality of continuous position data of the target object; integrating the category data, the initial position data and the dynamic position data of each target object to obtain structured target video data; screening abnormal video data in the target video data, and screening abnormal sensing data exceeding an early warning threshold value from all the sensing data; and determining the target emergency according to the abnormal video data and the abnormal sensing data, and generating early warning information containing the target emergency, wherein the early warning information comprises personnel risks, vehicle risks, weather risks, disaster risks, field conditions and traffic conditions.
In some embodiments, establishing an emergency plan library of a dense personnel area, and screening an emergency plan matched with a target emergency in the emergency plan library as a target plan according to the early warning information, includes: acquiring the existing emergency and the corresponding emergency plan of the personnel-intensive area in the Internet by utilizing a data processing and character extraction technology; extracting and integrating video data and sensing data of each existing emergency to generate an emergency plan library; screening the existing emergency matched with the target emergency in an emergency plan library by taking the early warning information of the target emergency as a keyword, calling the emergency plan corresponding to the existing emergency, and taking the emergency plan as an alternative plan; and adjusting the alternative plan in a targeted manner based on the abnormal video data and the abnormal sensing data of the target emergency, and finally obtaining the target plan suitable for the target emergency.
In some embodiments, further comprising: and collecting audio information of the area where the target emergency is located, and transmitting the audio information and the original video to the central data platform.
According to the technical scheme of the embodiment, at least one of the following advantages can be obtained.
According to the emergency disposal system and method for emergency events in the personnel-intensive area, the target video data and the sensing data are combined, and the defects that the monitoring means of the emergency events in the personnel-intensive area is single and data fusion analysis is lacked are overcome; an emergency equipment storage and management module is arranged to provide a way for rapidly handling target emergency for field rescue personnel; with the help of the visual interaction module, the target emergency is quickly disposed by the commander, the timeliness of emergency disposal is improved, and the difficult problems of real-time monitoring, early warning and quick disposal in a personnel-intensive area are effectively solved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings in which:
FIG. 1 is a block diagram of an emergency treatment system for a dense area emergency event according to an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram of a structured data fusion analysis according to an exemplary embodiment of the present application;
FIG. 3 is a schematic workflow diagram of an incident detection module according to an exemplary embodiment of the present application;
FIG. 4 is a schematic diagram of identification tracking of various objects in an original video according to an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of a matching of a target incident to a target protocol according to an exemplary embodiment of the present application;
FIG. 6 is an information interaction diagram of a visualization interaction module according to an exemplary embodiment of the present application; and
fig. 7 is a flowchart of an emergency handling method for a dense personnel area emergency event according to an exemplary embodiment of the present application.
Detailed Description
For a better understanding of the present application, various aspects of the present application will be described in more detail with reference to the accompanying drawings. It should be understood that the detailed description is merely illustrative of exemplary embodiments of the present application and does not limit the scope of the present application in any way. Like reference numerals refer to like elements throughout the specification. The expression "and/or" includes any and all combinations of one or more of the associated listed items.
In the drawings, the size, dimension, and shape of elements have been slightly adjusted for convenience of explanation. The figures are purely diagrammatic and not drawn to scale. As used herein, the terms "approximately", "about" and the like are used as table-approximating terms and not as table-degree terms, and are intended to account for inherent deviations in measured or calculated values that would be recognized by one of ordinary skill in the art. In addition, in the present application, the order in which the processes of the respective steps are described does not necessarily indicate an order in which the processes occur in actual operation, unless explicitly defined otherwise or can be inferred from the context.
It will be further understood that terms such as "comprising," "including," "having," "including," and/or "containing," when used in this specification, are open-ended and not closed-ended, and specify the presence of stated features, elements, and/or components, but do not preclude the presence or addition of one or more other features, elements, components, and/or groups thereof. Furthermore, when a statement such as "at least one of" appears after a list of listed features, it modifies that entire list of features rather than just individual elements in the list. Furthermore, when describing embodiments of the present application, the use of "may" mean "one or more embodiments of the present application. Also, the term "exemplary" is intended to refer to an example or illustration.
Unless otherwise defined, all terms (including engineering and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
In addition, the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
The application provides an emergency disposal system for emergency events in a personnel-intensive area, which comprises: the system comprises an Internet of things perception acquisition module, an emergency intelligent detection module, an emergency plan matching module and a central data platform. The internet of things perception acquisition module is used for acquiring various sensing data and original videos of the person-dense region. The emergency intelligent detection module is used for processing the original video, obtaining target video data, performing fusion analysis on the target video data and various sensing data, determining a target emergency and generating early warning information containing the target emergency. The emergency plan matching module is used for establishing an emergency plan library of the personnel dense area, and screening the emergency plan matched with the target emergency in the emergency plan library as a target plan according to the early warning information. The central data platform comprises a visual remote command module used for providing early warning information and a target plan for the commander so as to realize the linkage rescue of the commander and the on-site rescue staff.
Fig. 1 is a block diagram of an emergency treatment system for a dense personnel area emergency event according to an exemplary embodiment of the present application. As shown in fig. 1, the emergency treatment system for emergency events in a dense personnel area of the present application includes an intelligent monitoring terminal 100 and a central data platform 200. Specifically, the intelligent monitoring terminal 100 includes a communication positioning module 110, an internet of things perception acquisition module 120, an emergency intelligent detection module 130, an emergency plan matching module 140, a visualization interaction module 150, and an emergency equipment storage management module 160. The central data platform 200 includes a visualization remote direction module 210 and a visualization real-time monitoring module 220. More specifically, the communication positioning module 110 includes a beidou positioning unit 111, a beidou short message communication unit 112 and a 5G network communication unit 113; the internet of things perception acquisition module 120 comprises a plurality of sensing units 121 and camera equipment 122; the visual interaction module 150 includes an audio device 151; the emergency equipment storage management module 160 includes earthquake disaster equipment 161, flood disaster equipment 162, and fire equipment 163.
In some embodiments, the communication positioning module 110 determines the location information of the smart monitoring terminal 100 through the beidou positioning unit 111, that is, the location information for determining the target emergency. In addition, the beidou short message communication unit 112 and the 5G network communication unit 113 of the communication positioning module 110 are used for performing information interaction with the central data platform 200, and provide communication guarantee for data interaction between the intelligent monitoring terminal 100 and the central data platform 200. Specifically, when no emergency occurs in the intensive personnel area, the communication positioning module 110 uses the 5G network communication unit 113 and uses a 5G telecommunication network or the internet as a main communication means to ensure that the original video and various sensing data collected by the internet of things perception collection module 120 and the analysis information of the emergency intelligent detection module 130 can be accurately transmitted to the central data platform 200 in real time. Of course. When an emergency occurs in a dense personnel area, the communication positioning module 110 uses the beidou short message communication unit 112 and the 5G network communication unit 113 to ensure that the early warning information of the emergency intelligent detection module 130 can be accurately transmitted to the central data platform 200 in real time by taking a 5G telecommunication network or the internet as a main communication means.
FIG. 2 is a schematic diagram of a structured data fusion analysis according to an exemplary embodiment of the present application. Fig. 3 is a schematic workflow diagram of an incident detection module according to an exemplary embodiment of the present application.
As shown in fig. 2 and 3, in some embodiments, the various sensing units 121 of the internet of things sensing and collecting module 120 may include a locator, a water sensor, a harmful gas sensor, a smoke sensor, a temperature and humidity sensor, and the like. The locator is used for gathering the location information in monitoring area, the water logging sensor is used for gathering the water level information in monitoring area, harmful gas sensor is used for gathering the harmful gas data in monitoring area, smoke sensor is used for gathering the smog concentration data in monitoring area and temperature and humidity sensor is used for gathering the humiture data in monitoring area. Of course, the camera device 122 is used to capture raw video of the monitored area.
Fig. 4 is a schematic diagram of identification and tracking of various objects in an original video according to an exemplary embodiment of the present application.
In some embodiments, the emergency detection module 130 integrates an edge processing device with higher computing power as a core computing component of the smart monitoring disposal terminal 100. After the emergency detection module 130 receives the various sensing data of the internet of things perception acquisition module 120 and the original video, as shown in fig. 4, first, the emergency detection module 130 performs tracking identification and long-term monitoring on each target object in the original video by using a yolo (young Only Look once) target tracking identification and positioning algorithm. In particular, multi-target monitoring and tracking essentially builds corresponding matching problems between successive image frames based on relevant features of target location, speed, shape, texture, color, etc. The monitoring scene of the application is complex, and the number of targets in the intensive personnel area is large, so that the emergency detection module 130 has the greatest characteristic of adopting a YOLO algorithm, is high in running speed, can be used for a real-time system, and can analyze and process the original video in real time during emergency monitoring and early warning. The YOLO adopts the ideas of regression and 'divide-and-conquer' to design a network structure, for a current frame image input into an original video, the YOLO divides the current frame image into S grids, each grid is responsible for predicting a target object positioned in the grid, if the central point of a certain target object is positioned in the grid, the grid is responsible for detecting the target, and the YOLO directly outputs the position and the category information of the target object after the network is predicted, so that the detection of multiple target objects is realized. Further, the initial position information and the category information of the identified target object are used as the input of the tracker; in the aspect of tracking, a light-weight convolutional neural network is designed to extract image features and incorporate the image features into a related filtering tracking framework. Further, the target position and category information output by YOLO are stored in an observation sequence, the tracker in the emergency detection module 130 is used to predict the position of the target in the next frame, an observation value adjacent to the predicted value is searched around the predicted value, and a matched observation sequence is selected by combining with a related filtering template and the predicted value is corrected; further, tracking quality evaluation is carried out; when the tracking quality evaluation is poor or continuous N frames are not matched with the observed value, a temporary linked list is generated by combining a space position constraint method, the aim that the tracking fails due to shielding can be retraced through matching by combining the space position, the motion direction and the historical characteristic relevance, if the tracking fails, the target is determined to be a new target, and then a plurality of continuous position data, namely dynamic position data, are obtained. Furthermore, the automatic tracking of multiple types of target objects in each frame of image in the original video is realized through a fusion strategy of a tracker and a detector. When the tracking quality is better evaluated, the observation value is stored in the unmatched observation sequence when being matched with the predicted value, the target object which is failed to be tracked due to shielding can be tracked through matching, the target object is retraced by combining the spatial position, the motion direction and the historical characteristic relevance, if the target object is not matched, the target object is determined to be a new target, and then a plurality of continuous position data, namely dynamic position data, are obtained. Furthermore, the automatic tracking of multiple types of target objects in each frame of image in the original video is realized through a fusion strategy of a tracker and a detector.
The target object includes a dynamic target and a static target; the dynamic target may be a vehicle or a person, the static target may be a disaster bearing body, etc. Further, the category data, the initial position data and the dynamic position data of each target object are integrated to obtain structured target video data, so that the analysis and the processing in the subsequent steps can be facilitated. And further, screening abnormal video data in the target video data, and screening abnormal sensing data exceeding an early warning threshold value from all the sensing data. And finally, performing structural analysis on various sensor data, mainly extracting the personnel quantity information, abnormal event information, vehicle information, displacement information of a bearing body in a scene and disaster risk information in the scene in the original video, superposing the numerical changes of the monitoring information acquired by other sensors, performing fusion analysis, and performing mutual aid decision-making. For example, when the monitored personnel number information exceeds the maximum bearing number of the monitored area in the video data, personnel density overload early warning is sent, the possibility factor of personnel density overload is determined by comprehensively studying and judging the abnormal events of personnel risk, vehicle risk, weather risk and disaster risk in the monitored scene in combination with the information acquired by other sensors, and early warning information containing target emergency is further generated, wherein the early warning information comprises the personnel risk, the vehicle risk, the weather risk, the disaster risk, the field situation and the traffic situation. Utilize multiple sensor data of intelligent monitoring processing terminal, like the locating data, the water level data, harmful gas data, smog concentration data, humiture data, and real-time video data, utilize target identification and tracking algorithm in the machine vision, with targets such as personnel in the video data, the vehicle, supporting body and weather draw out, contrast and tracking through multiframe data, realize the structurization of video code stream data, fuse other structurization sensing data and carry out the analysis and differentiate, effectively promote the rate of accuracy and the robustness that abnormal event detected, the incident in the real-time supervision scene.
In some embodiments, the emergency detection module 130 transmits the warning information to the central data platform 200 and the emergency plan matching module 140 through the communication positioning module 110.
In some embodiments, the emergency plan matching module 140 collects common emergencies and their corresponding emergency plans in the dense personnel area in the internet by using data processing and text extraction technology. And extracting and integrating video data and sensing data of each existing emergency to generate an emergency plan library. In other words, the emergency plan library includes a plurality of existing emergency events and their corresponding digitized plans. The digital plan is a visual and efficient emergency plan which is comprehensive, specific and highly targeted by utilizing a computer technology and a network information technology, so that the planning and execution of the plan reach the level of standardization and visualization, the compiling period of the emergency plan is shortened, and the operability of the plan is improved. Specifically, according to the structural data such as abnormal video data and abnormal sensing data of the target emergency, the existing emergency matched with the abnormal video data and the abnormal sensing data is screened in the emergency plan library, that is, the structural data of the existing emergency is compared with the structural data of the target emergency, the existing emergency with fields matched and similar attribute values in the structural data is screened, task classification is carried out, and the data analysis and storage technology is used. Furthermore, the emergency plan corresponding to the existing emergency is called and used as an alternative plan, so that the plan is quickly called, and the rapid handling of emergency in the personnel-intensive area by auxiliary commanders and field rescuers is facilitated. Further, based on the abnormal video data and the abnormal sensing data of the target emergency, the alternative plan is adjusted in a targeted manner, and finally the target plan suitable for the target emergency is obtained. For example, when the supporting body is found to be displaced through video data, corresponding disaster emergency plans are called in time, when the concentration of harmful substances is found to exceed the standard, danger leakage emergency plans are called correspondingly, and after the water level is found to exceed a set threshold value, the flood disaster emergency plans are called after the flood situation is obtained through fusion research and judgment by combining weather information extracted from the video data.
Fig. 5 is a schematic diagram of matching a target incident with a target protocol according to an exemplary embodiment of the present application.
As shown in fig. 5, when the early warning information shows that the target emergency is a personnel risk, the content of the target plan can be organization and command organization and rescue personnel, communication alarm and information issue, power optimization organization and allocation, rescue material preparation and allocation, medical rescue power preparation and allocation, and safe evacuation of personnel in the affected area; when the early warning information shows that the target emergency is a vehicle risk, the content of the target plan can be used for communication alarm, information release and rescue goods preparation and allocation; when the early warning information shows that the target emergency is a disaster risk, the content of the target plan can be organization and command organization and rescue personnel, strength optimization organization and allocation, preparation and allocation of rescue goods and materials, preparation and allocation of medical rescue strength, disaster control and disaster elimination measures; when the early warning information shows that the target emergency is in a field situation, the content of the target plan can be used for safely evacuating people in a monitoring area; and when the early warning information shows that the target emergency is a traffic condition, the content of the target plan can be used for safely evacuating people in the monitoring area.
FIG. 6 is an information interaction diagram of a visualization interaction module according to an exemplary embodiment of the present application.
As shown in fig. 6, the visual interaction module 150, as an information interaction module between the intelligent monitoring disposal terminal 100 and the central data platform 200, has an audio device, can implement audio broadcasting and on-site radio reception functions, and is beneficial to establishing a communication bridge between on-site rescuers and intelligent staff, and meanwhile, with the help of the camera device 122 of the internet of things perception acquisition module 120, it is convenient for a commander to monitor the development and change of emergency in a dense area of staff at the central data platform 200, and guide the on-site rescuers to perform emergency quick response and disposal.
In some embodiments, the emergency equipment storage and management module 160 is used to store fast emergency equipment and rescue goods, including earthquake disaster equipment 161, flood disaster equipment 162, and fire equipment 163. And rapid emergency equipment and rescue goods are managed, so that emergency treatment equipment and rescue goods required for dealing with the target emergency and the target plan can be provided for field rescue workers at the first time when the target emergency occurs.
In some embodiments, the visual remote commanding module 210 is configured to communicate with the visual interaction module 150 to obtain audio information of the rescue workers on site and to transmit the audio information of the rescue workers on its own to the rescue workers on site. In addition, the visual real-time monitoring module 220 can acquire various sensing data and original videos of the internet of things sensing acquisition module 120 through the communication positioning module 110, and can receive target video data and early warning information of the emergency intelligent detection module 130. Therefore, the commander can assist the on-site rescue workers to command and rescue in the first time by combining the real-time dynamic and early warning information of the target emergency. In addition, the visual real-time monitoring module 220 can also visually display the terminal coordinate position of the smart monitoring treatment terminal 100 on a map and display a graphic element corresponding to the emergency on the position for monitoring.
According to the emergency disposal system for the emergency events in the personnel-intensive area, the target video data and the sensing data are combined, and the defects that the monitoring means of the emergency events in the personnel-intensive area is single and data fusion analysis is lacked are overcome; an emergency equipment storage and management module is arranged to provide a way for rapidly handling target emergency for field rescue personnel; with the help of the visual interaction module, the target emergency is quickly disposed by the commander, the timeliness of emergency disposal is improved, and the difficult problems of real-time monitoring, early warning and quick disposal in a personnel-intensive area are effectively solved.
Fig. 7 is a flowchart of an emergency handling method for a dense personnel area emergency event according to an exemplary embodiment of the present application.
As shown in fig. 7, the present application further provides an emergency handling method for emergency events in a dense personnel area, which may include: step S1, collecting various sensing data and original videos of the personnel dense area; step S2, processing the original video to obtain target video data, performing fusion analysis on the target video data and various sensing data to determine a target emergency, and generating early warning information containing the target emergency; step S3, establishing an emergency plan library of the personnel dense area, and screening an emergency plan matched with the target emergency in the emergency plan library as a target plan according to the early warning information; and step S4, transmitting the early warning information and the target plan to a central data platform so as to realize linkage rescue of commanders and on-site rescuers positioned on the central data platform.
In some embodiments, further comprising: and collecting the position information of the target emergency.
Processing the original video to obtain target video data, performing fusion analysis on the target video data and various sensing data to determine a target emergency and generate early warning information containing the target emergency, wherein the method comprises the following steps: performing grid division on a current frame image in an original video, and performing category identification and position determination on a target object in each grid to obtain category data and initial position data of each target object; the built-in tracker is used for predicting the position of each target object in the next frame image of the current frame image to obtain a predicted value, and matching and correcting the predicted value and the observed value of the next frame image to obtain dynamic position data of the target object, wherein the target object comprises a dynamic target and a static target, and the dynamic position data are a plurality of continuous position data of the target object; integrating the category data, the initial position data and the dynamic position data of each target object to obtain structured target video data; screening abnormal video data in the target video data, and screening abnormal sensing data exceeding an early warning threshold value from all the sensing data; and determining the target emergency according to the abnormal video data and the abnormal sensing data, and generating early warning information containing the target emergency, wherein the early warning information comprises personnel risks, vehicle risks, weather risks, disaster risks, field conditions and traffic conditions.
In some embodiments, establishing an emergency plan library of a dense personnel area, and screening an emergency plan matched with a target emergency in the emergency plan library as a target plan according to the early warning information, includes: acquiring the existing emergency and the corresponding emergency plan of the personnel-intensive area in the Internet by utilizing a data processing and character extraction technology; extracting and integrating video data and sensing data of each existing emergency to generate an emergency plan library; screening the existing emergency matched with the target emergency in an emergency plan library by taking the early warning information of the target emergency as a keyword, calling the emergency plan corresponding to the existing emergency, and taking the emergency plan as an alternative plan; and adjusting the alternative plan in a targeted manner based on the abnormal video data and the abnormal sensing data of the target emergency, and finally obtaining the target plan suitable for the target emergency.
In some embodiments, further comprising: and collecting audio information of the area where the target emergency is located, and transmitting the audio information and the original video to the central data platform.
In some embodiments, further comprising: and managing the emergency disposal equipment and the rescue goods stored in the emergency equipment storage and management module.
According to the emergency handling method for the emergency events in the personnel-intensive area, the target video data and the sensing data are combined, and the defects that the monitoring means of the emergency events in the personnel-intensive area is single and data fusion analysis is lacked are overcome; an emergency equipment storage and management module is arranged to provide a way for rapidly handling target emergency for field rescue personnel; with the help of the visual interaction module, the target emergency is quickly disposed by the commander, the timeliness of emergency disposal is improved, and the difficult problems of real-time monitoring, early warning and quick disposal in a personnel-intensive area are effectively solved.

Claims (6)

1. An emergency treatment system for emergency events in a dense area, comprising:
the system comprises an Internet of things perception acquisition module, a video acquisition module and a video processing module, wherein the Internet of things perception acquisition module is used for acquiring various sensing data and original videos of a person-dense area;
the intelligent emergency detection module is used for processing the original video, obtaining target video data, performing fusion analysis on the target video data and the multiple kinds of sensing data, determining a target emergency, and generating early warning information containing the target emergency, wherein the intelligent emergency detection module executes the following steps: performing grid division on a current frame image in the original video, and performing category identification and position determination on a target object in each grid to obtain category data and initial position data of each target object; the built-in tracker is used for predicting the position of each target object in the next frame image of the current frame image to obtain a predicted value, and matching and correcting the predicted value and the observed value of the next frame image to obtain dynamic position data of the target object, wherein the target object comprises a dynamic target and a static target, and the dynamic position data are a plurality of continuous position data of the target object; integrating the category data, the initial position data and the dynamic position data of each target object to obtain structured target video data; screening abnormal video data in the target video data, and screening abnormal sensing data exceeding an early warning threshold value from all the sensing data; determining the target emergency according to the abnormal video data and the abnormal sensing data, and generating early warning information containing the target emergency, wherein the early warning information comprises personnel risks, vehicle risks, weather risks, disaster risks, field conditions and traffic conditions;
the emergency plan matching module is used for establishing an emergency plan library of a dense personnel area, and screening an emergency plan matched with the target emergency event in the emergency plan library as a target plan according to the early warning information, and the emergency plan matching module executes the steps of: acquiring the existing emergency and the corresponding emergency plan of the personnel-intensive area in the Internet by utilizing a data processing and character extraction technology; extracting and integrating video data and sensing data of each existing emergency to generate an emergency plan library; screening the existing emergency matched with the target emergency in the emergency plan library by taking the early warning information of the target emergency as a keyword, calling an emergency plan corresponding to the existing emergency, and taking the emergency plan as an alternative plan; based on the abnormal video data and the abnormal sensing data of the target emergency, the alternative plan is adjusted in a targeted manner, and finally the target plan suitable for the target emergency is obtained; and
and the central data platform comprises a visual remote command module and is used for providing the early warning information and the target plan for the commander so as to realize the linkage rescue of the commander and the on-site rescue staff.
2. An emergency treatment system for emergency events in dense personnel areas according to claim 1, further comprising:
and the communication positioning module is used for acquiring the position information of the target emergency and sending various sensing data, target video data and early warning information to a central data platform.
3. An emergency treatment system for emergency events in dense personnel areas according to claim 1, further comprising:
and the visual interaction module is provided with audio equipment and is used for transmitting the audio information of the area where the target emergency is acquired by the audio equipment and the original video acquired by the IOT perception acquisition module to the central data platform.
4. An emergency handling method for emergency events in a dense personnel area, comprising:
collecting various sensing data and original videos of a person-dense area;
processing the original video to obtain target video data, performing fusion analysis on the target video data and the multiple kinds of sensing data to determine a target emergency and generate early warning information containing the target emergency, wherein the method specifically comprises the following steps: performing grid division on a current frame image in the original video, and performing category identification and position determination on a target object in each grid to obtain category data and initial position data of each target object; the built-in tracker is used for predicting the position of each target object in the next frame image of the current frame image to obtain a predicted value, and matching and correcting the predicted value and the observed value of the next frame image to obtain dynamic position data of the target object, wherein the target object comprises a dynamic target and a static target, and the dynamic position data are a plurality of continuous position data of the target object; integrating the category data, the initial position data and the dynamic position data of each target object to obtain structured target video data; screening abnormal video data in the target video data, and screening abnormal sensing data exceeding an early warning threshold value from all the sensing data; determining the target emergency according to the abnormal video data and the abnormal sensing data, and generating early warning information containing the target emergency, wherein the early warning information comprises personnel risks, vehicle risks, weather risks, disaster risks, field conditions and traffic conditions;
establishing an emergency plan library of a dense personnel area, and screening an emergency plan matched with the target emergency in the emergency plan library as a target plan according to the early warning information, wherein the method specifically comprises the following steps: acquiring the existing emergency and the corresponding emergency plan of the personnel-intensive area in the Internet by utilizing a data processing and character extraction technology; extracting and integrating video data and sensing data of each existing emergency to generate an emergency plan library; screening the existing emergency matched with the target emergency in the emergency plan library by taking the early warning information of the target emergency as a keyword, calling an emergency plan corresponding to the existing emergency, and taking the emergency plan as an alternative plan; based on the abnormal video data and the abnormal sensing data of the target emergency, the alternative plan is adjusted in a targeted manner, and finally the target plan suitable for the target emergency is obtained; and
and transmitting the early warning information and the target plan to a central data platform so as to realize linkage rescue of commanders and on-site rescue workers on the central data platform.
5. The emergency treatment method for emergency events in dense personnel areas of claim 4, further comprising:
and collecting the position information of the target emergency.
6. The emergency treatment method for emergency events in dense personnel areas of claim 4, further comprising:
and collecting audio information of the area where the target emergency is located, and transmitting the audio information and the original video to the central data platform.
CN202210021169.XA 2022-01-10 2022-01-10 Emergency disposal system and method for emergency events in personnel dense area Active CN114040003B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210021169.XA CN114040003B (en) 2022-01-10 2022-01-10 Emergency disposal system and method for emergency events in personnel dense area

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210021169.XA CN114040003B (en) 2022-01-10 2022-01-10 Emergency disposal system and method for emergency events in personnel dense area

Publications (2)

Publication Number Publication Date
CN114040003A CN114040003A (en) 2022-02-11
CN114040003B true CN114040003B (en) 2022-04-01

Family

ID=80147401

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210021169.XA Active CN114040003B (en) 2022-01-10 2022-01-10 Emergency disposal system and method for emergency events in personnel dense area

Country Status (1)

Country Link
CN (1) CN114040003B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114723133A (en) * 2022-04-07 2022-07-08 华北科技学院(中国煤矿安全技术培训中心) City emergency early warning and evacuation command method and system under emergency
CN115186881B (en) * 2022-06-27 2023-08-01 红豆电信有限公司 Urban safety prediction management method and system based on big data
CN115204752B (en) * 2022-09-13 2022-12-13 深圳市城市公共安全技术研究院有限公司 Emergency handling scheme generation method, system, device and storage medium
CN115549831B (en) * 2022-09-19 2023-06-20 广州市新三雅电子技术有限公司 Emergency broadcasting system with digital intercom encryption function
CN116109154B (en) * 2023-04-13 2023-07-21 广东省电信规划设计院有限公司 Emergency event handling scheme generation method and device
CN116703088A (en) * 2023-06-07 2023-09-05 京彩未来智能科技股份有限公司 Intelligent emergency resource mobilization system based on Internet of things
CN116739870B (en) * 2023-07-14 2024-05-31 大庆恒驰电气有限公司 Emergency system management system and method
CN117746571A (en) * 2023-11-02 2024-03-22 南京鼐云科技股份有限公司 Safety emergency management command dispatching system
CN117273401A (en) * 2023-11-21 2023-12-22 航天科工广信智能技术有限公司 Emergency command method, system and storage medium based on case reasoning and simulation
CN118175270B (en) * 2024-04-10 2024-10-01 江苏苏桦技术股份有限公司 Intelligent command system and method based on visual large screen

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314639A (en) * 2011-07-05 2012-01-11 万达信息股份有限公司 Visualized dynamic intelligent emergency disposal scheme generation method
CN103456136A (en) * 2013-09-18 2013-12-18 戴会超 Internet of Things framework type system for monitoring and early warning of major accident potential safety hazards of water conservancy and hydropower project
CN107194861A (en) * 2017-07-02 2017-09-22 四川藏区高速公路有限责任公司 A kind of road network operation comprehensive monitoring management platform and method based on 3DGIS+BIM
CN109640032A (en) * 2018-04-13 2019-04-16 河北德冠隆电子科技有限公司 Based on the more five dimension early warning systems of element overall view monitoring detection of artificial intelligence
CN110210697A (en) * 2018-11-21 2019-09-06 北京域天科技有限公司 A kind of emergency communication intelligent emergent DSS
CN110472496A (en) * 2019-07-08 2019-11-19 长安大学 A kind of traffic video intelligent analysis method based on object detecting and tracking
CN111401161A (en) * 2020-03-04 2020-07-10 青岛海信网络科技股份有限公司 Intelligent building management and control system for realizing behavior recognition based on intelligent video analysis algorithm
CN113284024A (en) * 2021-04-28 2021-08-20 四川万信数字科技有限公司 Intelligent emergency management system for intensive personnel place

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050219044A1 (en) * 2004-03-16 2005-10-06 Science Traveller International Inc Emergency, contingency and incident management system and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314639A (en) * 2011-07-05 2012-01-11 万达信息股份有限公司 Visualized dynamic intelligent emergency disposal scheme generation method
CN103456136A (en) * 2013-09-18 2013-12-18 戴会超 Internet of Things framework type system for monitoring and early warning of major accident potential safety hazards of water conservancy and hydropower project
CN107194861A (en) * 2017-07-02 2017-09-22 四川藏区高速公路有限责任公司 A kind of road network operation comprehensive monitoring management platform and method based on 3DGIS+BIM
CN109640032A (en) * 2018-04-13 2019-04-16 河北德冠隆电子科技有限公司 Based on the more five dimension early warning systems of element overall view monitoring detection of artificial intelligence
CN110210697A (en) * 2018-11-21 2019-09-06 北京域天科技有限公司 A kind of emergency communication intelligent emergent DSS
CN110472496A (en) * 2019-07-08 2019-11-19 长安大学 A kind of traffic video intelligent analysis method based on object detecting and tracking
CN111401161A (en) * 2020-03-04 2020-07-10 青岛海信网络科技股份有限公司 Intelligent building management and control system for realizing behavior recognition based on intelligent video analysis algorithm
CN113284024A (en) * 2021-04-28 2021-08-20 四川万信数字科技有限公司 Intelligent emergency management system for intensive personnel place

Also Published As

Publication number Publication date
CN114040003A (en) 2022-02-11

Similar Documents

Publication Publication Date Title
CN114040003B (en) Emergency disposal system and method for emergency events in personnel dense area
KR102005188B1 (en) Industrial site safety management system based on artificial intelligence using real-time location tracking and Geographic Information System, and method thereof
CN108039016B (en) A kind of monitoring of subway underground operation space safety and early warning system
CN113706355A (en) Method for building intelligent emergency system of chemical industry park
CN109066971A (en) Intelligent substation fortune inspection managing and control system and method based on whole station business datum
WO2018232846A1 (en) Large-scale peripheral security monitoring method and system
CN106060175A (en) Poisonous or flammable gas monitoring management system based on multilayer combination network
CN109146331A (en) A kind of contaminated site repair intelligence control platform based on technology of Internet of things
CN111178828A (en) Method and system for building fire safety early warning
CN112184773A (en) Helmet wearing detection method and system based on deep learning
CN115393566A (en) Fault identification and early warning method and device for power equipment, storage medium and equipment
CN113793234A (en) Wisdom garden platform based on digit twin technique
CN113762171A (en) Method and device for monitoring safety of railway construction site
CN114118847A (en) Chemical industry park danger source on-line monitoring platform
CN104574729B (en) Alarm method, device and system
CN117392591A (en) Site security AI detection method and device
CN117394346A (en) Operation control method of virtual power plant
CN117440014A (en) Railway perimeter precaution intelligent service platform based on sky and ground integration and working method
CN108650124B (en) WebGIS-based power grid communication early warning system
CN116523492A (en) Hydropower station supervision method and system, electronic equipment and storage medium
CN113535697B (en) Climbing frame data cleaning method, climbing frame control device and storage medium
CN103065194A (en) Method of hazardous waste emergency early warning and response through 3S technologies and system
KR101098043B1 (en) The intelligent surveillance system configuration plan in urban railroad environment
CN116739357B (en) Multi-mode fusion perception city existing building wide area monitoring and early warning method and device
KR102541868B1 (en) Method for early recovery of damage to environmental basic facilitiy in the event of disaster and apparatus thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant