CN117314704A - Emergency event management method, electronic device and storage medium - Google Patents
Emergency event management method, electronic device and storage medium Download PDFInfo
- Publication number
- CN117314704A CN117314704A CN202311274862.9A CN202311274862A CN117314704A CN 117314704 A CN117314704 A CN 117314704A CN 202311274862 A CN202311274862 A CN 202311274862A CN 117314704 A CN117314704 A CN 117314704A
- Authority
- CN
- China
- Prior art keywords
- state
- target
- drainage
- emergency
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000007726 management method Methods 0.000 title claims abstract description 32
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 79
- 238000001514 detection method Methods 0.000 claims description 56
- 238000009825 accumulation Methods 0.000 claims description 44
- 230000007613 environmental effect Effects 0.000 claims description 31
- 230000004927 fusion Effects 0.000 claims description 18
- 238000000034 method Methods 0.000 claims description 16
- 238000000605 extraction Methods 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 5
- 238000012806 monitoring device Methods 0.000 claims description 5
- 238000012360 testing method Methods 0.000 claims description 3
- 238000012549 training Methods 0.000 claims description 3
- 238000006073 displacement reaction Methods 0.000 claims description 2
- 238000007689 inspection Methods 0.000 abstract description 7
- 238000004891 communication Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 4
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000001186 cumulative effect Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000000903 blocking effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000000630 rising effect Effects 0.000 description 3
- 206010039203 Road traffic accident Diseases 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/26—Government or public services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/213—Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
- G06N3/0442—Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Data Mining & Analysis (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Multimedia (AREA)
- Tourism & Hospitality (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- General Business, Economics & Management (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Educational Administration (AREA)
- Development Economics (AREA)
- Alarm Systems (AREA)
Abstract
The application discloses an emergency event management method, electronic equipment and a storage medium, wherein the emergency event management method comprises the following steps: obtaining a forecasting result of an emergency event of a target area; when the probability of the emergent event represented by the forecast result is larger than the preset probability, acquiring first image information of a target emergent facility corresponding to the emergent event in the target area; determining an available status of the target emergency facility based on the first image information; when the target emergency facility is in a non-available state, corresponding alarm information is generated based on the identification information of the target emergency facility. When the emergency event occurrence probability is large, the use state of the target emergency facility can be identified in advance by collecting the image of the target emergency facility, and when the emergency event is in a non-usable state, the emergency event can be timely alarmed, so that the emergency event can be targeted, the emergency event can be timely, accurately and quickly responded before or during the emergency event occurrence without manual inspection.
Description
Technical Field
The application relates to the technical field of emergency management, in particular to an emergency event management method, electronic equipment and a storage medium.
Background
The intelligent park utilizes various information technologies or innovative concepts to open and integrate systems and services of the park, so that the operation and maintenance efficiency of the park is improved, the park management and services are optimized, and the life quality of people is improved.
Emergency events such as fire, waterlogging, ecological environment damage, traffic accidents and the like occur, and in the context of an intelligent park, attention is often paid to how to monitor and predict the emergency events, so that the establishment of efficient emergency accident management is a technical problem to be solved in the field.
The accurate prediction of the emergency event is that the public safety of the park provides powerful guarantee, and after the emergency event prediction is completed, related departments and people are required to deal with the emergency event, for example, the park is predicted to be about to be waterlogged, and one of emergency responses of the related departments is to check the blocking state of a street water outlet of the park; or predicting a fire disaster in a certain area or predicting a fire disaster to spread in a certain area, the related departments need to ensure that the fire-fighting channel or the fire-fighting facility can normally operate.
At present, the investigation of the available states of the emergency facilities may need manual inspection, however, the manual inspection is difficult to achieve timely and effective inspection of the available states of the emergency facilities, and the manual inspection is often not specific, all the emergency facilities need to be inspected one by one, so that emergency events cannot be accurately, timely and efficiently handled, the emergency events may be caused to happen, the emergency facilities are not inspected, and the emergency events cannot be handled at the first time.
Therefore, how to more specifically cope with an emergency event before or when the emergency event occurs becomes a technical problem to be solved.
Disclosure of Invention
The application provides an emergency event management method, electronic equipment and a storage medium, which are used for at least solving the technical problem of how to more pertinently cope with an emergency event before or when the emergency event occurs in the related technology.
According to a first aspect of the present application, there is provided an emergency event management method, comprising: obtaining a forecasting result of an emergency event of a target area; when the probability of the forecasting result representing the occurrence of the emergency event is larger than a preset probability, acquiring first image information of a target emergency facility corresponding to the emergency event in the target area; determining an availability status of the target emergency facility based on the first image information; and when the target emergency facility is in a non-available state, generating corresponding alarm information based on the identification information of the target emergency facility.
Optionally, the determining the availability status of the target emergency facility based on the first image information includes: inputting the first image information into a pre-trained first target detection model to obtain an available state detection result of the target emergency facility and an environmental characteristic detection result associated with the target emergency facility within a preset range of the target emergency facility; and carrying out weighted fusion on the available state detection result and the environment characteristic detection result to determine the available state of the target emergency facility.
Optionally, the determining the available state of the target emergency facility based on the available state detection result and the environment characteristic detection result through weighted fusion includes: predicting a corresponding emergency facility prediction use state based on the prediction result; determining a predicted environmental feature affecting the predicted usage state based on the predicted usage state and the available state detection result; comparing the predicted environmental characteristics with the environmental characteristic detection results; determining fusion weights of the available state detection result and the environment characteristic detection result based on the comparison result; and carrying out weighted fusion on the available state detection result and the environment characteristic detection result based on the fusion weight to determine the available state of the target emergency facility.
Optionally, the emergency event includes regional waterlogging, the emergency facility includes a drain, and the management method further includes: acquiring real-time rainfall information; determining a preset drainage state corresponding to each target drainage outlet based on the real-time rainfall information; and comparing the real-time drainage state of the target drainage outlet with the preset drainage state to determine whether to output blockage alarm information.
Optionally, the preset drainage state includes a preset ponding state at each target drainage outlet corresponding to the real-time rainfall information; comparing the real-time drainage state of the target drainage outlet with the preset drainage state, and determining whether to output blockage alarm information comprises the following steps: acquiring second image information of a real-time ponding state at the target water outlet; determining a real-time ponding state at the target water outlet and a ponding state prediction result in a future preset time length based on the second image information; and comparing the preset ponding state with the real-time ponding state and the ponding state prediction result respectively, and determining whether to output blockage alarm information.
Optionally, the preset ponding state comprises a preset ponding depth sequence divided according to time sequence; comparing the preset ponding state with the real-time ponding state and the ponding state prediction result respectively, and determining whether to output blockage alarm information comprises the following steps: comparing the real-time ponding state with the preset ponding depth sequence, determining that the real-time ponding state corresponds to a first preset ponding depth, and obtaining a first comparison result; determining a corresponding water accumulation state prediction result and a second preset water accumulation depth corresponding to the water accumulation state prediction result based on the real-time water accumulation state; comparing the accumulated water state prediction result with the second preset water inlet depth to obtain a second comparison result; and determining whether to output jam alarm information or not based on the first comparison result and/or the second comparison result.
Optionally, the preset drainage state includes a preset drainage amount at each of the target drainage openings corresponding to the real-time rainfall information; comparing the real-time drainage state of the target drainage outlet with the preset drainage state, and determining whether to output blockage alarm information comprises the following steps: acquiring an acquired drainage vibration signal of a vibration monitoring device arranged on the road test; inputting the drainage vibration signal into a pre-trained drainage identification model to obtain real-time drainage; and comparing the real-time drainage amount with the preset drainage amount to determine whether to output blockage alarm information.
Optionally, the displacement identification model comprises a twinning feature extraction network and an identification network; inputting the drainage vibration signal into a pre-trained drainage identification model, and obtaining real-time drainage comprises: inputting the drainage vibration signal into a twin feature extraction network to obtain drainage vibration features, wherein the twin feature extraction network is obtained by training a sample pair constructed based on the drainage vibration signal and the non-drainage vibration signal; and inputting the drainage vibration characteristics into a pre-trained recognition network to obtain a drainage recognition result, wherein the recognition network is trained based on the drainage vibration characteristic samples and corresponding drainage labels.
According to a second aspect of the present application, an embodiment of the present application further provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the emergency event management method of any of the first aspects above.
According to a third aspect of the present application, embodiments of the present application further provide a computer readable storage medium storing a computer program which when executed by a processor implements the emergency event management method according to any one of the first aspects described above.
According to the emergency event management method, the forecasting result of the emergency event of the target area is obtained; when the probability of the forecasting result representing the occurrence of the emergency event is larger than a preset probability, acquiring first image information of a target emergency facility corresponding to the emergency event in the target area; determining an availability status of the target emergency facility based on the first image information; and when the target emergency facility is in a non-available state, generating corresponding alarm information based on the identification information of the target emergency facility. The method has the advantages that the occurrence probability of the emergency event is confirmed through the forecasting result of the emergency event, when the occurrence probability is high, the use state of the target emergency facility can be identified in advance through collecting the image of the target emergency facility, and when the target emergency facility is in the unavailable state, the alarm is given out timely, so that the target emergency facility can be treated in a targeted manner, manual inspection is not needed, and the response to the corresponding emergency event which is more targeted before or during the emergency event can be timely, accurate and rapid.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a schematic flow chart of an emergency event management method provided in the present application;
FIG. 2 is a schematic diagram of an emergency event management apparatus provided herein;
fig. 3 is a schematic diagram of an electronic device provided in the present application.
Detailed Description
In order to more clearly illustrate the general concepts of the present application, a detailed description is provided below by way of example in connection with the accompanying drawings.
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The application proposes an emergency event management method, referring to fig. 1, the method may include:
s10, obtaining a forecasting result of the emergency event of the target area. As an exemplary embodiment, the target area may include a campus, or an area in a campus. The emergency event can be fire, waterlogging, ecological environment damage, traffic accident and the like, and the prediction result of the emergency event can be obtained from the prediction results sent by related departments, and by way of example, the weather prediction result can be obtained from a weather department, the earthquake prediction result can be obtained from an earthquake department, the fire prediction situation can be obtained from a fire department, or the fire alarm situation can be used as the prediction result.
S20, when the probability of the forecasting result representing the occurrence of the emergency event is larger than the preset probability, acquiring first image information of a target emergency facility corresponding to the emergency event in the target area. In this embodiment, the forecast result often has an emergency event occurrence probability, for example, there may be a forecast rainfall of the current target area and a park waterlogging occurrence probability in the weather forecast result; the fire department or the weather department can forecast the probability of fire occurrence in a certain area, the prediction of the fire of the park can be carried out through historical data, the risk of the fire of the park is also influenced by a plurality of other time-space data, in the time dimension, weather data of different times can influence the whole park in a large range, for example, the probability of fire occurrence of the whole park is increased sharply when the temperature is higher; in the spatial dimension, the functionality of each area can also have an impact on the probability of fire, such as in areas with greater dining and commercial service. In addition, the number of population and the electricity consumption can be combined to predict the fire disaster. After the forecasting result is obtained, if the probability of occurrence of the emergency event of the forecasting result is high, obtaining first image information of a target emergency facility corresponding to the emergency time. In this embodiment, the first image information of the corresponding target emergency facility may be acquired by the camera at the road side. In this embodiment, the target emergency facility for a campus waterlogging emergency event may be a water drain, and the target emergency facility for a campus fire emergency event may be a fire channel or fire extinguishing device.
S30, determining the available state of the target emergency facility based on the first image information. As an exemplary embodiment, the first image information may include image information of the target emergency facility, and in this embodiment, the state of the target emergency facility may be identified through an artificial intelligence model, and, as an exemplary embodiment, the state of the target emergency facility may be identified through a target detection model.
The target detection model may be an R-CNN model, for example, may be a fast R-CNN model, or a Mask R-CNN model, which increases a segmentation function on image features based on the fast R-CNN model, thereby further improving detection accuracy.
In an alternative embodiment, an SSD model can be further adopted, the SSD detects the target by applying convolution sliding windows on features of different scales, and detection accuracy is improved by fusion of the multi-layer feature graphs, so that the SSD has high detection speed and high accuracy.
S40, when the target emergency facility is in a non-available state, generating corresponding alarm information based on the identification information of the target emergency facility. In this embodiment, when it is determined that an emergency event may occur, if it is detected that the corresponding target emergency facility is not available, or the available state is insufficient to cope with the current emergency event, alarm information needs to be generated, so that the target emergency facility can be processed in time to cope with the emergency event, and the influence of the emergency event is enlarged due to the unavailability of the corresponding target emergency facility when the emergency event occurs.
In the application, the forecasting result of the emergency event of the target area is obtained; when the probability of the forecasting result representing the occurrence of the emergency event is larger than a preset probability, acquiring first image information of a target emergency facility corresponding to the emergency event in the target area; determining an availability status of the target emergency facility based on the first image information; and when the target emergency facility is in a non-available state, generating corresponding alarm information based on the identification information of the target emergency facility. The method has the advantages that the occurrence probability of the emergency event is confirmed through the forecasting result of the emergency event, when the occurrence probability is high, the use state of the target emergency facility can be identified in advance through collecting the image of the target emergency facility, and when the target emergency facility is in the unavailable state, the alarm is given out timely, so that the target emergency facility can be treated in a targeted manner, manual inspection is not needed, and the response to the corresponding emergency event which is more targeted before or during the emergency event can be timely, accurate and rapid.
As an exemplary embodiment, the definition of the available state of the target emergency facility not only needs the state of the target emergency facility itself, but also needs to be changed from the available state to the unavailable state by whether the target emergency facility is in response to an emergency event, that is, whether the target emergency facility is in use due to surrounding environment factors, for example, the water outlet is in the non-blocking state when being identified by the first image, however, more sundries or garbage may exist around the water outlet, and when draining water, the sundries or garbage may flow to the water outlet to cause the water outlet to be blocked during use, and thus to be in the unavailable state; as another example, the fire safety door of the fire passageway is open, but may be in an unavailable state if occupied by other items on the fire passageway. Thus, in this embodiment, it is further determined whether the target emergency facility is in a usable state by the characteristics of the surrounding environment.
As an exemplary embodiment, the determining of the available state of the target emergency facility may adopt a method that the first image information is input into a pre-trained first target detection model to obtain an available state detection result of the target emergency facility and an environmental characteristic detection result associated with the emergency facility within a preset range of the target emergency facility; and carrying out weighted fusion on the available state detection result and the environment characteristic detection result to determine the available state of the target emergency facility.
By detecting the available state of the target emergency facility itself, the state of the environmental characteristics associated with the use of the target emergency facility within the preset range of the target emergency facility is also detected. In this embodiment, the environmental feature detection result may include a static feature of an environmental feature, and may further include an influence of the static feature on a target emergency state in a use process of the target emergency facility, and in this embodiment, the environmental feature detection result may be assigned with a weight based on the influence, where the greater the degree of influence is, the higher the weight is, and further, the available state detection result and the environmental feature detection result are weighted and fused to determine an available state of the target emergency facility.
As an exemplary embodiment, different emergency events, and the severity of the emergency event, may vary for the use of the target emergency facility, for example, when the amount of rainfall is small, incomplete blockage of the drain may be required, when there is heavy rain in a short period of time or when there is heavy rainfall, complete unblocking of the drain may be required, and no surrounding debris or impurities are present to ensure that the drain will perform its maximum drainage function when there is rainfall; in addition, as for the fire, fire needs to be extinguished by the fire truck, and the fire-fighting safety door of the fire-fighting channel is opened, but the fire-fighting channel is full of sundries or the vehicle is parked, so that the artificial fire-fighting channel is not available. If a fire truck is not needed, fire can be extinguished through a nearby fire hydrant, sundries are filled in the fire channel, or vehicles are parked in the fire channel, so that the emergency is not greatly influenced, personnel evacuation is not greatly influenced, and therefore the availability of the fire channel can be determined. Therefore, the available state of the target emergency facility needs to be comprehensively confirmed through the influence of the characteristics such as the severity and the type of the emergency event and the surrounding environment factors on the current use state of the emergency event, so that the manpower and material resources can be saved as much as possible under the condition that the available state of the target emergency facility is ensured.
In this embodiment, the corresponding emergency facility prediction use state is predicted based on the prediction result; determining preset environmental characteristics affecting the predicted use state based on the predicted use state and the available state detection result; comparing the environmental characteristic detection results of the preset environmental characteristics; determining fusion weights of the available state detection result and the environment characteristic detection result based on the comparison result; and carrying out weighted fusion on the available state detection result and the environment characteristic detection result based on the fusion weight to determine the available state of the target emergency facility.
For example, when the prediction result may generate an emergency event, such as a park waterlogging or a park fire, the severity of the emergency event may be obtained through the prediction result, and the use state of the target emergency facility may be predicted based on the severity, and the predicted environmental characteristics of environmental factors that affect the use of the target emergency facility may be simulated through the predicted use state, such as the type of environmental factors, the location, the number, and the like of the environmental factors. After the predicted environmental features are obtained, comparing the predicted environmental features with the environmental feature detection results, in this embodiment, the similarity between the predicted environmental features and the environmental feature detection results can be calculated, and when the similarity is greater than a preset similarity, it is confirmed that the environmental feature detection results have a greater influence on the current emergency event handled by the target emergency facility. Therefore, in this embodiment, the fusion weight of the environmental feature detection result may be determined based on the above-mentioned similarity, and the fusion weight of the available state detection result may be determined by the probability of the available state detection result, and then the two weights are allocated in proportion, so as to perform weighted fusion on the available state detection result and the environmental feature detection result to determine the available state of the target emergency facility.
As an alternative embodiment, the emergency event includes regional waterlogging, for example, the emergency facility includes a water outlet, before the emergency event occurs, the target emergency facility may be predicted according to a prediction result, during the emergency event, the available state of the target emergency facility may change, for example, in the scene of the waterlogging in the park, the available state of the water outlet is available before raining, however, as the water amount increases, the water outlet may be blocked due to sundries brought by water flow and is in an unavailable state, so real-time monitoring is needed for the use state of the water outlet.
Therefore, in the present embodiment, the drain port will be described as an example, and real-time rainfall information is acquired during the use of the drain port. Determining a preset drainage state corresponding to each target drainage outlet based on the real-time rainfall information; the drain state of the drain port corresponding to the rainfall in each range may be preset, for example, the corresponding water accumulation height, water accumulation rate, etc. under a certain margin.
And acquiring the real-time drainage state of the target drainage outlet in real time, comparing the real-time drainage state of the target drainage outlet with the preset drainage state, and determining whether to output blockage alarm information. In this embodiment, the real-time drainage state may include a real-time water accumulation height at the drain port, a real-time water accumulation rising rate, and the like, and in this embodiment, by comparing the real-time drainage state with the preset drainage state, if the real-time water accumulation height exceeds the preset water accumulation height and/or the real-time water accumulation rising rate exceeds the preset water accumulation rising rate, the drain port may be in a blocking state, an alarm signal needs to be generated, and a corresponding worker is notified to timely process the current target drain port, so that the water accumulation can be timely drained to prevent waterlogging.
As an alternative embodiment, the preset drainage state includes a preset water accumulation state at each of the target drainage openings corresponding to the real-time rainfall information. Comparing the real-time drainage state of the target drainage outlet with the preset drainage state, and determining whether to output blockage alarm information comprises the following steps:
and acquiring second image information of the real-time ponding state at the target water outlet. As an exemplary embodiment, the second image information may include image or video information of consecutive multiframes. In this embodiment, after the emergency event occurs, the image information of the use state of the target emergency facility may be continuously acquired in real time.
And determining a real-time ponding state at the target water outlet and a ponding state prediction result in a future preset time length based on the second image information. As an exemplary embodiment, the water accumulation state in the future preset duration may be predicted by a machine learning model, and in this embodiment, the water accumulation state in the future preset duration may be predicted by using a recurrent neural network based on the time sequence feature by extracting continuous time sequence features of the current real-time water accumulation state. In this embodiment, for example, an LSTM model or a GRU model may be adopted, where a sequential accumulated feature of a change of a water accumulation state with time in a past preset time period may be extracted by adopting the GRU model, and further, a prediction is performed on a water accumulation state in a future preset time period based on the extracted sequential accumulated feature, for example, the extracted sequential accumulated feature may be directly mapped into the future preset time period, and the prediction is performed on the water accumulation state in the future preset time period through the current real-time water accumulation state and the mapped sequential accumulated feature.
And comparing the preset ponding state with the real-time ponding state and the ponding state prediction result respectively, and determining whether to output blockage alarm information. After the real-time ponding state and the ponding state prediction result are obtained, the real-time ponding state and the ponding state prediction result can be respectively compared with the preset ponding states of corresponding time sequences so as to predict whether the water outlet is congested or not in advance,
the preset ponding state comprises a preset ponding depth sequence divided according to time sequence;
comparing the preset ponding state with the real-time ponding state and the ponding state prediction result respectively, and determining whether to output blockage alarm information comprises the following steps:
comparing the real-time water accumulation state with the preset water accumulation depth sequence, determining that the real-time water accumulation state corresponds to a first preset water accumulation depth, and obtaining a first comparison result.
And determining a corresponding water accumulation state prediction result and a second preset water accumulation depth corresponding to the water accumulation state prediction result based on the real-time water accumulation state. In this embodiment, a GRU model may be used to extract a time sequence cumulative feature of a change of a water accumulation state with time sequence in a past preset time period, so as to predict the water accumulation state in a future preset time period based on the extracted time sequence cumulative feature, for example, the extracted time sequence cumulative feature may be directly mapped into the future preset time period, and the water accumulation state in the future preset time period is predicted through the current real-time water accumulation state and the mapped time sequence cumulative feature. And (3) aligning the water accumulation state prediction result with a preset water accumulation depth sequence in time sequence, so as to obtain a second preset water accumulation depth corresponding to the water accumulation state prediction result.
And comparing the accumulated water state prediction result with the second preset water inlet depth to obtain a second comparison result. And comparing the predicted ponding state under the same time sequence with a second preset ponding depth to obtain a ponding depth comparison result as a first comparison result.
And determining whether to output jam alarm information or not based on the first comparison result and/or the second comparison result. In this embodiment, when at least one ponding state exists in the first comparison result and the second comparison result, and the corresponding ponding depth exceeds the preset ponding depth, an alarm is given. Notifying corresponding staff to timely treat the current target water outlet so as to timely drain accumulated water and prevent waterlogging.
When the rainfall is large or the condition is bad, the second image may not be clear, and further the judgment accuracy of the real-time drainage state and the predicted drainage state is affected, so in order to further ensure the accurate judgment of the service condition of the target emergency equipment in the emergency event occurrence process, in the embodiment, the service condition of the current target emergency equipment can be confirmed by using the data collected by the data collection end in the campus.
Illustratively, many passive sensing devices, such as vibrating cables, vibrating optical fibers, ground wave sensors, etc., are deployed underground, either in the underground of underground pipelines or roadsides of a campus, for monitoring underground facilities, or for perimeter security. Many environmental monitoring sensors, such as noise detection sensors, etc., are disposed on the ground. The applicant finds that when the sensors collect data, all data in the detection range where the sensors are located are often collected, so that the use state of the target emergency facility can be monitored by using the data collected by the arranged sensors, the influence of the vision condition can be avoided, and the multi-source data in the intelligent park can be utilized to the maximum extent.
In this embodiment, the preset drainage state includes a preset drainage amount at each of the target drainage openings corresponding to the real-time rainfall information;
comparing the real-time drainage state of the target drainage outlet with the preset drainage state, and determining whether to output blockage alarm information comprises the following steps:
an acquired drainage vibration signal of a vibration monitoring device arranged on a drive test is acquired. In this embodiment, the vibration characteristics may be extracted by filtering the vibration signals because the intensities of the corresponding vibration signals are different for different amounts of water. In this embodiment, the positions of the water discharge ports are often fixed, and each water discharge port may be numbered and assigned an ID and bound to its position information.
When the water outlet discharges water, a corresponding vibration signal is generated. Therefore, in the present embodiment, the propagation direction of the vibration signal and the relative position and distance of the drain port and the vibration monitoring device can be preliminarily screened. Further, since the characteristics of the drain vibration signal of the drain port are often relatively fixed, whether or not the signal is the drain vibration signal of the drain port is determined by the propagation direction of the vibration signal and the characteristics of the signal itself.
Further, the extraction of the drainage characteristics can be carried out by adopting the relative positions of the signal propagation direction, the water outlet and the vibration monitoring device to carry out preliminary screening, and a signal set after the preliminary screening is determined. The original signal is subjected to wavelet analysis to obtain wavelet coefficients under a plurality of scales, and then the time domain and frequency domain information is simultaneously extracted through the change of the wavelet coefficients.
And inputting the drainage vibration signal into a pre-trained drainage identification model to obtain real-time drainage. Inputting the drainage vibration signals into a twin feature extraction network to obtain drainage vibration features, wherein the twin feature extraction network is obtained by training a sample pair constructed based on the drainage vibration signals and the non-drainage vibration signals. Vibration characteristics are extracted through two network branches of parameter sharing in the twin neural network respectively, so that the network model can better distinguish drainage vibration characteristics and non-drainage vibration characteristics. And further, the drainage rotation signal can be accurately identified.
And inputting the drainage vibration characteristics into a pre-trained recognition network to obtain a drainage recognition result, wherein the recognition network is trained based on the drainage vibration characteristic samples and corresponding drainage labels.
As another recognition mode, the real-time drainage quantity is calculated according to the amplitude of the drainage vibration characteristic, wherein the larger the amplitude is, the larger the drainage quantity is, and the real-time drainage quantity is obtained, and the real-time drainage quantity is compared with the preset drainage quantity to determine whether to output blockage alarm information. When the real-time drainage is smaller than the preset drainage, an alarm signal can be output to inform corresponding staff to timely process the current target drainage outlet, so that accumulated water can be timely discharged to prevent waterlogging.
The embodiment of the application also provides an emergency event management device, as shown in fig. 2, including:
a first obtaining module 21, configured to obtain a prediction result of an emergency event in the target area;
the second obtaining module 22 is configured to obtain first image information of a target emergency facility corresponding to the emergency event in the target area when the probability of occurrence of the emergency event represented by the forecast result is greater than a preset probability;
a prediction module 23 for determining an availability status of the target emergency facility based on the first image information;
and the alarm module 24 is used for generating corresponding alarm information based on the identification information of the target emergency facility when the target emergency facility is in a non-available state.
It should be noted that, the first obtaining module 21 in this embodiment may be used to perform the above-mentioned step S10, the second obtaining module 22 in this embodiment may be used to perform the above-mentioned step S20, the second predicting module 23 in this embodiment may be used to perform the above-mentioned step S30, and the alarm module 24 in this embodiment may be used to perform the above-mentioned step S40.
It should be noted that the above modules are the same as examples and application scenarios implemented by the corresponding steps, but are not limited to what is disclosed in the above embodiments. It should be noted that the above modules may be implemented in software or in hardware as part of the apparatus shown in fig. 3, where the hardware environment includes a network environment.
Thus, according to yet another aspect of the embodiments of the present application, there is also provided an electronic device for implementing the above emergency event management based method, which may be a server, a terminal, or a combination thereof.
Fig. 3 is a block diagram of an alternative electronic device according to an embodiment of the present application, as shown in fig. 3, including a processor 301, a communication interface 302, a memory 303, and a communication bus 304, wherein the processor 301, the communication interface 302, and the memory 303 perform communication with each other via the communication bus 304, wherein,
A memory 303 for storing a computer program;
the processor 301 is configured to execute the computer program stored in the memory 303, and implement the following steps:
obtaining a forecasting result of an emergency event of a target area;
when the probability of the forecasting result representing the occurrence of the emergency event is larger than a preset probability, acquiring first image information of a target emergency facility corresponding to the emergency event in the target area;
determining an availability status of the target emergency facility based on the first image information;
and when the target emergency facility is in a non-available state, generating corresponding alarm information based on the identification information of the target emergency facility.
Alternatively, in the present embodiment, the above-described communication bus may be a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or an EISA (Extended Industry Standard Architecture ) bus, or the like. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, only one thick line is shown in fig. 3, but not only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The memory may include RAM or may include non-volatile memory (non-volatile memory), such as at least one disk memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general purpose processor and may include, but is not limited to: CPU (Central Processing Unit ), NP (Network Processor, network processor), etc.; but also DSP (Digital Signal Processing, digital signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field-Programmable GateArray, field programmable gate array) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments, and this embodiment is not described herein.
It will be appreciated by those skilled in the art that the structure shown in fig. 3 is only illustrative, and the device implementing the emergency event management method may be a terminal device, and the terminal device may be a smart phone (such as an Android mobile phone, an iOS mobile phone, etc.), a tablet computer, a palmtop computer, a mobile internet device (Mobile Internet Devices, MID), a PAD, etc. Fig. 3 is not limited to the structure of the electronic device. For example, the terminal device may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in fig. 3, or have a different configuration than shown in fig. 3.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing a terminal device to execute in association with hardware, the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, ROM, RAM, magnetic or optical disk, etc.
According to yet another aspect of embodiments of the present application, there is also provided a storage medium. Alternatively, in the present embodiment, the above-described storage medium may be used for executing the program code of the emergency event management method.
Alternatively, in this embodiment, the storage medium may be located on at least one network device of the plurality of network devices in the network shown in the above embodiment.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of:
obtaining a forecasting result of an emergency event of a target area;
when the probability of the forecasting result representing the occurrence of the emergency event is larger than a preset probability, acquiring first image information of a target emergency facility corresponding to the emergency event in the target area;
determining an availability status of the target emergency facility based on the first image information;
And when the target emergency facility is in a non-available state, generating corresponding alarm information based on the identification information of the target emergency facility.
Alternatively, specific examples in the present embodiment may refer to examples described in the above embodiments, which are not described in detail in the present embodiment.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: various media capable of storing program codes, such as a U disk, ROM, RAM, a mobile hard disk, a magnetic disk or an optical disk.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the methods described in the various embodiments of the present application.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution provided in the present embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The non-mentioned places in the application can be realized by adopting or referring to the prior art.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.
Claims (10)
1. A method of emergency event management, comprising:
obtaining a forecasting result of an emergency event of a target area;
when the probability of the forecasting result representing the occurrence of the emergency event is larger than a preset probability, acquiring first image information of a target emergency facility corresponding to the emergency event in the target area;
Determining an availability status of the target emergency facility based on the first image information;
and when the target emergency facility is in a non-available state, generating corresponding alarm information based on the identification information of the target emergency facility.
2. The emergency event management method of claim 1, wherein the determining the availability status of the target emergency facility based on the first image information comprises:
inputting the first image information into a pre-trained first target detection model to obtain an available state detection result of the target emergency facility and an environmental characteristic detection result associated with the target emergency facility within a preset range of the target emergency facility;
and carrying out weighted fusion on the available state detection result and the environment characteristic detection result to determine the available state of the target emergency facility.
3. The emergency event management method of claim 2, wherein the determining the availability status of the target emergency facility based on the weighted fusion of the availability status detection result and the environmental feature detection result comprises:
predicting a corresponding emergency facility prediction use state based on the prediction result;
Determining a predicted environmental feature affecting the predicted usage state based on the predicted usage state and the available state detection result;
comparing the predicted environmental characteristics with the environmental characteristic detection results;
determining fusion weights of the available state detection result and the environment characteristic detection result based on the comparison result;
and carrying out weighted fusion on the available state detection result and the environment characteristic detection result based on the fusion weight to determine the available state of the target emergency facility.
4. The emergency event management method of claim 1, wherein the emergency event comprises a regional waterlogging, the emergency facility comprises a drain, the management method further comprising:
acquiring real-time rainfall information;
determining a preset drainage state corresponding to each target drainage outlet based on the real-time rainfall information;
and comparing the real-time drainage state of the target drainage outlet with the preset drainage state to determine whether to output blockage alarm information.
5. The emergency event management method of claim 4, wherein the preset drain status includes a preset water accumulation status at each of the target drain openings corresponding to real-time rainfall information;
Comparing the real-time drainage state of the target drainage outlet with the preset drainage state, and determining whether to output blockage alarm information comprises the following steps:
acquiring second image information of a real-time ponding state at the target water outlet;
determining a real-time ponding state at the target water outlet and a ponding state prediction result in a future preset time length based on the second image information;
and comparing the preset ponding state with the real-time ponding state and the ponding state prediction result respectively, and determining whether to output blockage alarm information.
6. The emergency event management method of claim 5, wherein the predetermined water accumulation state comprises a predetermined water accumulation depth sequence divided according to a time sequence;
comparing the preset ponding state with the real-time ponding state and the ponding state prediction result respectively, and determining whether to output blockage alarm information comprises the following steps:
comparing the real-time ponding state with the preset ponding depth sequence, determining that the real-time ponding state corresponds to a first preset ponding depth, and obtaining a first comparison result;
determining a corresponding water accumulation state prediction result and a second preset water accumulation depth corresponding to the water accumulation state prediction result based on the real-time water accumulation state;
Comparing the accumulated water state prediction result with the second preset water inlet depth to obtain a second comparison result;
and determining whether to output jam alarm information or not based on the first comparison result and/or the second comparison result.
7. The emergency event management method of claim 4, wherein the preset drain status includes a preset drain amount at each of the target drain openings corresponding to real-time rainfall information;
comparing the real-time drainage state of the target drainage outlet with the preset drainage state, and determining whether to output blockage alarm information comprises the following steps:
acquiring an acquired drainage vibration signal of a vibration monitoring device arranged on the road test;
inputting the drainage vibration signal into a pre-trained drainage identification model to obtain real-time drainage;
and comparing the real-time drainage amount with the preset drainage amount to determine whether to output blockage alarm information.
8. The emergency event management method of claim 7, wherein the displacement identification model comprises a twinning feature extraction network and an identification network;
inputting the drainage vibration signal into a pre-trained drainage identification model, and obtaining real-time drainage comprises:
Inputting the drainage vibration signal into a twin feature extraction network to obtain drainage vibration features, wherein the twin feature extraction network is obtained by training a sample pair constructed based on the drainage vibration signal and the non-drainage vibration signal;
and inputting the drainage vibration characteristics into a pre-trained recognition network to obtain a drainage recognition result, wherein the recognition network is trained based on the drainage vibration characteristic samples and corresponding drainage labels.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the emergency event management method of any of claims 1 to 8.
10. A readable medium having stored thereon computer program instructions executable by a processor to implement the emergency event management method of any of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311274862.9A CN117314704B (en) | 2023-09-28 | 2023-09-28 | Emergency event management method, electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311274862.9A CN117314704B (en) | 2023-09-28 | 2023-09-28 | Emergency event management method, electronic device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117314704A true CN117314704A (en) | 2023-12-29 |
CN117314704B CN117314704B (en) | 2024-04-19 |
Family
ID=89280719
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311274862.9A Active CN117314704B (en) | 2023-09-28 | 2023-09-28 | Emergency event management method, electronic device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117314704B (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105311783A (en) * | 2014-07-29 | 2016-02-10 | 北京市劳动保护科学研究所 | Fire disaster early warning method and system |
KR102138340B1 (en) * | 2019-11-25 | 2020-07-27 | 주식회사 이엘 | Autonomous Inspection and Failure Notification System for IoT-based Repair Facilities Using Intelligent Remote Terminal Device |
KR20200116560A (en) * | 2019-04-01 | 2020-10-13 | 주식회사 온품 | System and method for response disaster situations in mountain area using UAS |
KR20210058549A (en) * | 2019-11-14 | 2021-05-24 | 나동호 | rain water storage system using underground parking lot |
CN113076893A (en) * | 2021-04-09 | 2021-07-06 | 太原理工大学 | Highway drain pipe blocking situation sensing method based on deep learning |
KR20210155217A (en) * | 2020-06-15 | 2021-12-22 | 한국전력공사 | System for predicting and preventing disaster for electric power facilities and operation method thereof |
CN113865644A (en) * | 2021-09-17 | 2021-12-31 | 温州市数据管理发展集团有限公司 | Drainage facility operation monitoring system in place |
CN114444976A (en) * | 2022-03-16 | 2022-05-06 | 湖南乾惕建设工程有限公司 | Municipal drainage management method and system for sponge city |
KR102418897B1 (en) * | 2021-11-30 | 2022-07-08 | 주식회사 이엘 | A access control system for flooding expectation area |
CN115638850A (en) * | 2022-09-21 | 2023-01-24 | 瑞芯微电子股份有限公司 | Method for flood reduction, manhole cover device, electronic device and storage medium |
CN115660160A (en) * | 2022-10-18 | 2023-01-31 | 江苏鸿利智能科技股份有限公司 | Intelligent optimization system and method for sewage pipe network drainage |
WO2023050637A1 (en) * | 2021-09-30 | 2023-04-06 | 上海仙途智能科技有限公司 | Garbage detection |
-
2023
- 2023-09-28 CN CN202311274862.9A patent/CN117314704B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105311783A (en) * | 2014-07-29 | 2016-02-10 | 北京市劳动保护科学研究所 | Fire disaster early warning method and system |
KR20200116560A (en) * | 2019-04-01 | 2020-10-13 | 주식회사 온품 | System and method for response disaster situations in mountain area using UAS |
KR20210058549A (en) * | 2019-11-14 | 2021-05-24 | 나동호 | rain water storage system using underground parking lot |
KR102138340B1 (en) * | 2019-11-25 | 2020-07-27 | 주식회사 이엘 | Autonomous Inspection and Failure Notification System for IoT-based Repair Facilities Using Intelligent Remote Terminal Device |
KR20210155217A (en) * | 2020-06-15 | 2021-12-22 | 한국전력공사 | System for predicting and preventing disaster for electric power facilities and operation method thereof |
CN113076893A (en) * | 2021-04-09 | 2021-07-06 | 太原理工大学 | Highway drain pipe blocking situation sensing method based on deep learning |
CN113865644A (en) * | 2021-09-17 | 2021-12-31 | 温州市数据管理发展集团有限公司 | Drainage facility operation monitoring system in place |
WO2023050637A1 (en) * | 2021-09-30 | 2023-04-06 | 上海仙途智能科技有限公司 | Garbage detection |
KR102418897B1 (en) * | 2021-11-30 | 2022-07-08 | 주식회사 이엘 | A access control system for flooding expectation area |
CN114444976A (en) * | 2022-03-16 | 2022-05-06 | 湖南乾惕建设工程有限公司 | Municipal drainage management method and system for sponge city |
CN115638850A (en) * | 2022-09-21 | 2023-01-24 | 瑞芯微电子股份有限公司 | Method for flood reduction, manhole cover device, electronic device and storage medium |
CN115660160A (en) * | 2022-10-18 | 2023-01-31 | 江苏鸿利智能科技股份有限公司 | Intelligent optimization system and method for sewage pipe network drainage |
Also Published As
Publication number | Publication date |
---|---|
CN117314704B (en) | 2024-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101095528B1 (en) | An outomatic sensing system for traffic accident and method thereof | |
CN105868845A (en) | Risk pre-warning method and apparatus | |
JP7471470B2 (en) | Anomaly detection based on statistical image processing to prevent cable cuts | |
CN112907901B (en) | Tunnel monitoring entity risk early warning assessment model | |
KR100993205B1 (en) | System and method for detecting disaster occurrence | |
KR102583447B1 (en) | Flood prevention program for enhancing effectiveness of flood prevention, and flood prevention management system for parking/underground road therewith | |
CN118225179B (en) | Intelligent well lid monitoring method and system based on urban drainage | |
CN115146878A (en) | Commanding and scheduling method, system, vehicle-mounted equipment and computer readable storage medium | |
CN211630273U (en) | Intelligent image recognition device for railway environment | |
CN117314704B (en) | Emergency event management method, electronic device and storage medium | |
US20220228356A1 (en) | Actionable stormwater services platform | |
CN112907900B (en) | Slope monitoring entity risk early warning assessment model | |
CN114299739A (en) | Target road section passing method, system, storage medium and electronic device | |
KR102614856B1 (en) | System and method for predicting risk of crowd turbulence | |
KR102031087B1 (en) | Big Data-Based City Control System For Clean City Implementation | |
Wang et al. | A deep reinforcement learning evolution of emergency state during traffic network | |
CN114495028A (en) | Vehicle fake plate identification method and device, electronic equipment and storage medium | |
CN110020223B (en) | Behavior data analysis method and device | |
Zifeng | Macro and micro freeway automatic incident detection (AID) methods based on image processing | |
KR101885264B1 (en) | System for Service Support Control of Tax Nonpayment Vehicles Using CCTV Image of Smart City Control Center and Method for Management thereof | |
KR102641428B1 (en) | Inundation induced disaster prediction and alarming method using CCTV images | |
CN117314391B (en) | Operation and maintenance job management method and device, electronic equipment and storage medium | |
CN117312828B (en) | Public facility monitoring method and system | |
Dia et al. | Development of artificial neural network models for automated detection of freeway incidents | |
KR100706599B1 (en) | Parking/stopping vehicles detection system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |