CN114444386A - Fire early warning and post-disaster floor damage prediction method and system based on BIM and deep learning - Google Patents

Fire early warning and post-disaster floor damage prediction method and system based on BIM and deep learning Download PDF

Info

Publication number
CN114444386A
CN114444386A CN202210058118.4A CN202210058118A CN114444386A CN 114444386 A CN114444386 A CN 114444386A CN 202210058118 A CN202210058118 A CN 202210058118A CN 114444386 A CN114444386 A CN 114444386A
Authority
CN
China
Prior art keywords
fire
bim
floor
information
early warning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210058118.4A
Other languages
Chinese (zh)
Inventor
李仲元
孔宪扬
施由宁
潘震
范仁宽
张正学
张炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Energy Engineering Group Anhui Electric Power Design Institute Co Ltd
Original Assignee
China Energy Engineering Group Anhui Electric Power Design Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Energy Engineering Group Anhui Electric Power Design Institute Co Ltd filed Critical China Energy Engineering Group Anhui Electric Power Design Institute Co Ltd
Priority to CN202210058118.4A priority Critical patent/CN114444386A/en
Publication of CN114444386A publication Critical patent/CN114444386A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • G06Q50/265Personal security, identity or safety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2119/00Details relating to the type or aim of the analysis or the optimisation
    • G06F2119/02Reliability analysis or reliability optimisation; Failure analysis, e.g. worst case scenario performance, failure mode and effects analysis [FMEA]

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Development Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Security & Cryptography (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a fire early warning and post-disaster floor damage prediction method and system based on BIM and deep learning, wherein image information at different positions in a monitoring area is collected and preprocessed to form a first data set; inputting image information of the first data set to a fire recognition neural network model; judging whether a fire disaster occurs or not based on the first confidence coefficient parameter; if a fire disaster happens, a fire alarm is carried out; identifying flame information corresponding to the fire; and inputting the flame information into a BIM-based floor plate fire-resistant damage model, and predicting the damage degree of the floor plate bearing capacity. According to the invention, fire monitoring is carried out in real time based on image information and a deep learning algorithm, and early warning is carried out on potential threats such as combustibles and the like; meanwhile, fire early warning information is merged into the BIM building informatization model, so that the fire position and related floor damage are threatened, and accurate fire-fighting measures are carried out based on BIM, and the life and property safety is further guaranteed.

Description

Fire early warning and post-disaster floor damage prediction method and system based on BIM and deep learning
Technical Field
The invention relates to the technical field of fire early warning and fire fighting, in particular to a fire early warning method and system based on BIM and deep learning.
Background
The fire hazard causes huge potential safety hazards to the life and property safety of people, and is an important component of a building security system in the process of fire hazard early warning and fire fighting systems. The traditional fire monitoring system judges the occurrence of fire by collecting related data through a temperature sensor, a smoke sensor and the like, the response speed is slow, the identification efficiency of far flames, particularly smokeless flames, is low, and the accuracy and the timeliness of fire monitoring cannot be guaranteed. Meanwhile, the traditional fire-fighting system can open the whole building or the whole layer of fire-fighting self-spraying device when a fire disaster occurs, the fire can not be accurately extinguished, unnecessary property loss can be caused to other areas under the condition of small fire intensity, and when the fire disaster occurs, an escape path can not be planned to indicate the escape of field personnel. The utility model provides a conflagration early warning and fire control system based on BIM and degree of depth study carries out fire control in real time according to information such as image and degree of depth study algorithm, carries out the early warning to potential threats such as combustible substance, fuses into BIM building information model with conflagration early warning information simultaneously, and the three-dimensional visualization of make full use of BIM model to and synergistic characteristics, plan the route of fleing and help personnel to flee, carry out the accuracy simultaneously and put out a fire, ensured the security of the lives and property.
Disclosure of Invention
In view of the above, the invention provides a fire early warning method and system based on BIM and deep learning, which perform fire monitoring in real time according to information such as images and a deep learning algorithm, perform early warning on potential threats such as combustibles, and simultaneously integrate fire early warning information into a BIM building informatization model, thereby solving the technical problems of inaccurate fire extinguishment and incomplete identification of potential safety hazards of a floor slab.
The technical scheme of the invention is as follows:
a fire early warning and post-disaster floor damage prediction method based on BIM and deep learning comprises the following steps:
acquiring image information of different positions in a monitoring area, and preprocessing the image information to form a first data set;
inputting the image information of the first data set into a fire recognition neural network model to obtain a first confidence coefficient parameter;
judging whether a fire disaster occurs or not based on the first confidence coefficient parameter; if a fire disaster happens, a fire alarm is carried out;
identifying flame information corresponding to the fire; and inputting the flame information into a BIM-based floor plate fire-resistant damage model, and predicting the damage degree of the floor plate bearing capacity.
Preferably, the inputting the image information into a fire recognition neural network model to obtain a first confidence coefficient parameter includes:
compressing and cutting the size of an original image in the first data set to 256 multiplied by 3, carrying out convolution calculation through two visual depth convolution layers C1 and C2, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 100, the number of steps is 1, and the number of filling is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P1, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through three visual depth convolution layers C3, C4 and C5, wherein the number of convolution kernels is 3 multiplied by 3, the number of convolution kernels is 128, the number of channels is 100, the number of steps is 1, and the number of fillings is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P2, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through three visual depth convolution layers C6, C7 and C8, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 200, the number of steps is 1, and the number of filling is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P3, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through three visual depth convolution layers C9, C10 and C11, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 400, the number of steps is 1, and the number of filling is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P4, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through four visual depth convolution layers C12, C13, C14 and C15, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 400, the number of steps is 1, and the number of fillings is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P5, the number of pooling cores is 2 multiplied by 2, and the step length is 2; the output result is calculated through a full connection layer F1, and the number of channels is 4000; the output result is calculated through a full connection layer F2, and the number of channels is 1000;
the output result is calculated through a full connection layer F3, and the number of channels is 100; the output result is calculated through a full connection layer F4, and the number of channels is 5; the output result is calculated by normalizing the exponential layer by a softmax function, and a confidence Rn (n is 1,2,3., 5) is output.
Preferably, the first confidence coefficient parameter is an output matrix Rn;
the determining whether a fire is occurring based on the first confidence coefficient parameter includes: if the maximum value max (Rn) ═ R1 or max (Rn) ═ R2 or max (Rn) ═ R3 in the output matrix Rn, it is determined that a fire has occurred; and if the maximum value of the fire-fighting equipment is max (Rn) -R4 or max (Rn) -R5.
Preferably, the fire recognition neural network model comprises a training process;
the training process comprises a weight adjusting stage, the task of the weight adjusting stage is to adjust parameters according to the deviation value of a loss function until a model converges and a target prediction accuracy is achieved, and the loss function is as follows:
rule 1: l ═ LCE(yi,yi *)+αLMSE(yi,yi *)
Rule 2:
Figure BDA0003477162430000041
rule 3:
Figure BDA0003477162430000042
wherein L is a loss function and L is not less than 0, y is a true distribution, i.e., a true marker of the sample, y is*Is a predicted distribution, LCEIs a cross entropy loss function, LMSEThe method is characterized in that the method is a mean square error loss function, alpha is a weight value, in order to adjust the proportion of the value of the mean square error loss function in the whole loss function, N is the training batch number, and C is the category number.
Preferably, the preprocessing the image information includes preprocessing the image information by using a variable bounding box.
Preferably, the flame information includes: the real length a, the real width b, the real height c and the distance x between the flame and the floor slab of the flame; and combining the characteristic parameters of the floor slab and the flame characteristic parameters to obtain a combined parameter X, and scaling the parameters to enable the values of the parameters to be between [ -1,1] by adopting characteristic scaling.
Preferably, the flame information is input into a BIM-based floor slab fire-resistant damage model to predict the damage degree of the floor slab bearing capacity, and the method includes the steps of predicting the damage degree of the floor slab bearing capacity by using a linear regression model:
rule 1: f (x) wTX+b
Rule 2: d ═ f (x)/Q
Where f (x) is the predicted floor bearing capacity, wTThe weight parameter is X, the merging parameter is X, the offset is b, the initial bearing strength of the floor before the fire disaster occurs is Q, and the characteristic value of the damage degree of the bearing capacity is D.
Preferably, the fire alarm system further includes: triggering a fire alarm to alarm; triggering an electromagnetic valve in a fire-fighting spray system, and opening a fire-fighting spray head at a fire disaster position to extinguish fire; and opening the escape indication board on the escape path.
In addition, also provide a fire early warning and fire control system that adopts the aforesaid fire early warning and the post-disaster floor damage prediction method execution based on BIM and deep learning, its characterized in that, the system includes: the fire monitoring module is based on a camera, an infrared thermal imager, a smoke detector, a video information memory and a smoke information memory; a fire information processing system; a computer intelligent terminal; a BIM model memory; a visual terminal; a cloud server; a fire-fighting system control terminal and a fire-fighting rescue module based on a fire-fighting spray system, a fire-fighting alarm, a broadcast and an escape sign.
According to the scheme, based on a fire early warning method of BIM and deep learning and a post-disaster floor damage prediction method, image information of different positions in a monitoring area is collected and preprocessed to form a first data set; inputting the image information of the first data set into a fire recognition neural network model to obtain a first confidence coefficient parameter; judging whether a fire disaster occurs or not based on the first confidence coefficient parameter; if a fire disaster happens, a fire alarm is carried out; identifying flame information corresponding to the fire; and inputting the flame information into a BIM-based floor plate fire-resistant damage model, and predicting the damage degree of the floor plate bearing capacity so as to further perform corresponding early warning. According to the invention, the fire disaster monitoring is carried out in real time based on image information and a deep learning algorithm, and the early warning is carried out on potential threats such as combustibles and the like; meanwhile, fire early warning information is merged into the BIM building informatization model, so that the flame position and related floor damage threats are further predicted and early warned, and accurate fire fighting measures are carried out based on BIM, so that the life and property safety is further guaranteed.
Drawings
FIG. 1 is a flow chart of a method for fire early warning and post-disaster floor damage prediction based on BIM and deep learning in an embodiment of the invention;
fig. 2 is a structural diagram of a fire early warning and fire fighting system based on BIM and deep learning in the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, the method for fire early warning and post-disaster floor damage prediction based on BIM and deep learning according to the present invention includes:
acquiring image information of different positions in a monitoring area, and preprocessing the image information to form a first data set;
inputting the image information of the first data set into a fire recognition neural network model to obtain a first confidence coefficient parameter;
judging whether a fire disaster occurs or not based on the first confidence coefficient parameter; if a fire disaster happens, a fire alarm is carried out;
identifying flame information corresponding to the fire; and inputting the flame information into a BIM-based floor plate fire-resistant damage model, and predicting the damage degree of the floor plate bearing capacity.
Preferably, the inputting the image information into a fire recognition neural network model to obtain a first confidence coefficient parameter includes:
compressing and cutting the size of an original image in the first data set to 256 multiplied by 3, carrying out convolution calculation through two visual depth convolution layers C1 and C2, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 100, the number of steps is 1, and the number of filling is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P1, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through three visual depth convolution layers C3, C4 and C5, wherein the number of convolution kernels is 3 multiplied by 3, the number of convolution kernels is 128, the number of channels is 100, the number of steps is 1, and the number of fillings is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P2, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through three visual depth convolution layers C6, C7 and C8, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 200, the number of steps is 1, and the number of filling is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P3, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through three visual depth convolution layers C9, C10 and C11, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 400, the number of steps is 1, and the number of filling is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P4, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through four visual depth convolution layers C12, C13, C14 and C15, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 400, the number of steps is 1, and the number of fillings is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P5, the number of pooling cores is 2 multiplied by 2, and the step length is 2; the output result is calculated through a full connection layer F1, and the number of channels is 4000; the output result is calculated through a full connection layer F2, and the number of channels is 1000;
the output result is calculated through a full connection layer F3, and the number of channels is 100; the output result is calculated through a full connection layer F4, and the number of channels is 5; the output result is calculated by normalizing the exponential layer by a softmax function, and a confidence Rn (n is 1,2,3., 5) is output.
Specifically, in the present embodiment, a fire recognition neural network model (fire r-Net) based on deep learning is created and trained by creating flame image data sets of different combustibles in advance, and the model has the capability of determining whether a fire occurs and obtaining combustible information when a fire occurs, and is stored in a fire information processing system.
Further, flame videos and images of different combustibles are obtained through network downloading, flame shooting and flame test recording, and an original training data set S is obtained by marking according to the different combustibles, wherein the marking types are divided into 5 types which are respectively as follows: smokeless flame, smoky flame, smoke, light source, background; the method comprises the steps of carrying out image processing by adopting image coding compression, image enhancement, image filling, random erasing and image transformation to expand an original training data set S, disordering the training data set S, dividing the training data set S into K parts by adopting a cross verification method, selecting K-1 parts as a training set in turn, using the rest parts as a verification set SV, obtaining flame videos of different combustibles in a flame test recording mode, and marking to obtain a test data set ST. The data set S includes 30000 pieces of marked picture information, and the test data set ST includes 3000 pieces of marked picture information.
Further, flame videos of different combustibles are obtained through a flame test recording mode, the videos need to be recorded in different background environments, including shooting in indoor rooms with different functions, the difference of images is guaranteed, image information needs to be extracted from the videos, the method includes the steps that images of a plurality of frames in the videos are captured, and the capture interval is d frames.
Rule 1: d >12n, n is a positive integer and n is within the range of [5,10]
Further, the fire information processing system is trained through the training data set S and the deep learning model, the flame recognition capability of the fire information processing system is detected through the test data set ST, and if the judgment accuracy is less than 95%, the deep learning model is repeatedly optimized.
Further, setting:
rule 1:
Figure BDA0003477162430000091
rule 2: h' (n) ═ mR (h (n))
Rule 3: mr (x) max (ax, x), where a < -0.001
Rule 4:
Figure BDA0003477162430000092
rule 5: f ═ WP + b
Rule 6:
Figure BDA0003477162430000093
wherein f represents an image information matrix of an original image after compression clipping, g represents a visual depth convolution layer matrix, h (n) represents a result obtained by performing convolution operation on f and g, mR (x) represents an mRelu activation function which is used for removing overfitting and parts in the convolution operation result and is different from the traditional Relu function, mR (x) can keep local fine characteristics of the image, h' (n) represents a convolution operation result after the activation function operation, and a matrix P represents the pooling layer of the pooling layerAs a result, the pooling layer does not change the number of convolution layers, and the pooling layer has the effects of increasing the receptive field, preventing over-fitting, reducing the calculation amount of the model, increasing the calculation speed,
Figure BDA0003477162430000101
this indicates that the matrix h' (n) is operated with a step size of 2 and the number of pooling kernels is 2 × 2. The matrix F represents the calculation result of the fully connected layer, wherein W represents the weight matrix, b represents the deviation matrix, RnThe confidence coefficient is expressed and obtained through softmax normalized exponential operation, and the model is a common multi-classifier model.
The detailed structure of the fire recognition neural network model (FireR-Net) based on deep learning is shown in the following table:
Figure BDA0003477162430000102
Figure BDA0003477162430000111
preferably, the fire recognition neural network model comprises a training process;
the training process comprises a weight adjusting stage, the task of the weight adjusting stage is to adjust parameters according to the deviation value of a loss function until a model converges and a target prediction accuracy is achieved, and the loss function is as follows:
rule 1: l ═ LCE(yi,yi *)+αLMSE(yi,yi *)
Rule 2:
Figure BDA0003477162430000112
rule 3:
Figure BDA0003477162430000113
where L is the loss function and L ≧ 0, y is the true distribution, i.e., sampleTrue mark of, y*Is a predicted distribution, LCEIs a cross entropy loss function, LMSEThe method is characterized in that the method is a mean square error loss function, alpha is a weight value, in order to adjust the proportion of the value of the mean square error loss function in the whole loss function, N is the training batch number, and C is the category number.
Further, the gradient descending degree is judged according to whether the value of L continuously tends to 0, the deep learning model is transmitted in the forward direction, and the weight matrix W is adjusted by adopting an RMSprop optimizer.
Rule 1:
Figure BDA0003477162430000121
rule 2:
Figure BDA0003477162430000122
wherein wiIs an element in the weight matrix, θiIs to define a value for changing wiβ is the hyper-parameter and γ is the learning rate. The RMSprop optimizer differs from the traditional SGD (random gradient descent) optimizer in that the learning rate is continuously adjustable, making it easier to find the minimum of the loss function.
Preferably, the first confidence coefficient parameter is an output matrix Rn;
the determining whether a fire is occurring based on the first confidence coefficient parameter includes: if the maximum value max (Rn) ═ R1 or max (Rn) ═ R2 or max (Rn) ═ R3 in the output matrix Rn, it is determined that a fire has occurred; and if the maximum value of the fire-fighting equipment is max (Rn) -R4 or max (Rn) -R5.
Preferably, the preprocessing the image information includes preprocessing the image information by using a variable bounding box.
Further, the characteristics of the variable frame comprise three characteristics of a base point, a dimension and a width and a height. And sliding the base point of the variable boundary frame on the pixel point of the image to perform dense sampling. Assuming that the width and height dimensions of the input image are W and H, bounding boxes of different sizes are generated with each pixel of the image as a base point, respectively, and the number of bounding boxes is Num.
Rule 1:
Figure BDA0003477162430000123
rule 2:
Figure BDA0003477162430000124
rule 3: num ═ WH (n + m-1)
Where X is the width of the variable bounding box, Y is the height of the variable bounding box, ψ is the size ratio of n kinds, and s is the aspect ratio of m kinds. By controlling the position of the base point, the value of the size ratio and the value of the aspect ratio, Num bounding boxes of different sizes are obtained.
Furthermore, images framed in Num bounding boxes are input into a fire recognition neural network model (FireR-Net) based on deep learning for prediction. For each picture, an output matrix Rn is obtained, which includes the confidence of each type of prediction result, and if the maximum value of the output matrix max (Rn) ═ R4 or max (Rn) ═ R5, the bounding box is discarded, and at the same time, a threshold value λ is set, and if max (Rn) < λ, the bounding box is discarded.
Furthermore, the similarity between the variable bounding boxes is measured by adopting the Jaccard parameter, namely the size of the intersection of the two is divided by the size of the union of the two.
Rule 1:
Figure BDA0003477162430000131
rule 2: j (mu, nu) epsilon (0,1)
Wherein J is the cross-over ratio.
Further, the images framed by the rest of the boundary frames are searched for local maximum values by adopting an NMS non-maximum inhibition method, and the flame is identified and positioned. And performing descending order arrangement on the images framed by the rest of the bounding boxes according to the confidence coefficient value to obtain a descending order matrix K, and setting an empty optimal bounding box matrix L for storing the optimal bounding boxes.
Rule 1: the intersection ratio J values of K1 and K2 in the descending matrix K are compared, and if J (mu, nu) > kappa, K2 is discarded. K2 is retained if J (. mu.,. nu.) is ≦ κ.
Rule 2: the intersection ratio J values of K1 and K3 in descending matrix K are compared, and the operation of rule 1 is repeated until the entire matrix K is traversed, moving K1 into the preferred bounding box matrix L.
Rules 1 and 2 are repeated until all bounding box values in the descending matrix K are transferred to the preferred bounding box matrix L, the preferred bounding box in L and the coordinates of the base point, including the size of the variable bounding box, for the identified fire information and its location information.
Preferably, the flame information includes: the real length a, the real width b, the real height c and the distance x between the flame and the floor slab of the flame; and combining the characteristic parameters of the floor slab and the flame characteristic parameters to obtain a combined parameter X, and scaling the parameters to enable the values of the parameters to be between [ -1,1] by adopting characteristic scaling.
Further, according to the BIM model information, the size information of the room where the fire occurs is obtained, and the length of the room is A, the width of the room is B, and the height of the room is C.
Further, the real length a, the real width b and the real height c of the flame are obtained according to the size of the preferable boundary frame in the L in sequence, and a reduction coefficient is set
Figure BDA0003477162430000141
Rule 1:
Figure BDA0003477162430000142
rule 2:
Figure BDA0003477162430000143
rule 3:
Figure BDA0003477162430000144
further, the real length, real width and real height of the flame are input into the BIM model, which calculates the distance x from the flame to the floor.
Further, a BIM-based floor fire damage resistance model is established and trained, and under the condition of fire, the flame transfers the temperature to the floor in the modes of heat conduction, heat radiation and heat convection. Floor slabs are important structural members in a building room for directly supporting various objects of the building, and are often severely damaged due to a large amount of heat accumulated above in a fire. People need to know the damage condition of the floor slab in time and take reinforcement measures in time after a fire.
Further, the floor slab comprises a temperature sensor and a strain sensor, a floor slab bottom temperature value is obtained through the temperature sensor, and concrete strain, layout steel bar strain and slab bottom steel bar strain are obtained through the strain sensor.
Furthermore, in the BIM-based floor fire damage model, the BIM floor model comprises characteristic parameters of the floor, such as the thickness of a protective layer of the floor, the strength grade of concrete, the strength of reinforcing steel bars, the span ratio of a plate, the strain of concrete, the strain of reinforcing steel bars on the surface of the plate, the strain of reinforcing steel bars at the bottom of the plate, the reinforcement ratio of the surface of the plate, the deflection of the plate and the temperature of the bottom of the plate, which are used as parameters in the floor fire damage model, and then the characteristic parameters of flame, the real length a, the real width b, the real height c and the distance x between the flame and the floor are introduced. And combining the characteristic parameters of the floor slab and the flame characteristic parameters to obtain a combined parameter X, and scaling the parameters to enable the values of the parameters to be between [ -1,1] by adopting characteristic scaling.
Further, a data set Z is obtained through literature reference and test, a cross validation method is adopted, the training data set Z is divided into K parts, K-1 parts are selected as training sets in turn, the rest parts are used as verification sets ZV, flame videos of different combustibles are obtained through a flame test recording mode, and a test data set ZT is obtained through marking.
Preferably, the flame information is input into a BIM-based floor slab fire-resistant damage model to predict the damage degree of the floor slab bearing capacity, and the method includes the steps of predicting the damage degree of the floor slab bearing capacity by using a linear regression model:
rule 1: f (x) wTX+b
Rule 2: d ═ f (x)/Q
Wherein f (x) is a predicted buildingPlate bearing capacity, wTThe weight parameter is X, the merging parameter is X, the offset is b, the initial bearing strength of the floor before the fire disaster occurs is Q, and the characteristic value of the damage degree of the bearing capacity is D.
Further, the training process of the BIM-based floor fire damage resistance model comprises a weight adjusting stage, wherein the task is to adjust parameters according to the deviation value of a Loss function, a Huber Loss function is adopted, and the Loss is as follows:
rule 1:
Figure BDA0003477162430000151
rule 2:
Figure BDA0003477162430000152
where L is the loss function, y represents the true value, f (x) represents the predicted value, and δ is the hyperparameter.
Further, the weight of the model is adjusted by adopting a gradient descent method until convergence.
Rule 1:
Figure BDA0003477162430000153
wherein w is a weight parameter, N is a training batch number, and alpha is a learning rate.
Furthermore, after training of the BIM-based floor fire-resistant damage model is completed, the damage degree of the floor bearing capacity can be predicted.
Further, the identified flame information parameters comprise the real length a, the real width b, the real height c of the flame, the distance x between the flame and the floor slab, and the BIM model parameters of the floor slab comprise the protective layer thickness of the floor slab, the concrete strength grade Q, the reinforcing steel bar strength Q, the slab span ratio lambda and the concrete strain epsilon1Strain epsilon of steel bar on plate surface2Strain epsilon of steel bar at bottom of plate3Plate reinforcement rate rho and plate deflection gamma. And inputting the data into a BIM-based floor plate fire-resistant damage model to obtain a floor plate bearing capacity damage degree characteristic value D.
Preferably, the fire alarm system further includes: triggering a fire alarm to alarm; triggering an electromagnetic valve in the fire-fighting spraying system, and opening a fire-fighting spraying head at the position where the fire disaster occurs to extinguish the fire; and opening the escape indication board on the escape path.
In addition, as shown in fig. 2, the present invention further provides a fire early warning and fire fighting system implemented by the above fire early warning and post-disaster floor damage prediction method based on BIM and deep learning, wherein the system comprises: the fire monitoring module is based on a camera, an infrared thermal imager, a smoke detector, a video information memory and a smoke information memory; a fire information processing system; a computer intelligent terminal; a BIM model memory; a visual terminal; a cloud server; a fire-fighting system control terminal and a fire-fighting rescue module based on a fire-fighting spray system, a fire-fighting alarm, a broadcast and an escape sign.
Specifically, the computer intelligent terminal reads the building BIM from the BIM memory, and writes the fire information, the combustion species information and the camera number information obtained by the fire information processing system into the BIM for fusion.
Further, the intelligent terminal of the computer reads the smoke information from the smoke information memory and writes the smoke information into the BIM for fusion.
Furthermore, the computer intelligent terminal carries out fire-fighting system scheduling and escape path planning according to the fused BIM information to form a fire-fighting instruction and transmits the fire-fighting instruction to the fire-fighting system control terminal.
Furthermore, the fire extinguishing system control terminal opens the fire sprinkler head at the fire occurrence position for fire extinguishing through receiving the instruction of the computer intelligent terminal and the electromagnetic valve in the fire extinguishing sprinkler system, and simultaneously opens the fire sprinkler head at the smoke spreading position, and the size of the spraying flow is controlled through the electromagnetic valve.
Furthermore, the fire-fighting system control terminal controls the opening of the escape indication board on the escape path to guide the evacuation of people.
Furthermore, the fire-fighting system control terminal controls the broadcast to be started, and evacuation of people is guided.
Further, the computer intelligent terminal transmits the fused BIM to the visual terminal for displaying, and information contained in the fused BIM comprises a building BIM, fire information, combustion species information, camera number information, smoke detector number information, fire protection system scheduling information and escape path planning information.
Further, the visual terminal transmits visual information to the cloud, and fire rescue personnel and escape personnel can check the fire condition and the escape route through mobile devices such as a mobile phone.
Further, the personnel of fleing can pass mobile devices such as cell-phone and be connected with the high in the clouds to mark own position in the BIM model, and can send distress message, the BIM model will show personnel of fleing's positional information, the fire fighter of being convenient for rescues.
As shown in fig. 2, the fire early warning and fire fighting system based on BIM and deep learning according to the present invention includes a fire information processing system, a BIM model information memory, a computer intelligent terminal, a fire fighting system control terminal, a fire fighting spray system, a video information memory, a smoke information memory, a visual terminal, a cloud server, a mobile device, a camera, a smoke detector, an exhaust fan, a broadcast, an escape sign, and a fire fighting smoke exhaust window. The camera and the smoke detector acquire images and smoke information data in a building in real time; the fire information processing system obtains image data through the video information memory, obtains smoke data through the smoke data memory, and judges whether a fire occurs, combustion object information and the position of the fire according to a built-in deep learning algorithm; the computer intelligent terminal acquires a building BIM three-dimensional model from a BIM model memory, acquires fire information and position information from a fire information processing system, acquires smoke information from a smoke data memory, carries out fire protection system scheduling and escape path planning, and fuses the fire information, the smoke information, the fire protection system scheduling and the escape path planning information into the building BIM three-dimensional model; the fire-fighting system control terminal controls the fire-fighting alarm and the fire-fighting spraying system at the fire occurrence position to accurately extinguish fire by reading a scheduling instruction and an escape path planning instruction of the computer intelligent terminal, simultaneously opens a fire-fighting spray zero head at a smoke spreading position, controls the size of spraying flow through the electromagnetic valve, controls an escape indicator and broadcasts to help field personnel escape, and simultaneously opens a fire-fighting smoke exhaust window and an exhaust fan; the computer intelligent terminal inputs the fused BIM three-dimensional model into a visual terminal for visual display; fire rescue personnel and escape personnel can check the fire condition and the escape route through mobile equipment such as a mobile phone and the like; the personnel of fleing accessible mobile phone etc. are connected with the high in the clouds to mark own position in the BIM model, and can send distress message, the BIM model will show personnel of fleing's positional information, the fire fighter of being convenient for rescues. The invention realizes the fire early warning and fire rescue through the image information and the smoke information, can quickly and accurately judge whether a fire occurs through the image and the deep learning neural network model, obtains the fire information and the position information, by fusing the BIM three-dimensional model of the building with fire information, position information and smoke information, the fire fighting system in the building can be accurately scheduled, the fire extinguishing is carried out accurately at the place where the fire breaks out, the exhaust fan and the fire-fighting smoke exhaust window are controlled to exhaust harmful smoke in time to reduce casualties, the escape route planning can be carried out, the escape indicator board is controlled, the escape route is indicated by broadcast, meanwhile, fire scheduling information and escape route planning information are fused into the BIM model for visual display, so that firefighters can quickly and efficiently master a fire condition deployment rescue plan, and escape personnel can find an escape route for escape. The invention can effectively improve the efficiency of fire extinguishing and rescuing of the building fire and reduce property and life loss caused by the fire.
At least one camera needs to be installed in each room of the building, the shooting range can cover all corners in the room, and a plurality of smoke detectors need to be arranged at the same time. Cameras and smoke detectors need to be arranged in public areas of buildings at intervals, and the shooting range can be ensured to cover all corners in a room. The camera collects image information of different areas and stores the image information in the video information memory, and the smoke detector collects smoke information and stores the smoke information in the smoke information memory.
Building BIM information model is established, Revit software of Autodesk company is used in the modeling software, the established model comprises components of walls, columns, beams, floors, doors, windows, cameras, fire-fighting spraying systems and the like of the building, and the established BIM information model is stored in a BIM model memory.
Establishing flame image data sets of different combustibles, establishing and training a fire recognition neural network model (FireR-Net) based on deep learning, wherein the model has the capability of judging whether a fire breaks out or not and obtaining combustible information when the fire breaks out, and storing the model in a fire information processing system.
Acquiring flame videos and images of different combustibles in a network downloading and flame test recording mode, and labeling according to the different combustibles to obtain an original training data set S; the original training data set S is expanded by image processing in the modes of image coding compression, image enhancement and restoration and image transformation, and 4 sub-training data sets S1, S2, S3 and S4 are obtained. Flame videos of different combustibles are obtained through a flame test recording mode, and a test data set T is obtained through labeling according to the different combustibles. The fire information processing system is trained through the training data sets S, S1, S2, S3 and S4 and the deep learning model, the flame recognition capability of the information processing system is detected through the test data set T, and if the judgment accuracy is less than 95%, the deep learning model is repeatedly optimized.
Through the fire information processing system after the deep model training, whether fire and burning species information occur or not can be judged, and the serial number information of the camera is passed and transmitted to the computer intelligent terminal. And the intelligent computer terminal reads the smoke information from the smoke information memory and writes the smoke information into the BIM for fusion. And the computer intelligent terminal reads the building BIM from the BIM model memory, and writes the fire information, the combustion species information and the camera number information obtained by the fire information processing system into the BIM for fusion.
And the computer intelligent terminal carries out fire-fighting system scheduling and escape path planning according to the fused BIM information to form a fire-fighting instruction and transmits the fire-fighting instruction to the fire-fighting system control terminal. The fire-fighting system control terminal opens the fire-fighting spray header at the fire occurrence position for fire extinguishing through receiving the instruction of the computer intelligent terminal and the electromagnetic valve in the fire-fighting spray system, and simultaneously opens the fire-fighting spray header at the smoke spreading position, and controls the spray flow through the electromagnetic valve. And the fire-fighting system control terminal controls and opens the escape indication board on the escape path to guide the evacuation of people. The fire-fighting system control terminal controls and starts broadcasting to guide evacuation of people.
The computer intelligent terminal transmits the fused BIM model to the visual terminal for displaying, and information contained in the fused BIM model at the moment comprises a building BIM model, fire information, combustion species information, camera number information, smoke detector number information, fire protection system scheduling information and escape path planning information. Visual terminal gives the high in the clouds with visual information transfer, and fire rescue personnel and mobile devices such as fleeing personnel's accessible cell-phone are implemented and are looked over the conflagration condition and the route of fleing. The mobile devices such as the personnel of fleing accessible cell-phone are connected with the high in the clouds to mark own position in the BIM model, and can send distress message, the BIM model will show personnel of fleing's positional information, and the fire fighter of being convenient for rescues.
The above-mentioned embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solution of the present invention by those skilled in the art without departing from the spirit of the present invention should fall within the protection scope defined by the claims of the present invention.

Claims (9)

1. A fire early warning and post-disaster floor damage prediction method based on BIM and deep learning is characterized by comprising the following steps:
acquiring image information of different positions in a monitoring area, and preprocessing the image information to form a first data set;
inputting the image information of the first data set into a fire recognition neural network model to obtain a first confidence coefficient parameter;
judging whether a fire disaster occurs or not based on the first confidence coefficient parameter; if a fire disaster happens, a fire alarm is carried out;
identifying flame information corresponding to the fire; and inputting the flame information into a BIM-based floor plate fire-resistant damage model, and predicting the damage degree of the floor plate bearing capacity.
2. The method for fire early warning and post-disaster floor damage prediction based on BIM and deep learning according to claim 1, wherein the inputting the image information into a fire recognition neural network model to obtain a first confidence coefficient parameter comprises:
compressing and cutting the size of an original image in the first data set to 256 multiplied by 3, carrying out convolution calculation through two visual depth convolution layers C1 and C2, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 100, the number of steps is 1, and the number of filling is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P1, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through three visual depth convolution layers C3, C4 and C5, wherein the number of convolution kernels is 3 multiplied by 3, the number of convolution kernels is 128, the number of channels is 100, the number of steps is 1, and the number of fillings is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P2, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through three visual depth convolution layers C6, C7 and C8, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 200, the number of steps is 1, and the number of filling is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P3, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through three visual depth convolution layers C9, C10 and C11, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 400, the number of steps is 1, and the number of filling is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P4, the number of pooling cores is 2 multiplied by 2, and the step length is 2; performing convolution calculation on the output result through four visual depth convolution layers C12, C13, C14 and C15, wherein the number of convolution kernels is 3 multiplied by 3, the number of channels is 400, the number of steps is 1, and the number of fillings is 1;
the output result is subjected to the calculation of the maximum value in the range through a pooling layer P5, the number of pooling cores is 2 multiplied by 2, and the step length is 2; the output result is calculated through a full connection layer F1, and the number of channels is 4000; the output result is calculated through a full connection layer F2, and the number of channels is 1000;
the output result is calculated through a full connection layer F3, and the number of channels is 100; the output result is calculated through a full connection layer F4, and the number of channels is 5; the output result is calculated by normalizing the exponential layer by a softmax function, and a confidence Rn (n is 1,2,3., 5) is output.
3. The method for fire early warning and post-disaster floor damage prediction based on BIM and deep learning according to claim 1 or 2, wherein the first confidence coefficient parameter is an output matrix Rn;
the determining whether a fire is occurring based on the first confidence coefficient parameter includes: if the maximum value max (Rn) ═ R1 or max (Rn) ═ R2 or max (Rn) ═ R3 in the output matrix Rn, it is determined that a fire has occurred; and if the maximum value of the fire-fighting equipment is max (Rn) -R4 or max (Rn) -R5.
4. The BIM and deep learning based fire early warning and post-disaster floor damage prediction method according to claim 1 or 2, wherein the fire recognition neural network model comprises a training process;
the training process comprises a weight adjusting stage, the task of the weight adjusting stage is to adjust parameters according to the deviation value of a loss function until a model converges and a target prediction accuracy is achieved, and the loss function is as follows:
rule 1: l ═ LCE(yi,yi *)+αLMSE(yi,yi *)
Rule 2:
Figure FDA0003477162420000031
rule 3:
Figure FDA0003477162420000032
wherein L is a loss function and L is not less than 0, y is a true distribution, i.e., a true marker of the sample, y is*Is a predicted distribution, LCEIs a cross entropy loss function, LMSEThe method is characterized in that the method is a mean square error loss function, alpha is a weight value, in order to adjust the proportion of the value of the mean square error loss function in the whole loss function, N is the training batch number, and C is the category number.
5. The BIM and deep learning based fire early warning and post-disaster floor damage prediction method as claimed in claim 1, wherein the pre-processing of the image information comprises pre-processing the image information with a variable bounding box.
6. The method for fire early warning and post-disaster floor damage prediction based on BIM and deep learning according to claim 1, wherein the flame information comprises: the real length a, the real width b, the real height c and the distance x between the flame and the floor slab of the flame; and combining the characteristic parameters of the floor slab and the flame characteristic parameters to obtain a combined parameter X, and scaling the parameters to enable the values of the parameters to be between [ -1,1] by adopting characteristic scaling.
7. The BIM and deep learning based fire early warning and post-disaster floor damage prediction method according to claim 1, wherein the flame information is input to a BIM based floor fire damage resistance model to predict the damage degree of the floor bearing capacity, and the method comprises the following steps of predicting the damage degree of the floor bearing capacity by using a linear regression model:
rule 1: f (x) wTX+b
Rule 2: d ═ f (x)/Q
Where f (x) is the predicted floor bearing capacity, wTThe weight parameter is X, the merging parameter is X, the offset is b, the initial bearing strength of the floor before the fire disaster occurs is Q, and the characteristic value of the damage degree of the bearing capacity is D.
8. The method for fire early warning and post-disaster floor damage prediction based on BIM and deep learning of claim 1, wherein if a fire occurs, a fire alarm is given, further comprising: triggering a fire alarm to alarm; triggering an electromagnetic valve in the fire-fighting spraying system, and opening a fire-fighting spraying head at the position where the fire disaster occurs to extinguish the fire; and opening the escape indication board on the escape path.
9. A fire early warning and fire fighting system for performing the fire early warning and post-disaster floor damage prediction method based on BIM and deep learning according to any one of claims 1 to 8, the system comprising: the fire monitoring module is based on a camera, an infrared thermal imager, a smoke detector, a video information memory and a smoke information memory; a fire information processing system; a computer intelligent terminal; a BIM model memory; a visual terminal; a cloud server; a fire-fighting system control terminal and a fire-fighting rescue module based on a fire-fighting spray system, a fire-fighting alarm, a broadcast and an escape sign.
CN202210058118.4A 2022-01-19 2022-01-19 Fire early warning and post-disaster floor damage prediction method and system based on BIM and deep learning Pending CN114444386A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210058118.4A CN114444386A (en) 2022-01-19 2022-01-19 Fire early warning and post-disaster floor damage prediction method and system based on BIM and deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210058118.4A CN114444386A (en) 2022-01-19 2022-01-19 Fire early warning and post-disaster floor damage prediction method and system based on BIM and deep learning

Publications (1)

Publication Number Publication Date
CN114444386A true CN114444386A (en) 2022-05-06

Family

ID=81367848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210058118.4A Pending CN114444386A (en) 2022-01-19 2022-01-19 Fire early warning and post-disaster floor damage prediction method and system based on BIM and deep learning

Country Status (1)

Country Link
CN (1) CN114444386A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115178400A (en) * 2022-08-01 2022-10-14 重庆大学 Intelligent guniting robot, guniting system and method
CN117150298A (en) * 2023-09-01 2023-12-01 中国电建集团江西省水电工程局有限公司 Deep learning-based subway FAS fire alarm system debugging method
CN117635643A (en) * 2023-12-05 2024-03-01 中南大学 Improved indoor fire disaster and combustible automatic identification method of dual-attention network

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115178400A (en) * 2022-08-01 2022-10-14 重庆大学 Intelligent guniting robot, guniting system and method
CN117150298A (en) * 2023-09-01 2023-12-01 中国电建集团江西省水电工程局有限公司 Deep learning-based subway FAS fire alarm system debugging method
CN117150298B (en) * 2023-09-01 2024-06-07 中国电建集团江西省水电工程局有限公司 Deep learning-based subway FAS fire alarm system debugging method
CN117635643A (en) * 2023-12-05 2024-03-01 中南大学 Improved indoor fire disaster and combustible automatic identification method of dual-attention network

Similar Documents

Publication Publication Date Title
CN114444386A (en) Fire early warning and post-disaster floor damage prediction method and system based on BIM and deep learning
Zhang et al. Discovering worst fire scenarios in subway stations: A simulation approach
KR101478691B1 (en) System and method for delaying spread of fire in intelligent building
US20110112660A1 (en) Fire protection device, method for protecting against fire, and computer program
CN111369071A (en) Intelligent evacuation system and method based on evacuation time prediction and fire detection model
KR102011342B1 (en) Fire Safety Inspecting Method and System
US11729597B2 (en) Digital twin disaster management system customized for underground public areas
JP2007521455A (en) CT-Analyst: A software system for emergency assessment of chemical, biological and radiological (CBR) threats from the air with zero delay and high fidelity
CN111639825A (en) Method and system for indicating escape path of forest fire based on A-Star algorithm
KR20220071880A (en) Digital twin disaster management system customized for underground public areas
KR20220061312A (en) Deep learning based building management system with rail robot device
CN115482507A (en) Crowd gathering fire-fighting early warning method and system based on artificial intelligence
CN116704694A (en) Automatic fire prevention and control method for expressway tunnel and emergency disposal management system
CN116363825B (en) Method and device for displaying fire spreading trend, electronic equipment and medium
KR102449203B1 (en) Method for managing safty condition information
KR20150089313A (en) A manufacturing method and system of disaster information map for dangerous articles safety
CN116704693A (en) Method and system for confirming fire point by multi-camera linkage based on Internet of things
Restas Disaster management with resource optimization supported by drone applications
CN115331383A (en) Construction site safety risk identification method and system
CN114067065A (en) Escape route visualization system and escape method
Li Intelligent Science Empowers: Building Fire Protection Technology Development
KR20210073102A (en) Disaster management system using two-dimensional floor plans
CN111160780A (en) Dispatching robot and dispatching method
CN116882742B (en) Mobile intelligent fire-fighting linkage analysis management system based on digital visualization
Fujita et al. Collapsed Building Detection Using Multiple Object Tracking from Aerial Videos and Analysis of Effective Filming Techniques of Drones

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination