CN112347874A - Fire detection method, device, equipment and storage medium - Google Patents

Fire detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN112347874A
CN112347874A CN202011155200.6A CN202011155200A CN112347874A CN 112347874 A CN112347874 A CN 112347874A CN 202011155200 A CN202011155200 A CN 202011155200A CN 112347874 A CN112347874 A CN 112347874A
Authority
CN
China
Prior art keywords
fire
detected
target
area
temperature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011155200.6A
Other languages
Chinese (zh)
Inventor
李庆民
卢存盟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chuangze Intelligent Robot Group Co ltd
Original Assignee
Chuangze Intelligent Robot Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chuangze Intelligent Robot Group Co ltd filed Critical Chuangze Intelligent Robot Group Co ltd
Priority to CN202011155200.6A priority Critical patent/CN112347874A/en
Publication of CN112347874A publication Critical patent/CN112347874A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0014Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation from gases, flames
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0014Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation from gases, flames
    • G01J5/0018Flames, plasma or welding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The application discloses a fire detection method, a device, equipment and a storage medium, comprising: acquiring picture data of a to-be-detected area; detecting the picture data by using a detection model based on a machine learning algorithm to generate a fire picture credibility value corresponding to the picture data to obtain a target credibility; acquiring the environmental temperature of the area to be detected to obtain a target environmental temperature; and judging whether the target reliability is greater than a preset reliability threshold value or not and whether the target environment temperature is greater than a preset temperature threshold value or not, and if so, judging that the fire disaster happens to the area to be detected. According to the method, the fire detection is carried out by using the method of matching the visible light image analysis based on artificial intelligence with the environmental temperature measurement, so that the accuracy of the fire detection is improved to a certain extent.

Description

Fire detection method, device, equipment and storage medium
Technical Field
The present invention relates to the field of fire prevention technologies, and in particular, to a fire detection method, device, equipment, and storage medium.
Background
Among various disasters, the fire disaster is one of the main disasters which threaten public safety and social development most frequently and most generally, and the harm and loss caused by the fire disaster can be greatly reduced by monitoring, preventing and controlling the possible fire disaster in advance. In the traditional method, threshold conditions are set according to RGB (R represents red, G represents green, B represents blue), HIS (H represents color tone, I represents intensity or brightness, and S represents color saturation) to judge a fire, or the fire is judged only through infrared equipment. However, although a model with high accuracy and robustness can be obtained for fire detection after a large amount of data training by using a machine learning method, false detection still exists only through images, and meanwhile, the fire is judged by purely using an infrared thermal imaging technology, so that the possibility of false judgment also exists.
Disclosure of Invention
In view of the above, the present invention provides a fire detection method, a fire detection device, a fire detection apparatus, and a storage medium, which can perform double detection on an image and a temperature simultaneously to determine whether a fire occurs, thereby effectively improving the accuracy of fire detection. The specific scheme is as follows:
a first aspect of the present application provides a fire detection method, including:
acquiring picture data of a to-be-detected area;
detecting the picture data by using a detection model based on a machine learning algorithm to generate a fire picture credibility value corresponding to the picture data to obtain a target credibility;
acquiring the environmental temperature of the area to be detected to obtain a target environmental temperature;
and judging whether the target reliability is greater than a preset reliability threshold value or not and whether the target environment temperature is greater than a preset temperature threshold value or not, and if so, judging that the fire disaster happens to the area to be detected.
Optionally, the picture data includes a video stream of the area to be detected and/or image data of the area to be detected.
Optionally, the detecting the picture data by using a detection model based on a machine learning algorithm to generate a fire picture reliability value corresponding to the picture data, so as to obtain a target reliability, includes:
and detecting the picture data by using a YOLO detection model to generate a fire picture credibility value corresponding to the picture data to obtain a target credibility.
Optionally, before the detecting the picture data by using the YOLO detection model, the method further includes:
acquiring a fire picture, and labeling the fire picture to obtain a fire training data set;
and training a blank model constructed based on a YOLO algorithm by using the fire training data set to obtain the YOLO detection model.
Optionally, the obtaining the ambient temperature of the area to be detected to obtain a target ambient temperature includes:
and acquiring the ambient temperature of the area to be detected by using a temperature sensor to obtain the target ambient temperature.
Optionally, the obtaining, by using a temperature sensor, the ambient temperature of the area to be detected to obtain a target ambient temperature includes:
acquiring an infrared thermal imaging image of the area to be detected by using an infrared temperature sensor;
and reading the environmental temperatures of different areas in the infrared thermal imaging graph, and outputting the maximum value of the environmental temperatures of the different areas to obtain the target environmental temperature.
Optionally, the determining whether the target reliability is greater than a preset reliability threshold and whether the target environment temperature is greater than a preset temperature threshold, if so, determining that the area to be detected is in a fire, and further including:
and saving the picture data of the area to be detected.
A second aspect of the present application provides a fire detection apparatus comprising:
the image data acquisition module is used for acquiring image data of the area to be detected;
the credibility acquisition module is used for detecting the picture data by using a neural network detection model so as to generate a fire picture credibility probability value corresponding to the picture data and obtain target credibility;
the temperature acquisition module is used for acquiring the environmental temperature of the area to be detected so as to obtain a target environmental temperature;
and the judging module is used for judging whether the target reliability is greater than a preset reliability threshold value or not and whether the target environment temperature is greater than a preset temperature threshold value or not, and if so, judging that the fire disaster happens to the area to be detected.
A third aspect of the application provides an electronic device comprising a processor and a memory; wherein the memory is adapted to store a computer program that is loaded and executed by the processor to implement the aforementioned fire detection method.
A fourth aspect of the present application provides a storage medium having stored therein computer-executable instructions that, when loaded and executed by a processor, implement the aforementioned fire detection method.
According to the method, firstly, a detection model based on a machine learning algorithm is used for detecting the image data of the area to be detected so as to generate a fire image credibility value corresponding to the image data, a target credibility is obtained, then the environmental temperature of the area to be detected is obtained so as to obtain a target environmental temperature, finally, whether the target credibility is larger than a preset credibility threshold value or not and whether the target environmental temperature is larger than a preset temperature threshold value or not are judged, and if yes, the area to be detected is judged to have a fire. Therefore, the method for detecting the fire disaster by matching the visible light image analysis based on the artificial intelligence with the ambient temperature measurement improves the accuracy of the fire disaster detection to a certain extent.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a flow chart of a fire detection method provided herein;
FIG. 2 is a flow chart of a particular fire detection method provided herein;
FIG. 3 is a flow chart of a particular fire detection method provided herein;
FIG. 4 is a schematic structural diagram of a fire detection device according to the present application;
fig. 5 is a block diagram of an electronic fire detection device according to the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
At present, in order to monitor and prevent and control possible fires in advance, so that harm and loss caused by the fires are reduced to a great extent, the fires can be judged specifically according to threshold conditions set by RGB and HIS, or the fires are judged only through infrared equipment, or whether the fires happen or not is judged through an image type fire detection technology based on a video monitoring platform. However, the fire detection result of the above method is greatly influenced by the outside world, and there is a possibility of erroneous judgment. In order to overcome the technical problem, the application provides a fire detection scheme, can carry out dual inspection to image and temperature simultaneously, and then judge whether to break out a fire, has effectively improved the degree of accuracy that fire detected.
Fig. 1 is a flowchart of a fire detection method according to an embodiment of the present disclosure. Referring to fig. 1, the fire detection method includes:
s11: and acquiring picture data of the area to be detected.
In this embodiment, a video monitoring platform may be carried to monitor a fire in real time, and based on the video monitoring platform, the picture data of the area to be detected is obtained, where the picture data includes a video stream of the area to be detected and/or image data of the area to be detected. The video monitoring platform can comprise a video/image acquisition system, an integrated system and the like, the video/image acquisition system acquires and stores video images of a monitoring point through a camera device, and compared with a common camera for outputting AV signals, the video monitoring platform can obtain video streams and image data which are brighter in color, clearer in image quality and capable of being restored by adopting an RGB camera. The integrated system can be used for operating a detection model based on a machine learning algorithm, reading the reliability of the target, and meanwhile, comprehensively analyzing the environmental temperature measured by the temperature sensor and reading the environmental temperature of the target. It is understood that the video monitoring platform may further include a database system for storing video streams and image data acquired by the video/image acquisition system, and may also store videos or pictures of a fire scene, which may be used as model training samples in constructing a detection model based on a machine learning algorithm.
S12: and detecting the picture data by using a detection model based on a machine learning algorithm to generate a fire picture credibility value corresponding to the picture data to obtain a target credibility.
In this embodiment, the detection of the frame data by using the detection model based on the Machine learning algorithm may have a plurality of different detection forms and implementation methods according to different algorithms, for example, a SVM (Support Vector Machine) algorithm, a GBDT (Gradient boost Decision Tree) algorithm, an RF (Random Forest) algorithm, etc. are used to obtain a traditional detection model based on the Machine learning algorithm, or a CNN (Convolutional Neural network) algorithm, an RNN (current Neural network, cyclic Neural network) algorithm, an LSTM (Long short-term memory network) algorithm is used to obtain a detection model based on the deep learning algorithm, it is not difficult to understand that the deep Convolutional Neural network may automatically extract and learn more intrinsic features in the training data of the deep Convolutional hierarchical layer, the deep convolutional neural network is applied to the image type fire detection technology, so that the classification effect is obviously enhanced, and the detection accuracy is further improved.
Furthermore, after the detection model based on the machine learning algorithm is used for detecting the picture data, the detection model directly generates a fire picture credibility value corresponding to the picture data, to obtain the target reliability, it should be noted that the essence of the fire picture reliability value is a probability value in a range of 0 to 1, which represents the probability that the picture data, i.e. the picture type of the video stream of the area to be detected or the picture type of the image of the area to be detected, is a fire picture, and the target reliability is directly expressed as the fire picture reliability value, whether the picture corresponding to the picture data belongs to the fire type picture or not can be reflected to some extent, and it is understood that, the higher the fire picture credibility value is, the higher the probability that the picture corresponding to the picture data is a fire is.
S13: and acquiring the ambient temperature of the area to be detected to obtain the target ambient temperature.
In this embodiment, after detecting the picture data of the to-be-detected region, the temperature of the to-be-detected region needs to be acquired, and there are many methods for detecting the temperature of the specific region, for example, the temperature can be measured by radiation type temperature measurement, spectroscopic temperature measurement, laser interference temperature measurement, or sound wave or microwave method. Further, the ambient temperature of the region to be detected can also be obtained by using infrared thermal imaging. It should be noted that, due to different influence factors, certain deviation may occur between detection results obtained by different temperature measurement methods, and the influence of the certain deviation on the detection results may be ignored.
S14: and judging whether the target reliability is greater than a preset reliability threshold value or not and whether the target environment temperature is greater than a preset temperature threshold value or not, and if so, judging that the fire disaster happens to the area to be detected.
In this embodiment, whether a fire occurs in the area to be detected is determined based on the target reliability and the target environment temperature, specifically, whether the target reliability is greater than a preset reliability threshold and the target environment temperature is greater than a preset temperature threshold is determined, if yes, it is determined that a fire occurs in the area to be detected, and if not, it is determined that a fire does not occur in the area to be detected. It is easy to understand that under the condition that the target reliability is not greater than the preset reliability threshold and/or the target environment temperature is not greater than the preset temperature threshold, it is determined that the fire disaster does not occur in the area to be detected, that is, when the picture type corresponding to the picture data of the area to be detected is detected as a fire scene picture and the temperature of the area to be detected is within a certain range, the fire disaster in the area to be detected can be determined. It should be noted that the preset reliability threshold and the preset temperature threshold are both empirical values, and are determined according to actual conditions, in practical applications, the range of the preset reliability threshold is 0.7 to 0.8, the preset temperature threshold is generally an average value of actual temperatures detected by an infrared thermal imaging device in a certain range when a common object burns, of course, the empirical values are only approximate ranges or directions, a specific setting mode is determined according to a specific scene, and this embodiment is not limited.
Therefore, in the embodiment of the application, the image data is detected by using a detection model based on a machine learning algorithm to obtain the target reliability, meanwhile, the environmental temperature of the area to be detected is obtained by using a temperature sensor and the like to obtain the target environmental temperature, and whether the area to be detected has a fire or not is judged based on the target reliability and the target environmental temperature. This application embodiment can carry out dual inspection to image and temperature simultaneously, and then judges whether conflagration breaing out, has effectively improved the degree of accuracy that fire detected.
Fig. 2 is a flowchart of a specific fire detection method according to an embodiment of the present disclosure. Referring to fig. 2, the fire detection method includes:
s21: and acquiring picture data of the area to be detected.
It should be noted that, for details, corresponding contents disclosed in the foregoing embodiment may be referred to in the process of acquiring the picture data of the to-be-detected region in step S21, and details are not repeated here.
S22: and acquiring a fire picture, and labeling the fire picture to obtain a fire training data set.
In this embodiment, before performing model training, a training set, that is, a set of sample data with labels and tags for training a detection model, needs to be obtained first, in this embodiment, sample data for training a blank model constructed based on a yolo (young Only Look once) algorithm is fire pictures with different scenes, different types, different scales, and different combustion specifications and states, the fire pictures may be collected from an online search, or a plurality of sections of fire videos or videos simulating a fire may be collected and sorted, and a fire picture is manually captured or a video editing tool is used to capture a video frame of a fire scene. After a certain number of fire pictures are obtained, the fire pictures are subjected to data annotation by using a picture annotation tool, such as PXCOOK, MarkMan and the like, and of course, the annotation can be performed manually, since the embodiment detects the fire face change data by using the YOLO detection model, the embodiment can manually perform data annotation on the pictures by using a YOLO-series dedicated picture annotation tool YOLO _ MARK, frame the flames in the fire pictures by using a rectangular frame which contains the complete flame outline and is close to the outline edge as much as possible, and indicate that the type is fire. After the labeling is finished, each fire picture correspondingly generates a txt file which comprises a target class number, coordinates of the central point of the labeling rectangular frame and the width and the height of the rectangular frame. A certain number of marked fire pictures are selected as a fire training data set to train the detection model, and the rest marked fire pictures can be used as test data to verify the performance of the model. It should be noted that, in this embodiment, the source of the fire picture and the specific method for marking the picture are not limited.
S23: and training a blank model constructed based on a YOLO algorithm by using the fire training data set to obtain a YOLO detection model.
S24: and detecting the picture data by using the YOLO detection model to generate a fire picture credibility value corresponding to the picture data to obtain a target credibility.
In this embodiment, the YOLO detection model is used to detect the picture data of the region to be detected, the YOLO algorithm is used, the input image is passed through the Convolutional Neural Network (CNN) to obtain feature maps of different scales, and classification and regression are performed to obtain the category and the object bounding box of the predicted object. The YOLO algorithm is a single end-to-end neural network deep learning algorithm model for achieving target detection, the essence of the YOLO algorithm is that target detection is treated as a regression problem, and compared with a traditional machine learning algorithm, such as SVM and the like or a method for judging fire according to threshold conditions set by RGB and HSI, the method for detecting fire by using the detection model based on the YOLO algorithm can obtain higher accuracy while ensuring detection speed.
The principle of the YOLO algorithm is to segment the input image into S-S meshes, each consisting of (x, y, w, h) and a confidence c (object). Coordinates (x, y) represent the center position of the detection bounding box with respect to the grid. (W, h) is the width and height of the bounding box. If the center of an object falls within a grid cell, the grid cell is responsible for detecting the object. Each cell of the grid predicts a bounding box and a confidence for that box. The confidence level is confidence level, and includes two aspects, namely the probability of the bounding box containing the target, and the accuracy of the bounding box, and the calculation formula is expressed as:
C(Object)=Pr(Object)*IOU(Pred,Truth)
where pr (object) indicates the size of the probability that the bounding box contains an object, and when the bounding box is background (i.e. does not contain an object), pr (object) is 0. When the bounding box contains the target, pr (object) is 1. The accuracy of the bounding box can be represented by the IOU (intersection/union) of the predicted box and the actual box, which is denoted as IOU (Pred, Truth).
The network framework of the YOLO algorithm includes convolutional layers for extracting image features and fully-connected layers for classification and localization. In this embodiment, the convolutional layer is configured to extract image features of the fire training data set, the full-link layer divides the extracted image features into a mesh form, performs image classification and positioning processing on each mesh obtained by the division, determines a position and a confidence of a bounding box of a target portion in an image corresponding to the image features, and identifies and detects the image based on the position and the confidence of the bounding box.
S25: and acquiring the ambient temperature of the area to be detected to obtain the target ambient temperature.
S26: and judging whether the target reliability is greater than a preset reliability threshold value or not and whether the target environment temperature is greater than a preset temperature threshold value or not, and if so, judging that the fire disaster happens to the area to be detected.
It should be noted that, the processes of step S25 and step S25 may specifically refer to the corresponding contents disclosed in the foregoing embodiments, and are not described herein again.
Therefore, in the embodiment of the application, the fire training data set is constructed based on a large number of fire pictures with labels and labels, the YOLO detection model is trained by using the fire training data set, so that the data picture of the area to be detected is input into the YOLO detection model for detection, and the fire picture credibility value corresponding to the picture data is generated.
Fig. 3 is a flowchart of a fire detection method according to an embodiment of the present disclosure. Referring to fig. 3, the fire detection method includes:
s31: and acquiring picture data of the area to be detected.
S32: and detecting the picture data by using a detection model based on a machine learning algorithm to generate a fire picture credibility value corresponding to the picture data to obtain a target credibility.
It should be noted that, for the processes in step S31 and step S32, reference may be made to the corresponding contents disclosed in the foregoing embodiments, and details are not repeated here.
S33: and acquiring an infrared thermal imaging image of the area to be detected by using an infrared temperature sensor.
S34: and reading the environmental temperatures of different areas in the infrared thermal imaging graph, and outputting the maximum value of the environmental temperatures of the different areas to obtain the target environmental temperature.
In this embodiment, the obtaining of the infrared thermal imaging map of the to-be-detected region by using the infrared temperature sensor may be specifically implemented by using an infrared thermal imaging device, where the infrared thermal imaging device may include a display, a temperature sensor, and the like, and the display may be integrated in the integrated system of the video monitoring platform, so as to simultaneously determine the picture data and the temperature data of the to-be-detected region, thereby determining whether a fire disaster occurs in the to-be-detected region. It should be noted that, the thermal imaging device utilizes an infrared detector and an optical imaging objective to receive infrared radiation energy of a measured object, and reflects an energy distribution pattern onto a photosensitive element of the infrared detector, so as to obtain an infrared thermal imaging image, wherein the thermal image corresponds to a thermal distribution field on the surface of an object, and is mainly displayed by colors, and different colors represent different temperatures of the measured object. Some mature thermal imaging devices can directly display temperature, when the devices do not support direct temperature display, color data in a thermal image needs to be read and temperature data needs to be output, in this embodiment, the maximum value in the environmental temperature of the region to be detected needs to be output, it is not difficult to understand that one thermal imaging image has different colors, so different temperatures exist, that is, different region units in the region to be detected have different environmental temperatures under the influence of factors such as environment and materials, and the maximum value of the environmental temperature of the region to be detected is used for representing the environmental temperature of the region, so that the sensitivity and accuracy of detection can be increased, and unnecessary loss caused by misjudgment of fire can be reduced.
S35: and judging whether the target reliability is greater than a preset reliability threshold value or not and whether the target environment temperature is greater than a preset temperature threshold value or not, and if so, judging that the fire disaster happens to the area to be detected.
It should be noted that, for the specific process of the step S35, reference may be made to the corresponding contents disclosed in the foregoing embodiments, and details are not repeated herein.
S36: and saving the picture data of the area to be detected.
In this embodiment, when it is determined that a fire occurs in the area to be detected, the picture data may be stored, where the picture data includes a video stream of the area to be detected and/or image data of the area to be detected, and of course, the stored picture data is data of a fire scene, and the data of the fire scene may be used to train a detection model. It should be noted that the stored picture data may be stored in the database system of the video monitoring platform, and the database system may also be a data hard disk loaded on the video monitoring platform.
Therefore, the method introduces infrared thermal imaging, obtains the temperature data of the area to be detected by reading the temperature data in the infrared thermal imaging graph and outputting the maximum value of the environmental temperature in the area to be detected, and comprehensively judges whether the area to be detected has a fire by combining the fire picture credibility value corresponding to the picture data of the area to be detected.
Referring to fig. 4, the embodiment of the present application also discloses a fire detection device, which includes:
the image data acquisition module 11 is used for acquiring image data of the area to be detected;
the credibility obtaining module 12 is configured to detect the picture data by using a neural network detection model to generate a fire picture credibility probability value corresponding to the picture data, so as to obtain a target credibility;
the temperature acquisition module 13 is configured to acquire an ambient temperature of the to-be-detected region to obtain a target ambient temperature;
and the judging module 14 is configured to judge whether the target reliability is greater than a preset reliability threshold and whether the target environment temperature is greater than a preset temperature threshold, and if so, judge that a fire disaster occurs in the area to be detected.
Therefore, in the embodiment of the application, the image data is detected by using a detection model based on a machine learning algorithm to obtain the target reliability, meanwhile, the environmental temperature of the area to be detected is obtained by using a temperature sensor and the like to obtain the target environmental temperature, and whether the area to be detected has a fire or not is judged based on the target reliability and the target environmental temperature. This application embodiment can carry out dual inspection to image and temperature simultaneously, and then judges whether conflagration breaing out, has effectively improved the degree of accuracy that fire detected.
In some specific embodiments, the image data obtaining module 11 is specifically configured to obtain a video stream of the area to be detected and/or image data of the area to be detected.
In some embodiments, the reliability obtaining module 12 is specifically configured to detect the image data by using a YOLO detection model to generate a fire image reliability value corresponding to the image data, so as to obtain a target reliability.
In some embodiments, the fire detection apparatus further comprises:
the training data set building module is used for acquiring a fire picture and marking the fire picture to obtain a fire training data set;
and the detection model training module is used for training a blank model constructed based on a YOLO algorithm by using the fire training data set so as to obtain the YOLO detection model.
In some specific embodiments, the temperature obtaining module 13 is specifically configured to obtain the ambient temperature of the area to be detected by using a temperature sensor, so as to obtain the target ambient temperature.
In some specific embodiments, the temperature obtaining module 13 specifically includes:
the thermal imaging image acquisition unit is used for acquiring an infrared thermal imaging image of the area to be detected by using an infrared temperature sensor;
and the target ambient temperature output unit is used for reading the ambient temperatures of different areas in the infrared thermal imaging graph and outputting the maximum value of the ambient temperatures of the different areas to obtain the target ambient temperature.
In some embodiments, the fire detection apparatus further includes a data storage module, specifically configured to store the picture data of the area to be detected.
Further, the embodiment of the application also provides electronic equipment. FIG. 5 is a block diagram illustrating an electronic device 20 according to an exemplary embodiment, and the contents of the diagram should not be construed as limiting the scope of use of the present application in any way.
Fig. 5 is a schematic structural diagram of an electronic device 20 according to an embodiment of the present disclosure. The electronic device 20 may specifically include: at least one processor 21, at least one memory 22, a power supply 23, a communication interface 24, an input output interface 25, and a communication bus 26. Wherein the memory 22 is used for storing a computer program, which is loaded and executed by the processor 21 to implement the relevant steps in the fire detection method disclosed in any of the foregoing embodiments. In addition, the electronic device 20 in the present embodiment may be specifically a controller.
In this embodiment, the power supply 23 is configured to provide a working voltage for each hardware device on the electronic device 20; the communication interface 24 can create a data transmission channel between the electronic device 20 and an external device, and a communication protocol followed by the communication interface is any communication protocol applicable to the technical solution of the present application, and is not specifically limited herein; the input/output interface 25 is configured to obtain external input data or output data to the outside, and a specific interface type thereof may be selected according to specific application requirements, which is not specifically limited herein.
In addition, the storage 22 is used as a carrier for resource storage, and may be a read-only memory, a random access memory, a magnetic disk or an optical disk, etc., and the resources stored thereon may include an operating system 221, a computer program 222, video data 223, etc., and the storage may be a transient storage or a permanent storage.
The operating system 221 is configured to manage and control each hardware device and the computer program 222 on the electronic device 20, so as to implement the operation and processing of the picture data 223 of the large-volume area to be detected in the memory 22 by the processor 21, and may be Windows Server, Netware, Unix, Linux, or the like. The computer program 222 may further include a computer program that can be used to perform other specific tasks in addition to the computer program that can be used to perform the fire detection method by the electronic device 20 disclosed in any of the foregoing embodiments. Data 223 may include various visual data collected by electronic device 20.
Further, an embodiment of the present application further discloses a storage medium, in which a computer program is stored, and when the computer program is loaded and executed by a processor, the steps of the fire detection method disclosed in any of the foregoing embodiments are implemented.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The fire detection method, the fire detection device, the fire detection equipment and the fire detection storage medium provided by the invention are described in detail, specific examples are applied in the description to explain the principle and the implementation mode of the invention, and the description of the examples is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A method of fire detection, comprising:
acquiring picture data of a to-be-detected area;
detecting the picture data by using a detection model based on a machine learning algorithm to generate a fire picture credibility value corresponding to the picture data to obtain a target credibility;
acquiring the environmental temperature of the area to be detected to obtain a target environmental temperature;
and judging whether the target reliability is greater than a preset reliability threshold value or not and whether the target environment temperature is greater than a preset temperature threshold value or not, and if so, judging that the fire disaster happens to the area to be detected.
2. A fire detection method according to claim 1, wherein the picture data comprises a video stream of the area to be detected and/or image data of the area to be detected.
3. The fire detection method according to claim 1, wherein the detecting the picture data by using a detection model based on a machine learning algorithm to generate a fire picture reliability value corresponding to the picture data to obtain a target reliability includes:
and detecting the picture data by using a YOLO detection model to generate a fire picture credibility value corresponding to the picture data to obtain a target credibility.
4. The fire detection method according to claim 3, wherein before the detecting the picture data by using the YOLO detection model, the method further comprises:
acquiring a fire picture, and labeling the fire picture to obtain a fire training data set;
and training a blank model constructed based on a YOLO algorithm by using the fire training data set to obtain the YOLO detection model.
5. The fire detection method according to claim 1, wherein the obtaining the ambient temperature of the area to be detected to obtain the target ambient temperature comprises:
and acquiring the ambient temperature of the area to be detected by using a temperature sensor to obtain the target ambient temperature.
6. The fire detection method according to claim 5, wherein the acquiring the ambient temperature of the area to be detected by using the temperature sensor to obtain the target ambient temperature comprises:
acquiring an infrared thermal imaging image of the area to be detected by using an infrared temperature sensor;
and reading the environmental temperatures of different areas in the infrared thermal imaging graph, and outputting the maximum value of the environmental temperatures of the different areas to obtain the target environmental temperature.
7. The fire detection method according to any one of claims 1 to 6, wherein the determining whether the target reliability is greater than a preset reliability threshold and the target ambient temperature is greater than a preset temperature threshold, and if so, after the fire occurs in the area to be detected, further includes:
and saving the picture data of the area to be detected.
8. A fire detection device, comprising:
the image data acquisition module is used for acquiring image data of the area to be detected;
the credibility acquisition module is used for detecting the picture data by using a neural network detection model so as to generate a fire picture credibility probability value corresponding to the picture data and obtain target credibility;
the temperature acquisition module is used for acquiring the environmental temperature of the area to be detected so as to obtain a target environmental temperature;
and the judging module is used for judging whether the target reliability is greater than a preset reliability threshold value or not and whether the target environment temperature is greater than a preset temperature threshold value or not, and if so, judging that the fire disaster happens to the area to be detected.
9. An electronic device, comprising a processor and a memory; wherein the memory is for storing a computer program that is loaded and executed by the processor to implement a fire detection method as claimed in any one of claims 1 to 7.
10. A storage medium having stored thereon computer-executable instructions which, when loaded and executed by a processor, carry out a fire detection method according to any one of claims 1 to 7.
CN202011155200.6A 2020-10-26 2020-10-26 Fire detection method, device, equipment and storage medium Pending CN112347874A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011155200.6A CN112347874A (en) 2020-10-26 2020-10-26 Fire detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011155200.6A CN112347874A (en) 2020-10-26 2020-10-26 Fire detection method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112347874A true CN112347874A (en) 2021-02-09

Family

ID=74358448

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011155200.6A Pending CN112347874A (en) 2020-10-26 2020-10-26 Fire detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112347874A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113723300A (en) * 2021-08-31 2021-11-30 平安国际智慧城市科技股份有限公司 Artificial intelligence-based fire monitoring method and device and storage medium
CN117093032A (en) * 2023-10-19 2023-11-21 万华化学集团股份有限公司 Reactor temperature control method, system, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110135266A (en) * 2019-04-17 2019-08-16 浙江理工大学 A kind of dual camera electrical fire preventing control method and system based on deep learning
CN110598655A (en) * 2019-09-18 2019-12-20 东莞德福得精密五金制品有限公司 Artificial intelligence cloud computing multispectral smoke high-temperature spark fire monitoring method
CN111027541A (en) * 2019-11-15 2020-04-17 国网安徽省电力有限公司检修分公司 Flame detection method and system based on visible light and thermal imaging and storage medium
CN111339997A (en) * 2020-03-20 2020-06-26 浙江大华技术股份有限公司 Method and apparatus for determining ignition region, storage medium, and electronic apparatus
CN111739250A (en) * 2020-07-01 2020-10-02 广东工业大学 Fire detection method and system combining image processing technology and infrared sensor

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110135266A (en) * 2019-04-17 2019-08-16 浙江理工大学 A kind of dual camera electrical fire preventing control method and system based on deep learning
CN110598655A (en) * 2019-09-18 2019-12-20 东莞德福得精密五金制品有限公司 Artificial intelligence cloud computing multispectral smoke high-temperature spark fire monitoring method
CN111027541A (en) * 2019-11-15 2020-04-17 国网安徽省电力有限公司检修分公司 Flame detection method and system based on visible light and thermal imaging and storage medium
CN111339997A (en) * 2020-03-20 2020-06-26 浙江大华技术股份有限公司 Method and apparatus for determining ignition region, storage medium, and electronic apparatus
CN111739250A (en) * 2020-07-01 2020-10-02 广东工业大学 Fire detection method and system combining image processing technology and infrared sensor

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113723300A (en) * 2021-08-31 2021-11-30 平安国际智慧城市科技股份有限公司 Artificial intelligence-based fire monitoring method and device and storage medium
CN117093032A (en) * 2023-10-19 2023-11-21 万华化学集团股份有限公司 Reactor temperature control method, system, electronic equipment and storage medium
CN117093032B (en) * 2023-10-19 2024-02-02 万华化学集团股份有限公司 Reactor temperature control method, system, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
EP3869459B1 (en) Target object identification method and apparatus, storage medium and electronic apparatus
RU2380758C2 (en) Method and device for smoke detection
CN109308447A (en) The method of equipment operating parameter and operating status is automatically extracted in remote monitoriong of electric power
Cheng et al. Smoke detection and trend prediction method based on Deeplabv3+ and generative adversarial network
CN112347874A (en) Fire detection method, device, equipment and storage medium
CN111489342A (en) Video-based flame detection method and system and readable storage medium
Chowdhury et al. Computer vision and smoke sensor based fire detection system
CN112989989A (en) Security inspection method, device, equipment and storage medium
CN114255562A (en) Wisdom fire control early warning system based on thing networking
JP7074174B2 (en) Discriminator learning device, discriminator learning method and computer program
CN112613483A (en) Outdoor fire early warning method based on semantic segmentation and recognition
CN114446002B (en) Fire on-line monitoring method, device, medium and system
CN114550129B (en) Machine learning model processing method and system based on data set
CN116977256A (en) Training method, device, equipment and storage medium for defect detection model
US11508143B2 (en) Automated salience assessment of pixel anomalies
CN111127433B (en) Method and device for detecting flame
CN113947744A (en) Fire image detection method, system, equipment and storage medium based on video
CN107784665B (en) Dynamic object tracking method and system
TWI793901B (en) Smoke detection system and smoke detection method
CN116468974B (en) Smoke detection method, device and storage medium based on image generation
CN116563770B (en) Method, device, equipment and medium for detecting vehicle color
CN116156149B (en) Detection method and device for detecting camera movement
CN110263661B (en) Flame detection method and device based on new color space and fast-LOF
CN114220073A (en) Fire monitoring method, device and system and computer storage medium
CN115482509A (en) Smoke and fire identification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210209

RJ01 Rejection of invention patent application after publication