CN213424156U - Fire prediction device - Google Patents

Fire prediction device Download PDF

Info

Publication number
CN213424156U
CN213424156U CN202022360296.1U CN202022360296U CN213424156U CN 213424156 U CN213424156 U CN 213424156U CN 202022360296 U CN202022360296 U CN 202022360296U CN 213424156 U CN213424156 U CN 213424156U
Authority
CN
China
Prior art keywords
sensor
data
coordinator
environment
fire
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202022360296.1U
Other languages
Chinese (zh)
Inventor
朱姗姗
王佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Roushi Intelligent Technology Co ltd
Original Assignee
Guangzhou Roushi Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Roushi Intelligent Technology Co ltd filed Critical Guangzhou Roushi Intelligent Technology Co ltd
Priority to CN202022360296.1U priority Critical patent/CN213424156U/en
Application granted granted Critical
Publication of CN213424156U publication Critical patent/CN213424156U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Fire Alarms (AREA)
  • Alarm Systems (AREA)

Abstract

The utility model provides a conflagration prediction device, include: the system comprises at least one environment sensor, at least one image sensor, at least one terminal node, a coordinator and an upper computer; the environment sensor and the image sensor are electrically connected with the terminal nodes in a one-to-one correspondence manner, the terminal nodes are in communication connection with the coordinator, the coordinator is in communication connection with the upper computer, and the output end of the upper computer is connected with the alarm; through setting up environmental sensor and image sensor, make the two of image data and the on-the-spot physical parameter of traditional environmental sensor measurement conflagration combine, realize the complementation of sensor data, restore the characteristic of scene from a plurality of dimensions to realize accurate high-efficient ground environmental fire's control prediction.

Description

Fire prediction device
Technical Field
The utility model relates to a conflagration prediction field, concretely relates to conflagration prediction device.
Background
A fire is a very common disaster. Along with the continuous development of society and the continuous increase of social wealth, the occurrence of fire often brings huge wealth loss, and moreover, the occurrence of fire also seriously threatens the life safety of people and influences the stable and prosperous development of society. The earlier and more accurate detection of a fire minimizes various losses to the home.
The existing fire prediction system is mostly based on the discrimination of a single sensor, no reliable judgment basis exists on data, the fire prediction accuracy rate cannot meet the requirements of people, and although a fire prediction device based on multiple sensors is also arranged at present, the multiple sensors are simply spliced and cannot accurately and efficiently monitor and predict the environmental fire.
SUMMERY OF THE UTILITY MODEL
In order to overcome the defect of the prior art, the utility model aims to solve the technical problem that a conflagration prediction device is proposed, through setting up environmental sensor and image sensor, do the combination with the two of the on-the-spot physical parameter of image data and traditional environmental sensor measurement conflagration, realize sensor data's complementation, restore the characteristic of scene from a plurality of dimensions to realize the control prediction of accurate high-efficient ground environment condition of a fire.
To achieve the purpose, the utility model adopts the following technical proposal:
the utility model provides a pair of conflagration prediction device, include: the system comprises at least one environment sensor, at least one image sensor, at least one terminal node, a coordinator and an upper computer; the environment sensor is used for collecting environment data, and the image sensor is used for collecting image data which are generated to the terminal node; the terminal node is used for receiving the environment data and sending the environment data to the coordinator; the coordinator generates packed data according to the environment data of the receiving terminal nodes and the environment data and sends the packed data to the upper computer; the upper computer is used for monitoring and receiving the packed data of the coordinator, generating a decision signal according to the packed data and sending the decision signal to the alarm; the environment sensor and the image sensor are electrically connected with the terminal nodes in a one-to-one correspondence mode, the terminal nodes are in communication connection with the coordinator, the coordinator is in communication connection with the upper computer, and the output end of the upper computer is connected with the alarm.
Preferably, the environment sensor further comprises a filter circuit, the environment sensor is electrically connected with an input end of the filter circuit, and an output end of the filter circuit is electrically connected with an input end of the terminal node.
Preferably, the upper computer comprises a convolutional neural network trainer and a monitor; the convolutional neural network trainer is used for performing characteristic extraction data splicing on the environmental data and the image data in the packed data to generate a fire probability and transmitting the fire probability to the monitor; the monitor is used for receiving the fire probability and displaying the fire probability; the convolutional neural network trainer is in communication connection with the coordinator, and is connected with the monitor.
Preferably, the environment sensor comprises a temperature sensor, a humidity sensor, a smoke sensor, a flame sensor and a CO concentration sensor, and the temperature sensor, the humidity sensor, the smoke sensor, the flame sensor and the CO concentration sensor are all electrically connected with the terminal node.
Preferably, the terminal nodes and the coordinator are CC2530 terminal nodes and CC2530 coordinator.
Preferably, the temperature sensor is a DS18B20 sensor.
Preferably, the humidity sensor is an SHT20 sensor.
Preferably, the smoke sensor is an MQ-2 smoke gas sensor.
Preferably, the flame sensor is an infrared light receiving sensor.
Preferably, the CO concentration sensor is a MS2200 carbon monoxide sensor.
Preferably, the image sensor is an AI-SU500C color CCD camera.
The utility model has the advantages that:
the utility model provides a fire prediction device, gather data through setting up two kinds of sensors of environmental sensor and image sensor, environmental sensor mainly used gathers environmental data such as environment temperature, humidity, smog, flame, CO concentration in the environment, and image sensor mainly used gathers the image data of environment, and pass environmental data and image data transmission to the host computer in, the host computer carries out the concatenation of feature extraction data through the convolutional neural network training ware and produces the conflagration probability, wherein set up a plurality of terminal nodes, each terminal node all is connected with temperature sensor, humidity transducer, the smog sensor, the flame sensor, CO concentration sensor and image sensor, combine the advantage of a plurality of terminal nodes, can acquire environmental data and image data from a plurality of angles, produce the conflagration probability after the concatenation of feature extraction data of rethread host computer, the accuracy of fire identification is greatly improved, and the real-time and reliable alarm effect of early fire is realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic structural diagram provided in the embodiment of the present invention.
The attached drawings are as follows: 1 an environmental sensor; 2 an image sensor; 3, a terminal node; 4, a coordinator; 5, an upper computer; 6, an alarm; 7 a filter circuit.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. Based on the embodiments in the present invention, all other embodiments obtained by a person skilled in the art without creative efforts belong to the protection scope of the present invention.
The technical solution of the present invention is further explained by the following embodiments with reference to the accompanying drawings.
As shown in fig. 1, the fire prediction apparatus provided in the present embodiment includes: the system comprises at least one environment sensor 1, at least one image sensor 2, at least one terminal node 3, a coordinator 4 and an upper computer 5; the environment sensor 1 is used for collecting environment data, the image sensor 2 is used for collecting image data, and the environment data and the image data are all generated to the terminal node 3; the terminal node 3 is used for receiving the environment data and sending the environment data to the coordinator 4; the coordinator 4 generates packed data according to the environment data and the environment data of the receiving terminal node 3 and sends the packed data to the upper computer 5; the upper computer 5 is used for monitoring and receiving the packed data of the coordinator 4, generating a decision signal according to the packed data and sending the decision signal to the alarm 6; the environment sensor 1 and the image sensor 2 are electrically connected with the terminal nodes 3 in a one-to-one correspondence mode, the terminal nodes 3 are connected with the coordinator 4 in a communication mode, the coordinator 4 is connected with the upper computer 5 in a communication mode, and the output end of the upper computer 5 is connected with the alarm 6. Specifically, the environment sensor 1 and the image sensor 2 are electrically connected with a terminal node 3, the terminal node 3 and the coordinator 4 can be electrically connected and can also be connected by a wireless network to enable the terminal node and the coordinator to communicate, the coordinator 4 and the upper computer 5 can be electrically connected and can also be connected by a wireless network to enable the coordinator to communicate, and the upper computer 5 is electrically connected with the alarm 6; in this embodiment, be provided with two terminal nodes 3, every terminal node 3 can place in a plurality of different positions, and every terminal node 3 all is connected with environmental sensor 1 and image sensor 2, gathers environmental data and image data through terminal node 3, and unified transmission is carried out data interaction by coordinator 4 in host computer 5 to coordinator 4, and host computer 5 carries out analysis processes after receiving data and generates the decision-making model, and control alarm 6 reports to the police.
Preferably, the environment sensor device further comprises a filter circuit 7, wherein the environment sensor 1 is electrically connected with an input end of the filter circuit 7, and an output end of the filter circuit 7 is electrically connected with an input end of the terminal node 3.
Preferably, the environment sensor 1 includes a temperature sensor, a humidity sensor, a smoke sensor, a flame sensor, and a CO concentration sensor, and the temperature sensor, the humidity sensor, the smoke sensor, the flame sensor, and the CO concentration sensor are all electrically connected to the terminal node 3. Specifically, every terminal node 3 is connected with temperature sensor, humidity transducer, smoke transducer, flame sensor, CO concentration sensor and image sensor 2, after the terminal node 3 board carried the data that the ADC module was used for gathering corresponding sensor, through IIC, the USART acquires relevant sensor data, carry out hardware filtering to analog voltage data through filter circuit 7, and then the data of gathering terminal node 3 are after the median is filtered, obtain real-time environmental data, terminal node 3 passes through wireless transmission to coordinator 4, coordinator 4 changes the USB agreement through TTL, reach host computer 5 with data, host computer 5 integrates data.
Preferably, the upper computer 5 comprises a convolutional neural network trainer and a monitor; the convolutional neural network trainer is used for performing characteristic extraction data splicing on the environmental data and the image data in the packed data to generate a fire probability and transmitting the fire probability to the monitor; the monitor is used for receiving the fire probability and displaying the fire probability; the convolutional neural network trainer is in communication connection with the coordinator 4 and is connected with the monitor. Specifically, the process of the upper computer 5 receiving the data of the coordinator 4 for integration includes firstly, the convolutional neural network trainer of the upper computer 5 splices the data collected by the environment sensor 1 into a one-dimensional feature vector S1, performs noise reduction filtering on the color image collected by the image sensor 2 to obtain final data of the image, performs deep learning feature extraction on the image data to obtain an image convolution feature vector, passes through a full connection layer, and maps the multidimensional image convolution feature vector to the one-dimensional feature vector S2. Further, the data of the environment sensor 1 and the one-dimensional feature vector S1 extracted by the convolutional neural network are spliced to obtain a one-dimensional feature vector S3, and particularly, in order to ensure that the decisive action of each sensing data on the network is close, normalization processing can be preferentially considered on the features S2 and S1 respectively before data splicing. Further, a full connection relation is established between the spliced characteristic vector S3 and the output result, in order to obtain the probability value of the fire, the output result is processed by adopting an activation function, generally, the activation function adopts a Sigmoid function, finally, the probability value of the fire is obtained and sent to a monitor, the monitor monitors and displays the probability value of the fire, and when the probability value of the fire exceeds a preset value, a decision signal is sent to the alarm 6 for alarm prompt.
The above process is a prediction process of a single scene, also called a forward propagation process of a network. For multiple scenes, the process is similar and is not described in detail. The fire prediction device disclosed by the application further comprises a training link, namely a network back propagation process. Sufficient sample data needs to be collected prior to training. The sample collection procedure was as follows: firstly, various sensor data of a single scene are collected, and the fire condition is manually calibrated, and the fire condition is divided into two conditions of fire occurrence and fire non-occurrence. Changing scene environment, such as increasing pedestrian walking links, or changing physical parameters of ambient light, temperature, humidity and the like, or intentionally igniting a certain object under the condition of ensuring safety to cause the condition similar to fire and the like, repeatedly collecting sample data of various conditions to obtain a sample set, extracting various data, and according to the following steps of 7: 3 into training and test sets. And for the loss function in the training process, adopting a cross entropy loss function, solving the cross entropy of the prediction result and the label value, and reversely updating the weight values of each layer by adopting an Adam optimization algorithm according to the set learning rate.
Preferably, the terminal node 3 and coordinator 4 are CC2530 terminal node 3 and CC2530 coordinator 4.
Preferably, the temperature sensor is a DS18B20 sensor.
Preferably, the humidity sensor is an SHT20 sensor.
Preferably, the smoke sensor is an MQ-2 smoke gas sensor.
Preferably, the flame sensor is an infrared light receiving sensor.
Preferably, the CO concentration sensor is a MS2200 carbon monoxide sensor.
Preferably, the image sensor 2 is an AI-SU500C color CCD camera.
The sensors can also be the same type of sensors which can achieve similar acquisition effects.
The above description is only a preferred embodiment of the present invention, and should not be taken as limiting the invention, and any modifications, equivalent replacements, improvements, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A fire prediction apparatus, comprising: the system comprises at least one environment sensor (1), at least one image sensor (2), at least one terminal node (3), a coordinator (4) and an upper computer (5);
the environment sensor (1) is used for collecting environment data, the image sensor (2) is used for collecting image data, and the environment data and the image data are all generated to the terminal node (3);
the terminal node (3) is used for receiving environment data and sending the environment data to the coordinator (4);
the coordinator (4) generates packed data according to the environment data of the receiving terminal node (3) and the environment data, and sends the packed data to the upper computer (5);
the upper computer (5) is used for monitoring and receiving the packed data of the coordinator (4), generating a decision signal according to the packed data and sending the decision signal to the alarm (6);
environmental sensor (1) with image sensor (2) all with terminal node (3) one-to-one electricity is connected, terminal node (3) all with coordinator (4) communication is connected, coordinator (4) with host computer (5) communication is connected, the output of host computer (5) with alarm (6) are connected.
2. A fire prediction device as defined in claim 1, wherein: the environment sensor is characterized by further comprising a filter circuit (7), the environment sensor (1) is electrically connected with the input end of the filter circuit (7), and the output end of the filter circuit (7) is electrically connected with the input end of the terminal node (3).
3. A fire prediction device as defined in claim 1, wherein: the upper computer (5) comprises a convolutional neural network trainer and a monitor;
the convolutional neural network trainer is used for generating fire probability after feature extraction data splicing is carried out on the environmental data and the image data in the packed data, and transmitting the fire probability to the monitor;
the monitor is used for receiving the fire probability and displaying the fire probability;
the convolutional neural network trainer is in communication connection with the coordinator (4), and is connected with the monitor.
4. A fire prediction device as defined in claim 1, wherein: the environment sensor (1) comprises a temperature sensor, a humidity sensor, a smoke sensor, a flame sensor and a CO concentration sensor, and the temperature sensor, the humidity sensor, the smoke sensor, the flame sensor and the CO concentration sensor are all electrically connected with the terminal node (3).
5. A fire prediction device as defined in claim 4, wherein: the temperature sensor is a DS18B20 sensor.
6. A fire prediction device as defined in claim 4, wherein: the terminal node (3) and the coordinator (4) are a CC2530 terminal node (3) and a CC2530 coordinator (4).
7. A fire prediction device as defined in claim 4, wherein: the smoke sensor is an MQ-2 smoke gas-sensitive sensor.
8. A fire prediction device as defined in claim 4, wherein: the flame sensor is an infrared light receiving sensor.
9. A fire prediction device as defined in claim 4, wherein: the CO concentration sensor is an MS2200 carbon monoxide sensor.
10. A fire prediction device as defined in claim 4, wherein: the image sensor (2) is an AI-SU500C color CCD camera.
CN202022360296.1U 2020-10-21 2020-10-21 Fire prediction device Active CN213424156U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202022360296.1U CN213424156U (en) 2020-10-21 2020-10-21 Fire prediction device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202022360296.1U CN213424156U (en) 2020-10-21 2020-10-21 Fire prediction device

Publications (1)

Publication Number Publication Date
CN213424156U true CN213424156U (en) 2021-06-11

Family

ID=76268378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202022360296.1U Active CN213424156U (en) 2020-10-21 2020-10-21 Fire prediction device

Country Status (1)

Country Link
CN (1) CN213424156U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115273385A (en) * 2022-07-11 2022-11-01 杭州海康威视数字技术股份有限公司 Camera for flame detection

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115273385A (en) * 2022-07-11 2022-11-01 杭州海康威视数字技术股份有限公司 Camera for flame detection
CN115273385B (en) * 2022-07-11 2024-03-26 杭州海康威视数字技术股份有限公司 A camera for flame detection

Similar Documents

Publication Publication Date Title
CN104966375A (en) Security monitoring system and monitoring method
CN106652285A (en) Distributed multi-defence-area vibration optical fiber perimeter alarm system and perimeter monitoring method
AU2021100365A4 (en) A multi-sensor-based intelligent monitoring and early warning system and method for dam safety
CN102208018A (en) Method for recognizing fire disaster of power transmission line based on video variance analysis
CN213424156U (en) Fire prediction device
CN117319451B (en) Urban fire-fighting Internet of things supervision system based on multi-mode big data and method thereof
CN113192283B (en) Wireless fire early warning system with multi-sensor information fusion
CN113160513A (en) Flame detection device based on multisensor
CN104240418B (en) A kind of signal processing method and warning device
CN201091014Y (en) Fire detecting device
CN108010254A (en) One kind is based on four wave band infrared flame detectors and its flame identification algorithm
CN114460080A (en) Rice disease and pest intelligent monitoring system
CN112785803A (en) Monitoring system based on Internet of things
CN110930632B (en) Early warning system based on artificial intelligence
CN106548592A (en) A kind of Household security system based on Internet of Things
CN111986436A (en) Comprehensive flame detection method based on ultraviolet and deep neural networks
CN210271155U (en) Composite smoke-sensing detection labyrinth
CN114255562A (en) Wisdom fire control early warning system based on thing networking
CN112947147A (en) Fire-fighting robot based on multi-sensor and cloud platform algorithm
CN116863629A (en) Alarm device, alarm method and electronic equipment based on multiple sensors
CN115931141A (en) Temperature identification method of infrared temperature measurement map based on improved ANN algorithm
CN212569977U (en) Fire identification alarm device based on grey and RBF double-layer neural networks
CN113553996A (en) Dangerous person discovering system based on deep learning neural network emotion recognition
CN205264012U (en) High accuracy natural gas leakage detector based on thing networking
CN210222823U (en) Community population flow analysis system based on probe

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant