CN110427022B - Fire-fighting hidden danger detection robot based on deep learning and detection method - Google Patents

Fire-fighting hidden danger detection robot based on deep learning and detection method Download PDF

Info

Publication number
CN110427022B
CN110427022B CN201910609252.7A CN201910609252A CN110427022B CN 110427022 B CN110427022 B CN 110427022B CN 201910609252 A CN201910609252 A CN 201910609252A CN 110427022 B CN110427022 B CN 110427022B
Authority
CN
China
Prior art keywords
fire
rgb
camera
hidden danger
fighting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910609252.7A
Other languages
Chinese (zh)
Other versions
CN110427022A (en
Inventor
赵熙桐
邝佳
程磊
刘通
李峻
刘江莹
姚栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University of Science and Engineering WUSE
Original Assignee
Wuhan University of Science and Engineering WUSE
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University of Science and Engineering WUSE filed Critical Wuhan University of Science and Engineering WUSE
Priority to CN201910609252.7A priority Critical patent/CN110427022B/en
Publication of CN110427022A publication Critical patent/CN110427022A/en
Application granted granted Critical
Publication of CN110427022B publication Critical patent/CN110427022B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Fire-Detection Mechanisms (AREA)
  • Fire Alarms (AREA)

Abstract

The invention discloses a fire-fighting hidden danger detection robot based on deep learning and a detection method, wherein the detection robot comprises a mobile robot body, a control system, a thermal infrared imager, an RGB-D camera, a laser radar and a gas sensor; the control system, the thermal infrared imager, the RGB-D camera, the laser radar and the gas sensor are all fixedly arranged on the mobile robot body; and the control system is respectively connected and communicated with the mobile robot body, the thermal infrared imager, the RGB-D camera, the laser radar and the gas sensor. The control system controls the mobile robot body to automatically patrol in the target area, and the RGB-D camera acquires RGB images of the target area and depth information of the target area in real time; completing positioning and map construction through an RGB-D camera and a laser mine; then carrying out fire-fighting hidden danger detection and smoldering fire detection in real time; and finally, carrying out classified early warning on fire-fighting hidden dangers. The invention facilitates real-time monitoring of personnel by forming a visual interface.

Description

Fire-fighting hidden danger detection robot based on deep learning and detection method
Technical Field
The invention belongs to the technical field of image recognition, relates to a fire-fighting hidden danger detection robot and a detection method, and particularly relates to a fire-fighting hidden danger detection robot system and a fire-fighting hidden danger detection method based on deep learning.
Background
With the increasing growth of electronic commerce, the requirement on the safety of e-commerce warehouse logistics is higher and higher. The freight transfer warehouse of the logistics company has the characteristics of various goods types, large inventory and accumulation of various flammable packaging cartons. Once a fire disaster happens in the logistics center, the loss is large, and the teaching and training are profound. At present, fire fighting detection in the market generally utilizes a traditional image processing method to detect emerging flames, and can not detect smoldering conditions in the environment and other existing fire fighting hidden dangers in time. Meanwhile, the system issues guidance suggestions about promoting the construction of intelligent fire protection comprehensively in response to the fire department of the ministry of public security. The places such as walkways, stairs, exits and the like in buildings are regulated according to the fourth chapter of fire rules, namely twenty-sixth chapter, and the places need to be kept smooth frequently and are forbidden to stack articles. Evacuation signs and indicator lights, to ensure completeness and good use. And the fire control regulation of the people's republic of China, chapter I, chapter II, fire control work, and the guideline of ' prevention is the main and fire control is combined '. At present, the fire-fighting hidden danger is inspected manually, so that the problems of high cost, low efficiency, incapability of finding the fire-fighting hidden danger and the like exist.
Disclosure of Invention
In order to more efficiently realize fire-fighting hidden danger detection, the invention provides a fire-fighting hidden danger detection robot and a detection method based on deep learning. The detected information such as position information, temperature, gas concentration and images of the fire-fighting hidden danger is issued to an upper computer through an ROS distributed system, a visual interface is formed, and real-time monitoring of personnel is facilitated.
The technical scheme adopted by the detection robot is as follows: the utility model provides a fire control hidden danger detection robot based on degree of depth study which characterized in that: the system comprises a mobile robot body, a control system, a thermal infrared imager, an RGB-D camera, a laser radar and a gas sensor;
the control system, the thermal infrared imager, the RGB-D camera, the laser radar and the gas sensor are all fixedly arranged on the mobile robot body; and the control system is respectively connected and communicated with the mobile robot body, the thermal infrared imager, the RGB-D camera, the laser radar and the gas sensor.
The detection method adopts the technical scheme that: a fire-fighting hidden danger detection method based on deep learning is characterized by comprising the following steps:
step 1: the control system controls the mobile robot body to automatically patrol in the target area, and the RGB-D camera acquires RGB images of the target area and depth information of the target area in real time; completing positioning and map construction (SLAM) by an RGB-D camera and a laser radar;
step 2: carrying out fire-fighting hidden danger detection in real time in the patrol process of the mobile robot body;
and step 3: detecting smoldering fire in real time in the inspection process of the mobile robot body;
and 4, step 4: and (5) carrying out classified early warning on fire-fighting hidden dangers.
According to the method, the visual laser SLAM (Simultaneous Localization and mapping) is utilized to understand the warehouse environment and construct a navigation map, compared with a single laser composition, the method has more environment information, and three-dimensional environment information is displayed on a two-dimensional plane; the fire fighting potential safety hazard in the warehouse is detected and identified, the coordinates of a potential danger area and the coordinates of a smoldering point are determined through the matching of a thermal image and an RGB-D camera image, the existing fire condition is alarmed in the prior art, and the fire fighting potential safety hazard and the smoldering condition are early-warned and positioned; compared with the traditional alarm system, the invention carries out graded early warning according to different fire-fighting risk degrees.
Drawings
FIG. 1 is a schematic structural diagram of a detection robot according to an embodiment of the present invention;
FIG. 2 is a method schematic of an embodiment of the invention;
FIG. 3 is a flow chart of location and mapping SLAM according to an embodiment of the present invention;
FIG. 4 is a SLAM effect graph of an embodiment of the present invention in which (a) laboratory, (b) laser patterning effect;
FIG. 5 is a schematic diagram of an improved YOLOV3 infrastructure according to an embodiment of the present invention;
fig. 6 is a diagram illustrating the detection effect of fire hazard in different scenarios according to the embodiment of the present invention, where (a) a normal fire extinguisher cabinet, (b) an abnormal fire extinguisher cabinet, (c) a normal fire indicator lamp, and (d) an abnormal fire indicator lamp;
FIG. 7 is a flow chart of smoldering fire detection in accordance with an embodiment of the present invention;
FIG. 8 is a diagram of positioning and detecting smoldering fires, wherein (a) an RGB smoldering point diagram, (b) an infrared smoldering point diagram, and (c) a smoldering trace tracking diagram, according to an embodiment of the present invention;
FIG. 9 is a graph showing the change in concentration of gas in air according to an embodiment of the present invention;
FIG. 10 is a block diagram of a fuzzy inference decision model according to an embodiment of the present invention.
Detailed Description
In order to facilitate the understanding and implementation of the present invention for those of ordinary skill in the art, the present invention is further described in detail with reference to the accompanying drawings and examples, it is to be understood that the embodiments described herein are merely illustrative and explanatory of the present invention and are not restrictive thereof.
Referring to fig. 1, the fire-fighting hidden danger detection robot based on deep learning provided by the invention comprises a mobile robot body 1, a control system 2, a thermal infrared imager 3, an RGB-D camera 4, a laser radar 5 and a gas sensor 6; the control system 2, the thermal infrared imager 3, the RGB-D camera 4, the laser radar 5 and the gas sensor 6 are all fixedly arranged on the mobile robot body 1; the control system 2 is respectively connected and communicated with the mobile robot body 1, the thermal infrared imager 3, the RGB-D camera 4, the laser radar 5 and the gas sensor 6.
The gas sensor 6 of the present embodiment includes a CO gas sensor, a CO2 gas sensor, and a combustible gas sensor.
The thermal infrared imager 3 and the RGB-D camera 4 of the present embodiment are located on the same vertical plane.
Referring to fig. 2, the method for detecting fire hazard based on deep learning provided by the invention comprises the following steps:
step 1: the control system 2 controls the mobile robot body 1 to automatically patrol in the target area, and the RGB-D camera 4 acquires RGB images of the target area and depth information of the target area in real time; completing positioning and Mapping and constructing Simultaneous Localization and Mapping, SLAM through an RGB-D camera 4 and a laser radar 5;
the control system 2 of the present embodiment is provided with an ROS system and a tensrflow; the method is executed under an ROS operating system and is divided into a laser SLAM inspection system, a fire-fighting hidden danger detection system and a fire-fighting hidden danger grading early warning system which are integrated with vision according to different functions;
referring to fig. 3, the positioning and mapping in this embodiment is specifically implemented by the following sub-steps:
step 1.1: constructing a laser local grid map by using a Hector _ SLAM algorithm;
step 1.2: in the map updating stage, a Bayes estimation method is adopted, point cloud information acquired by the RGB-D camera 4 is fused, a two-dimensional discrete obstacle map is formed by projection, and a two-dimensional local grid map is constructed;
step 1.3: and fusing the local grid maps to construct a global two-dimensional grid map.
Please refer to fig. 4, which is a SLAM effect diagram.
Step 2: during the patrol process of the mobile robot body 1, fire-fighting hidden danger detection is carried out in real time;
the specific implementation of the embodiment includes the following sub-steps:
step 2.1: manufacturing a fire-fighting hidden danger detection data set;
acquiring images of a power transformation room, a fire extinguisher cabinet, an escape door and an evacuation indicator light in different indoor scenes, and enhancing data by rotating, translating, amplifying, reducing, rotating a certain angle and the like, so as to improve the generalization, wherein the training set is 60%, the testing set is 20% and the verifying set is 20%; please refer to Table 1, which is a data set of the present embodiment;
TABLE 1
Label Equipment Train Validate Test Total
nor0 Normal transformer room 786 262 262 1310
ab1 Abnormal transformer room 817 275 275 1367
nor2 Normal fire extinguisher box 767 255 255 1277
ab3 Abnormal fire extinguisher box 889 295 295 1483
nor4 Normal escape door 859 287 287 1433
ab5 Abnormal escape door 862 287 287 1436
nor6 Normal evacuation indicator lamp 782 260 260 1302
ab7 Abnormal evacuation indicator lamp 877 293 293 1463
Step 2.2: changing a basic network in a target detection network YOLOV3 into a vgg16 form, adding a BN layer behind each convolution layer, carrying out batch normalization, preventing gradient disappearance or explosion and accelerating training speed; changing the non-overlapping maximum pooling layer into an overlapping pooling layer to avoid overfitting; three full-connection layers are removed, and a Bottleneck layer is added, so that the calculation amount is reduced.
Please refer to fig. 5, which is a table 2 showing the improved YOLOV3 network structure, the improved YOLOV3 network test accuracy and the number of frames detected per second;
TABLE 2
Figure BDA0002121807550000041
Figure BDA0002121807550000051
Step 2.3: inputting the pixel RGB image into an improved YOLOV3 network for training, and outputting a trained RGB image;
step 2.4: and (4) carrying out fire-fighting hidden danger detection in different scenes, and outputting the detected RGB images and coordinates with the possibility of fire-fighting hidden danger.
And step 3: in the inspection process of the mobile robot body 1, smoldering fire detection is carried out in real time;
referring to fig. 7, in the present embodiment, the specific implementation of step 3 includes the following sub-steps:
step 3.1: acquiring a thermal infrared image and temperature of a target area in real time by using a thermal infrared imager 3; the gas sensor 6 acquires CO and CO in a target area in real time2The concentration of combustible gas;
step 3.2: the combined RGB-D camera and thermal infrared imager are regarded as a binocular camera, the RGB image is matched with corner pixel coordinates in the thermal infrared image by using a self-made thermal infrared chessboard, the matching relation between each thermal image pixel coordinate and the pixel coordinate in the RGB image is obtained, and further the depth information of each thermal image pixel coordinate is obtained;
in this embodiment, the thermal infrared chessboard is composed of a 28cm by 28cm heating plate and a heat insulating material, the heat insulating material is cut into 4cm by 4cm squares, the heating plate is covered with a heat insulating square every 4cm, and the heat insulating square and the uncovered area form a 7 by 7 chessboard.
Step 3.3: the smoldering trend judgment is carried out by combining the data of the gas sensor and the temperature data;
step 3.4: detecting smoldering, outputting temperature, CO and CO2Combustible gas data, and smoldering position coordinates.
FIG. 8 is a diagram of positioning and detecting smoldering fires, wherein (a) an RGB smoldering point diagram, (b) an infrared smoldering point diagram, and (c) a smoldering trace tracking diagram, according to an embodiment of the present invention; FIG. 9 is a graph showing the change in concentration of gas in air according to an embodiment of the present invention;
and 4, step 4: carrying out grading early warning on fire-fighting hidden dangers;
in this embodiment, the specific implementation of step 4 includes the following sub-steps:
step 4.1: dividing the fire-fighting hidden danger into three fuzzy sets of smoldering fire PL, PM with fire-fighting hidden danger and PS without abnormal condition by using a fuzzy decision method;
please refer to fig. 10, which is a fuzzy inference decision model diagram of the present embodiment, specifically including the following steps:
step 4.1.1: the information layer is mainly responsible for collecting physical quantity and chemical quantity characteristics from the sensor, a series of preprocessing is carried out on collected signals, and the processed signals are sent to the characteristic layer through the A/D converter;
step 4.1.2: adopting a BP neural network as a processing algorithm of a characteristic layer;
step 4.1.3: and in the decision layer, the potential fire-fighting hidden danger and the fire recognition result are fused to obtain the final decision output.
Step 4.2: outputting a smoldering condition as a primary alert; outputting the abnormal conditions that the fire extinguisher cabinet, the escape door, the indicator light and the transformer substation chamber do not conform to the fire-fighting rules and the fire-fighting regulations of the people's republic of China, such as the damage of the fire extinguisher cabinet, the blockage of the escape door, the damage of the indicator light, the shielding of the door of the transformer substation and the like, as a secondary alarm;
step 4.3: the position coordinates, the temperature, the CO2, the combustible gas concentration and the detection picture information with fire-fighting hidden danger are visually displayed on the display of the control system 2.
The fire-fighting hidden danger detection system is different from the traditional fire-fighting hidden danger detection system, combines fire-fighting inspection with detection of various fire-fighting hidden dangers and detection of smoldering condition, and can give an early warning in time before open fire occurs, so that the fire rate of numerous places which are easy to have fire-fighting hidden dangers, such as logistics warehouses, superstores and the like, can be reduced to a great extent.
It should be understood that parts of the specification not set forth in detail are well within the prior art.
It should be understood that the above description of the preferred embodiments is given for clarity and not for any purpose of limitation, and that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (4)

1. A fire-fighting hidden danger detection method based on deep learning is characterized in that a fire-fighting hidden danger detection robot based on deep learning is adopted;
the method is characterized in that: the fire-fighting hidden danger detection robot comprises a mobile robot body (1), a control system (2), a thermal infrared imager (3), an RGB-D camera (4), a laser radar (5) and a gas sensor (6);
the control system (2), the thermal infrared imager (3), the RGB-D camera (4), the laser radar (5) and the gas sensor (6) are all fixedly arranged on the mobile robot body (1); the control system (2) is respectively connected and communicated with the mobile robot body (1), the thermal infrared imager (3), the RGB-D camera (4), the laser radar (5) and the gas sensor (6);
the method comprises the following steps:
step 1: the control system (2) controls the mobile robot body (1) to automatically patrol in the target area, and the RGB-D camera (4) acquires RGB images of the target area and depth information of the target area in real time; positioning and map construction are completed through an RGB-D camera (4) and a laser radar (5);
the positioning and mapping are specifically realized by the following substeps:
step 1.1: constructing a laser local grid map by using a Hector _ SLAM algorithm;
step 1.2: in the map updating stage, a Bayes estimation method is adopted, point cloud information acquired by an RGB-D camera (4) is fused, a two-dimensional discrete obstacle map is formed by projection, and a two-dimensional local grid map is constructed;
step 1.3: fusing local grid maps and constructing a global two-dimensional grid map;
step 2: during the patrol process of the mobile robot body (1), fire-fighting hidden danger detection is carried out in real time;
the specific implementation of the step 2 comprises the following substeps:
step 2.1: manufacturing a fire-fighting hidden danger detection data set;
acquiring images of a power transformation room, a fire extinguisher cabinet, an escape door and an evacuation indicator light in different indoor scenes, and dividing a data set into a training set, a testing set and a verification set after data enhancement;
step 2.2: changing a basic network in a target detection network YOLOV3 into a vgg16 form, and simultaneously adding a BN layer behind each convolution layer for batch normalization; changing the non-overlapping maximum pooling layer into an overlapping pooling layer; removing three full connection layers, and adding a Bottleneck layer;
step 2.3: inputting the pixel RGB image into an improved YOLOV3 network for training, and outputting a trained RGB image;
step 2.4: carrying out fire-fighting hidden danger detection in different scenes, and outputting detected RGB images and coordinates which may have fire-fighting hidden dangers;
and step 3: in the patrol process of the mobile robot body (1), smoldering fire detection is carried out in real time;
the specific implementation of the step 3 comprises the following substeps:
step 3.1: acquiring a thermal infrared image and temperature of a target area in real time by using a thermal infrared imager (3); the gas sensor (6) acquires CO and CO in a target area in real time2The concentration of combustible gas;
step 3.2: the combined RGB-D camera and thermal infrared imager are regarded as a binocular camera, the RGB image is matched with corner pixel coordinates in the thermal infrared image by using a thermal infrared chessboard, the matching relation between the pixel coordinates of each thermal image and the pixel coordinates in the RGB image is obtained, and further the depth information of each thermal image pixel coordinate is obtained;
step 3.3: the smoldering trend judgment is carried out by combining the data of the gas sensor and the temperature data;
step 3.4: detecting smoldering, outputting temperature, CO and CO2Combustible gas data, and smoldering position coordinates;
and 4, step 4: carrying out grading early warning on fire-fighting hidden dangers;
the specific implementation of the step 4 comprises the following substeps:
step 4.1: the data processing of the fire-fighting early warning system adopts a three-layer structure: the information layer, the characteristic layer and the decision layer divide the fire-fighting hidden danger into three fuzzy sets of smoldering fire PL, existence of PM (particulate matter) of the fire-fighting hidden danger and no abnormal PS (packet switched) by using a fuzzy decision method;
step 4.1.1: the information layer is responsible for collecting physical quantity and chemical quantity characteristics from the sensor, a series of preprocessing is carried out on collected signals, and the processed signals are sent to the characteristic layer through the A/D converter;
step 4.1.2: adopting a BP neural network as a processing algorithm of a characteristic layer;
step 4.1.3: in a decision layer, fusing the potential fire-fighting hidden danger and the fire recognition result to obtain final decision output;
step 4.2: outputting a smoldering condition as a primary alert; outputting a secondary alarm when the fire extinguisher cabinet, the escape door, the indicator light and the transformer room do not accord with the fire-fighting rules and the fire-fighting regulations of the people's republic of China;
step 4.3: and (3) visually displaying the position coordinates, the temperature, the CO and CO2, the combustible gas concentration and the detection picture information with the fire-fighting hidden danger on a display of the control system (2).
2. The method of claim 1, wherein: the gas sensor (6) comprises a CO gas sensor, a CO2 gas sensor and a combustible gas sensor.
3. The method of claim 1, wherein: the thermal infrared imager (3) and the RGB-D camera (4) are positioned on the same vertical plane.
4. The method of claim 1, wherein: in step 3.2, the thermal infrared chessboard consists of a 28cm by 28cm heating plate and thermal insulation materials, the thermal insulation materials are cut into 4cm by 4cm squares, the heating plate is covered with a thermal insulation square every 4cm, and the thermal insulation square and the uncovered area form a 7 by 7 chessboard.
CN201910609252.7A 2019-07-08 2019-07-08 Fire-fighting hidden danger detection robot based on deep learning and detection method Active CN110427022B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910609252.7A CN110427022B (en) 2019-07-08 2019-07-08 Fire-fighting hidden danger detection robot based on deep learning and detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910609252.7A CN110427022B (en) 2019-07-08 2019-07-08 Fire-fighting hidden danger detection robot based on deep learning and detection method

Publications (2)

Publication Number Publication Date
CN110427022A CN110427022A (en) 2019-11-08
CN110427022B true CN110427022B (en) 2022-03-15

Family

ID=68410345

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910609252.7A Active CN110427022B (en) 2019-07-08 2019-07-08 Fire-fighting hidden danger detection robot based on deep learning and detection method

Country Status (1)

Country Link
CN (1) CN110427022B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110889364A (en) * 2019-11-21 2020-03-17 大连理工大学 Method for constructing grid map by using infrared sensor and visible light sensor
CN110910498B (en) * 2019-11-21 2021-07-02 大连理工大学 Method for constructing grid map by using laser radar and binocular camera
CN110942401B (en) * 2019-11-21 2023-12-19 黑龙江电力调度实业有限公司 Intelligent communication method for electric power Internet of things
CN111015687A (en) * 2019-12-31 2020-04-17 江苏顺飞信息科技有限公司 Industrial combustible gas leakage source detection robot and working method
CN111652261A (en) * 2020-02-26 2020-09-11 南开大学 Multi-modal perception fusion system
CN111223265B (en) * 2020-04-16 2020-07-28 上海翼捷工业安全设备股份有限公司 Fire detection method, device, equipment and storage medium based on neural network
CN112509269A (en) * 2020-10-30 2021-03-16 重庆电子工程职业学院 Wireless fire alarm system
CN113384844B (en) * 2021-06-17 2022-01-28 郑州万特电气股份有限公司 Fire extinguishing action detection method based on binocular vision and fire extinguisher safety practical training system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107065874A (en) * 2017-04-13 2017-08-18 常州大学怀德学院 A kind of fire patrol intelligent vehicle based on laser SLAM technologies
CN106843240A (en) * 2017-04-18 2017-06-13 哈尔滨理工大学 System is coordinated by fire prevention robot and multimachine based on intelligent video
CN107193277B (en) * 2017-05-05 2020-05-08 宁波华狮智能科技有限公司 Autonomous mobile fire-fighting robot capable of automatically detecting and extinguishing fire and control method
CN108538007A (en) * 2018-04-09 2018-09-14 安徽大学 A kind of inside fire early warning system and method based on radar avoidance trolley platform
CN109276833A (en) * 2018-08-01 2019-01-29 吉林大学珠海学院 A kind of robot patrol fire-fighting system and its control method based on ROS
CN109356652B (en) * 2018-10-12 2020-06-09 深圳市翌日科技有限公司 Underground self-adaptive fire classification early warning method and system

Also Published As

Publication number Publication date
CN110427022A (en) 2019-11-08

Similar Documents

Publication Publication Date Title
CN110427022B (en) Fire-fighting hidden danger detection robot based on deep learning and detection method
Yuan et al. Fire detection using infrared images for UAV-based forest fire surveillance
CN109646853A (en) A kind of autonomous fire fighting robot device and monitoring system
Yuan et al. Vision-based forest fire detection in aerial images for firefighting using UAVs
GB2547416A (en) A fire detection system
CN111639825B (en) Forest fire indication escape path method and system based on A-Star algorithm
Zhu et al. Intelligent fire monitor for fire robot based on infrared image feedback control
CN109472411A (en) The adaptive emergency evacuation navigation system of large scale business synthesis
CN111686392A (en) Artificial intelligence fire extinguishing system is surveyed to full scene of vision condition
CN114272548B (en) Intelligent fire extinguishing equipment for buildings and fire extinguishing method thereof
CN112101181A (en) Automatic hidden danger scene recognition method and system based on deep learning
CN105975991B (en) A kind of improved extreme learning machine fire kind recognition methods
CN114186735B (en) Fire emergency lighting lamp layout optimization method based on artificial intelligence
Zhang et al. Mobile sentry robot for laboratory safety inspection based on machine vision and infrared thermal imaging detection
CN112488423A (en) Method for planning escape path of trapped personnel in fire scene
CN115578684A (en) Special robot cooperative auxiliary rescue control method for building fire fighting
CN107894285A (en) A kind of infrared temperature inspection device and method based on augmented reality
CN116764147A (en) Carry on extinguishing device's explosion-proof unmanned car of patrolling and examining
CN113569801A (en) Distribution construction site live equipment and live area identification method and device thereof
Ko et al. Intelligent wireless sensor network for wildfire detection
CN113657238B (en) Fire early warning method based on neural network, storage medium and terminal equipment
Aspragathos et al. THEASIS System for Early Detection of Wildfires in Greece: Preliminary Results from its Laboratory and Small Scale Tests
CN116739870B (en) Emergency system management system and method
Fujita et al. Collapsed Building Detection Using Multiple Object Tracking from Aerial Videos and Analysis of Effective Filming Techniques of Drones
CN114442606B (en) Alert condition early warning robot and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant