CN116152667A - Fire detection method and device, electronic equipment and storage medium - Google Patents

Fire detection method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116152667A
CN116152667A CN202310395650.XA CN202310395650A CN116152667A CN 116152667 A CN116152667 A CN 116152667A CN 202310395650 A CN202310395650 A CN 202310395650A CN 116152667 A CN116152667 A CN 116152667A
Authority
CN
China
Prior art keywords
image
target object
target
area
alarm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310395650.XA
Other languages
Chinese (zh)
Other versions
CN116152667B (en
Inventor
王隐之
李�诚
周晓
祝克帅
吴罕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Intelingda Information Technology Co ltd
Intelingda Information Technology Shenzhen Co ltd
Original Assignee
Hefei Intelingda Information Technology Co ltd
Intelingda Information Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Intelingda Information Technology Co ltd, Intelingda Information Technology Shenzhen Co ltd filed Critical Hefei Intelingda Information Technology Co ltd
Priority to CN202310395650.XA priority Critical patent/CN116152667B/en
Publication of CN116152667A publication Critical patent/CN116152667A/en
Application granted granted Critical
Publication of CN116152667B publication Critical patent/CN116152667B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/28Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming

Abstract

The embodiment of the application provides a fire detection method, a fire detection device, electronic equipment and a storage medium, and relates to the technical field of security protection, wherein the fire detection method comprises the following steps: acquiring a first image; detecting the first image by adopting a target detection network to obtain a target object detected as smoke or flame; when the target object is a foreground in the first image, determining the total frame number of alarm images in a continuous preset number of frame images before the first image, wherein the alarm images are images including the target object in the preset number of frame images; if the total frame number is greater than or equal to the preset frame number threshold, judging the structural similarity and the dynamic characteristics of the target object to obtain a fire detection result. By applying the technical scheme provided by the embodiment of the application, the application range of fire detection can be enlarged, and the false alarm rate of fire is reduced.

Description

Fire detection method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of security technologies, and in particular, to a fire detection method, a fire detection device, an electronic device, and a storage medium.
Background
In the production and living scenes such as construction sites, coal mines, petrochemical industry, water conservancy and hydropower, forest fire prevention, warehouse logistics, straw burning and the like, the situation of a fire disaster is more serious. The current fire detection methods can be divided into two types, the first is a method for detecting based on physical characteristics of smoke or flame such as thermal imaging, infrared imaging, polarization lens, smoke concentration, etc., and the second is a method for detecting based on characteristics of smoke or flame in a still image. The first fire detection method can limit the recognition distance of smoke or flame, can only be suitable for special scenes, can not detect smoke and flame at the same time, and can not be widely applied, and the second fire detection method can misreport the background information such as lamplight, cloud and the like as fire.
Disclosure of Invention
An object of the embodiments of the present application is to provide a fire detection method, apparatus, electronic device, and storage medium, so as to expand the application range of fire detection and reduce the false alarm rate of fire. The specific technical scheme is as follows:
in a first aspect, embodiments of the present application provide a fire detection method, including:
acquiring a first image;
detecting the first image by adopting a target detection network to obtain a target object detected as smoke or flame;
when the target object is a foreground in the first image, determining the total frame number of alarm images in a continuous preset number of frame images before the first image, wherein the alarm images are images including the target object in the preset number of frame images;
and if the total frame number is greater than or equal to a preset frame number threshold, judging the structural similarity and the dynamic characteristics of the target object to obtain a fire detection result.
In some embodiments, the method further comprises:
determining foreground pixel points and background pixel points in the first image by using a Gaussian mixture model to obtain a foreground image;
counting the number of foreground pixel points included in a corresponding target area of the target object on the foreground image;
If the ratio of the counted number to the total number of the pixel points included in the target area is greater than a preset proportion threshold, determining that the target object is the foreground in the first image.
In some embodiments, the area occupied by the target object in the first image is a first area;
and judging the structural similarity of the target object, wherein the method comprises the following steps:
acquiring a second image, wherein an area with the same position as the first area in the second image is a second area;
calculating the structural similarity of the first area and the second area;
and if the structural similarity is within a preset similarity interval, determining that the target object is suspected smoke.
In some embodiments, the step of calculating the structural similarity of the first region and the second region comprises:
and determining the structural similarity of the first area and the second area according to the brightness average value, the brightness standard deviation and the brightness covariance of the first area and the brightness average value, the brightness standard deviation and the brightness covariance of the second area.
In some embodiments, performing dynamic characteristic determination on the target object includes:
Intercepting the areas occupied by the target objects in the alarm images to obtain a plurality of alarm target small images;
and adopting a dynamic classification network to judge the dynamic characteristics of the plurality of alarm target small images to obtain a classification result, wherein the classification result indicates whether the target object is suspected smoke or not.
In some embodiments, the step of intercepting the areas occupied by the target objects in the plurality of alert images to obtain a plurality of alert target small images includes:
selecting an alarm image from the plurality of alarm images in an equally spaced manner;
and intercepting the area occupied by the target object from the selected alarm image to obtain an alarm target small image.
In some embodiments, after obtaining the fire detection result, the method further comprises:
when the fire detection result indicates that the target object is flame, a thermal imaging technology or an infrared imaging technology is adopted to obtain a third image, wherein the area occupied by the target object in the third image is a third area;
and if the average brightness value of the third area is larger than a preset brightness threshold value, outputting alarm information indicating that the target object is flame.
In a second aspect, embodiments of the present application provide a fire detection apparatus, the apparatus comprising:
A first acquisition module: for acquiring a first image;
and a detection module: the first image is detected by adopting a target detection network to obtain a target object detected as smoke or flame;
a first determination module: the method comprises the steps of determining the total frame number of warning images in a continuous preset number of frame images before a first image when the target object is a foreground in the first image, wherein the warning images are images including the target object in the preset number of frame images;
and a judging module: and if the total frame number is greater than or equal to a preset frame number threshold, judging the structural similarity and the dynamic characteristics of the target object to obtain a fire detection result.
In some embodiments, the apparatus further comprises:
a second determination module: the method comprises the steps of determining foreground pixel points and background pixel points in a first image by using a Gaussian mixture model to obtain a foreground image;
and a statistics module: the method comprises the steps of counting the number of foreground pixel points included in a corresponding target area of the target object on the foreground image;
and a third determination module: and determining that the target object is the foreground in the first image if the ratio of the counted number to the total number of the pixel points included in the target area is greater than a preset proportion threshold.
In some embodiments, the area occupied by the target object in the first image is a first area;
the judging module comprises:
an acquisition unit: the method comprises the steps of acquiring a second image, wherein a region, which is the same as the first region in position, in the second image is a second region;
a calculation unit: for calculating a structural similarity of the first region and the second region;
a determination unit: and if the structural similarity is within a preset similarity interval, determining that the target object is suspected smoke.
In some embodiments, the computing unit is specifically configured to:
and determining the structural similarity of the first area and the second area according to the brightness average value, the brightness standard deviation and the brightness covariance of the first area and the brightness average value, the brightness standard deviation and the brightness covariance of the second area.
In some embodiments, the determining module includes:
and an intercepting unit: the method comprises the steps of intercepting the areas occupied by the target objects in a plurality of alarm images to obtain a plurality of alarm target small images;
a judging unit: and the dynamic characteristic judgment is carried out on the plurality of alarm target small images by adopting a dynamic classification network to obtain a classification result, and the classification result indicates whether the target object is suspected smoke or not.
In some embodiments, the intercepting unit is specifically configured to:
selecting an alarm image from the plurality of alarm images in an equally spaced manner;
and intercepting the area occupied by the target object from the selected alarm image to obtain an alarm target small image.
In some embodiments, the apparatus further comprises:
and a second acquisition module: after the fire detection result is obtained, when the fire detection result indicates that the target object is flame, a thermal imaging technology or an infrared imaging technology is adopted to obtain a third image, wherein the area occupied by the target object in the third image is a third area;
and an output module: and outputting alarm information indicating that the target object is flame if the average brightness value of the third area is larger than a preset brightness threshold value.
In a third aspect, an embodiment of the present application further provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
and the processor is used for realizing any fire detection method step when executing the program stored in the memory.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium having a computer program stored therein, which when executed by a processor, implements any of the above-described fire detection method steps.
In a fifth aspect, embodiments of the present invention also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform any of the above-described fire detection method steps.
The beneficial effects of the embodiment of the application are that:
in the technical scheme provided by the embodiment of the application, a large number of target objects which are smoke or flame are screened out from the image by utilizing the target detection network and combining the image characteristics such as the color, the shape and the like of the smoke and the flame in the image, so that the high detection rate of the smoke and the flame is realized, and the application range of the fire detection method is enlarged. After the target object is obtained, the target object is subjected to multistage filtration by adopting the prospect, the total frame number, the structural similarity, the dynamic characteristics and the like of the collected target object, the non-smog and non-flame target object is removed, the target object which is smog or flame can be accurately determined based on the target object, the accuracy of a fire detection result is improved, and the false alarm rate of the fire is reduced.
Of course, not all of the above-described advantages need be achieved simultaneously in practicing any one of the products or methods of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the following description will briefly introduce the drawings that are required to be used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other embodiments may also be obtained according to these drawings to those skilled in the art.
Fig. 1 is a schematic flow chart of a fire detection method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an object detection network according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an image frame sequence according to an embodiment of the present application;
FIG. 4 is a schematic diagram of recording tracking information according to an embodiment of the present application;
fig. 5 is a schematic flow chart of a foreground detection method according to an embodiment of the present application;
fig. 6 is a schematic flow chart of structural similarity determination provided in the embodiment of the present application;
FIG. 7 is a schematic flow chart of dynamic characteristic determination according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a dynamic classification network according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a second flow chart of a fire detection method according to an embodiment of the present disclosure;
FIG. 10 is a schematic diagram of a third flow chart of a fire detection method according to an embodiment of the present disclosure;
FIG. 11 is a schematic illustration of an acquired image frame provided in an embodiment of the present application;
fig. 12 is a schematic diagram of a target network detection result provided in an embodiment of the present application;
FIG. 13 is a schematic diagram of a foreground detection result provided in an embodiment of the present application;
FIG. 14 is a schematic diagram of tracking and buffering results provided in an embodiment of the present application;
FIG. 15 is a schematic view of a fire detection device according to an embodiment of the present disclosure;
fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. Based on the embodiments herein, a person of ordinary skill in the art would be able to obtain all other embodiments based on the disclosure herein, which are within the scope of the disclosure herein.
In the production and living scenes such as construction sites, coal mines, petrochemical industry, water conservancy and hydropower, forest fire prevention, warehouse logistics, straw burning and the like, the situation of a fire disaster is more serious. The current fire detection methods can be divided into two types, the first is a method for detecting based on physical characteristics of smoke or flame such as thermal imaging, infrared imaging, polarization lens, smoke concentration, etc., and the second is a method for detecting based on characteristics of smoke or flame in a still image. The first fire detection method can limit the recognition distance of smoke or flame, can only be suitable for special scenes, can not detect smoke and flame at the same time, and can not be widely applied, and the second fire detection method can misreport the background information such as lamplight, cloud and the like as fire.
In order to solve the above technical problems, the embodiments of the present application provide a fire detection method, which may be applied to an electronic device, where the electronic device may be a mobile phone, a tablet, a computer, a server, a cluster, or other devices, and this is not limited. For ease of understanding, the following description uses the electronic device as an execution body, and is not intended to be limiting.
In the fire detection method provided by the embodiment of the application, the target detection network is a neural network based on deep learning, such as a yolov5s network. In the fire detection method, the electronic equipment can fit the characteristics of the smoke and the flame in the image by using a deep learning method based on the characteristics of the smoke and the flame, and give a pre-screening result, such as a target object of the smoke or the flame. Here, a lower threshold may be set so that smoke targets and flame targets have extremely high detection rates; in addition, the target detection network can be quantized, and the quantized target detection network is used for target detection, so that the network reasoning speed is improved, and the target detection network can reach real-time reasoning speed at an end side platform of 1T (maximum packet forwarding rate and maximum exchange capacity of the whole machine).
After the pre-screening result is obtained, the electronic equipment combines the dynamic characteristics such as the shape change and the color change of the same tracked target object, and the flame physical characteristics such as the brightness information of the infrared imaging or thermal imaging domain, and the like of the moving foreground region in the multi-frame image, and performs multi-stage filtering on the pre-screening result so as to achieve the purposes of extremely high detection first and multi-stage filtering, so that the fire detection method provided by the embodiment of the application can finish detection and alarm of smoke and flame within a few seconds, greatly shortens the fire alarm time, reduces the working intensity of manual monitoring, improves the working efficiency and quality of fire safety supervision, and reduces the hidden danger of fire.
The fire detection method provided by the embodiment of the application aims at providing a fire detection method suitable for various scenes such as construction sites, coal mines, petrochemical industry, water conservancy and hydropower, forest fire prevention, storage logistics, straw burning and the like by combining the dynamic characteristics of smoke and flame, the physical characteristics of flame and the like through a neural network means, greatly expanding the applicable scenes, improving the detection rate of the smoke and flame and reducing the false alarm rate of smoke and fire.
In addition, aiming at the targets which are easily misreported as smoke or flame, such as lamplight, color similar objects, clouds, mountain fog and the like, the fire detection method provided by the embodiment of the application can ensure the high detection of the pyrotechnic target through the extremely low threshold value, and then the non-smoke and non-flame targets are filtered through various filtering means. In the process of filtering by using a plurality of filtering means, the fire detection method provided by the embodiment of the application can be realized by only setting the individual threshold value, so that the fire detection method provided by the embodiment of the application is simple and convenient to operate and low in operation and maintenance cost.
The fire detection method provided in the embodiment of the present application will be described in detail by way of specific examples.
Referring to fig. 1, fig. 1 is a schematic flow chart of a fire detection method according to an embodiment of the present application, where the method includes the following steps:
step S11: a first image is acquired.
Step S12: and detecting the first image by adopting a target detection network to obtain a target object detected as smoke or flame.
Step S13: when the target object is a foreground in the first image, determining the total frame number of the warning images in the continuous preset number of frame images before the first image, wherein the warning images are images including the target object in the preset number of frame images.
Step S14: if the total frame number is greater than or equal to the preset frame number threshold, judging the structural similarity and the dynamic characteristics of the target object to obtain a fire detection result.
In the technical scheme provided by the embodiment of the application, the colors of smoke and flame in the image are combined by utilizing the target detection network. The shape and other image characteristics, a large number of target objects which are smoke or flame are screened from the image, so that the high detection rate of the smoke and the flame is realized, and the application range of the fire detection method is enlarged. After the target object is obtained, the target object is subjected to multistage filtration by adopting the prospect, the total frame number, the structural similarity, the dynamic characteristics and the like of the collected target object, the non-smog and non-flame target object is removed, the target object which is smog or flame can be accurately determined based on the target object, the accuracy of a fire detection result is improved, and the false alarm rate of the fire is reduced.
In the step S11, the first image may be one frame image currently acquired by the camera, or may be any frame image in a video acquired in advance by the camera, which is not limited thereto. Taking the first image as an example of a frame of image currently acquired by the camera. After the camera acquires a frame of image, the frame of image is transmitted to the electronic equipment. The electronic device receives the frame image (i.e., the first image) sent by the camera.
In the step S12, the target detection network may be any neural network, such as the yolov5S network. In one example, the structure of the object detection network may be shown in fig. 2, where in fig. 2, the network layers in the object detection network are as follows: CBL layer, csp1_x layer, csp2_x layer, SPP layer, deconvolution layer (Conv transfer 2d layer), feature fusion layer (Concat layer), convolution layer (Conv layer), and output layer.
CBL layer: consists of a convolution layer (Conv layer), a batch normalization layer (batch normalization, BN), a LeakyRelu activation function.
Csp1_x layer: referring to the CSPNet (CSPNet: cross Stage Partial Network, cross-phase local network) structure, the network layer consists of a CBL layer, x Resunit layers (Resunit residual structure), a Conv layer, a Concat layer, a BN layer and a LeakyRelu activation function. The character x in the csp1_x layer indicates the number of reset layers included in the csp1_x layer. As shown in fig. 2, the character x=1 or 3 in the csp1_x layer, when x=1, the csp1_x layer is the csp1_1 layer, and the csp1_1 layer includes 1 reset layer; when x=3, the csp1_x layer is the csp1_3 layer, and the csp1_3 layer includes 3 reset layers.
Reset layer: the residual structure in the Resnet network is used for reference, and the residual structure consists of a CBL layer and a feature fusion layer (ADD layer), so that the target detection network can be constructed deeper.
ADD layer: two feature maps of the same size are combined in a mode of adding the equal positions.
Csp2_x layer: with reference to the CSPNet structure, the network layer consists of x CBL layers, conv layers, concat layers, BN layers and LeakyRelu activation functions. The character x in the csp2_x layer indicates the number of CBL layers included in the csp2_x layer except the first CBL layer. As shown in fig. 2, the character x=1 in the csp2_x layer, for example, when x=1, the csp2_x layer is the csp2_1 layer, and 1 CBL layer is included in the csp2_1 layer in addition to the first CBL layer.
SPP layer: consists of a plurality of max pooling layers (Maxpool layers) and Concat layers.
Concat layer: two feature maps of the same size fuse features in a channel merge fashion.
Output layer: a first output layer of the network (output 1), a second output layer of the network (output 2), and a third output layer of the network (output 3).
In the embodiment of the present application, the structure of the target detection network may also be implemented in other forms, which is not limited. In order to improve the reasoning speed, the target detection network can be a quantized neural network, and the quantized target detection network is adopted, so that the physical storage space occupied by a model generated by training is smaller, and the storage pressure of the electronic equipment is reduced.
After the electronic equipment acquires the first image, the first image is input into a target detection network, after the target detection network processes the first image, a target object detected in the first image is output, wherein the target object can be smoke or flame, and the number of the target objects can be one or more. When the target detection network outputs a target object, target frame information corresponding to the target object can be output, and the target object is represented by the target frame information. The target frame information may include coordinates of a target frame, and width and height information of the target frame, and a confidence that a target object represented by the target frame is smoke or flame. Here, a lower confidence threshold may be set, so that the electronic device may evaluate most of the target objects with confidence above the confidence threshold as smoke or flame, ensuring a high detection rate of smoke.
In the embodiment of the application, the offset coordinate of the target frame predicted by the target detection network relative to the anchor point is%t x t y ) The width and the height of the predicted target frame are%t w t h ) The width and the height of the anchor frame are as followsp w p h ) The coordinates of the anchor frame are%c x c y ) At this time, the electronic apparatus may calculate the coordinates of the target frame by the following equation (1) and equation (2):
Figure SMS_1
(1)
Figure SMS_2
(2)
In the formulas (1) and (2), the following formula (d)b x b y ) As the coordinates of the target frame,
Figure SMS_3
is an activation function of a neural networkt x t y ) For the offset coordinate of the predicted target frame relative to the anchor point, the method is as followsc x c y ) Is the coordinates of the anchor frame.
The electronic device may calculate the width and the height of the target frame according to the following formula (3) and formula (4):
Figure SMS_4
(3)
Figure SMS_5
(4)
in the above formula (3) and formula (4), the ratio of (a) to (b)b w b h ) The width and the height of the target frame are as followsp w p h ) For the width and height of the anchor frame,
Figure SMS_6
is an activation function of a neural networkt w t h ) Is the width and height of the predicted target frame.
In the step S13, the warning image is an image including the target object. The electronic equipment detects whether the target object is a foreground in the first image, searches alarm images in a continuous preset number of frame images before the first image when the target object is detected to be the foreground in the first image, and counts the total frame number of the alarm images. The preset number can be set according to actual requirements, for example, the preset number can be 20, 50, 100, etc. In the embodiment of the application, the first image is a frame of warning image with the latest current acquisition time.
In this embodiment of the present application, if the total frame number of the images before the first image does not reach the preset number, the electronic device determines the total frame number of the alert images in all the images before the first image. For example, as shown in the image frame sequence schematic diagram of fig. 3, the current first image is image 0, and includes 20 frames of images, namely, image-20 to image-1, before image 0, if the preset number is 15, and there are more than 15 frames of images before image 0, the electronic device determines the total frame number of the alarm image in the 15 frames of images (namely, image-14 to image 0) before image 0; if the preset number is 30, the number of frames of the images before the image 0 is less than 30 and 20<30, the electronic device determines the total number of frames of the alarm image in all the images before the image 0 (i.e. the images-20 to 0).
In the embodiment of the present application, in order to facilitate statistics of total frame numbers of alarm images and improve fire detection efficiency, after determining that a target object is a foreground, an electronic device may assign a tracking ID (Identity) to the target object, and subsequently, when acquiring other images including the target object again, the electronic device may also assign the tracking ID. In the embodiment of the application, whether the target objects in the two images are the same target object can be determined through the IOU (Intersection over Union, cross-over ratio).
For example, the target frame a and the target frame B are detected in the front and rear two frame images. Wherein the upper left corner coordinate of the target frame A is%x a1y a1 ) The right lower corner coordinate of the target frame A is%x a2y a2 ) The upper left corner coordinate of the target frame B is%x b1y b1 ) The right lower corner coordinate of the target frame B is%x b2y b2 ) The coordinates of the upper left corner of the intersection part of the target frame A and the target frame B can be obtained by the following formula (5) and formula (6)x 1y 1 ) The coordinates of the lower right corner of the intersection part of the target frame A and the target frame B are obtained through the following formula (7) and formula (8)x 2y 2 ):
x 1 =max (x a1 ,x b1 ) (5)
y 1 =min (y a1 ,y b1 ) (6)
x 2 =min (x a2 ,x b2 ) (7)
y 2 =max (y a2 ,y b2 ) (8)
In the above formulas (5) to (8),x a1 is the abscissa of the upper left corner of the target frame a,y a1 is the ordinate of the upper left corner of the target frame A; x a2 Is the abscissa of the lower right corner of the target frame a,y a2 is the ordinate of the lower right corner of the target frame A;x b1 is the abscissa of the upper left corner of the target frame B,y b1 for the purpose ofThe ordinate of the upper left corner of the frame B,x b2 is the abscissa of the lower right corner of the target frame B,y b2 is the ordinate of the lower right corner of the target frame B;x 1 is the abscissa of the upper left corner of the intersection of object box a and object box B,y 1 is the ordinate of the upper left corner of the intersection of object box a and object box B,x 2 is the abscissa of the lower right corner of the intersection of object box a and object box B,y 2 is the ordinate of the lower right corner of the intersection of the target frame A and the target frame B; the max () function represents the one that returns the largest value and the min () function represents the one that returns the smallest value.
Obtaining the left upper corner coordinate of the intersection part of the target frame A and the target frame Bx 1y 1 ) And lower right corner coordinates [ ]x 2y 2 ) Then, the area of the intersection of the target frame a and the target frame B can be obtained by the following formula (9), the formula (9) being as follows:
area in =max(x 2 -x 1 ,0)* max(y 2 -y 1 ,0) (9)
wherein, the liquid crystal display device comprises a liquid crystal display device,area in representing the area of the intersection between target frame a and target frame B,x 1 is the abscissa of the upper left corner of the intersection of object box a and object box B,y 1 is the ordinate of the upper left corner of the intersection of object box a and object box B,x 2 is the abscissa of the lower right corner of the intersection of object box a and object box B, y 2 Is the ordinate of the lower right corner of the intersection of the target frame A and the target frame B; the max () function represents the one that returns the largest value and the min () function represents the one that returns the smallest value.
The area of the target frame a can be found by the following formula (10):
area a = (x a2 -x a1 )* (y a2 -y a1 ) (10)
wherein, the liquid crystal display device comprises a liquid crystal display device,area a for the area of the target frame a,x a1 is the abscissa of the upper left corner of the target frame a,y a1 is the ordinate of the upper left corner of the target frame A;x a2 is the abscissa of the lower right corner of the target frame a,y a2 is the ordinate of the lower right corner of the target frame a.
The area of the target frame B can be found by the following formula (11):
area b = (x b2 -x b1 )* (y b2 -y b1 ) (11)
wherein, the liquid crystal display device comprises a liquid crystal display device,area b for the area of the target frame B,x b1 is the abscissa of the upper left corner of the target frame B,y b1 is the ordinate of the upper left corner of the target frame B,x b2 is the abscissa of the lower right corner of the target frame B,y b2 is the ordinate of the lower right corner of the target frame B.
The intersection ratio of the target frame a and the target frame B can be obtained by the following formula (12):
iou a b , =area in /(area a +area b -area in +0.00001) (12)
wherein, the liquid crystal display device comprises a liquid crystal display device,iou a,b representing the intersection ratio of object box a and object box B,area in representing the area of the intersection between target frame a and target frame B,area a for the area of the target frame a,area b is the area of the target frame B.
When (when)iou a,b Greater than a preset cross ratio threshold thr iou When the tracking ID is used, the target frame A and the target frame B can be determined to represent the same target object, and the same tracking ID is allocated; otherwise, it may be determined that the target frame a and the target frame B represent different target objects, and different track IDs are assigned.
In this embodiment of the present application, the upper left corner coordinates and the lower right corner coordinates of the target frame a, and the upper left corner and the lower right corner coordinates of the target frame B may be calculated by the coordinates and the width and height of the target frame a output by the target detection network, and the coordinates and the width and height of the target frame B output by the target detection network.
For a target object, the electronic device continuously records the tracking information of the target object, and the recording form is shown in fig. 4. In fig. 4, flg0-Flgn is an alarm flag, info0-Info is tracking information, and the tracking information may include a tracking ID, a screenshot of an area occupied by a target object (i.e., a screenshot of an area occupied by a target frame), and the like. Wherein the warning flag is used for indicating whether the target object is included in the image. For example, if a target object is included in one frame image, the warning flag is marked as 1, and if the target object is not included in one frame image, the warning flag is marked as 0. Thus, for a target object, the electronic device can count the number of images with the alarm mark 1 corresponding to the tracking ID, and thus the total frame number of the alarm images can be obtained.
In the step S14, the preset frame number threshold may be set according to the actual requirement. For example, the preset frame number threshold may be a fixed value, such as 10, 20, or 30. The preset frame number threshold value can also be determined according to the preset ratio, the preset ratio can be 70%, 80%, 90%, 95% and the like, and the mode of determining the preset frame number threshold value can be suitable for different scenes without frequently adjusting the preset frame number threshold value by a user, so that maintenance cost is reduced.
After the electronic equipment counts the total frame number of the alarm image, the electronic equipment judges whether the total frame number of the alarm image is larger than a preset frame number threshold value or not; if the total frame number of the alarm image is smaller than the preset frame number threshold value, the target object is an interference target, not smoke or flame, the target object is filtered, and the fire detection operation of the target object is finished; if the total frame number of the alarm image is greater than or equal to the preset frame number threshold, the target object is likely to be smoke or flame, and the electronic equipment judges the structural similarity and the dynamic characteristics of the target object so as to further filter the target object, so that whether the target object is smoke or flame or not is accurately determined, and a fire detection result is obtained. In this embodiment of the present application, the fire detection result may include a type of the target object, information of a target frame corresponding to the target object, a confidence level of the target frame corresponding to the target object, and the like.
For example, the preset number isnThe preset ratio is thr alarm Correspondingly, the preset frame number threshold is n×thr alarm . For a target object, the electronic device may count the total frame number of the obtained alarm image, and further calculate the alarm proportion according to the following formula (13):
Figure SMS_7
(13)
Wherein, the liquid crystal display device comprises a liquid crystal display device,S alarm for the alert ratio, the numerator to the right of the equation represents the total frame number of the alert image, Σ is the sum symbol,nin order to be able to set the number in advance,Flg i when the alarm mark is 1, the current image i comprises the target object, and the total frame number of the alarm image is added by 1; when the warning flag is 0, it indicates that the current image i does not include the target object, and the total frame number of the warning image remains unchanged.
When (when)S alarm ≥thr alarm When the method is used, the electronic equipment can determine that the total frame number of the alarm image corresponding to the target object is greater than or equal to a preset frame number threshold value, and judge the structural similarity and dynamic characteristics of the target object so as to further filter the target object. When (when)S alarm <thr alarm And when the total frame number of the alarm image corresponding to the target object is smaller than the preset frame number threshold value, the electronic equipment can determine that the total frame number of the alarm image corresponding to the target object is smaller than the preset frame number threshold value, and fire detection operation on the target object is finished.
When the fire detection result obtained in step S14 is smoke or flame, the electronic device indicates that the target object is smoke or flame, where the alert information may include coordinates of a target frame corresponding to the target object, a type of the target object (such as smoke or flame), a confidence level of the target object output by the target detection network, and so on.
Corresponding to the fire detection method described above, the embodiment of the present application further provides a foreground detection method, see fig. 5, which may include the following steps.
Step S51: and determining a foreground pixel point and a background pixel point in the first image by using the Gaussian mixture model to obtain a foreground image.
In this embodiment of the present invention, an electronic device may use a gaussian mixture model to perform background modeling on an image, and further use a plurality of gaussian mixture models included in the gaussian mixture model to characterize features of each pixel point in the image, after obtaining a new frame of image (such as a first image), match each pixel point in the first image with the gaussian mixture model, if the matching is successful, determine that the pixel point is a background pixel point, or determine that the pixel point is a foreground pixel point, set a pixel value of the determined foreground pixel point as a first pixel value, set a pixel value of the determined background pixel point as a second pixel value, so as to obtain a foreground image, so as to distinguish a foreground and a background in the first image from the foreground image. Wherein the first pixel value and the second pixel value are set to different values, such as setting the first pixel value to 1, setting the second pixel value to 0, or setting the first pixel value to 0, and setting the second pixel value to 1.
In the embodiment of the application, the method for updating the mixed Gaussian model is as follows:
at a certain moment, for example, at the moment t, for the pixel points which are not matched, namely, the pixel points which are judged to be foreground pixel points, the mean value and the covariance of the Gaussian model are not changed, and for the pixel points which are matched, namely, the pixel points which are judged to be background pixel points, the mean value and the covariance matrix of the Gaussian model can be updated according to the following formulas (14), (15) and (16).
ρ=α/w i t, (14)
μ i,t = (1-ρ)*μ i,t-1 +ρX t (15)
i,t = (1-ρ)* i,t-1 +ρ*diag[(X t -μ i,t ) T (X t -μ i,t )] (16)
Wherein, the liquid crystal display device comprises a liquid crystal display device,w i,t the weight coefficient learned by the Gaussian model is obtained, alpha is the preset learning rate in advance,ρin order to preset the ratio of learning rate to the weight coefficient learned by the Gaussian model in advance,μ i,t is the mean value of the Gaussian model i at the moment t,μ i,t-1 representation oft-1 moment gaussian modeliIs used for the average value of (a),X t for the pixel value of the pixel point on the matching at time t,Σ i t, is thattTime Gaussian modeliIs used for the co-variance matrix of (a),Σ i t-,1 representation oft-1 moment gaussian modeliIs used for the co-variance matrix of (a),diag() For constructing a diagonal matrix.
Step S52: and counting the number of foreground pixel points included in a corresponding target area of the target object on the foreground image.
In the embodiment of the application, the electronic device determines the area occupied by the target object on the foreground image, namely the target area, and counts the number of foreground pixel points included in the target area. For example, in the embodiment shown in the above step S51, if the value of the foreground pixel is set to 1 and the value of the background pixel is set to 0, the number of pixels with the pixel value of 1 in the target area may be counted to determine the number of foreground pixels in the target area. The number of foreground pixels in the target area can be understood as the area occupied by the foreground pixels in the target area.
Step S53: if the ratio of the counted number to the total number of the pixel points included in the target area is greater than a preset proportion threshold value, determining that the target object is a foreground in the first image.
In the embodiment of the application, after counting the obtained number of the foreground pixel points in the target area, the electronic device may compare the number of the foreground pixel points in the target area with the total number of the pixel points included in the target area; if the target area is the frontIf the ratio of the number of the foreground pixels to the total number of pixels included in the target area is greater than the preset ratio threshold, the electronic device may determine that the target object is a foreground in the first image, and then execute step S13, step S14, and so on. If the ratio of the number of foreground points in the target area to the total number of pixel points included in the target area is less than or equal to the preset proportion threshold, the electronic device can determine that the target object is not the foreground in the first image, and the target object does not perform subsequent steps. Wherein the preset proportional threshold may also be referred to as a foreground threshold, symbolized bythr fg This means that the value may be an arbitrary value configured in advance.
In the embodiment of the application, the electronic device may calculate the ratio of the number of foreground pixel points in the target area to the total number of pixel points included in the target area through the following formula (17).
Figure SMS_8
(17)
Wherein, the liquid crystal display device comprises a liquid crystal display device,athe foreground proportion, that is, the ratio of the number of foreground pixels in the target area to the total number of pixels included in the target area,arearepresenting the area occupied by the target area, i.e. the total number of pixel points comprised by the target area,X i represent the firstiAnd a pixel point.
In the technical scheme provided by the embodiment of the application, the electronic equipment uses the Gaussian mixture model to extract the moving target according to the information of the multi-frame images, performs foreground calculation on the target object, and filters the target object which does not belong to the foreground, namely filters the target object which does not belong to smoke or flame, so that the false alarm rate of fire is reduced.
In this embodiment of the present application, the electronic device may further detect a moving object (i.e. a foreground) in the image by adopting other manners, so as to filter a background object, which is not limited.
In some embodiments, the electronic device may use the structural similarity determining procedure shown in fig. 6 to determine structural similarity of the target object, and may specifically include the following steps.
Step S61: and acquiring a second image, wherein the region in the second image, which is the same as the first region in position, is the second region.
The area occupied by the target object in the first image is a first area. When the electronic equipment detects that the total frame number of the alarm image is greater than or equal to a preset frame number threshold value, acquiring a next frame image of the current first image, namely a second image. In the second image, the region at the same position as the first region is used as a second region, namely, a region occupied by the target object in the second image.
Step S62: and calculating the structural similarity of the first area and the second area.
In this embodiment of the present application, the electronic device may directly extract feature information in the first area and the second area from the first image and the second image, so as to calculate structural similarity of the first area and the second area.
The electronic device may also intercept the first region and the second region to obtain two alert target thumbnail, such as the screenshot of the region occupied by the target object in Info1-Info in fig. 4. The electronic equipment extracts characteristic information from the two alarm target small drawings, and further calculates the structural similarity of the two alarm target small drawings, wherein the structural similarity is used as the structural similarity of the first area and the second area. This facilitates extraction of the feature information, improving accuracy of the structural similarity calculation.
In some embodiments, the electronic device may determine the structural similarity of the first region and the second region by the luminance average, the luminance standard deviation, and the luminance covariance of the first region, and the luminance average, the luminance standard deviation, and the luminance covariance of the second region.
For example, the first region is noted asxThe second region is marked asyFirst, theiThe value of each pixel point is recorded asimg i The first area and the second area have the same size, and the number of the pixel points in the first area and the second area is N
The electronic device may calculate the average value of the luminance of the first region and the average value of the luminance of the second region by the following formula (18), formula (19):
Figure SMS_9
(18)
Figure SMS_10
(19)
wherein, the liquid crystal display device comprises a liquid crystal display device,μ x is the average value of the brightness of the first region,μ y is the average value of the brightness of the second region,Nfor the total number of pixel points in the first area or the second area,x i is the first in the first areaiThe luminance value of the individual pixel points,y i is the first in the second regioniThe brightness value of each pixel point is sigma, which is the sum sign.
The electronic device may calculate the standard deviation of the luminance of the first region and the standard deviation of the luminance of the second region by the following formula (20), formula (21):
Figure SMS_11
(20)
Figure SMS_12
(21)
wherein sigma x Is the standard deviation of the brightness of the first region,σ y is the standard deviation of the brightness of the second region,x i is the first in the first areaiThe luminance value of the individual pixel points,y i is the first in the second regioniThe luminance value of the individual pixel points,μ x is the average value of the brightness of the first region,μ y is the average value of the brightness of the second region,Nthe sum sign is the total number of pixel points in the first area or the second area.
The electronic device may calculate the covariance of the first region and the second region by the following equation (22):
Figure SMS_13
(22)
wherein, the liquid crystal display device comprises a liquid crystal display device,σ xy for the covariance of the first region and the second region,Nfor the total number of pixel points in the first area or the second area, x i Is the first in the first areaiThe luminance value of the individual pixel points,y i is the first in the second regioniThe luminance value of the individual pixel points,μ x is the average value of the brightness of the first region,μ y the average value of the brightness of the second region, Σ, is the sum symbol.
The structural similarity of the first region and the second region can be calculated by the following formula (23):
Figure SMS_14
(23)
wherein, the liquid crystal display device comprises a liquid crystal display device,SSIM x,y the structural similarity between the first region and the second region; c1 and C2 are constants, and the statistical analysis is set to 5 and 59, respectively;μ x is the average value of the brightness of the first region,μ y is the average value of the brightness of the second region,σ x is the standard deviation of the brightness of the first region,σ y is the standard deviation of the brightness of the second region,σ xy is the covariance of the first region and the second region.
Step S63: if the structural similarity is within the preset similarity interval, determining that the target object is suspected smoke.
In this embodiment of the present application, the preset similarity interval may be a similarity interval formed by an upper threshold and a lower threshold of a structural similarity preset by the electronic device, for example, the upper threshold is set asthr ssim_u The lower threshold isthr ssim_d The structural similarity interval can be expressed as [ thr ] ssim_d thr ssim_u ]. When the structural similarity of the first area and the second area is greater than or equal to the lower threshold and less than or equal to the upper threshold, the coincidence structural similarity is positioned at the preset similarity In the range, the electronic device can determine that the target object is suspected smoke, i.e. the target object is suspected smoke or suspected flame; and for the target object with the structural similarity smaller than the lower threshold or the structural similarity larger than the upper threshold, judging that the target object is non-pyrotechnic, and exiting detection.
In order to improve the accuracy of fire detection, the electronic device may set different similarity intervals for different types of target objects. For example, when the target object output by the target detection network is flame, the upper threshold and the lower threshold may be a set of values, and when the structural similarity of the first area and the second area corresponding to the target object is located in a corresponding preset similarity interval, the target object may be determined to be suspected flame; when the target detection network outputs that the target object is smoke, the upper threshold and the lower threshold of the target detection network can be another set of values, and when the structural similarity of the first area and the second area corresponding to the target object is located in a corresponding preset similarity interval, the target object can be determined to be suspected smoke. This facilitates a more accurate filtering of flame or smoke targets, respectively, expands the application range of the fire detection method, and reduces the false alarm rate of the fire.
In the technical scheme provided by the embodiment of the application, after the structural similarity of the first area and the second area is calculated by the electronic equipment, the non-pyrotechnic target is filtered through the preset similarity, so that the false alarm rate of fire detection is reduced.
In some embodiments, the electronic device may use the dynamic characteristic determining flow shown in fig. 7 to determine the dynamic characteristic of the target object, and may specifically include the following steps.
Step S71: and intercepting the areas occupied by the target objects in the alarm images to obtain a plurality of alarm target small images.
In the embodiment of the application, the electronic device may intercept the area occupied by the target object in the alarm image in advance and record the area occupied by the target object in the alarm image, such as the area occupied by the target object in the alarm image included in Info1-Info in fig. 4, which can save the storage space of the electronic device. In order to facilitate the analysis of the image and simplify the operation of the device, the electronic device may also store the complete alert image, and intercept the area occupied by the target object in the alert image when the dynamic characteristic determination is required, which is not limited.
In this embodiment of the present application, the electronic device may obtain a plurality of alert target thumbnail in any one of the following manners:
Mode one: and randomly selecting a plurality of alarm images from the alarm images, and intercepting the area occupied by the target object from the randomly selected alarm images to obtain a plurality of alarm target small images.
Mode two: and selecting alarm images from the plurality of alarm images in an equidistant mode, and then intercepting the area occupied by the target object from the selected alarm images to obtain a plurality of alarm target small images.
And thirdly, taking the screenshot of the first area and the second area as an alarm target small image for subsequent dynamic characteristic judgment.
In the embodiment of the application, taking a case of intercepting and recording alarm target small images from a plurality of alarm images in advance as an example. The electronic equipment collects alarm target small images corresponding to all alarm images in the buffer queue, and buffers the alarm target small images into a listlist alarm In the method, an index set of the list is set as A, and the number of alarm target small images in the index set A is set asnum A The electronic device may obtain a plurality of alert target small graphs for subsequent dynamic characteristic judgment according to the following two rules:
rule 1: selection listlist alarm An alarm target small image corresponding to the alarm image acquired earliest and an alarm target small image corresponding to the alarm image acquired latest;
Rule 2: when the number k >2 of the alarm target plots for the subsequent dynamic characteristic judgment, the index value is calculated according to the following formula (24) except that two alarm target plots are selected according to the rule 1:
idx i = i* (num A -2)/(k-1) , i∈{1,2,…,k-2} (24)
wherein, the liquid crystal display device comprises a liquid crystal display device,idx i is a ropeLeading values;ka number of alert images to be selected;num A the number of the alarm target small images in the index set A;irepresent the firstiIndex values to be calculatedk-2 index values.
According to the calculation resultk-2 index values constituting index set B, from the list using the index values in index set Blist alarm Find the corresponding onek2 alarm target small images, and selecting two alarm target small images according to rule 1 to obtain k alarm target small images for subsequent dynamic characteristic judgment, wherein the k alarm target small images can form an input set of a dynamic classification networklist cls
Wherein the number of selected alert images (i.e., the number of alert target thumbnail) may be a preconfigured number of alert images to be selected.
Step S72: and adopting a dynamic classification network to judge dynamic characteristics of the plurality of alarm target small images to obtain a classification result, wherein the classification result indicates whether the target object is suspected smoke or not.
In the embodiment of the present application, the dynamic classification network may be any neural network, such as the yolov5s network described above. In one example, the structure of the dynamic classification network can be seen in fig. 8, and in fig. 8, each network layer in the dynamic classification network is as follows: input layer, CBL layer, cbl_3 layer, convolutional layer (conv layer), RES layer, feature fusion layer (Concat layer), pad layer, average pooling layer (Avgpool layer), dimension transformation layer (Reshape), and Linear layer, i.e., full connection layer (Linear), output layer.
CBL layer: consists of a convolution layer (Conv layer), a batch normalization layer (batch normalization, BN), a LeakyRelu activation function. Wherein the LeakyRelu activation function is similar to rule, except that the negative slope is not 0.
Cbl_3 layer: there are 3 CBL layers.
RES layer: consists of a CBL layer, a convolution layer (Conv layer), a batch normalization layer (batch normalization, BN) and a feature fusion layer (ADD layer). The residual structure in the Resnet network is used for reference, so that the network can be constructed deeper.
Pad layer: consists of a convolution layer (Conv layer), a batch normalization layer (batch normalization, BN), a Relu activation function.
ADD layer: and the feature fusion layer fuses the features in a mode of adding the two feature images with the same size at the same position.
Input layer: the first input layer of the network (input 1) and the second input layer of the network (input 2).
Output layer: the first output layer (output 1) of the network.
Concat layer: and the feature fusion layer fuses features in a channel merging mode by two feature graphs with the same size.
The dynamic characteristics of smoke and flame include morphological changes, brightness changes, color changes, and the like. After the electronic device obtains a plurality of alarm target small images in step S71, inputting the alarm target small images into a dynamic classification network, and processing the alarm target small images by the dynamic classification network to obtain dynamic characteristics such as morphological change, brightness change, color change and the like of a target object; if the target object output by the target detection network is flame, the dynamic characteristics of the target object such as form change, brightness change, color change and the like are matched with the dynamic characteristics of the flame, and the target object output by the dynamic classification network is flame; if the target object output by the target detection network is smoke, the dynamic characteristics of the target object such as morphological change, brightness change, color change and the like are matched with the dynamic characteristics of the smoke, and the target object output by the dynamic classification network is smoke.
In order to save the cost, the smoke target and the flame target adopt the same dynamic classification network; in order to improve the accuracy of fire detection, different pairs of dynamic classification networks are used for smoke targets and flame targets. This is not limited.
In this embodiment of the present application, the electronic device may further perform dynamic characteristic determination on the target object by using other manners, which is not limited.
In the technical scheme provided by the embodiment of the application, the electronic equipment intercepts the areas occupied by the target objects in the alarm images to obtain the alarm target small images, carries out dynamic characteristic judgment on the alarm target small images to obtain the classification result, further filters the classification result to be non-pyrotechnic target objects, and reduces the false alarm rate of fire.
In some embodiments, the electronic device is associated with an infrared thermal imaging device. In order to further improve the accuracy of flame detection, the embodiment of the present application further provides a fire detection method, as shown in fig. 9, which may include steps S91 to S96. Step S91 to step S94 are the same as step S11 to step S14 described above, and will not be described here again.
In step S95, when the fire detection result indicates that the target object is flame, a thermal imaging technology or an infrared imaging technology is adopted to obtain a third image, and the area occupied by the target object in the third image is the third area.
In this embodiment of the present application, when the fire detection result indicates that the target object is flame, the electronic device may acquire an infrared thermal imaging image acquired by the infrared thermal imaging device using a thermal imaging technology or an infrared imaging technology, that is, a third image. The third region in the third image may be detected and determined by a related object detection method, or a region in the infrared thermal imaging image at the same position as the first region may be used as the third region.
In the embodiment of the application, the infrared thermal imaging device can be in an on state, and when the fire detection result indicates that the target object is flame, the electronic device can directly acquire the current infrared thermal imaging image acquired by the infrared thermal imaging device, so that the fire detection efficiency can be improved. The infrared thermal imaging equipment can also be in a closed state, when the fire detection result indicates that the target object is flame, the electronic equipment can start the infrared thermal imaging equipment, and then the infrared thermal imaging image acquired by the infrared thermal imaging equipment is acquired, so that the energy of the electronic equipment can be saved, and the cost of fire detection is reduced.
In step S96, if the average brightness value of the third area is greater than the preset brightness threshold, alarm information indicating that the target object is flame is output.
The preset brightness threshold can be obtained by collecting infrared thermal imaging images of the ignition points under different illumination conditions and counting the distribution of brightness values of the ignition points, and the preset brightness threshold can be a lower limit of the brightness values, such as 220, so that the detection rate of the fire is improved while the false alarm rate of the fire is reduced; the preset brightness threshold can be the upper limit of brightness value, and fire detection accuracy is achieved.
The electronic equipment obtains the brightness average value of the third area, and compares the brightness average value of the third area with a preset brightness threshold value. If the average brightness value of the third area is greater than the preset brightness threshold, the electronic device can output alarm information indicating that the target object is flame, and if the average brightness value of the third area is less than or equal to the preset brightness threshold, the target object is not flame and alarm information is not output.
In the technical scheme provided by the embodiment of the application, when the electronic equipment is provided with the thermal imaging technology or the infrared imaging technology, the fire detection result can be filtered again through the thermal imaging technology or the infrared imaging technology, so that the fire false alarm rate is further reduced.
The fire detection method according to the embodiment of the present application will be described in detail with reference to a fire detection flow shown in fig. 10. The electronic equipment comprises a target detection network, a foreground extraction module, a filtering module, a tracking module, a similarity comparison module, a dynamic classification module, an infrared thermal imaging module and an alarm module.
Step S101: the target detection network detects suspicious pyrotechnical targets in the images and positions and outputs target frame information.
Step S102: the target detection network filters suspicious targets with the confidence level included in the target frame information smaller than a preset confidence level threshold. The rest suspicious targets are the detection results of the target detection network
Step S101 and step S102 can be referred to the description in step S12 described above.
Step S103: the foreground extraction module performs foreground extraction on the image to obtain a moving foreground region. See the description in step S51 above.
Step S104: the filtering module is used for matching the detection result of the target detection network with the foreground region extracted by the foreground extraction module and filtering the detection result which is not in the foreground region. See the description in step S52 and step S53 described above.
Step S105: the tracking module tracks the residual detection results and maintains tracking information of all the detection results.
Step S106: the tracking module analyzes the tracking information, and filters the tracking targets of which the length of the tracking sequence exceeds a length threshold and the proportion of the alarming result in the tracking sequence does not meet the duty ratio threshold.
The length of the tracking sequence is the number of frames of the image from the suspicious target to the current moment, the length threshold is the preset number, the proportion of the alarming result in the tracking sequence to the tracking sequence is the ratio of the total number of the alarming images to the preset number, and the proportion threshold is the ratio of the preset number of frames to the preset number.
In step S106, when the length of the tracking sequence does not exceed the length threshold, if the proportion of the alarm result in the tracking sequence satisfies the tracking target (i.e., the target) with the duty ratio threshold, the tracking target may be retained, and step S107 may be performed. If the length of the tracking sequence exceeds the length threshold, but the proportion of the alarm result in the tracking sequence to the tracking sequence does not meet the tracking target with the threshold of the proportion, filtering the tracking target.
Step S107: the similarity comparison module selects a tracking target with the length exceeding a length threshold value from the rest of tracking information, and collects an alarm small image of the tracking target. The alarm small image is an image of the area occupied by the tracking target.
Step S85 to step S87 can be described in the above steps S13 and S14.
Step S108: and comparing the structural similarity of the last alarm small image of the tracking target by the similarity comparison module, combining the alarm small images in the subsequent frame of image, wherein the comparison structural similarity does not meet the condition, and exiting the detection. See the description above in the embodiment shown in fig. 6. The structural similarity is outside a preset similarity interval when the structural similarity does not meet the condition.
Step S109: the dynamic classification module screens the alarm small images of the tracking targets in an equidistant mode, sends the screened alarm small images into a dynamic classification network, dynamically judges the alarm small images, and filters the classification threshold value of the judgment result to filter the tracking targets which do not meet the classification threshold value. See the description above in the embodiment shown in fig. 7.
In the embodiment of the present application, the execution order of step S108 and step S109 is not limited.
Step S1010: the infrared thermal imaging module judges whether an infrared imaging or thermal imaging switch is on or not, and judges whether a tracking target is of flame type or not. If the infrared imaging or thermal imaging switch is turned on, the tracking target is a flame type, then step S1011 is executed; otherwise, step S1012 is performed.
Step S1011: the infrared thermal imaging module acquires an infrared thermal imaging image, performs region matching on the last detection result of the tracking target, calculates a brightness value in the region, compares the calculated brightness value with a brightness threshold value, and filters the tracking target which does not meet the brightness threshold value.
Step S1012: the alarm module generates and outputs smoke and fire alarm information. The smoke and fire warning information comprises target frame information, suspected target types and the like.
In the following, a specific example will be described, where the buffer length is set to 5, i.e. the preset number is set to 5.
The electronic equipment continuously acquires 5 frames of images, such as 5 frames of images shown in (a) - (e) in fig. 11.
After the electronic device detects the 5 frames of images using the target detection network, the output detection results are shown in fig. 12 (a) - (e), and include a target frame (i.e., a white rectangular frame in fig. 12) and a confidence (or score), where the score of the target object detected in fig. 12 (a) is 0.660929, the score of the target object detected in fig. 12 (b) is 0.597544, the score of the target object detected in fig. 12 (c) is 0.599902, the score of the target object detected in fig. 12 (d) is 0.347256, and the score of the target object detected in fig. 12 (e) is 0.432127.
The electronic device performs foreground detection on the 5 frames of images to obtain a foreground proportion of each target frame, namely, a foreground proportion of the target object, wherein the foreground proportion is a proportion occupied by foreground pixel points in the target frame, as shown in fig. 13, the foreground proportion of the detected target object in fig. 13 (a) is 0.617188, the foreground proportion of the detected target object in fig. 13 (b) is 0.573913, the foreground proportion of the detected target object in fig. 13 (c) is 0.586667, the foreground proportion of the detected target object in fig. 13 (d) is 0.634615, and the foreground proportion of the detected target object in fig. 13 (e) is 0.603704.
Assuming that the foreground proportion of the target object in the 5 frames of images is greater than a preset proportion threshold, the electronic equipment tracks the 5 frames of images and caches tracking information. The tracking information comprises an alarm target small image, such as a white rectangular frame area in (a) - (e) in fig. 14, and tracking IDs of target objects, such as tracking IDs of 6 in (a) - (e) in fig. 14.
Based on the cache tracking information, the electronic device performs structural similarity judgment, dynamic classification (dynamic characteristic judgment), infrared imaging or thermal imaging judgment and the like on the target object, so as to judge whether the target object is smoke or flame or not, and further perform smoke and fire alarm.
Corresponding to the above fire detection method, the embodiment of the present application further provides a fire detection device, as shown in fig. 15, including:
the first acquisition module 151: for acquiring a first image;
detection module 152: the target detection network is used for detecting the first image to obtain a target object detected as smoke or flame;
the first determination module 153: when the target object is a foreground in the first image, determining the total frame number of alarm images in a continuous preset number of frame images before the first image, wherein the alarm images are images including the target object in the preset number of frame images;
Judgment module 154: and if the total frame number is greater than or equal to the preset frame number threshold, judging the structural similarity and the dynamic characteristics of the target object to obtain a fire detection result.
In the technical scheme provided by the embodiment of the application, the colors of smoke and flame in the image are combined by utilizing the target detection network. The shape and other image characteristics, a large number of target objects which are smoke or flame are screened from the image, so that the high detection rate of the smoke and the flame is realized, and the application range of the fire detection method is enlarged. After the target object is obtained, the target object is subjected to multistage filtration by adopting the prospect, the total frame number, the structural similarity, the dynamic characteristics and the like of the collected target object, the non-smog and non-flame target object is removed, the target object which is smog or flame can be accurately determined based on the target object, the accuracy of a fire detection result is improved, and the false alarm rate of the fire is reduced.
In some embodiments, the apparatus further comprises:
a second determination module: the method comprises the steps of determining foreground pixel points and background pixel points in a first image by using a Gaussian mixture model to obtain a foreground image;
and a statistics module: the method comprises the steps of counting the number of foreground pixel points included in a corresponding target area of the target object on the foreground image;
And a third determination module: and determining that the target object is the foreground in the first image if the ratio of the counted number to the total number of the pixel points included in the target area is greater than a preset proportion threshold.
In some embodiments, the area occupied by the target object in the first image is a first area;
the judging module 154 includes:
an acquisition unit: the method comprises the steps of acquiring a second image, wherein a region, which is the same as the first region in position, in the second image is a second region;
a calculation unit: for calculating a structural similarity of the first region and the second region;
a determination unit: and if the structural similarity is within a preset similarity interval, determining that the target object is suspected smoke.
In some embodiments, the computing unit is specifically configured to:
and determining the structural similarity of the first area and the second area according to the brightness average value, the brightness standard deviation and the brightness covariance of the first area and the brightness average value, the brightness standard deviation and the brightness covariance of the second area.
In some embodiments, the determining module 154 includes:
and an intercepting unit: the method comprises the steps of intercepting the areas occupied by the target objects in a plurality of alarm images to obtain a plurality of alarm target small images;
A judging unit: and the dynamic characteristic judgment is carried out on the plurality of alarm target small images by adopting a dynamic classification network to obtain a classification result, and the classification result indicates whether the target object is suspected smoke or not.
In some embodiments, the intercepting unit is specifically configured to:
selecting an alarm image from the plurality of alarm images in an equally spaced manner;
and intercepting the area occupied by the target object from the selected alarm image to obtain an alarm target small image.
In some embodiments, the apparatus further comprises:
and a second acquisition module: after the fire detection result is obtained, when the fire detection result indicates that the target object is flame, a thermal imaging technology or an infrared imaging technology is adopted to obtain a third image, wherein the area occupied by the target object in the third image is a third area;
and an output module: and outputting alarm information indicating that the target object is flame if the average brightness value of the third area is larger than a preset brightness threshold value.
In correspondence with the above fire detection method, the embodiment of the present application further provides an electronic device, as shown in fig. 16, including a processor 161, a communication interface 162, a memory 163, and a communication bus 164, wherein the processor 161, the communication interface 162, and the memory 163 perform communication with each other through the communication bus 164,
A memory 163 for storing a computer program;
the processor 161 is configured to execute the program stored in the memory 163 to implement the steps of the fire detection method.
The communication bus mentioned above for the electronic devices may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
In yet another embodiment provided herein, there is also provided a computer readable storage medium having stored therein a computer program which when executed by a processor implements the steps of any of the fire detection methods described above.
In yet another embodiment provided herein, there is also provided a computer program product containing instructions that, when run on a computer, cause the computer to perform any of the fire detection methods of the above embodiments.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for apparatus, electronic devices, storage media and computer program product embodiments, the description is relatively simple, as it is substantially similar to method embodiments, with reference to the description of method embodiments in part.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modifications, equivalent substitutions, improvements, etc. that are within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (16)

1. A fire detection method, the method comprising:
acquiring a first image;
detecting the first image by adopting a target detection network to obtain a target object detected as smoke or flame;
when the target object is a foreground in the first image, determining the total frame number of alarm images in a continuous preset number of frame images before the first image, wherein the alarm images are images including the target object in the preset number of frame images;
and if the total frame number is greater than or equal to a preset frame number threshold, judging the structural similarity and the dynamic characteristics of the target object to obtain a fire detection result.
2. The method according to claim 1, wherein the method further comprises:
determining foreground pixel points and background pixel points in the first image by using a Gaussian mixture model to obtain a foreground image;
counting the number of foreground pixel points included in a corresponding target area of the target object on the foreground image;
If the ratio of the counted number to the total number of the pixel points included in the target area is greater than a preset proportion threshold, determining that the target object is the foreground in the first image.
3. The method of claim 1, wherein the area occupied by the target object in the first image is a first area;
and judging the structural similarity of the target object, wherein the method comprises the following steps:
acquiring a second image, wherein an area with the same position as the first area in the second image is a second area;
calculating the structural similarity of the first area and the second area;
and if the structural similarity is within a preset similarity interval, determining that the target object is suspected smoke.
4. A method according to claim 3, wherein the step of calculating the structural similarity of the first region and the second region comprises:
and determining the structural similarity of the first area and the second area according to the brightness average value, the brightness standard deviation and the brightness covariance of the first area and the brightness average value, the brightness standard deviation and the brightness covariance of the second area.
5. The method of claim 1, wherein performing dynamic property determination on the target object comprises:
Intercepting the areas occupied by the target objects in the alarm images to obtain a plurality of alarm target small images;
and adopting a dynamic classification network to judge the dynamic characteristics of the plurality of alarm target small images to obtain a classification result, wherein the classification result indicates whether the target object is suspected smoke or not.
6. The method of claim 5, wherein the step of capturing the areas occupied by the target objects in the plurality of alert images to obtain a plurality of alert target small images comprises:
selecting an alarm image from the plurality of alarm images in an equally spaced manner;
and intercepting the area occupied by the target object from the selected alarm image to obtain an alarm target small image.
7. The method of claim 1, wherein after obtaining the fire detection result, the method further comprises:
when the fire detection result indicates that the target object is flame, a thermal imaging technology or an infrared imaging technology is adopted to obtain a third image, wherein the area occupied by the target object in the third image is a third area;
and if the average brightness value of the third area is larger than a preset brightness threshold value, outputting alarm information indicating that the target object is flame.
8. A fire detection device, the device comprising:
a first acquisition module: for acquiring a first image;
and a detection module: the first image is detected by adopting a target detection network to obtain a target object detected as smoke or flame;
a first determination module: the method comprises the steps of determining the total frame number of warning images in a continuous preset number of frame images before a first image when the target object is a foreground in the first image, wherein the warning images are images including the target object in the preset number of frame images;
and a judging module: and if the total frame number is greater than or equal to a preset frame number threshold, judging the structural similarity and the dynamic characteristics of the target object to obtain a fire detection result.
9. The apparatus of claim 8, wherein the apparatus further comprises:
a second determination module: the method comprises the steps of determining foreground pixel points and background pixel points in a first image by using a Gaussian mixture model to obtain a foreground image;
and a statistics module: the method comprises the steps of counting the number of foreground pixel points included in a corresponding target area of the target object on the foreground image;
And a third determination module: and determining that the target object is the foreground in the first image if the ratio of the counted number to the total number of the pixel points included in the target area is greater than a preset proportion threshold.
10. The apparatus of claim 8, wherein the area occupied by the target object in the first image is a first area;
the judging module comprises:
an acquisition unit: the method comprises the steps of acquiring a second image, wherein a region, which is the same as the first region in position, in the second image is a second region;
a calculation unit: for calculating a structural similarity of the first region and the second region;
a determination unit: and if the structural similarity is within a preset similarity interval, determining that the target object is suspected smoke.
11. The apparatus according to claim 10, characterized in that the computing unit is specifically configured to:
and determining the structural similarity of the first area and the second area according to the brightness average value, the brightness standard deviation and the brightness covariance of the first area and the brightness average value, the brightness standard deviation and the brightness covariance of the second area.
12. The apparatus of claim 8, wherein the determining module comprises:
And an intercepting unit: the method comprises the steps of intercepting the areas occupied by the target objects in a plurality of alarm images to obtain a plurality of alarm target small images;
a judging unit: and the dynamic characteristic judgment is carried out on the plurality of alarm target small images by adopting a dynamic classification network to obtain a classification result, and the classification result indicates whether the target object is suspected smoke or not.
13. The device according to claim 12, characterized in that said interception unit is specifically configured to:
selecting an alarm image from the plurality of alarm images in an equally spaced manner;
and intercepting the area occupied by the target object from the selected alarm image to obtain an alarm target small image.
14. The apparatus of claim 8, wherein the apparatus further comprises:
and a second acquisition module: after the fire detection result is obtained, when the fire detection result indicates that the target object is flame, a thermal imaging technology or an infrared imaging technology is adopted to obtain a third image, wherein the area occupied by the target object in the third image is a third area;
and an output module: and outputting alarm information indicating that the target object is flame if the average brightness value of the third area is larger than a preset brightness threshold value.
15. The electronic equipment is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
a memory for storing a computer program;
a processor for implementing the method of any of claims 1-7 when executing a program stored on a memory.
16. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed by a processor, implements the method of any of claims 1-7.
CN202310395650.XA 2023-04-14 2023-04-14 Fire detection method and device, electronic equipment and storage medium Active CN116152667B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310395650.XA CN116152667B (en) 2023-04-14 2023-04-14 Fire detection method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310395650.XA CN116152667B (en) 2023-04-14 2023-04-14 Fire detection method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116152667A true CN116152667A (en) 2023-05-23
CN116152667B CN116152667B (en) 2023-06-30

Family

ID=86354603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310395650.XA Active CN116152667B (en) 2023-04-14 2023-04-14 Fire detection method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116152667B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770644A (en) * 2010-01-19 2010-07-07 浙江林学院 Forest-fire remote video monitoring firework identification method
CN102236947A (en) * 2010-04-29 2011-11-09 中国建筑科学研究院 Flame monitoring method and system based on video camera
CN103473788A (en) * 2013-07-31 2013-12-25 中国电子科技集团公司第三十八研究所 Indoor fire and flame detection method based on high-definition video images
CN107633212A (en) * 2017-08-30 2018-01-26 清华大学苏州汽车研究院(吴江) A kind of firework detecting method and device based on video image
CN108830161A (en) * 2018-05-18 2018-11-16 武汉倍特威视系统有限公司 Smog recognition methods based on video stream data
CN109086647A (en) * 2018-05-24 2018-12-25 北京飞搜科技有限公司 Smog detection method and equipment
CN110276284A (en) * 2019-06-11 2019-09-24 暨南大学 Flame identification method, device, equipment and storage medium based on video quality assessment
US20200012859A1 (en) * 2017-03-28 2020-01-09 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fire detection
CN110807377A (en) * 2019-10-17 2020-02-18 浙江大华技术股份有限公司 Target tracking and intrusion detection method, device and storage medium
CN111310662A (en) * 2020-02-17 2020-06-19 淮阴工学院 Flame detection and identification method and system based on integrated deep network
CN113096103A (en) * 2021-04-15 2021-07-09 北京工业大学 Intelligent smoke image sensing method for emptying torch
CN113657233A (en) * 2021-08-10 2021-11-16 东华大学 Unmanned aerial vehicle forest fire smoke detection method based on computer vision
CN113657250A (en) * 2021-08-16 2021-11-16 南京图菱视频科技有限公司 Flame detection method and system based on monitoring video
CN115063718A (en) * 2022-06-10 2022-09-16 嘉洋智慧安全生产科技发展(北京)有限公司 Fire detection method, fire detection device, fire detection apparatus, fire detection program, storage medium, and storage medium
CN115410134A (en) * 2022-09-30 2022-11-29 西安工程大学 Video fire smoke detection method based on improved YOLOv5s

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101770644A (en) * 2010-01-19 2010-07-07 浙江林学院 Forest-fire remote video monitoring firework identification method
CN102236947A (en) * 2010-04-29 2011-11-09 中国建筑科学研究院 Flame monitoring method and system based on video camera
CN103473788A (en) * 2013-07-31 2013-12-25 中国电子科技集团公司第三十八研究所 Indoor fire and flame detection method based on high-definition video images
US20200012859A1 (en) * 2017-03-28 2020-01-09 Zhejiang Dahua Technology Co., Ltd. Methods and systems for fire detection
CN107633212A (en) * 2017-08-30 2018-01-26 清华大学苏州汽车研究院(吴江) A kind of firework detecting method and device based on video image
CN108830161A (en) * 2018-05-18 2018-11-16 武汉倍特威视系统有限公司 Smog recognition methods based on video stream data
CN109086647A (en) * 2018-05-24 2018-12-25 北京飞搜科技有限公司 Smog detection method and equipment
CN110276284A (en) * 2019-06-11 2019-09-24 暨南大学 Flame identification method, device, equipment and storage medium based on video quality assessment
CN110807377A (en) * 2019-10-17 2020-02-18 浙江大华技术股份有限公司 Target tracking and intrusion detection method, device and storage medium
CN111310662A (en) * 2020-02-17 2020-06-19 淮阴工学院 Flame detection and identification method and system based on integrated deep network
CN113096103A (en) * 2021-04-15 2021-07-09 北京工业大学 Intelligent smoke image sensing method for emptying torch
CN113657233A (en) * 2021-08-10 2021-11-16 东华大学 Unmanned aerial vehicle forest fire smoke detection method based on computer vision
CN113657250A (en) * 2021-08-16 2021-11-16 南京图菱视频科技有限公司 Flame detection method and system based on monitoring video
CN115063718A (en) * 2022-06-10 2022-09-16 嘉洋智慧安全生产科技发展(北京)有限公司 Fire detection method, fire detection device, fire detection apparatus, fire detection program, storage medium, and storage medium
CN115410134A (en) * 2022-09-30 2022-11-29 西安工程大学 Video fire smoke detection method based on improved YOLOv5s

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZHANG HONG等: "Networked Video Smoke and Fire Monitoring System Based on DM642 and i.MX27", 《INTERNATIONAL JOURNAL OF SMART HOME》, vol. 11, no. 2, pages 25 - 38 *
郭亦鹏: "基于视频监控的大空间室内火灾烟雾识别方法", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 03, pages 136 - 959 *

Also Published As

Publication number Publication date
CN116152667B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
CN108062349B (en) Video monitoring method and system based on video structured data and deep learning
CN108053427B (en) Improved multi-target tracking method, system and device based on KCF and Kalman
CN108052859B (en) Abnormal behavior detection method, system and device based on clustering optical flow characteristics
CN112560657B (en) Method, device, computer device and storage medium for identifying smoke and fire
Çetin et al. Video fire detection–review
Mahmoud et al. Forest fire detection and identification using image processing and SVM
KR101237089B1 (en) Forest smoke detection method using random forest classifier method
US7868772B2 (en) Flame detecting method and device
JP5218906B2 (en) Smoke detection device and smoke detection method
US9613432B2 (en) Fire detection system and method employing digital images processing
CN110390229B (en) Face picture screening method and device, electronic equipment and storage medium
Pundir et al. Deep belief network for smoke detection
CN109815787B (en) Target identification method and device, storage medium and electronic equipment
KR20130032856A (en) A method for monitoring a video and an apparatus using it
CN110619277A (en) Multi-community intelligent deployment and control method and system
Lim et al. Gun detection in surveillance videos using deep neural networks
CN111797726A (en) Flame detection method and device, electronic equipment and storage medium
KR101454644B1 (en) Loitering Detection Using a Pedestrian Tracker
CN111047624A (en) Image dim target detection method, device, equipment and storage medium
CN112651398A (en) Vehicle snapshot control method and device and computer readable storage medium
CN110544271B (en) Parabolic motion detection method and related device
CN113920585A (en) Behavior recognition method and device, equipment and storage medium
KR101394270B1 (en) System and method for image monitoring
CN113657250A (en) Flame detection method and system based on monitoring video
JP5758165B2 (en) Article detection device and stationary person detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant