CN115691026A - Intelligent early warning monitoring management method for forest fire prevention - Google Patents

Intelligent early warning monitoring management method for forest fire prevention Download PDF

Info

Publication number
CN115691026A
CN115691026A CN202211704016.1A CN202211704016A CN115691026A CN 115691026 A CN115691026 A CN 115691026A CN 202211704016 A CN202211704016 A CN 202211704016A CN 115691026 A CN115691026 A CN 115691026A
Authority
CN
China
Prior art keywords
pixel point
edge
edge pixel
temperature
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211704016.1A
Other languages
Chinese (zh)
Other versions
CN115691026B (en
Inventor
袁传武
张维
王怡
孙拥康
吴文丰
周明玥
刘卫华
刘驰
周胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Optical Valley Technology Co ltd
HUBEI ACADEMY OF FORESTRY
Original Assignee
Optical Valley Technology Co ltd
HUBEI ACADEMY OF FORESTRY
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Optical Valley Technology Co ltd, HUBEI ACADEMY OF FORESTRY filed Critical Optical Valley Technology Co ltd
Priority to CN202211704016.1A priority Critical patent/CN115691026B/en
Publication of CN115691026A publication Critical patent/CN115691026A/en
Application granted granted Critical
Publication of CN115691026B publication Critical patent/CN115691026B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/28Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture specially adapted for farming

Landscapes

  • Fire-Detection Mechanisms (AREA)

Abstract

The invention relates to the technical field of forest fire prevention monitoring, in particular to an intelligent early warning monitoring management method for forest fire prevention. The method comprises the following steps: clustering pixels in a temperature image to be analyzed to obtain an initial abnormal area, judging whether a fire disaster happens according to the temperature of the pixels in the initial abnormal area, and if the fire disaster happens, calculating the edge probability corresponding to each characteristic pixel so as to obtain candidate pixels; judging the category of each candidate pixel point according to the included angle between the straight line corresponding to each candidate pixel point and the vector vertical to the wind speed and the included angle between the vector of each candidate pixel point pointing to the clustering center of the initial abnormal area and the wind speed vector, and further calculating the temperature change index; and obtaining the smoke spreading speed and the flame spreading speed based on the gray value, the height, the motion direction angle, the gray value corresponding to the standard red and the coordinates of the matched pixel points of the edge pixel points in the gray image to be analyzed, and further judging the early warning grade. The invention improves the early warning reliability of forest fire prevention.

Description

Intelligent early warning monitoring management method for forest fire prevention
Technical Field
The invention relates to the technical field of forest fire prevention monitoring, in particular to an intelligent early warning monitoring management method for forest fire prevention.
Background
The unique natural environment of the forest is often used as a fire place, and the great destructive power of the forest fire brings irreparable great economic loss, heavy casualties and serious damage to the ecological environment to human beings. Therefore, in order to reduce the loss caused by the forest fire, it is necessary to detect the occurrence of the fire as much as possible in the early stage of the occurrence of the fire.
An air-space-ground-human four-in-one forest fire sensing system for forest fire prevention integrates various monitoring devices such as a remote sensing satellite, an unmanned aerial vehicle, a high-point cloud deck, ground patrol and the like, a perfect air-space-ground-human integrated forest fire monitoring network is constructed, and a skyrome type comprehensive and efficient monitoring and early warning is carried out on forest fires. The traditional fire identification method comprises the steps of collecting images of forest monitoring areas, obtaining abnormal areas by using a K-means clustering algorithm, comparing the abnormal areas with standard images when no fire occurs, and carrying out intelligent early warning when the difference reaches a set threshold value; when the abnormal area in the temperature image is obtained by using the K-means clustering algorithm, because the distance measurement mode of the traditional K-means clustering algorithm is obtained through the temperature difference between the pixel point and the clustering center, under the influence of fire, the central temperature of the fire area is attenuated towards the fire edge, the fire continuously spreads, the temperature of the normal area without the fire is possibly higher due to the heat conduction effect at the edge of the fire area, and the temperature of the central point of the fire area is possibly more different from the temperature of the fire edge, so that when the traditional clustering algorithm based on the temperature difference is used for carrying out clustering analysis, the extraction result of the abnormal area is not accurate enough, and the early warning reliability of the fire-proof forest is lower.
Disclosure of Invention
In order to solve the problem of low reliability of the existing method in intelligent early warning of forest fire prevention, the invention aims to provide an intelligent early warning monitoring management method for forest fire prevention, and the adopted technical scheme is as follows:
the invention provides an intelligent early warning monitoring management method for forest fire prevention, which comprises the following steps:
acquiring two adjacent frames of temperature images and corresponding gray level images of a forest monitoring area, and respectively recording the temperature images as a reference temperature image, a temperature image to be analyzed, a reference gray level image and a gray level image to be analyzed;
clustering pixels in a temperature image to be analyzed to obtain an initial abnormal area, judging whether a fire disaster occurs or not according to the temperature of the pixels in the initial abnormal area, if so, marking each edge pixel of the initial abnormal area and pixels outside the initial abnormal area in a preset neighborhood as feature pixels, and obtaining a straight line corresponding to each feature pixel based on each feature pixel and a clustering center of the initial abnormal area; obtaining edge probability corresponding to each characteristic pixel point according to the gray value of the pixel point on the straight line, and obtaining candidate pixel points according to the edge probability; judging the category of each candidate pixel point according to the minimum included angle between a straight line corresponding to each candidate pixel point and a vector vertical to the wind speed, the included angle between the vector pointing to the clustering center of the initial abnormal area by each candidate pixel point and the wind speed vector and the marginal probability; obtaining a temperature change index based on the category, the coordinate and the wind direction angle of each candidate pixel point and the temperature of the pixel point in the reference temperature image;
acquiring matched pixel points of edge pixel points in a gray level image to be analyzed in a reference gray level image; obtaining a possibility index of each edge pixel point based on the gray value, the corresponding height, the corresponding motion direction angle, the gray value corresponding to the standard red and the wind direction angle of each edge pixel point in the gray image to be analyzed; obtaining smoke spreading speed and flame spreading speed according to the possibility index and the coordinates of the matched pixel points; and judging the early warning level based on the temperature change index, the smoke spreading speed and the flame spreading speed.
Preferably, the determining the category of each candidate pixel point according to the minimum included angle between the straight line corresponding to each candidate pixel point and the vector perpendicular to the wind speed, the included angle between the vector pointing to the clustering center of the initial abnormal region by each candidate pixel point and the wind speed vector, and the edge probability includes:
for any candidate pixel point: judging whether the minimum included angle between a straight line corresponding to the candidate pixel point and a vector perpendicular to the wind speed is smaller than or equal to a first angle threshold value or not, and if the minimum included angle is smaller than or equal to the first angle threshold value, judging that the candidate pixel point is a fire wing pixel point; if the wind speed vector is larger than the first angle threshold, judging whether an included angle between the vector of the candidate pixel point pointing to the clustering center of the initial abnormal area and the wind speed vector is smaller than or equal to a second angle threshold, if the included angle is smaller than or equal to the second angle threshold, judging that the candidate pixel point is a fire head pixel point, and if the included angle is larger than the first angle threshold, judging that the candidate pixel point is a fire tail pixel point.
Preferably, the obtaining of the temperature change index based on the category, the coordinate, the wind direction angle of each candidate pixel point and the temperature of the pixel point in the reference temperature image includes:
for any candidate pixel: acquiring an included angle between a straight line formed by the candidate pixel point and the pixel point with the maximum temperature difference in the preset neighborhood and the horizontal positive direction of the candidate pixel point, and recording the included angle as a first included angle; obtaining the confidence corresponding to the candidate pixel point according to the temperature, the coordinate, the first included angle and the wind direction angle of the actual edge pixel point obtained in the category corresponding to the candidate pixel point;
obtaining edge pixel points of the fire area based on the confidence coefficient and the edge probability corresponding to each candidate pixel point; obtaining a fire area based on edge pixel points of the fire area; and calculating a temperature change index according to the Euclidean distance between the pixel point in the fire area and the clustering center of the initial abnormal area, the temperature of the pixel point in the fire area and the temperature of the corresponding pixel point in the reference temperature image.
Preferably, obtaining the confidence corresponding to each candidate pixel point includes:
for the v-th candidate pixel point:
if the candidate pixel point is a fire point, calculating the confidence corresponding to the candidate pixel point by adopting the following formula:
Figure 398004DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 280028DEST_PATH_IMAGE002
for the confidence of the corresponding nth candidate pixel point,
Figure 65582DEST_PATH_IMAGE003
is the included angle between the straight line and the horizontal positive direction formed by the pixel point with the maximum temperature difference between the v-th candidate pixel point and the preset neighborhood and the v-th candidate pixel point,
Figure 662916DEST_PATH_IMAGE004
is the wind direction angle, and the wind direction angle,
Figure 559328DEST_PATH_IMAGE005
to have judgedThe number of actual edge pixel points of the broken fire head region,
Figure 292929DEST_PATH_IMAGE006
the temperature of the v-th candidate pixel point,
Figure 452646DEST_PATH_IMAGE007
is a first
Figure 268768DEST_PATH_IMAGE008
The temperature of the pixel points at the actual edge of each fire head,
Figure 703291DEST_PATH_IMAGE009
for the v-th candidate pixel point and the judged fire head region
Figure 556978DEST_PATH_IMAGE008
The Euclidean distance of each actual edge pixel point,
Figure 418754DEST_PATH_IMAGE010
the maximum Euclidean distance between the v-th candidate pixel point and the actual edge pixel point of the judged fire head region,
Figure 990681DEST_PATH_IMAGE011
in order to take the absolute value of the value,
Figure 963316DEST_PATH_IMAGE012
is a function of taking the maximum value.
Preferably, the temperature change index is calculated using the following formula:
Figure 928300DEST_PATH_IMAGE013
wherein, the first and the second end of the pipe are connected with each other,
Figure 226557DEST_PATH_IMAGE014
as an index of the temperature change,
Figure 551359DEST_PATH_IMAGE015
for the number of pixel points in the fire area in the temperature image to be analyzed,
Figure 593264DEST_PATH_IMAGE016
the Euclidean distance between the ith pixel point in the fire area and the clustering center of the initial abnormal area in the temperature image to be analyzed,
Figure 93647DEST_PATH_IMAGE017
the maximum value of the Euclidean distances between all pixel points in the fire area and the clustering center of the initial abnormal area in the temperature image to be analyzed,
Figure 562806DEST_PATH_IMAGE018
for the temperature of the ith pixel point in the fire area in the temperature image to be analyzed,
Figure 840816DEST_PATH_IMAGE019
the temperature of a pixel point in the reference temperature image, which is the same as the ith pixel point in the fire area in the temperature image to be analyzed,
Figure 686412DEST_PATH_IMAGE011
the absolute value is taken.
Preferably, the obtaining the edge probability corresponding to each feature pixel point according to the gray value of the pixel point on the straight line includes:
for the a-th characteristic pixel point:
acquiring a preset number of sampling points closest to the characteristic pixel point on two sides of the characteristic pixel point on a straight line corresponding to the characteristic pixel point respectively; based on the gray value of each sampling point and the gray value of the characteristic pixel point, calculating the edge probability corresponding to the characteristic pixel point by adopting the following formula:
Figure 572459DEST_PATH_IMAGE020
wherein, the first and the second end of the pipe are connected with each other,
Figure 478099DEST_PATH_IMAGE021
the edge probability corresponding to the a-th feature pixel point,
Figure 511914DEST_PATH_IMAGE022
the number of the sampling points in the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point,
Figure 426780DEST_PATH_IMAGE023
the first abnormal region in the straight line corresponding to the a-th characteristic pixel point
Figure 229651DEST_PATH_IMAGE024
The temperature of the individual sampling points is,
Figure 574700DEST_PATH_IMAGE025
the first abnormal region in the straight line corresponding to the a-th characteristic pixel point
Figure 361391DEST_PATH_IMAGE026
The temperature of the individual sampling points is,
Figure 814369DEST_PATH_IMAGE027
is the first outside the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point
Figure 737325DEST_PATH_IMAGE024
The temperature of the individual sampling points is,
Figure 250346DEST_PATH_IMAGE028
is the first outside the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point
Figure 258754DEST_PATH_IMAGE026
The temperature of the individual sampling points is,
Figure 249843DEST_PATH_IMAGE029
is the temperature of the a-th characteristic pixel point,
Figure 292886DEST_PATH_IMAGE030
the average temperature of the a-th characteristic pixel point and all sampling points on the corresponding straight line is calculated,
Figure 976808DEST_PATH_IMAGE031
is an exponential function with a natural constant as a base number,
Figure 472511DEST_PATH_IMAGE011
in order to take the absolute value of the value,
Figure 264362DEST_PATH_IMAGE032
to adjust the parameters.
Preferably, the obtaining of the probability index of each edge pixel point based on the gray value, the corresponding height, the corresponding motion direction angle, the gray value corresponding to the standard red color and the wind direction angle of each edge pixel point in the gray image to be analyzed includes:
for any edge pixel:
respectively carrying out normalization processing on the gray value of the edge pixel point and the corresponding height, calculating the difference value between a constant 1 and the normalized gray value, and recording as a first color difference corresponding to the edge pixel point; taking the product of the first color difference and the normalized height as the initial smoke possibility corresponding to the edge pixel point;
calculating the difference between the constant 1 and the normalized height, and recording as the height difference; calculating the difference between the gray value of the edge pixel point and the gray value corresponding to the standard red, and recording as a second color difference corresponding to the edge pixel point; taking a natural constant as a base number, and taking a value of an exponential function taking the negative second color difference as an index as a color difference index; taking the product of the height difference and the color difference index as the initial possibility of flame corresponding to the edge pixel point;
and obtaining the probability index of the edge pixel point according to the initial probability of the smoke, the initial probability of the flame, the corresponding motion direction angle, the coordinate of the edge pixel point in the preset neighborhood and the wind direction angle corresponding to the edge pixel point.
Preferably, the probability index of the edge pixel point is obtained according to the initial probability of smoke corresponding to the edge pixel point, the initial probability of flame corresponding to the edge pixel point, the corresponding motion direction angle, the coordinate of the edge pixel point in the preset neighborhood and the wind direction angle, and the probability index comprises:
calculating the average value of the angles of straight lines formed by the edge pixel points and the edge pixel points in the preset neighborhood, and recording the average value as a first angle average value; calculating the absolute value of the difference value between the first angle average value and the wind direction angle, recording the absolute value as a first direction difference, calculating the sum of the first direction difference and an adjusting parameter, recording the sum as a direction index, and taking the ratio of a constant 1 to the direction index as a first optimization coefficient corresponding to the w-th edge pixel point;
calculating the difference between the motion direction angle and the wind direction angle of the edge pixel point, and recording as a second direction difference; normalizing the second direction difference, and taking the difference value of the constant 1 and the normalized second direction difference as a second optimization coefficient corresponding to the edge pixel point;
obtaining a smoke possibility index of the edge pixel point according to the first optimization coefficient, the second optimization coefficient and the smoke initial possibility corresponding to the edge pixel point; obtaining a flame possibility index of the edge pixel point according to the first optimization coefficient, the second optimization coefficient and the flame initial possibility corresponding to the edge pixel point;
the likelihood indicators include a smoke likelihood indicator and a flame likelihood indicator.
Preferably, obtaining the smoke spreading speed and the flame spreading speed according to the probability index and the coordinates of the matched pixel points, includes:
for any edge pixel: if the smoke possibility index and the flame possibility index are both larger than the judgment threshold, judging whether the smoke possibility index is larger than the flame possibility index, if so, judging that the edge pixel point is a smoke edge pixel point, and if not, judging that the edge pixel point is a flame edge pixel point; if the smoke possibility index is larger than the judgment threshold and the flame possibility index is smaller than or equal to the judgment threshold, judging that the edge pixel point is a smoke edge pixel point; if the flame possibility index is larger than the judgment threshold and the smoke possibility index is smaller than or equal to the judgment threshold, judging the edge pixel point as a flame edge pixel point;
obtaining a smoke region and a flame region based on the smoke edge pixel points and the flame edge pixel points; and obtaining the smoke spreading speed and the flame spreading speed based on the Euclidean distance between each edge pixel point of the smoke area and the flame area and the corresponding matching pixel point and the acquisition time interval of the adjacent frame gray level images.
Preferably, the obtaining of matching pixel points of each edge pixel point in the gray scale image to be analyzed in the reference gray scale image includes:
for the w-th edge pixel point in the gray image to be analyzed:
obtaining the pixel point with the same position as the edge pixel point in the reference gray level image and marking as the first
Figure 161911DEST_PATH_IMAGE033
Each pixel point, calculating the w-th edge pixel point and the w-th edge pixel point
Figure 16735DEST_PATH_IMAGE033
The gray level difference of each pixel point is determined according to the second criterion if the gray level difference is not 0
Figure 999734DEST_PATH_IMAGE033
The gray value of each pixel point in the preset neighborhood of each pixel point, the first
Figure 598206DEST_PATH_IMAGE033
Gray values of pixel points in the preset neighborhood of all pixel points in the preset neighborhood of the pixel points, gray values of the w-th edge pixel points and gray values of pixel points in the preset neighborhood of the w-th edge pixel points are calculated, and the gray values of the pixel points in the preset neighborhood of the w-th edge pixel points are calculated
Figure 350261DEST_PATH_IMAGE033
The corresponding optimal value of each pixel point in the preset neighborhood of each pixel point; will be first
Figure 641565DEST_PATH_IMAGE033
Per pixel point predictionSetting a pixel point with the maximum preferred value in the neighborhood as a matching pixel point of the w-th edge pixel point in the reference gray image;
first, the
Figure 111861DEST_PATH_IMAGE033
The acquisition process of the corresponding preferred value of any pixel point in the preset neighborhood of each pixel point is as follows: calculating the gray difference between the pixel point and the w-th edge pixel point, and recording the gray difference as a first characteristic difference; calculating the gray difference of the pixel points at the corresponding positions in the preset neighborhood of each pixel point and the w-th edge pixel point in the preset neighborhood of the pixel point, and recording the gray difference as a second characteristic difference; obtaining a corresponding preference value of the pixel point based on the first characteristic difference and the second characteristic difference; the first feature difference and the second feature difference are both inversely related to preferred values.
The invention has at least the following beneficial effects:
1. the method analyzes the forest according to the temperature change condition, the smoke information and the flame information of the forest monitoring area, judges whether the forest monitoring area is in fire in real time, and once the forest is in fire, the faster the temperature change and the faster the spreading speed of the fire area are, the higher the danger degree is, so that when the forest monitoring area is in fire, the temperature change index and the fire spreading speed of the fire area need to be judged in time; considering that the fire spreading speed can be determined by the smoke spreading speed and the flame spreading speed, the smoke spreading speed and the flame spreading speed are calculated according to the probability indexes of the edge pixel points and the coordinates of the matched pixel points, and when the smoke spreading speed and the flame spreading speed are high, the faster the fire spreading is, the higher the danger degree is; the invention analyzes the forest condition from a plurality of aspects, improves the fire identification precision, judges the early warning grade according to the temperature change index, the smoke spreading speed and the flame spreading speed, completes the accurate identification and early warning of the fire, improves the early warning reliability and reduces the hazard of the fire.
2. When the temperature change index is obtained, the pixels in the temperature image to be analyzed are clustered by adopting a traditional clustering algorithm to obtain an initial abnormal area, then edge pixels in the initial abnormal area and pixels around the edge pixels are judged according to the internal temperature change condition of the fire area, and accurate edge pixels in the fire area are obtained, so that the calculation precision of the temperature change index is improved, and the early warning reliability of forest fire prevention can be effectively improved.
3. When the smoke spreading speed and the flame spreading speed are obtained, the fact that the color of flame is close to red and is close to the ground, the color of the smoke is dark and is far from the ground is considered, therefore, the gray value of the flame edge pixel point is close to the gray value corresponding to the standard red and is low in height, the gray value of the smoke edge pixel point is low and is high, and in addition, the smoke direction and the flame direction can be influenced by natural wind; based on the characteristics, the probability indexes of each edge pixel point in the gray level image to be analyzed are calculated, and then the smoke spreading speed and the flame spreading speed are obtained according to the probability indexes and the coordinates of the matched pixel points.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of an intelligent early warning monitoring management method for forest fire prevention according to the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined invention purpose, the following detailed description will be given for an intelligent early warning monitoring management method for forest fire prevention according to the present invention with reference to the accompanying drawings and preferred embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The specific scheme of the intelligent early warning monitoring management method for forest fire prevention provided by the invention is specifically described below with reference to the accompanying drawings.
An embodiment of an intelligent early warning monitoring management method for forest fire prevention comprises the following steps:
the embodiment provides an intelligent early warning monitoring management method for forest fire prevention, and as shown in fig. 1, the intelligent early warning monitoring management method for forest fire prevention of the embodiment includes the following steps:
the method comprises the following steps of S1, acquiring two adjacent frames of temperature images and corresponding gray level images of a forest monitoring area, and respectively recording the temperature images as a reference temperature image, a temperature image to be analyzed, a reference gray level image and a gray level image to be analyzed.
The specific scenario addressed by this embodiment is: the method comprises the steps of collecting monitoring images and temperature images of a forest monitoring area in real time through front-end monitoring equipment in the air-space-ground-human integrated early warning monitoring system, analyzing gray level images and temperature images of adjacent frames respectively, judging whether a fire disaster occurs or not, and if the fire disaster occurs, obtaining a temperature change index, a flame spreading speed and a smoke spreading speed so as to determine an early warning grade.
Firstly, a front-end monitoring device in an air-ground-human integrated early warning and monitoring system is used for collecting a monitoring image of a forest monitoring area in real time, the collected monitoring image is an RGB image, and the collected monitoring image is subjected to gray processing by a weighted gray processing method to obtain a gray image of the forest monitoring area; meanwhile, a temperature image of a forest monitoring area is collected in real time through a satellite, the temperature image is an infrared thermal image, and each pixel point in the temperature image has a corresponding temperature value. Graying is prior art and will not be described in detail herein.
In this embodiment, a temperature image and a corresponding grayscale image of a forest monitoring area are obtained, the forest monitoring area is analyzed by combining adjacent frame images, next, two adjacent frame temperature images and two corresponding adjacent frame grayscale images are taken as an example for explanation, a temperature image of a frame in the two adjacent frame temperature images at the previous acquisition time is taken as a reference temperature image, and a temperature image of a frame in the two adjacent frame temperature images at the later acquisition time is taken as a temperature image to be analyzed; recording a gray level image of the forest monitoring area acquired at the same time as the temperature image to be analyzed as a gray level image to be analyzed, and recording a gray level image of the forest monitoring area acquired at the same time as the reference temperature image as a reference gray level image; the present embodiment will be described below by taking a reference temperature image, a temperature image to be analyzed, a reference grayscale image, and a grayscale image to be analyzed as examples.
S2, clustering pixels in a temperature image to be analyzed to obtain an initial abnormal area, judging whether a fire disaster occurs or not according to the temperature of the pixels in the initial abnormal area, if so, marking each edge pixel of the initial abnormal area and pixels outside the initial abnormal area in a preset neighborhood as feature pixels, and obtaining a straight line corresponding to each feature pixel based on each feature pixel and a clustering center of the initial abnormal area; obtaining edge probability corresponding to each characteristic pixel point according to the gray value of the pixel point on the straight line, and obtaining candidate pixel points according to the edge probability; judging the category of each candidate pixel point according to the minimum included angle between a straight line corresponding to each candidate pixel point and a vector vertical to the wind speed, the included angle between the vector of each candidate pixel point pointing to the clustering center of the initial abnormal area and the wind speed vector and the marginal probability; and obtaining a temperature change index based on the category, the coordinate and the wind direction angle of each candidate pixel point and the temperature of the pixel point in the reference temperature image.
The method comprises the steps of firstly analyzing a reference temperature image and a temperature image to be analyzed to obtain a suspected fire area in the temperature image to be analyzed, setting the number of clustering categories to be 2 by using a K-means clustering algorithm for the temperature image to be analyzed, setting the clustering centers to be a pixel point with the highest temperature and a pixel point with the lowest temperature in the temperature image to be analyzed respectively, and setting the distance measurement mode to be temperature difference. Obtaining a region with higher temperature and a region with lower temperature in the temperature image to be analyzed through clustering, wherein the average gray value of the pixel points in the region with higher temperature is greater than the average gray value of the pixel points in the region with lower temperature; the temperature of the area where the fire occurs is higher than that of the normal area, so that the area with a larger average gray value of pixel points in the clustered temperature images to be analyzed is marked as an initial abnormal area, the average value of the temperatures of all the pixel points in the initial abnormal area is calculated, the average value of the temperatures of all the pixel points in the temperature images to be analyzed is calculated, the absolute value of the difference value between the average value of the temperatures of all the pixel points in the initial abnormal area and the average value of the temperatures of all the pixel points in the temperature images to be analyzed is calculated, the greater the absolute value is, the greater the possibility that the fire occurs in the forest is indicated, the temperature difference threshold is set, whether the absolute value is greater than the temperature difference threshold is judged, if the absolute value is greater, the fire occurs in the forest monitoring area is judged, and the early warning grade needs to be further judged; if the number of the fire disasters is less than or equal to the number of the fire disasters in the forest monitoring area, judging that the fire disasters do not occur in the forest monitoring area; in the embodiment, the temperature difference threshold is set to be 20 ℃, and in specific application, an implementer can set the temperature difference threshold according to the outside air temperature; since the fire area is obtained only according to the clustering result, it is likely that part of abnormal areas will be misjudged as normal areas, and therefore the initial abnormal area is not obtained accurately enough, the embodiment further analyzes the edge pixel points of the initial abnormal area and the neighborhood pixel points of the edge pixel points, and screens and obtains the real edge pixel points by obtaining the corresponding edge probability, so that the finally obtained fire area edge is more real and accurate, and the fire area can be further analyzed accurately.
And recording each edge pixel point of the initial abnormal area and pixel points outside the initial abnormal area in a preset neighborhood as characteristic pixel points, so as to obtain a plurality of characteristic pixel points, and then analyzing the characteristic pixel points.
For the a-th characteristic pixel point:
taking the characteristic pixel point and the clustering center of the initial abnormal area as two pixel points on a straight line corresponding to the characteristic pixel point, taking the characteristic pixel point and the clustering center of the initial abnormal area as a straight line corresponding to the characteristic pixel point, dividing the pixel point on the straight line into two parts based on the clustering center of the initial abnormal area, wherein one part is the pixel point positioned in the initial abnormal area, the other part is the pixel point positioned outside the initial abnormal area, respectively obtaining a preset number of sampling points which are closer to the a-th characteristic pixel point in the two parts, namely taking the a-th characteristic pixel point as a central point, respectively obtaining a preset number of sampling points which are closest to the a-th characteristic pixel point on two sides of the a-th characteristic pixel point on the straight line corresponding to the a-th characteristic pixel point, wherein the sampling points are all positioned on the straight line, the sampling points are divided into two types, one type is the sampling point in the initial abnormal area, and the other type is the sampling point outside the initial abnormal area; the preset number is set to 5 in the embodiment, and in a specific application, an implementer can set the preset number according to a specific situation. Considering that the temperature inside the fire area is continuously attenuated and accords with a certain distribution rule, a certain temperature change rule exists in the fire area, and the temperature change condition outside the fire area does not accord with the distribution rule, so that a certain difference exists; based on this, this embodiment judges the a-th feature pixel point through the difference between the temperature change of the sampling point in the initial abnormal region on the straight line corresponding to the a-th feature pixel point and the temperature change outside the initial abnormal region on the straight line corresponding to the a-th feature pixel point, obtains the marginal probability corresponding to the a-th feature pixel point, the marginal probability means the probability that the corresponding feature pixel point is the real marginal pixel point, and the specific expression of the marginal probability corresponding to the a-th feature pixel point is:
Figure 248444DEST_PATH_IMAGE020
wherein, the first and the second end of the pipe are connected with each other,
Figure 869654DEST_PATH_IMAGE021
the edge probability corresponding to the a-th feature pixel point,
Figure 331860DEST_PATH_IMAGE022
is the number of sampling points in the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point,
Figure 289451DEST_PATH_IMAGE023
is the first in the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point
Figure 229726DEST_PATH_IMAGE024
The temperature of the individual sampling points is,
Figure 956373DEST_PATH_IMAGE025
the first abnormal region in the straight line corresponding to the a-th characteristic pixel point
Figure 589480DEST_PATH_IMAGE026
The temperature of the individual sampling points is,
Figure 768788DEST_PATH_IMAGE027
is the first outside the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point
Figure 512753DEST_PATH_IMAGE024
The temperature of the individual sampling points is,
Figure 359487DEST_PATH_IMAGE028
is the first outside the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point
Figure 897915DEST_PATH_IMAGE026
The temperature of the individual sampling points is,
Figure 830099DEST_PATH_IMAGE029
is the temperature of the a-th characteristic pixel point,
Figure 843667DEST_PATH_IMAGE030
the average temperature of the a-th characteristic pixel point and all sampling points on the corresponding straight line is calculated,
Figure 544907DEST_PATH_IMAGE031
is an exponential function with a natural constant as a base number,
Figure 254237DEST_PATH_IMAGE011
in order to take the absolute value of the value,
Figure 673717DEST_PATH_IMAGE032
to adjust the parameters.
The adjustment parameter is introduced to prevent the denominator from being 0, and the present embodiment sets the adjustment parameter
Figure 493906DEST_PATH_IMAGE032
The value of (A) is 0.01, and in a specific application, an implementer can set the value according to specific situations;
Figure 315231DEST_PATH_IMAGE034
representing the difference between the temperature changes of the pixel points at the two sides of the a-th characteristic pixel point, wherein the smaller the difference is, the more the temperature change condition of the sampling point corresponding to the a-th characteristic pixel point accords with the temperature change rule in the fire area, and the more the a-th characteristic pixel point cannot be a real edge pixel point;
Figure 461042DEST_PATH_IMAGE035
representing the temperature difference between the a-th characteristic pixel point and the corresponding sampling point, wherein the larger the difference is, the more unlikely the a-th characteristic pixel point is to be a real edge pixel point; when the difference between the temperature changes of the pixel points at the two sides of the a-th characteristic pixel point is larger and the temperature difference between the a-th characteristic pixel point and the corresponding sampling point is smaller, the temperature change condition of the sampling point corresponding to the a-th characteristic pixel point is less consistent with the temperature change rule in the fire area, and the a-th characteristic pixel point is more likely to be a real edge pixel point, namely the edge probability corresponding to the a-th characteristic pixel point is larger.
By adopting the method, the edge probability corresponding to each characteristic pixel point can be obtained. The whole fire area can be divided into sub-areas of different categories, the categories comprise a fire head area, a fire tail area and a fire wing area, the edge characteristics of the sub-areas of the different categories are different, the edge direction of the fire head area is closer to the wind direction, the edge direction of the fire wing area is closer to the vertical direction of the wind direction, and the edge direction of the fire tail area is opposite to the wind direction. Since the feature pixels exist in the preset neighborhood of each edge pixel in the initial abnormal region, the embodiment will screen these feature pixels, obtain the feature pixels that are more likely to be edge pixels, and record them as candidate pixels, where the candidate pixels are feature pixels whose edge probability is greater than the first probability threshold, that is, multiple candidate pixels are obtained, and the candidate pixels will be subsequently analyzed. The first probability threshold is set to 0.6 by the embodiment, and in a specific application, the implementer can set the probability threshold according to specific situations.
The fire area can be divided into three parts, namely a fire head, a fire wing and a fire tail, wherein the fire head is positioned at the front end of the fire area and is the part with the fastest flame forward extension and the strongest fire, and the direction of the fire head is consistent with the wind direction; the fire tail is positioned at the rear end of the fire, and the fire tail spreads against the wind, so that the speed is slowest and the strength is minimum; the fire wings are positioned at two sides of a fire area and vertically extend with the wind direction, the speed is between the fire head and the fire tail, the part which is closer to the fire head is faster in extension and higher in strength, and the part which is closer to the fire tail is slower in extension and lower in strength. Because edge pixel points of different parts of a fire area have different characteristics, candidate pixel points are firstly classified in the embodiment, and a wind speed vector of a forest monitoring area is acquired in real time through a wind speed sensor pre-installed in an air-space-ground-person system, and two vectors perpendicular to the wind speed vector are acquired at the same time.
This embodiment has already obtained the straight lines corresponding to all the feature pixel points, because the candidate pixel points are a part of the feature pixel pointsTherefore, a straight line corresponding to each candidate pixel point is also obtained, and the description will be given by taking the r-th candidate pixel point as an example. First, a first angle threshold is set to
Figure 836659DEST_PATH_IMAGE036
The second angle threshold is
Figure 460539DEST_PATH_IMAGE037
In a specific application, the implementer can set the conditions according to specific situations; for the r-th candidate pixel point: obtaining the minimum included angle between the straight line corresponding to the candidate pixel point and the vector vertical to the wind speed, and recording the minimum included angle as
Figure 136371DEST_PATH_IMAGE038
When is coming into contact with
Figure 190433DEST_PATH_IMAGE039
If so, judging the candidate pixel point as a fire wing pixel point, otherwise, indicating the candidate pixel point as a fire head pixel point or a fire tail pixel point, further analyzing the candidate pixel point, acquiring the included angle between the vector of the candidate pixel point to the clustering center of the initial abnormal area and the wind speed vector, and recording the included angle as the included angle
Figure 318926DEST_PATH_IMAGE040
When is coming into contact with
Figure 480917DEST_PATH_IMAGE041
And if not, judging the candidate pixel point to be a fire tail pixel point. By adopting the method, all the candidate pixel points are judged, and are classified into three types, namely the fire wing pixel points, the fire head pixel points and the fire tail pixel points.
Next, the embodiment determines whether the candidate pixel belongs to the edge pixel of the corresponding category based on the category corresponding to the candidate pixel, so as to obtain an accurate fire area. For the v-th candidate pixel point, acquiring a straight line between the v-th candidate pixel point and the pixel point with the largest temperature difference in the preset neighborhood and the v-th candidate pixel point and an included angle in the horizontal positive direction, and recording the included angle as a first included angle, if the v-th candidate pixel point is a fire head pixel point, calculating the confidence coefficient corresponding to the candidate pixel point according to the difference between the first included angle and the wind direction and the temperature difference between the v-th candidate pixel point and the actual fire head edge point, wherein the specific calculation formula is as follows:
Figure 11255DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 498869DEST_PATH_IMAGE002
for the confidence of the v-th candidate pixel point correspondence,
Figure 114658DEST_PATH_IMAGE003
is the included angle between the straight line and the horizontal positive direction formed by the pixel point with the maximum temperature difference between the v-th candidate pixel point and the preset neighborhood and the v-th candidate pixel point,
Figure 80340DEST_PATH_IMAGE004
is the wind direction angle, and the wind direction angle,
Figure 465185DEST_PATH_IMAGE005
the number of the actual edge pixel points of the fire head area which is judged already,
Figure 123699DEST_PATH_IMAGE006
for the temperature of the v-th candidate pixel point,
Figure 161538DEST_PATH_IMAGE007
is a first
Figure 930911DEST_PATH_IMAGE008
The temperature of the pixel points at the actual edge of each fire head,
Figure 170262DEST_PATH_IMAGE009
for the v-th candidate pixel point and the judged fire head region
Figure 999678DEST_PATH_IMAGE008
The Euclidean distance of each actual edge pixel point,
Figure 324480DEST_PATH_IMAGE010
the maximum Euclidean distance between the v-th candidate pixel point and the actual edge pixel point of the judged fire head region,
Figure 897544DEST_PATH_IMAGE003
characterizing the first angle.
When the temperature is higher than the set temperature
Figure 256981DEST_PATH_IMAGE005
When the value of (a) is 0, the calculation formula of the confidence coefficient
Figure 991719DEST_PATH_IMAGE042
. In the embodiment, the confidence that the v-th candidate pixel point is the actual fire head edge pixel point is represented by the difference between the direction and the wind direction corresponding to the v-th candidate pixel point and the temperature difference between the v-th candidate pixel point and the actual fire head edge point, and the smaller the difference is, the more likely the v-th candidate pixel point is the actual fire head edge pixel point, and the greater the confidence corresponding to the v-th candidate pixel point is.
If the v-th candidate pixel point is a fire wing pixel point, the corresponding confidence coefficient is calculated by adopting the confidence coefficient calculation formula, and only the confidence coefficient in the formula is used
Figure 69396DEST_PATH_IMAGE005
Replacing the number of the actual edge pixel points of the fire wing area with the number of the actual edge pixel points of the fire wing area which are already judged, and the specific calculation process is not repeated; if the v-th candidate pixel point is a fire tail pixel point, the confidence coefficient is still calculated by adopting the confidence coefficient calculation formula, and only the confidence coefficient in the formula is calculated
Figure 446151DEST_PATH_IMAGE005
Replacing the number of the actual edge pixel points of the judged fire tail area, and not excessively describing the specific calculation process; so far, by adopting the method, the method can be obtainedConfidence of all candidate pixel point correspondences.
The embodiment has obtained the edge probability and confidence of each candidate pixel point; if the actual edge pixel points are screened only based on the edge probability, the influence of the wind speed on the edge of the fire area is ignored, the fire area obtained by directly screening accords with the temperature characteristic distribution but does not accord with the shape characteristic of the fire, so that the extraction of the fire area is not accurate enough, and if the actual edge pixel points are screened only based on the edge characteristic (namely confidence), the obtained fire area is close to the actual fire area, but the temperature distribution does not accord with the fire characteristic enough, so that the extraction of the fire area is not accurate enough; therefore, in this embodiment, edge pixel points are screened by combining the edge probability and the confidence level, the product of the edge probability and the confidence level of each candidate pixel point is calculated, normalization processing is performed on the product, the normalization result is used as the target probability of the corresponding candidate pixel point, the larger the target probability is, the more likely the corresponding candidate pixel point is to be an actual edge pixel point, a second probability threshold is set, and when the target probability is greater than the second probability threshold, the corresponding candidate pixel point is determined to be an actual edge pixel point. The target probability not only considers the difference of neighborhood temperature change values and the average temperature difference characteristic of the edge pixel points, but also considers the direction characteristic of edge pixel points in different areas of a fire disaster, thereby improving the screening precision of the actual edge points. Thereby acquiring an accurate fire area. In this embodiment, the second probability threshold is set to 0.8, and in a specific application, an implementer may set the second probability threshold according to a specific situation.
By adopting the steps, the actual edge pixel points of the fire area are obtained, the closed area formed by combining the actual edge pixel points is recorded as the fire area, the area of the fire area in the temperature image to be analyzed is obtained, and the larger the area of the fire area is, the higher the danger degree is.
In this embodiment, a fire area in a temperature image to be analyzed is already obtained, and if the temperature difference between a pixel point in the fire area and a previous frame of temperature image is large, it indicates that the temperature change of the fire area is more obvious; if the distance between the pixel point in the fire area in the temperature image to be analyzed and the pixel point with the highest temperature is shorter, the reference weight of the temperature change of the corresponding pixel point is larger; based on this, the temperature change index is calculated according to the Euclidean distance between each pixel point in the fire area in the temperature image to be analyzed and the corresponding clustering center, the temperature of each pixel point in the fire area in the temperature image to be analyzed, and the temperature of the pixel point in the reference temperature image, which is the same as the position of each pixel point in the fire area in the temperature image to be analyzed, namely:
Figure 660094DEST_PATH_IMAGE013
wherein the content of the first and second substances,
Figure 822524DEST_PATH_IMAGE014
as an index of the temperature change,
Figure 121918DEST_PATH_IMAGE015
for the number of pixel points in the fire area in the temperature image to be analyzed,
Figure 771205DEST_PATH_IMAGE016
is the Euclidean distance between the ith pixel point in the fire area in the temperature image to be analyzed and the clustering center of the initial abnormal area,
Figure 105235DEST_PATH_IMAGE017
the maximum value of the Euclidean distances between all pixel points in the fire area and the clustering center of the initial abnormal area in the temperature image to be analyzed,
Figure 447354DEST_PATH_IMAGE018
for the temperature of the ith pixel point in the fire area in the temperature image to be analyzed,
Figure 234045DEST_PATH_IMAGE019
the temperature of a pixel point in the reference temperature image, which is the same as the ith pixel point in the fire area in the temperature image to be analyzed,
Figure 687023DEST_PATH_IMAGE011
the absolute value is taken.
Figure 875559DEST_PATH_IMAGE043
Representing the reference weight corresponding to the ith pixel point in the fire area in the temperature image to be analyzed, wherein the closer the distance between the ith pixel point and the corresponding clustering center is, the larger the corresponding reference weight is;
Figure 388579DEST_PATH_IMAGE044
representing the temperature difference of the same position in the two adjacent frames of temperature images; when the distance between a pixel point in a fire area and a clustering center of an initial abnormal area in a temperature image to be analyzed is shorter and the temperature difference of the pixel point at the same position in the fire area in two frames of temperature images is larger, the more the temperature change of the fire area is, namely the temperature change index is larger; when the distance between a pixel point in a fire area and a corresponding clustering center in a temperature image to be analyzed is longer and the temperature difference between the pixel point at the same position in the fire area in the front and back two frames of temperature images is smaller, the smaller the temperature change of the fire area is, namely the smaller the temperature change index is.
Thus, the temperature change index is obtained by analyzing the temperature image to be analyzed and the reference temperature image.
S3, acquiring matched pixel points of each edge pixel point in the gray level image to be analyzed in the reference gray level image; obtaining a possibility index of each edge pixel point based on the gray value, the corresponding height, the corresponding motion direction angle, the gray value corresponding to the standard red and the wind direction angle of each edge pixel point in the gray image to be analyzed; obtaining smoke spreading speed and flame spreading speed according to the possibility index and the coordinates of the matched pixel points; and judging the early warning level based on the temperature change index, the smoke spreading speed and the flame spreading speed.
In step S2, each temperature image is analyzed, a fire area in the temperature image to be analyzed is obtained, and then the monitoring image is further analyzed to obtain information of the fire, such as information of a flame boundary, a smoke boundary, a spreading speed, and the like. Specifically, in this embodiment, a grayscale image corresponding to two temperature images analyzed in step S2 is taken as an example for explanation; firstly, detecting a gray image to be analyzed by using a canny operator to obtain edge pixel points in the gray image to be analyzed, regarding isolated edge pixel points as noise points, not analyzing, and dividing continuous edge pixel points in the gray image to be analyzed into three types, wherein the first type is interference edge pixel points, the second type is smoke edge pixel points, and the third type is flame edge pixel points. In this embodiment, the smoke region and the flame region need to be acquired, so that the continuous edge pixel points need to be further analyzed, and adaptive classification is performed to acquire the corresponding region. It should be noted that, in this embodiment, the edge pixels mentioned subsequently are all continuous edge pixels.
For the w-th edge pixel point in the gray image to be analyzed:
firstly, calculating initial possibility corresponding to the w-th edge pixel point according to the gray value of the w-th edge pixel point, wherein the initial possibility comprises smoke initial possibility and flame initial possibility, the smoke initial possibility represents the initial possibility that the corresponding pixel point is a smoke edge pixel point, and the flame initial possibility represents the initial possibility that the corresponding pixel point is a flame edge pixel point; considering that smoke generated during combustion in a forest area is heavy in color, gray values of smoke edge pixel points are small, smoke can float upwards above flame under the general condition, and therefore the edge pixel points above the smoke are more likely to be the smoke edge pixel points, and the edge pixel points below the smoke are more likely to be the flame edge pixel points; based on this, in this embodiment, the smoke initial probability corresponding to the edge pixel point and the flame initial probability corresponding to the edge pixel point are calculated according to the gray value of the edge pixel point, the height of the edge pixel point in the forest in the vertical direction, and the gray value corresponding to the standard red after graying, specifically, the gray value and the corresponding height of the edge pixel point are respectively normalized, and the difference between the constant 1 and the normalized gray value is calculated and is recorded as the first color difference corresponding to the edge pixel point; taking the product of the first color difference and the normalized height as the initial smoke possibility corresponding to the edge pixel point; calculating the difference value between the constant 1 and the normalized height, recording the difference value as a height difference, calculating the difference between the gray value of the edge pixel point and the gray value corresponding to the standard red, recording the difference value as a second color difference corresponding to the edge pixel point, and taking the natural constant as a base number and the value of an exponential function taking the negative second color difference as an index as a color difference index; taking the product of the height difference and the color difference index as the initial possibility of flame corresponding to the edge pixel point; the specific expressions of the initial probability of smoke corresponding to the edge pixel point and the initial probability of flame corresponding to the edge pixel point are as follows:
Figure 662566DEST_PATH_IMAGE045
Figure 184814DEST_PATH_IMAGE046
wherein the content of the first and second substances,
Figure 959348DEST_PATH_IMAGE047
the initial probability of smoke corresponding to the w-th edge pixel point,
Figure 908849DEST_PATH_IMAGE048
for the initial probability of flame corresponding to the w-th edge pixel point,
Figure 404553DEST_PATH_IMAGE049
is the gray value of the w-th edge pixel point,
Figure 464912DEST_PATH_IMAGE050
is the gray value corresponding to the standard red color,
Figure 362461DEST_PATH_IMAGE051
the height corresponding to the w-th edge pixel point,
Figure 482864DEST_PATH_IMAGE052
the maximum height of all edge pixel points in the gray level image to be analyzed in the vertical direction is obtained,
Figure 731443DEST_PATH_IMAGE053
is a natural constant and is a natural constant,
Figure 329914DEST_PATH_IMAGE011
in order to take the absolute value of the value,
Figure 81970DEST_PATH_IMAGE054
for the first color difference corresponding to the w-th edge pixel point,
Figure 638853DEST_PATH_IMAGE055
in order to be the difference in height,
Figure 109149DEST_PATH_IMAGE056
for the second color difference corresponding to the w-th edge pixel point,
Figure 248662DEST_PATH_IMAGE057
is an indicator of color difference.
Figure 120803DEST_PATH_IMAGE058
Indicating that the gray value of the w-th edge pixel point is normalized,
Figure 583008DEST_PATH_IMAGE059
expressing the height of the w-th edge pixel point in the vertical direction to be subjected to normalization processing; when the gray value of the w-th edge pixel point is smaller and the height of the w-th edge pixel point in the vertical direction is higher, it is indicated that the w-th edge pixel point is more likely to be a smoke edge pixel point, namely the smoke initial possibility corresponding to the w-th edge pixel point is higher; when the height of the w-th edge pixel point in the vertical direction is lower and the gray value of the w-th edge pixel point is closer to the gray value corresponding to the standard red, the w-th edge pixel point is more likely to be a flame edge pixel point, namely the w-th edge pixel pointThe greater the likelihood of a point correspondence of flame initiation.
By adopting the method, the smoke initial possibility corresponding to each edge pixel point and the flame initial possibility corresponding to each edge pixel point can be obtained.
Because when a forest is in fire, the generated smoke and flame can move along with the wind direction, so that the direction of the edge of the flame and the direction of the edge of the smoke have higher similarity with the wind direction, the wind direction angle is obtained in real time through the wind speed sensor, and then the initial possibility is optimized based on the wind direction angle, so that more accurate possibility indexes are obtained, and the reliability of early warning is improved. It should be noted that the wind direction angle obtained in this embodiment is an included angle between the direction of the natural wind and the horizontal positive direction.
For the w-th edge pixel point in the gray image to be analyzed:
acquiring pixel points in a preset neighborhood of the w-th edge pixel point, wherein the pixel points in the preset neighborhood of the w-th edge pixel point necessarily contain other edge pixel points because the edge pixel points are continuous, calculating the average value of the angles of the w-th edge pixel point and the edge pixel points in the preset neighborhood, and recording the average value as the angle of the straight line formed by the w-th edge pixel point and the edge pixel points in the preset neighborhood
Figure 540600DEST_PATH_IMAGE060
(ii) a In the present embodiment, the angle of the straight line is an angle formed by the straight line and the positive horizontal direction; the specific calculation formula is as follows:
Figure 480874DEST_PATH_IMAGE061
wherein, in the step (A),
Figure 473101DEST_PATH_IMAGE062
the number of the inner edge pixel points in the preset neighborhood of the w-th edge pixel point,
Figure 840628DEST_PATH_IMAGE063
is the coordinate information of the w-th edge pixel point,
Figure 285516DEST_PATH_IMAGE064
is the abscissa of the w-th edge pixel,
Figure 29481DEST_PATH_IMAGE065
is the ordinate of the w-th edge pixel point,
Figure 876214DEST_PATH_IMAGE066
for the w-th edge pixel point within the preset neighborhood
Figure 945801DEST_PATH_IMAGE067
The coordinate information of the pixel points of each edge,
Figure 612406DEST_PATH_IMAGE068
for the w-th edge pixel point within the preset neighborhood
Figure 157132DEST_PATH_IMAGE067
The abscissa of the pixel point of each edge,
Figure 858372DEST_PATH_IMAGE069
for the w-th edge pixel point within the preset neighborhood
Figure 833281DEST_PATH_IMAGE067
The vertical coordinate of the pixel points of each edge,
Figure 518341DEST_PATH_IMAGE070
is an arctangent function. In this embodiment, the size of the preset neighborhood is 5 × 5, and in a specific application, an implementer may set the size according to a specific situation.
When a forest fires, smoke and flame can move along with the wind direction, the direction of the edge of the flame and the direction of the edge of the smoke have high similarity with the wind direction, and therefore if the average value of the angles of straight lines formed by the edge pixel points and the edge pixel points in the preset adjacent domain of the edge pixel points is closer to the wind direction at the current moment, the more likely the corresponding edge pixel points are flame edge pixel points or smoke edge pixel points. Based on the method, a first optimization coefficient is obtained according to the angle average value, and the smoke initial possibility and the flame initial possibility corresponding to the w-th edge pixel point are optimized based on the optimization coefficient; specifically, the average value of the angles of straight lines formed by the w-th edge pixel point and edge pixel points in a preset neighborhood is recorded as a first angle average value, the absolute value of the difference value between the first angle average value and a wind direction angle is calculated and recorded as a first direction difference, the sum of the first direction difference and an adjusting parameter is calculated and recorded as a direction index, and the ratio of a constant 1 to the direction index is used as a first optimization coefficient corresponding to the w-th edge pixel point; the specific expression of the first optimization coefficient corresponding to the w-th edge pixel point is as follows:
Figure 604108DEST_PATH_IMAGE071
wherein the content of the first and second substances,
Figure 159855DEST_PATH_IMAGE072
a first optimization coefficient corresponding to the w-th edge pixel point,
Figure 305665DEST_PATH_IMAGE060
is the average value of the angle of the straight line formed by the w-th edge pixel point and the edge pixel point in the preset neighborhood,
Figure 212441DEST_PATH_IMAGE073
is the wind direction angle, and the wind direction angle,
Figure 101900DEST_PATH_IMAGE032
in order to adjust the parameters of the device,
Figure 777732DEST_PATH_IMAGE074
in order to be the difference in direction,
Figure 94444DEST_PATH_IMAGE075
is a direction indicator.
Figure 488516DEST_PATH_IMAGE060
The difference in the first direction is characterized, and the adjustment parameter is introduced to prevent the denominator from being 0, and this embodiment is designed toSetting of parameters
Figure 930735DEST_PATH_IMAGE032
The value of (b) is 0.01, which can be set by the practitioner in a particular application, depending on the particular situation;
Figure 726652DEST_PATH_IMAGE074
representing the difference between the average angle of straight lines formed by the w-th edge pixel point and the edge pixel points in the preset adjacent area and the wind direction angle at the current moment; when a fire disaster occurs in a forest, the generated smoke and flame can move along with the wind direction, so that when the difference between the average angle of a straight line formed by the w-th edge pixel and the edge pixel in the preset neighborhood and the wind direction angle at the current moment is smaller, the w-th edge pixel is more accordant with the wind direction, namely the probability that the w-th edge pixel is a flame edge pixel or a smoke edge pixel is higher, and the first optimization coefficient corresponding to the w-th edge pixel is higher; when the difference between the average angle of the straight line formed by the w-th edge pixel point and the edge pixel point in the preset neighborhood and the wind direction angle at the current moment is larger, the w-th edge pixel point is not in accordance with the wind direction, namely the probability that the w-th edge pixel point is an interference pixel point is larger, and the first optimization coefficient corresponding to the w-th edge pixel point is smaller.
By adopting the method, the first optimization coefficient corresponding to each edge pixel point can be obtained, if the initial possibility corresponding to each edge pixel point is optimized only based on the first optimization coefficient, namely the category of the edge pixel point is judged directly based on the optimized possibility, a phenomenon that a small part of the edge pixel points are misjudged still exists, the judgment precision is not high enough, in order to further improve the judgment precision of the edge pixel points, the edge pixel points are analyzed based on continuous multi-frame gray scale, the embodiment takes two adjacent frames of gray scale images as an example to explain, namely, the gray scale image to be analyzed and the reference gray scale image as an example to explain.
In this embodiment, the motion direction angle of the edge pixel point in the gray image is obtained by a frame difference method, and for the w-th edge pixel in the gray image to be analyzedPoint: obtaining the coordinate information of the edge pixel point in the gray level image to be analyzed
Figure 214265DEST_PATH_IMAGE063
Obtaining the pixel point with the same position as the edge pixel point in the reference gray level image and marking as the first
Figure 830055DEST_PATH_IMAGE033
Each pixel point, calculating the gray value of the w-th edge pixel point and the w-th edge pixel point
Figure 326895DEST_PATH_IMAGE033
The absolute value of the difference value of the gray values of the pixel points is used as the w-th edge pixel point and the w-th edge pixel point
Figure 711740DEST_PATH_IMAGE033
If the gray difference of each pixel point is 0, the pixel point at the position is a static pixel point, and if the gray difference is not 0, the first pixel point is obtained
Figure 635834DEST_PATH_IMAGE033
The pixels in the preset neighborhood of each pixel are then aligned
Figure 738919DEST_PATH_IMAGE033
Each pixel point in the preset neighborhood of each pixel point is analyzed from the first
Figure 773871DEST_PATH_IMAGE033
Selecting a matching pixel point of a w-th edge pixel point in a preset neighborhood of the w-th pixel point, wherein the matching pixel point of the w-th edge pixel point is a pixel point of which the change position of the w-th edge pixel point along with time is moved; because the acquisition interval between two adjacent frames of gray-scale images is short, if
Figure 13222DEST_PATH_IMAGE033
If a certain pixel point in the preset neighborhood of the pixel point is a matched pixel point of the w-th edge pixel point, the gray values of the pixel point and the w-th edge pixel pointThe gray scales of the pixel points at the corresponding positions in the preset neighborhood of the pixel point and the preset neighborhood of the w-th edge pixel point are similar; based on this, for
Figure 842638DEST_PATH_IMAGE033
Calculating the optimal value corresponding to the pixel point according to the gray value of the pixel point, the gray value of each pixel point in the preset neighborhood of the pixel point, the gray value of the w-th edge pixel point and the gray value of each pixel point in the preset neighborhood of the w-th edge pixel point, wherein the specific calculation formula is as follows:
Figure 433019DEST_PATH_IMAGE076
wherein, the first and the second end of the pipe are connected with each other,
Figure 6083DEST_PATH_IMAGE077
a preferred value corresponding to the c-th pixel point,
Figure 362591DEST_PATH_IMAGE049
is the gray value of the w-th edge pixel point,
Figure 628487DEST_PATH_IMAGE078
is the gray value of the c-th pixel point,
Figure 706164DEST_PATH_IMAGE079
to preset the number of pixels in the neighborhood,
Figure 817340DEST_PATH_IMAGE080
is the gray value of the jth pixel point in the preset neighborhood of the c pixel point,
Figure 296863DEST_PATH_IMAGE081
is the gray value of the jth pixel point in the preset neighborhood of the w-th edge pixel point,
Figure 468081DEST_PATH_IMAGE012
is a function of taking the maximum value.
Figure 33055DEST_PATH_IMAGE082
Representing the gray difference between the c-th pixel point and the w-th edge pixel point,
Figure 947921DEST_PATH_IMAGE083
the representation carries out normalization processing on the difference;
Figure 547530DEST_PATH_IMAGE084
representing the gray difference of the jth pixel point in the preset neighborhood of the w-th edge pixel point and the jth pixel point in the preset neighborhood of the c-th pixel point; when the gray difference between the c-th pixel point and the w-th edge pixel point is smaller, and the gray difference between the pixel point in the preset neighborhood of the c-th pixel point and the pixel point in the preset neighborhood of the w-th edge pixel point is smaller, the probability that the c-th pixel point is a matching pixel point of the w-th edge pixel point is higher, namely the optimal value corresponding to the c-th pixel point is higher; when the gray difference between the c-th pixel point and the w-th edge pixel point is larger, and the gray difference between the pixel point in the preset neighborhood of the c-th pixel point and the pixel point in the preset neighborhood of the w-th edge pixel point is larger, the probability that the c-th pixel point is a matching pixel point of the w-th edge pixel point is smaller, namely the optimal value corresponding to the c-th pixel point is smaller.
By the above method, the first
Figure 889649DEST_PATH_IMAGE033
The greater the preferred value is, the greater the possibility that the corresponding pixel point is the matching pixel point of the w-th edge pixel point is, and the second edge pixel point is obtained
Figure 676340DEST_PATH_IMAGE033
The pixel point with the maximum preferred value in the preset neighborhood of the pixel point is used as a matching pixel point of the w-th edge pixel point; obtaining the motion direction angle corresponding to the w-th edge pixel point
Figure 660476DEST_PATH_IMAGE085
The specific calculation formula of the motion direction angle is
Figure 114591DEST_PATH_IMAGE086
Wherein, in the step (A),
Figure 630542DEST_PATH_IMAGE064
is the abscissa of the w-th edge pixel,
Figure 904528DEST_PATH_IMAGE065
is the ordinate of the w-th edge pixel point,
Figure 161197DEST_PATH_IMAGE087
the abscissa of the matched pixel point of the w-th edge pixel point,
Figure 469819DEST_PATH_IMAGE088
the vertical coordinate of the matched pixel point of the w-th edge pixel point; the closer the motion direction angle and the wind direction angle corresponding to the w-th edge pixel point are, the higher the probability that the edge pixel point is a smoke edge pixel point or a flame edge pixel point is, so that the second optimization coefficient corresponding to the w-th edge pixel point is determined according to the motion direction angle and the wind direction angle of the w-th edge pixel point in the embodiment, and specifically, the difference between the motion direction angle and the wind direction angle of the w-th edge pixel point is calculated and recorded as the second direction difference; normalizing the difference in the second direction, and taking the difference value between the constant 1 and the normalized difference in the second direction as a second optimization coefficient corresponding to the w-th edge pixel point; the specific expression of the second optimization coefficient corresponding to the w-th edge pixel point is as follows:
Figure 153741DEST_PATH_IMAGE089
wherein, the first and the second end of the pipe are connected with each other,
Figure 180603DEST_PATH_IMAGE090
second optimization coefficient corresponding to w-th edge pixel point,
Figure 975384DEST_PATH_IMAGE085
For the motion direction angle corresponding to the w-th edge pixel point,
Figure 404091DEST_PATH_IMAGE004
is the wind direction angle, and the wind direction angle,
Figure 258915DEST_PATH_IMAGE091
a difference in the second direction is characterized,
Figure 241914DEST_PATH_IMAGE012
is a function of the maximum value.
Figure 105965DEST_PATH_IMAGE091
Representing the difference between the motion direction angle corresponding to the w-th edge pixel point and the wind direction angle at the current moment,
Figure 389179DEST_PATH_IMAGE092
the representation carries out normalization processing on the difference; when the difference between the motion direction angle corresponding to the w-th edge pixel point and the wind direction angle at the current moment is smaller, the motion trend of the w-th edge pixel point is more consistent with the motion trend of natural wind, the probability that the w-th edge pixel point is a flame edge pixel point or a smoke edge pixel point is higher, namely the second optimization coefficient corresponding to the w-th edge pixel point is higher; when the difference between the motion direction angle corresponding to the w-th edge pixel point and the wind direction angle at the current moment is larger, the motion trend of the w-th edge pixel point is explained to be less suitable for the motion trend of natural wind, the probability that the w-th edge pixel point is a flame edge pixel point or a smoke edge pixel point is smaller, and namely, the second optimization coefficient corresponding to the w-th edge pixel point is smaller.
Thus, a first optimization coefficient and a second optimization coefficient corresponding to the w-th edge pixel point, and the smoke initial possibility and the flame initial possibility corresponding to the w-th edge pixel point are obtained; calculating a first optimization coefficient corresponding to the w-th edge pixel point, and a w-th edge pixel point pairThe product of the second optimization coefficient and the smoke initial possibility corresponding to the w-th edge pixel point, and the normalization processing is carried out on the product, and the normalization result is obtained
Figure 414904DEST_PATH_IMAGE093
As the smoke possibility index of the w-th edge pixel point; calculating the product of a first optimization coefficient corresponding to the w-th edge pixel point, a second optimization coefficient corresponding to the w-th edge pixel point and the initial possibility of flame corresponding to the w-th edge pixel point, normalizing the product, and normalizing the result
Figure 882269DEST_PATH_IMAGE094
As the flame possibility index of the w-th edge pixel point.
Setting a judgment threshold value
Figure 550011DEST_PATH_IMAGE095
If the smoke possibility index of the w-th edge pixel point
Figure 687731DEST_PATH_IMAGE093
And flame possibility index of w-th edge pixel point
Figure 884358DEST_PATH_IMAGE094
Are all greater than
Figure 107529DEST_PATH_IMAGE095
Then judge
Figure 313382DEST_PATH_IMAGE093
Whether or not greater than
Figure 305609DEST_PATH_IMAGE094
If the number of the edge pixels is larger than the preset value, determining that the w-th edge pixel is a smoke edge pixel, and if the number of the edge pixels is smaller than or equal to the preset value, determining that the w-th edge pixel is a flame edge pixel; if the smoke possibility index of the w-th edge pixel point
Figure 673136DEST_PATH_IMAGE093
Is greater than
Figure 383603DEST_PATH_IMAGE095
And the flame possibility index of the w-th edge pixel point
Figure 127568DEST_PATH_IMAGE094
Is less than or equal to
Figure 974302DEST_PATH_IMAGE095
If yes, judging the w-th edge pixel point as a smoke edge pixel point; if the flame possibility index of the w-th edge pixel point
Figure 43889DEST_PATH_IMAGE094
Is greater than
Figure 710493DEST_PATH_IMAGE095
And the smoke possibility index of the w-th edge pixel point
Figure 514940DEST_PATH_IMAGE093
Is less than or equal to
Figure 950600DEST_PATH_IMAGE095
Judging that the w-th edge pixel point is a flame edge pixel point; if the smoke possibility index of the w-th edge pixel point
Figure 191089DEST_PATH_IMAGE093
And flame possibility index of w-th edge pixel point
Figure 344989DEST_PATH_IMAGE094
Are all less than or equal to
Figure 696336DEST_PATH_IMAGE095
And judging the w-th edge pixel point as an interference edge pixel point. This example arrangement
Figure 252083DEST_PATH_IMAGE095
A value of 0.8, which in a particular application may be implemented according toSetting specific conditions.
By adopting the method, each edge pixel point in the gray image to be analyzed is judged to obtain the smoke edge pixel point and the flame edge pixel point, and then the smoke area and the flame area are obtained based on the smoke edge pixel point and the flame edge pixel point, so that the area of the smoke area and the area of the flame area in the gray image to be analyzed are obtained.
When a forest is in fire, the faster the flame and smoke spread, the higher the danger degree is; therefore, if a forest fires, the flame spread speed and the smoke spread speed need to be calculated, and each flame area or each smoke area in the gray level image to be analyzed is marked as a target area; for any target area in the gray image to be analyzed: if the distance between the edge pixel point of the target region and the corresponding matching pixel point in the reference gray-scale image is farther, it indicates that the target region is faster to spread, based on this, the embodiment calculates the spreading speed of the target region according to the euclidean distance between each edge pixel point of the target region and the corresponding matching pixel point and the acquisition time interval of the adjacent frame gray-scale image, that is:
Figure 663472DEST_PATH_IMAGE096
wherein the content of the first and second substances,
Figure 304669DEST_PATH_IMAGE097
is the propagation speed of the target area and,
Figure 459707DEST_PATH_IMAGE098
the number of edge pixels in the target area,
Figure 869960DEST_PATH_IMAGE099
is the Euclidean distance between the mth edge pixel point of the target area and the corresponding matching pixel point,
Figure 186672DEST_PATH_IMAGE100
is adjacent toAnd acquiring time interval of frame gray level images. The euclidean distance calculation formula is a well-known technique and will not be described in detail herein.
When the Euclidean distance between the edge pixel point of the target area and the corresponding matching pixel point is longer, the faster the target area spreads is shown; when the Euclidean distance between the edge pixel point of the target area and the corresponding matching pixel point is shorter, the slower the target area spreads. By adopting the method, the spreading speed of each target area can be obtained, namely the spreading speed of the flame area and the spreading speed of the smoke area are obtained, and the higher the spreading speed of the flame area is, the higher the spreading speed of the smoke area is, the higher the danger degree of the current fire is; considering that a plurality of smoke areas and a plurality of flame areas may appear at the same time, each smoke area and each flame area have corresponding propagation speeds, calculating the mean value of the propagation speeds of all the smoke areas in the gray-scale image to be analyzed, taking the mean value as the smoke propagation speed, and calculating the mean value of the propagation speeds of all the flame areas in the gray-scale image to be analyzed, and taking the mean value as the flame propagation speed.
In this embodiment, the temperature change index, the area of the fire area, the flame spread rate, the smoke spread rate, the area of the flame area, and the area of the smoke area are normalized respectively, and the average value after normalization of the six values is calculated as the current fire early warning index.
The larger the current fire early warning index is, the more serious the current fire is; setting a first early warning threshold value, a second early warning threshold value, a third early warning threshold value and a fourth early warning threshold value, wherein the first early warning threshold value, the second early warning threshold value, the third early warning threshold value and the fourth early warning threshold value are all values between 0 and 1, the first early warning threshold value is larger than the second early warning threshold value, the second early warning threshold value is larger than the third early warning threshold value, and the third early warning threshold value is larger than the fourth early warning threshold value; if the current fire early warning index is larger than a first early warning threshold value, a red early warning is sent out; if the current fire early warning index is larger than the second early warning threshold value and smaller than or equal to the first early warning threshold value, an orange early warning is sent out; if the current fire early warning index is larger than a third early warning threshold value and smaller than or equal to a second early warning threshold value, a yellow early warning is sent out; if the current fire early warning index is larger than the fourth early warning threshold and smaller than or equal to the third early warning threshold, a blue early warning is sent out; and if the current fire early warning index is less than or equal to the fourth early warning threshold value, no early warning is sent out. In this embodiment, the first early warning threshold is set to be 0.8, the second early warning threshold is set to be 0.6, the third early warning threshold is set to be 0.4, and the fourth early warning threshold is set to be 0.1.
Therefore, the monitoring and intelligent early warning of forest fires are completed, and once the system sends out the early warning, fire fighters carry out corresponding processing according to the early warning level, so that the economic loss is reduced, and the damage degree to the ecological environment is reduced.
In the embodiment, a forest is analyzed according to the temperature change condition, smoke information and flame information of a forest monitoring area, whether a fire disaster occurs in the forest monitoring area is judged in real time, once the forest fires, the faster the temperature change and the faster the spreading speed of the fire disaster area are, and the higher the danger degree is, so that when the forest monitoring area fires, the temperature change index and the fire spreading speed of the fire disaster area need to be judged in time, and the temperature image to be analyzed is analyzed according to the influence condition of the wind direction on the edge of the fire disaster area, so that the temperature change index is obtained; considering that the fire spreading speed can be determined by the smoke spreading speed and the flame spreading speed, the embodiment calculates the smoke spreading speed and the flame spreading speed according to the probability indexes of the edge pixel points and the coordinates of the matched pixel points, and when the smoke spreading speed and the flame spreading speed are both higher, the faster the fire spreading is, the higher the danger degree is; the embodiment analyzes the forest condition from multiple aspects, improves the fire identification precision, judges the early warning level according to the temperature change index, the smoke spreading speed and the flame spreading speed, completes the accurate identification and early warning of the fire, improves the early warning reliability and reduces the fire hazard. When the temperature change index is obtained, the pixels in the temperature image to be analyzed are clustered by adopting a traditional clustering algorithm to obtain an initial abnormal region, then the edge pixels in the initial abnormal region and the pixels around the edge pixels are judged according to the internal temperature change condition of the fire region, the accurate edge pixels in the fire region are obtained, the calculation precision of the temperature change index is further improved, and the early warning reliability of forest fire prevention can be effectively improved. When the smoke spreading speed and the flame spreading speed are obtained, the gray value of the flame edge pixel point is closer to the gray value corresponding to the standard red color and lower in height, and the gray value of the smoke edge pixel point is smaller and higher in height, considering that the color of the flame is close to the red color and closer to the ground, and the color of the smoke is darker and farther from the ground, and in addition, the smoke direction and the flame direction can be influenced by natural wind; based on the characteristics, the probability indexes of all edge pixel points in the gray level image to be analyzed are calculated, then the smoke spreading speed and the flame spreading speed are obtained according to the probability indexes and the coordinates of the matched pixel points, the edge pixel points are analyzed from multiple angles, and the calculation accuracy of the smoke spreading speed and the flame spreading speed is improved.

Claims (10)

1. An intelligent early warning monitoring management method for forest fire prevention is characterized by comprising the following steps:
acquiring two adjacent frames of temperature images and corresponding gray level images of a forest monitoring area, and respectively recording the temperature images as a reference temperature image, a temperature image to be analyzed, a reference gray level image and a gray level image to be analyzed;
clustering pixels in a temperature image to be analyzed to obtain an initial abnormal area, judging whether a fire disaster occurs or not according to the temperature of the pixels in the initial abnormal area, if so, marking each edge pixel of the initial abnormal area and pixels outside the initial abnormal area in a preset neighborhood as feature pixels, and obtaining a straight line corresponding to each feature pixel based on each feature pixel and a clustering center of the initial abnormal area; obtaining edge probability corresponding to each characteristic pixel point according to the gray value of the pixel point on the straight line, and obtaining candidate pixel points according to the edge probability; judging the category of each candidate pixel point according to the minimum included angle between a straight line corresponding to each candidate pixel point and a vector vertical to the wind speed, the included angle between the vector of each candidate pixel point pointing to the clustering center of the initial abnormal area and the wind speed vector and the marginal probability; obtaining a temperature change index based on the category, the coordinate and the wind direction angle of each candidate pixel point and the temperature of the pixel point in the reference temperature image;
acquiring matched pixel points of edge pixel points in a gray level image to be analyzed in a reference gray level image; obtaining a possibility index of each edge pixel point based on the gray value, the corresponding height, the corresponding motion direction angle, the gray value corresponding to the standard red and the wind direction angle of each edge pixel point in the gray image to be analyzed; obtaining the smoke spreading speed and the flame spreading speed according to the possibility index and the coordinates of the matched pixel points; and judging the early warning level based on the temperature change index, the smoke spreading speed and the flame spreading speed.
2. The intelligent early warning and monitoring management method for forest fire prevention according to claim 1, wherein the judging of the category of each candidate pixel point according to the minimum included angle between a straight line corresponding to each candidate pixel point and a vector perpendicular to the wind speed, the included angle between a vector pointing to a clustering center of an initial abnormal area by each candidate pixel point and the wind speed vector and the edge probability comprises:
for any candidate pixel: judging whether the minimum included angle between a straight line corresponding to the candidate pixel point and a vector perpendicular to the wind speed is smaller than or equal to a first angle threshold value or not, and if the minimum included angle is smaller than or equal to the first angle threshold value, judging that the candidate pixel point is a fire wing pixel point; if the wind speed vector is larger than the first angle threshold, judging whether an included angle between the vector of the candidate pixel point pointing to the clustering center of the initial abnormal area and the wind speed vector is smaller than or equal to a second angle threshold, if the included angle is smaller than or equal to the second angle threshold, judging that the candidate pixel point is a fire head pixel point, and if the included angle is larger than the first angle threshold, judging that the candidate pixel point is a fire tail pixel point.
3. The intelligent early warning and monitoring management method for forest fire prevention according to claim 1, wherein the obtaining of the temperature change index based on the category, the coordinate, the wind direction angle of each candidate pixel and the temperature of the pixel in the reference temperature image comprises:
for any candidate pixel: acquiring an included angle between a straight line formed by the candidate pixel point and the candidate pixel point with the maximum temperature difference in the preset neighborhood and the horizontal positive direction, and recording the included angle as a first included angle; obtaining the confidence corresponding to the candidate pixel point according to the temperature, the coordinate, the first included angle and the wind direction angle of the actual edge pixel point obtained in the category corresponding to the candidate pixel point;
obtaining edge pixel points of the fire area based on the confidence and the edge probability corresponding to each candidate pixel point; obtaining a fire area based on edge pixel points of the fire area; and calculating a temperature change index according to the Euclidean distance between the pixel point in the fire area and the clustering center of the initial abnormal area, the temperature of the pixel point in the fire area and the temperature of the corresponding pixel point in the reference temperature image.
4. The intelligent early warning and monitoring management method for forest fire prevention according to claim 3, wherein the obtaining of the confidence corresponding to each candidate pixel point comprises:
for the v-th candidate pixel point:
if the candidate pixel point is a fire point, calculating the confidence corresponding to the candidate pixel point by adopting the following formula:
Figure 63285DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 277229DEST_PATH_IMAGE002
for the confidence of the v-th candidate pixel point correspondence,
Figure 448447DEST_PATH_IMAGE003
is the included angle between the straight line and the horizontal positive direction formed by the pixel point with the maximum temperature difference between the v-th candidate pixel point and the preset neighborhood and the v-th candidate pixel point,
Figure 482262DEST_PATH_IMAGE004
is the wind direction angle, and the wind direction angle,
Figure 397128DEST_PATH_IMAGE005
the number of the actual edge pixel points of the fire head area which is judged already,
Figure 465578DEST_PATH_IMAGE006
the temperature of the v-th candidate pixel point,
Figure 807698DEST_PATH_IMAGE007
is a first
Figure 328809DEST_PATH_IMAGE008
The temperature of the pixel points at the actual edge of each fire head,
Figure 47367DEST_PATH_IMAGE009
for the v-th candidate pixel point and the judged fire head region
Figure 973253DEST_PATH_IMAGE008
The Euclidean distance of each actual edge pixel point,
Figure 486274DEST_PATH_IMAGE010
the maximum Euclidean distance between the v-th candidate pixel point and the actual edge pixel point of the judged fire head region,
Figure 760260DEST_PATH_IMAGE011
in order to take the absolute value of the value,
Figure 16929DEST_PATH_IMAGE012
is a function of taking the maximum value.
5. The intelligent early warning and monitoring management method for forest fire prevention according to claim 3, wherein the temperature change index is calculated by adopting the following formula:
Figure 794393DEST_PATH_IMAGE013
wherein, the first and the second end of the pipe are connected with each other,
Figure 478315DEST_PATH_IMAGE014
as an index of the temperature change,
Figure 239597DEST_PATH_IMAGE015
for the number of pixel points in the fire area in the temperature image to be analyzed,
Figure 768799DEST_PATH_IMAGE016
the Euclidean distance between the ith pixel point in the fire area and the clustering center of the initial abnormal area in the temperature image to be analyzed,
Figure 666348DEST_PATH_IMAGE017
the maximum value of the Euclidean distances between all pixel points in the fire area and the clustering center of the initial abnormal area in the temperature image to be analyzed,
Figure 786750DEST_PATH_IMAGE018
for the temperature of the ith pixel point in the fire area in the temperature image to be analyzed,
Figure 766820DEST_PATH_IMAGE019
the temperature of a pixel point in the reference temperature image, which is the same as the position of the ith pixel point in the fire area in the temperature image to be analyzed,
Figure 99713DEST_PATH_IMAGE011
the absolute value is taken.
6. The intelligent early warning and monitoring management method for forest fire prevention according to claim 1, wherein the obtaining of the edge probability corresponding to each characteristic pixel point according to the gray value of the pixel point on the straight line comprises:
for the a-th characteristic pixel point:
acquiring a preset number of sampling points closest to the characteristic pixel point on two sides of the characteristic pixel point on a straight line corresponding to the characteristic pixel point respectively; based on the gray value of each sampling point and the gray value of the characteristic pixel point, calculating the edge probability corresponding to the characteristic pixel point by adopting the following formula:
Figure 117347DEST_PATH_IMAGE020
wherein, the first and the second end of the pipe are connected with each other,
Figure 143072DEST_PATH_IMAGE021
the edge probability corresponding to the a-th feature pixel point,
Figure 347788DEST_PATH_IMAGE022
the number of the sampling points in the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point,
Figure 749951DEST_PATH_IMAGE023
the first abnormal region in the straight line corresponding to the a-th characteristic pixel point
Figure 622092DEST_PATH_IMAGE024
The temperature of each of the sample points is measured,
Figure 818718DEST_PATH_IMAGE025
the first abnormal region in the straight line corresponding to the a-th characteristic pixel point
Figure 776310DEST_PATH_IMAGE026
The temperature of each of the sample points is measured,
Figure 716584DEST_PATH_IMAGE027
is the first outside the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point
Figure 434442DEST_PATH_IMAGE024
The temperature of the individual sampling points is,
Figure 801970DEST_PATH_IMAGE028
is the first outside the initial abnormal region on the straight line corresponding to the a-th characteristic pixel point
Figure 246858DEST_PATH_IMAGE026
The temperature of the individual sampling points is,
Figure 459664DEST_PATH_IMAGE029
the temperature of the a-th characteristic pixel point,
Figure 306398DEST_PATH_IMAGE030
the average temperature of the a-th characteristic pixel point and all sampling points on the corresponding straight line is calculated,
Figure 844826DEST_PATH_IMAGE031
is an exponential function with a natural constant as a base number,
Figure 777010DEST_PATH_IMAGE011
in order to take the absolute value of the value,
Figure 59087DEST_PATH_IMAGE032
to adjust the parameters.
7. The intelligent early warning and monitoring management method for forest fire prevention according to claim 1, wherein the obtaining of the probability index of each edge pixel point based on the gray value, the corresponding height, the corresponding motion direction angle, the gray value corresponding to standard red and the wind direction angle of each edge pixel point in the gray image to be analyzed comprises:
for any edge pixel:
respectively carrying out normalization processing on the gray value of the edge pixel point and the corresponding height, calculating the difference value between a constant 1 and the normalized gray value, and recording as a first color difference corresponding to the edge pixel point; taking the product of the first color difference and the normalized height as the initial smoke possibility corresponding to the edge pixel point;
calculating the difference between the constant 1 and the normalized height, and recording the difference as a height difference; calculating the difference between the gray value of the edge pixel point and the gray value corresponding to the standard red, and recording as a second color difference corresponding to the edge pixel point; taking a natural constant as a base number, and taking a value of an exponential function taking the negative second color difference as an index as a color difference index; taking the product of the height difference and the color difference index as the initial possibility of flame corresponding to the edge pixel point;
and obtaining the probability index of the edge pixel point according to the initial probability of the smoke, the initial probability of the flame, the corresponding motion direction angle, the coordinate of the edge pixel point in the preset neighborhood and the wind direction angle corresponding to the edge pixel point.
8. The intelligent early warning and monitoring management method for forest fire prevention according to claim 7, wherein the possibility index of the edge pixel point is obtained according to the initial possibility of smoke, the initial possibility of flame, the corresponding movement direction angle, the coordinate of the edge pixel point in a preset neighborhood and the wind direction angle corresponding to the edge pixel point, and the method comprises the following steps:
calculating the average value of the angles of straight lines formed by the edge pixel points and the edge pixel points in the preset neighborhood, and recording the average value as a first angle average value; calculating the absolute value of the difference value between the first angle average value and the wind direction angle, recording the absolute value as a first direction difference, calculating the sum of the first direction difference and an adjusting parameter, recording the sum as a direction index, and taking the ratio of a constant 1 to the direction index as a first optimization coefficient corresponding to the w-th edge pixel point;
calculating the difference between the motion direction angle and the wind direction angle of the edge pixel point, and recording as a second direction difference; normalizing the second direction difference, and taking the difference value of the constant 1 and the normalized second direction difference as a second optimization coefficient corresponding to the edge pixel point;
obtaining a smoke possibility index of the edge pixel point according to the first optimization coefficient, the second optimization coefficient and the smoke initial possibility corresponding to the edge pixel point; obtaining a flame possibility index of the edge pixel point according to the first optimization coefficient, the second optimization coefficient and the flame initial possibility corresponding to the edge pixel point;
the likelihood indicators include a smoke likelihood indicator and a flame likelihood indicator.
9. The intelligent early warning monitoring management method for forest fire prevention according to claim 8, wherein the obtaining of the smoke spreading speed and the flame spreading speed according to the possibility index and the coordinates of the matched pixel points comprises:
for any edge pixel: if the smoke possibility index and the flame possibility index are both larger than the judgment threshold, judging whether the smoke possibility index is larger than the flame possibility index, if so, judging that the edge pixel point is a smoke edge pixel point, and if not, judging that the edge pixel point is a flame edge pixel point; if the smoke possibility index is larger than the judgment threshold and the flame possibility index is smaller than or equal to the judgment threshold, judging that the edge pixel point is a smoke edge pixel point; if the flame possibility index is larger than the judgment threshold and the smoke possibility index is smaller than or equal to the judgment threshold, judging the edge pixel point as a flame edge pixel point;
obtaining a smoke region and a flame region based on the smoke edge pixel points and the flame edge pixel points; and obtaining the smoke spreading speed and the flame spreading speed based on the Euclidean distance between each edge pixel point and the corresponding matching pixel point in the smoke area and the flame area and the acquisition time interval of the adjacent frame gray level images.
10. The intelligent early warning and monitoring management method for forest fire prevention according to claim 1, wherein the obtaining of matching pixel points of each edge pixel point in the gray image to be analyzed in the reference gray image comprises:
for the w-th edge pixel point in the gray image to be analyzed:
obtaining the pixel point with the same position as the edge pixel point in the reference gray level image and marking as the first
Figure 760327DEST_PATH_IMAGE033
Each pixel point, calculating the w-th edge pixel point and the w-th edge pixel point
Figure 735236DEST_PATH_IMAGE033
The gray difference of each pixel point is determined according to the second criterion if the gray difference is not 0
Figure 886207DEST_PATH_IMAGE033
The gray value of each pixel point in the preset neighborhood of each pixel point, the first
Figure 971975DEST_PATH_IMAGE033
Gray values of pixel points in the preset neighborhood of all pixel points in the preset neighborhood of the pixel points, gray values of the w-th edge pixel points and gray values of pixel points in the preset neighborhood of the w-th edge pixel points are calculated, and the gray values of the pixel points in the preset neighborhood of the w-th edge pixel points are calculated
Figure 996563DEST_PATH_IMAGE033
The corresponding optimal value of each pixel point in the preset neighborhood of each pixel point; will be first
Figure 142373DEST_PATH_IMAGE033
The pixel point with the maximum preferred value in the preset neighborhood of the pixel point is used as a matching pixel point of the w-th edge pixel point in the reference gray level image;
first, the
Figure 783570DEST_PATH_IMAGE033
The acquisition process of the corresponding preferred value of any pixel point in the preset neighborhood of each pixel point is as follows: calculating the pixel pointRecording the gray difference with the w-th edge pixel point as a first characteristic difference; calculating the gray difference of each pixel point in the preset neighborhood of the pixel point and the pixel point at the corresponding position in the preset neighborhood of the w-th edge pixel point, and recording the gray difference as a second characteristic difference; obtaining a corresponding optimal value of the pixel point based on the first characteristic difference and the second characteristic difference; the first characteristic difference and the second characteristic difference are both inversely related to preferred values.
CN202211704016.1A 2022-12-29 2022-12-29 Intelligent early warning monitoring management method for forest fire prevention Active CN115691026B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211704016.1A CN115691026B (en) 2022-12-29 2022-12-29 Intelligent early warning monitoring management method for forest fire prevention

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211704016.1A CN115691026B (en) 2022-12-29 2022-12-29 Intelligent early warning monitoring management method for forest fire prevention

Publications (2)

Publication Number Publication Date
CN115691026A true CN115691026A (en) 2023-02-03
CN115691026B CN115691026B (en) 2023-05-05

Family

ID=85055430

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211704016.1A Active CN115691026B (en) 2022-12-29 2022-12-29 Intelligent early warning monitoring management method for forest fire prevention

Country Status (1)

Country Link
CN (1) CN115691026B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115880285A (en) * 2023-02-07 2023-03-31 南通南铭电子有限公司 Method for identifying abnormality of lead-out wire of aluminum electrolytic capacitor
CN116204690A (en) * 2023-04-28 2023-06-02 泰力基业股份有限公司 Block terminal data transmission system with automatic fire extinguishing function
CN116311079A (en) * 2023-05-12 2023-06-23 探长信息技术(苏州)有限公司 Civil security engineering monitoring method based on computer vision
CN116433035A (en) * 2023-06-13 2023-07-14 中科数创(临沂)数字科技有限公司 Building electrical fire risk assessment prediction method based on artificial intelligence
CN116665136A (en) * 2023-07-31 2023-08-29 济宁长兴塑料助剂有限公司 Chemical production safety risk real-time monitoring system
CN116863253A (en) * 2023-09-05 2023-10-10 光谷技术有限公司 Operation and maintenance risk early warning method based on big data analysis
CN116993632A (en) * 2023-09-28 2023-11-03 威海广泰空港设备股份有限公司 Production fire early warning method based on machine vision
CN117058625A (en) * 2023-10-11 2023-11-14 济宁港航梁山港有限公司 Campus fire control remote monitoring system based on thing networking
CN117253144A (en) * 2023-09-07 2023-12-19 建研防火科技有限公司 Fire risk grading management and control method
CN117635922A (en) * 2023-12-06 2024-03-01 北京薇笑美网络科技有限公司 Quality identification method based on router network cable interface
CN117711127A (en) * 2023-11-08 2024-03-15 金舟消防工程(北京)股份有限公司 Fire safety supervision method and system
CN117765051A (en) * 2024-01-10 2024-03-26 济宁市市政园林养护中心 Afforestation maintenance monitoring and early warning system and method
CN117711127B (en) * 2023-11-08 2024-07-02 金舟消防工程(北京)股份有限公司 Fire safety supervision method and system

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0984413A2 (en) * 1998-09-01 2000-03-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and system for automatic forest fire recognition
JP2004109105A (en) * 2002-07-23 2004-04-08 Jfe Steel Kk Flaw type classification boundary setting method in surface defect detection, and defect detection method
CN104573713A (en) * 2014-12-31 2015-04-29 天津弘源慧能科技有限公司 Mutual inductor infrared image recognition method based on image textual features
CN106920358A (en) * 2017-04-14 2017-07-04 国网福建省电力有限公司 Forest fire accident alarm method based on power transmission network
US20170251013A1 (en) * 2016-02-26 2017-08-31 Oracle International Corporation Techniques for discovering and managing security of applications
AU2020101011A4 (en) * 2019-06-26 2020-07-23 Zhejiang University Method for identifying concrete cracks based on yolov3 deep learning model
AU2020102091A4 (en) * 2019-10-17 2020-10-08 Wuhan University Of Science And Technology Intelligent steel slag detection method and system based on convolutional neural network
CN112348419A (en) * 2021-01-05 2021-02-09 光谷技术有限公司 Internet of things processing system and method
CN112435427A (en) * 2020-11-12 2021-03-02 光谷技术股份公司 Forest fire monitoring system and method
CN113375730A (en) * 2021-07-11 2021-09-10 昆山广翔昌智能信息科技有限公司 System for monitoring living water environment of crabs by long-time and short-time memory neural network
US20210312197A1 (en) * 2019-11-21 2021-10-07 Dalian University Of Technology Grid map obstacle detection method fusing probability and height information
US11201890B1 (en) * 2019-03-29 2021-12-14 Mandiant, Inc. System and method for adaptive graphical depiction and selective remediation of cybersecurity threats
CN113888353A (en) * 2021-09-29 2022-01-04 华能(浙江)能源开发有限公司清洁能源分公司 Energy efficiency diagnosis method, system and medium for distributed photovoltaic power generation equipment
CN114090352A (en) * 2021-10-27 2022-02-25 中国华能集团清洁能源技术研究院有限公司 Method and device for diagnosing abnormal unit energy efficiency and storage medium
US11403860B1 (en) * 2022-04-06 2022-08-02 Ecotron Corporation Multi-sensor object detection fusion system and method using point cloud projection

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0984413A2 (en) * 1998-09-01 2000-03-08 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and system for automatic forest fire recognition
JP2004109105A (en) * 2002-07-23 2004-04-08 Jfe Steel Kk Flaw type classification boundary setting method in surface defect detection, and defect detection method
CN104573713A (en) * 2014-12-31 2015-04-29 天津弘源慧能科技有限公司 Mutual inductor infrared image recognition method based on image textual features
US20170251013A1 (en) * 2016-02-26 2017-08-31 Oracle International Corporation Techniques for discovering and managing security of applications
CN106920358A (en) * 2017-04-14 2017-07-04 国网福建省电力有限公司 Forest fire accident alarm method based on power transmission network
US11201890B1 (en) * 2019-03-29 2021-12-14 Mandiant, Inc. System and method for adaptive graphical depiction and selective remediation of cybersecurity threats
AU2020101011A4 (en) * 2019-06-26 2020-07-23 Zhejiang University Method for identifying concrete cracks based on yolov3 deep learning model
AU2020102091A4 (en) * 2019-10-17 2020-10-08 Wuhan University Of Science And Technology Intelligent steel slag detection method and system based on convolutional neural network
US20210312197A1 (en) * 2019-11-21 2021-10-07 Dalian University Of Technology Grid map obstacle detection method fusing probability and height information
CN112435427A (en) * 2020-11-12 2021-03-02 光谷技术股份公司 Forest fire monitoring system and method
CN112348419A (en) * 2021-01-05 2021-02-09 光谷技术有限公司 Internet of things processing system and method
CN113375730A (en) * 2021-07-11 2021-09-10 昆山广翔昌智能信息科技有限公司 System for monitoring living water environment of crabs by long-time and short-time memory neural network
CN113888353A (en) * 2021-09-29 2022-01-04 华能(浙江)能源开发有限公司清洁能源分公司 Energy efficiency diagnosis method, system and medium for distributed photovoltaic power generation equipment
CN114090352A (en) * 2021-10-27 2022-02-25 中国华能集团清洁能源技术研究院有限公司 Method and device for diagnosing abnormal unit energy efficiency and storage medium
US11403860B1 (en) * 2022-04-06 2022-08-02 Ecotron Corporation Multi-sensor object detection fusion system and method using point cloud projection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘俊、张文风: "基于YOLOv3算法的高速公路火灾检测" *
袁传武等: "国有林场森林防火和资源监管"空天地人"四位一体监测系统建设" *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115880285A (en) * 2023-02-07 2023-03-31 南通南铭电子有限公司 Method for identifying abnormality of lead-out wire of aluminum electrolytic capacitor
CN116204690A (en) * 2023-04-28 2023-06-02 泰力基业股份有限公司 Block terminal data transmission system with automatic fire extinguishing function
CN116204690B (en) * 2023-04-28 2023-07-18 泰力基业股份有限公司 Block terminal data transmission system with automatic fire extinguishing function
CN116311079B (en) * 2023-05-12 2023-09-01 探长信息技术(苏州)有限公司 Civil security engineering monitoring method based on computer vision
CN116311079A (en) * 2023-05-12 2023-06-23 探长信息技术(苏州)有限公司 Civil security engineering monitoring method based on computer vision
CN116433035A (en) * 2023-06-13 2023-07-14 中科数创(临沂)数字科技有限公司 Building electrical fire risk assessment prediction method based on artificial intelligence
CN116433035B (en) * 2023-06-13 2023-09-15 中科数创(临沂)数字科技有限公司 Building electrical fire risk assessment prediction method based on artificial intelligence
CN116665136A (en) * 2023-07-31 2023-08-29 济宁长兴塑料助剂有限公司 Chemical production safety risk real-time monitoring system
CN116665136B (en) * 2023-07-31 2023-10-31 山东长兴塑料助剂股份有限公司 Chemical production safety risk real-time monitoring system
CN116863253A (en) * 2023-09-05 2023-10-10 光谷技术有限公司 Operation and maintenance risk early warning method based on big data analysis
CN116863253B (en) * 2023-09-05 2023-11-17 光谷技术有限公司 Operation and maintenance risk early warning method based on big data analysis
CN117253144A (en) * 2023-09-07 2023-12-19 建研防火科技有限公司 Fire risk grading management and control method
CN117253144B (en) * 2023-09-07 2024-04-12 建研防火科技有限公司 Fire risk grading management and control method
CN116993632A (en) * 2023-09-28 2023-11-03 威海广泰空港设备股份有限公司 Production fire early warning method based on machine vision
CN116993632B (en) * 2023-09-28 2023-12-19 威海广泰空港设备股份有限公司 Production fire early warning method based on machine vision
CN117058625B (en) * 2023-10-11 2024-01-16 济宁港航梁山港有限公司 Campus fire control remote monitoring system based on thing networking
CN117058625A (en) * 2023-10-11 2023-11-14 济宁港航梁山港有限公司 Campus fire control remote monitoring system based on thing networking
CN117711127A (en) * 2023-11-08 2024-03-15 金舟消防工程(北京)股份有限公司 Fire safety supervision method and system
CN117711127B (en) * 2023-11-08 2024-07-02 金舟消防工程(北京)股份有限公司 Fire safety supervision method and system
CN117635922A (en) * 2023-12-06 2024-03-01 北京薇笑美网络科技有限公司 Quality identification method based on router network cable interface
CN117765051A (en) * 2024-01-10 2024-03-26 济宁市市政园林养护中心 Afforestation maintenance monitoring and early warning system and method
CN117765051B (en) * 2024-01-10 2024-06-07 济宁市市政园林养护中心 Afforestation maintenance monitoring and early warning system and method

Also Published As

Publication number Publication date
CN115691026B (en) 2023-05-05

Similar Documents

Publication Publication Date Title
CN115691026A (en) Intelligent early warning monitoring management method for forest fire prevention
CN110516609B (en) Fire disaster video detection and early warning method based on image multi-feature fusion
CN107085714B (en) Forest fire detection method based on video
CN107944359A (en) Flame detecting method based on video
CN112069975A (en) Comprehensive flame detection method based on ultraviolet, infrared and vision
CN108765470A (en) One kind being directed to the improved KCF track algorithms of target occlusion
CN108960142B (en) Pedestrian re-identification method based on global feature loss function
CN111611907A (en) Image-enhanced infrared target detection method
CN115841488B (en) PCB hole inspection method based on computer vision
CN111667655A (en) Infrared image-based high-speed railway safety area intrusion alarm device and method
CN109034038B (en) Fire identification device based on multi-feature fusion
CN115294109A (en) Real wood board production defect identification system based on artificial intelligence, and electronic equipment
CN116503912B (en) Security check early warning method based on electronic graph bag
CN113076899B (en) High-voltage transmission line foreign matter detection method based on target tracking algorithm
CN115049955A (en) Fire detection analysis method and device based on video analysis technology
CN114120181A (en) Fire monitoring system and method based on video identification
CN117237747B (en) Hardware defect classification and identification method based on artificial intelligence
CN107704818A (en) A kind of fire detection system based on video image
CN110660187B (en) Forest fire alarm monitoring system based on edge calculation
CN116630332B (en) PVC plastic pipe orifice defect detection method based on image processing
CN116257651B (en) Intelligent monitoring system for abnormal sound of through channel cab apron
CN113221603A (en) Method and device for detecting shielding of monitoring equipment by foreign matters
CN108960181A (en) Black smoke vehicle detection method based on multiple dimensioned piecemeal LBP and Hidden Markov Model
CN114662594B (en) Target feature recognition analysis system
CN114283367B (en) Artificial intelligent open fire detection method and system for garden fire early warning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant