CN115356782B - Night ground fire automatic identification method, system, electronic equipment and medium - Google Patents
Night ground fire automatic identification method, system, electronic equipment and medium Download PDFInfo
- Publication number
- CN115356782B CN115356782B CN202211141249.5A CN202211141249A CN115356782B CN 115356782 B CN115356782 B CN 115356782B CN 202211141249 A CN202211141249 A CN 202211141249A CN 115356782 B CN115356782 B CN 115356782B
- Authority
- CN
- China
- Prior art keywords
- pixel
- fire
- data
- wave band
- condition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 238000012545 processing Methods 0.000 claims abstract description 47
- 238000012216 screening Methods 0.000 claims abstract description 15
- 230000005855 radiation Effects 0.000 claims description 21
- 238000004590 computer program Methods 0.000 claims description 12
- 238000012360 testing method Methods 0.000 claims description 11
- 230000003068 static effect Effects 0.000 claims description 6
- 238000001514 detection method Methods 0.000 description 13
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 7
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 239000000779 smoke Substances 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V8/00—Prospecting or detecting by optical means
- G01V8/10—Detecting, e.g. by using light barriers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/60—Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Astronomy & Astrophysics (AREA)
- Remote Sensing (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Life Sciences & Earth Sciences (AREA)
- Geophysics (AREA)
- Fire-Detection Mechanisms (AREA)
Abstract
The invention discloses a night ground fire automatic identification method, a system, electronic equipment and a medium, which relate to the technical field of image identification and computer vision, wherein the method comprises the following steps: acquiring micro-light observation image data output by a satellite micro-light imager; the micro-light observation image data comprises micro-light observation data of a plurality of pixels; respectively carrying out night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the low-light observed image data to obtain preliminary low-light image data; screening fire pixels from the preliminary low-light image data to obtain suspected fire pixel data; and determining absolute fire pixel data and relative fire pixel data according to the suspected fire pixel data. The invention carries out automatic fire identification on the obtained low-light observation image data based on the satellite low-light imager, and has the advantages of high precision and high reliability.
Description
Technical Field
The invention relates to the technical field of image recognition and computer vision, in particular to an automatic recognition method, system, electronic equipment and medium for night ground fire.
Background
The existing smoke and fire detection is mainly judged according to smoke, temperature, light and the like, and with the development of a neural network and a machine learning method, the fire detection is generally carried out by adopting a method based on image convolution learning. Since it is based on data priors and statistics, its algorithm is not robust. In addition, the existing fire detection method using deep learning mostly does not distinguish the region to which the pixels belong, such as land, ocean and the like, and the accuracy of the detection result is possibly poor, so that the false detection rate is high.
The satellite low-light-level imager has good natural light source imaging capability, and the detectable minimum radiation intensity reaches 10 -9W·cm-2·sr-1·μm-1. And the night fire has the characteristic similar to city lamplight on the low-light image, and the radiation energy of the night fire is in the detection range of the low-light imager. However, a technical scheme for detecting fire by adopting a satellite low-light-level imager does not exist at present.
Disclosure of Invention
The invention aims to provide a night ground fire automatic identification method, a system, electronic equipment and a medium, which are based on a satellite low-light imaging instrument to automatically identify fire of acquired low-light observation image data and have the advantages of high precision and high reliability.
In order to achieve the above object, the present invention provides the following solutions:
in a first aspect, the present invention provides a method for automatically identifying a night ground fire, including:
Acquiring micro-light observation image data output by a satellite micro-light imager; the micro-light observation image data comprises micro-light observation data of a plurality of pixels;
Respectively carrying out night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the low-light observed image data to obtain preliminary low-light image data;
screening fire pixels from the preliminary low-light image data to obtain suspected fire pixel data;
Determining absolute fire condition pixel data and relative fire condition pixel data according to the suspected fire condition pixel data; the absolute fire condition pixel data and the relative fire condition pixel data form night ground fire condition data.
Optionally, the micro-light observation data includes M13 band data and M16 band data;
Respectively performing night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the low-light observed image data to obtain preliminary low-light image data, wherein the method specifically comprises the following steps of:
Judging whether the solar zenith angle corresponding to M13 wave band data of each pixel is larger than a first set threshold value or not according to each pixel to obtain a first result;
if the first result indicates no, the pixels are removed;
If the first result indicates yes, marking the pixel as a night pixel; a plurality of night pixels form a night pixel set;
Based on a land/ocean static geographic database, carrying out matching screening on the night pixel set to obtain a night land pixel set; the night land pixel set comprises a plurality of night land pixels;
judging whether a bright temperature value corresponding to M16 wave band data of each night land pixel is smaller than a second set threshold value or not according to each night land pixel so as to obtain a second result;
if the second result indicates yes, the night land pixels are removed;
If the second result indicates no, marking the night land pixels as preliminary low-light pixels; and forming preliminary low-light image data by a plurality of the preliminary low-light pixels.
Optionally, the micro-optical observation data includes DNB band data and M13 band data;
screening the fire pixels of the preliminary low-light image data to obtain suspected fire pixel data, wherein the method specifically comprises the following steps of:
performing first histogram processing on DNB wave band data in the preliminary low-light-level image data to obtain DNB wave band radiation threshold values;
Performing second histogram processing on the bright temperature value corresponding to the M13 wave band data in the preliminary low-light image data to obtain an M13 wave band bright temperature threshold;
Extracting a primary suspected fire pixel set from the primary low-light image data according to the DNB wave band radiation threshold and the M13 wave band brightness temperature threshold; the primary suspected fire image element set comprises a plurality of primary suspected fire image elements; the primary suspected fire pixel is a pixel in the primary low-light-level image data, wherein DNB wave band data is larger than DNB wave band radiation threshold value, and bright temperature value corresponding to M13 wave band data is larger than M13 wave band bright temperature threshold value;
Judging whether the bright temperature difference value of the primary suspected fire pixel is larger than a third set threshold value or not so as to obtain a third result; the brightness temperature difference value of the primary suspected fire pixel is the difference value of the brightness temperature value corresponding to the M13 wave band data and the brightness temperature value corresponding to the M16 wave band data of the primary suspected fire pixel;
If the third result indicates yes, marking the primary suspected fire pixel as a secondary suspected fire pixel; the secondary suspected fire pixels form suspected fire pixel data;
and if the third result indicates no, eliminating the primary suspected fire pixels.
Optionally, the suspected fire image element data comprises a plurality of secondary suspected fire image elements; the micro-light observation data comprise M13 wave band data and M16 wave band data;
according to the suspected fire condition pixel data, absolute fire condition pixel data and relative fire condition pixel data are determined, and the method specifically comprises the following steps:
Judging whether M13 wave band data of the secondary suspected fire pixels are larger than a fourth set threshold value or not so as to obtain a fourth result;
if the fourth result indicates yes, marking the secondary suspected fire pixel as an absolute fire pixel; the absolute fire pixels form absolute fire pixel data;
And if the fourth result indicates no, determining relative fire pixel data according to the M13 wave band data and the M16 wave band data of the secondary suspected fire pixels.
Optionally, determining the relative fire pixel data according to the M13 band data and the M16 band data of the secondary suspected fire pixel specifically includes:
establishing a background window by taking the secondary suspected fire image element as a center, and extracting a background image element in the background window;
Judging whether the background pixels meet a preset fire test condition set or not according to each background pixel to obtain a fifth result; the preset fire test condition group comprises a first condition, a second condition and a third condition; the first condition is determined according to the bright temperature deviation of the secondary suspected fire pixel, the bright temperature average deviation of the background pixel and the average absolute deviation of the bright temperature of the background pixel; the second condition is determined according to the bright temperature deviation of the secondary suspected fire pixel and the bright temperature average deviation of the background pixel; the third condition is determined according to an average value of absolute differences of bright temperature values corresponding to M13 wave band data of the secondary suspected fire pixels, bright temperature values corresponding to M13 wave band data of the background pixels and bright temperature values corresponding to M13 wave band data of the background pixels; the brightness Wen Piancha of the secondary suspected fire pixel represents the difference value between the brightness temperature value corresponding to the M13 wave band data and the brightness temperature value corresponding to the M16 wave band data of the secondary suspected fire pixel; the average deviation of the brightness and the temperature of the background pixels represents the average value of brightness Wen Piancha of brightness temperature values corresponding to M13 wave band data and brightness Wen Piancha of brightness temperature values corresponding to M16 wave band data of the background pixels; the average absolute deviation of the brightness temperature of the background pixel represents the average value of the absolute difference value of the brightness temperature value corresponding to the M13 wave band data and the brightness temperature value corresponding to the M16 wave band data of the background pixel;
if the fifth result indicates yes, marking the background pixel as a relative fire pixel; the relative fire image elements form relative fire image element data;
And if the fifth result indicates no, discarding the background pixel.
Optionally, the first condition is Δbt > Δbt B+3.5δ(ΔBTB);
the second condition is Δbt > Δbt B +6k;
the third condition is BT 13>BT13B+3δ(BT13B);
Wherein Δbt represents the bright Wen Piancha of the secondary suspected fire element, Δbt B represents the bright temperature average deviation of the background element, and δ (Δbt B) represents the average absolute deviation of the bright temperature of the background element; BT 13 represents a bright temperature value corresponding to M13 band data of the secondary suspected fire element, BT 13B represents a bright temperature value corresponding to M13 band data of the background element, and δ (BT 13B) represents an average value of absolute differences of bright temperature values corresponding to M13 band data of the background element.
Optionally, before the steps of respectively performing night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the micro-light observation image data to obtain preliminary micro-light image data, the method further includes:
and filling default values in the micro-light observation data by adopting a linear interpolation method.
In a second aspect, the present invention provides an automatic night ground fire identification system comprising:
The data acquisition module is used for acquiring the micro-light observation image data output by the satellite micro-light imager; the micro-light observation image data comprises micro-light observation data of a plurality of pixels;
The pixel preliminary processing module is used for respectively carrying out night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the low-light observed image data so as to obtain preliminary low-light image data;
the suspected fire determining module is used for screening the fire pixels of the preliminary low-light image data to obtain suspected fire pixel data;
The night fire condition determining module is used for determining absolute fire condition pixel data and relative fire condition pixel data according to the suspected fire condition pixel data; the absolute fire condition pixel data and the relative fire condition pixel data form night ground fire condition data.
In a third aspect, the present invention provides an electronic device comprising a memory and a processor;
the memory is used for storing a computer program, and the processor is used for running the computer program to execute the night ground fire automatic identification method.
The present invention provides a computer-readable storage medium storing a computer program;
The computer program when executed by the processor realizes the steps of the automatic recognition method for the ground fire at night.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
The invention provides a night ground fire automatic identification method, a night ground fire automatic identification system, electronic equipment and a medium, wherein micro-light observation image data output by a satellite micro-light imager are obtained; sequentially performing night pixel identification processing, land pixel identification processing, cloud pixel removal processing and fire pixel screening on the micro-light observation image data to obtain suspected fire pixel data; and finally, further screening the suspected fire condition image data to determine absolute fire condition image data and relative fire condition image data. According to the invention, by utilizing the strong radiation characteristic of a fire source in a visible light/near infrared band at night and combining the characteristic that the sensitivity of a satellite low-light imager to heat radiation is obviously higher than that of a satellite low-light imager in the vicinity of a 4 mu m channel, the night pixels and the land pixels are extracted, the cloud pixels are removed, and the suspected fire pixels are further screened and extracted, so that the accurate and reliable night ground fire pixels are rapidly obtained.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of the automatic identification method of ground fire at night according to the invention;
FIG. 2 is a diagram showing the comparison of fire detection numbers in the embodiment of the present invention;
fig. 3 is a schematic structural diagram of the automatic recognition system for ground fire at night according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention will be further described in detail with reference to the drawings and detailed description below in order to make the objects, features and advantages of the invention more comprehensible.
Example 1
As shown in fig. 1, this embodiment provides a method for automatically identifying a ground fire at night, including:
Step 100, obtaining micro-light observation image data output by a satellite micro-light imager; the micro-light observation image data comprises micro-light observation data of a plurality of pixels; the micro-optical observation data comprises M13 wave band data, M16 wave band data and DNB wave band data. Wherein, M13 band data, M16 band data and DNB band data are SDR (Sensor Data Records) grade data; SDR (Sensor Data Records) refers to a data format or level of the above-mentioned band, and hereinafter, SDR level data of the above-mentioned three bands are used by default.
Prior to step 200, the night ground fire automatic identification method further includes:
and filling default values in the micro-light observation data by adopting a linear interpolation method. In particular, satellite microoptical imagers are typically populated with missing values of 999.9 and-999.9 when they are processed, which are not practical, and are required to be processed in order to ensure data validity and reduce the complexity of subsequent calculations.
The missing values in the M13 band data, the M16 band data and the DNB band data are mainly OBPT (Onboard pixel trim, on-orbit pixel trimming) types, and the missing values are caused by a bowknot effect in the data acquisition process, so that a linear interpolation method is adopted to fill the missing values.
In a specific practical application, the data obtained after linear interpolation can be spatially matched.
Step 200, respectively performing night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the low-light observed image data to obtain preliminary low-light image data.
Step 200 specifically includes:
(1) Judging whether the solar zenith angle corresponding to M13 wave band data of each pixel is larger than a first set threshold value or not according to each pixel to obtain a first result; specifically, the first set threshold is 100 °.
If the first result indicates no, the pixel is indicated to be in daytime or in the morning and evening, and the pixel is removed to eliminate interference; if the first result indicates yes, marking the pixel as a night pixel; a plurality of the night-time pixels form a set of night-time pixels.
(2) Based on a land/ocean static geographic database, carrying out matching screening on the night pixel sets to obtain the night land pixel sets, and ensuring that a fire detection background is the ground; the set of night-time land pixels includes a plurality of night-time land pixels.
In particular, the land/sea static geographic database includes latitude and longitude information and land, water identification, which defines which locations on the earth are land and which locations are water. The micro-light observation image data also comprises longitude and latitude information of the image, the longitude and latitude information of the image is matched with the longitude and latitude information in the land/sea static geographic database, and the land and water body identifications in the land/sea static geographic database are compared with the micro-light observation image data according to the land and water body identifications, so that land pixels and water body pixels are determined, only the land pixels are reserved by removing the water body pixels, and the interference of the water body pixels on the next data processing can be avoided.
(3) The cloud is an important interference factor affecting the detection of the fire, and the ground object spectrum information received by the cloud coverage area sensor is polluted, so that the detection of the fire can be affected. In order to effectively eliminate interference of cloud on an extraction result, cloud identification is needed, and the purpose of removing the cloud is achieved. Thus, the following steps of processing are performed:
Judging whether a bright temperature value corresponding to M16 wave band data of each night land pixel is smaller than a second set threshold value or not according to each night land pixel so as to obtain a second result; the second set threshold is 265K.
If the second result shows that the cloud pixels exist, the data are inaccurate, and the night land pixels are removed; if the second result indicates no, marking the night land pixels as preliminary low-light pixels; and forming preliminary low-light image data by a plurality of the preliminary low-light pixels.
And 300, screening fire pixels from the preliminary low-light image data to obtain suspected fire pixel data.
Step 300 specifically includes:
(1) The radiation value of the fire at night in the DNB wave band is considered to be higher than that of the background pixels, so that the screening of suspected fire is completed by combining the DNB wave band radiation threshold with the infrared bright temperature (bright temperature difference) threshold.
Specifically, performing first histogram processing on DNB band data in the preliminary low-light-level image data to obtain DNB band radiation threshold values.
In practical application, considering that a fixed bright temperature threshold condition (BT 13 >305K is usually adopted) cannot detect a colder fire source, performing a second histogram processing on a bright temperature value corresponding to M13 band data in the preliminary low-light image data to find a split point of the bright temperature of the fire and the background BT 13, and thus obtaining an M13 band bright temperature threshold.
Extracting a primary suspected fire pixel set from the primary low-light image data according to the DNB wave band radiation threshold and the M13 wave band brightness temperature threshold; the primary suspected fire image element set comprises a plurality of primary suspected fire image elements; the primary suspected fire pixel is a pixel in the preliminary low-light image data, wherein DNB wave band data is larger than the DNB wave band radiation threshold value, and a bright temperature value corresponding to M13 wave band data is larger than the M13 wave band bright temperature threshold value, and the pixel corresponds to the pixel marked as the preliminary low-light pixel in the step 200.
(2) Fire has stronger infrared radiation near the mid-wave infrared 4 μm band (M13 band) and relatively weaker radiation near the thermal infrared 12 μm band (M16 band), and suspected fire can be screened by setting the bright temperature difference threshold conditions ΔBT of M13 (4.05 μm) and M16 channels to be more than 10K. Namely:
Judging whether the bright temperature difference value of the primary suspected fire pixel is larger than a third set threshold value or not so as to obtain a third result; the brightness temperature difference value of the primary suspected fire pixel is the difference value of the brightness temperature value corresponding to the M13 wave band data and the brightness temperature value corresponding to the M16 wave band data of the primary suspected fire pixel; the third set threshold is 10K.
If the third result indicates yes, marking the primary suspected fire pixel as a secondary suspected fire pixel; the secondary suspected fire pixels form suspected fire pixel data; and if the third result indicates no, eliminating the primary suspected fire pixels.
Step 400, determining absolute fire condition pixel data and relative fire condition pixel data according to the suspected fire condition pixel data; the absolute fire condition pixel data and the relative fire condition pixel data form night ground fire condition data.
Step 400 specifically includes:
(1) And (3) carrying out absolute fire threshold test on the basis of suspected fire screening, and setting a higher bright temperature threshold, wherein the absolute fire threshold is triggered only by a very definite fire pixel.
Preferably, the fourth set threshold is set to 320K, and whether the M13 band data of the secondary suspected fire element is greater than the fourth set threshold, that is, BT 13 >320K is judged, so as to obtain a fourth result; if the fourth result indicates yes, marking the secondary suspected fire pixel as an absolute fire pixel; and the absolute fire pixels form absolute fire pixel data. And if the fourth result indicates no, determining relative fire pixel data according to the M13 wave band data and the M16 wave band data of the secondary suspected fire pixels.
(2) The absolute fire threshold test in the above (1) can only detect part of clear fire pixels, and part of potential fires may be missed. The relative fire point pixels need to be further detected through a relative fire threshold test, and the steps are as follows:
a) Establishing a background window by taking the secondary suspected fire image element as a center, and extracting a background image element in the background window; specifically, a background window of n×n pixels is established. Wherein, the odd number N epsilon [3-21], N takes the value of the smallest integer which can make the number of the effective background pixels larger than 8 and reach 25% of the total number of pixels of the background window pixels.
B) Judging whether the background pixels meet a preset fire test condition set or not according to each background pixel to obtain a fifth result; the preset fire test condition group comprises a first condition, a second condition and a third condition.
The first condition is determined according to the brightness Wen Piancha delta BT of the secondary suspected fire pixel, the brightness temperature average deviation delta BT B of the background pixel and the average absolute deviation delta (delta BT B) of the brightness temperature of the background pixel, namely delta BT > delta BT B+3.5δ(ΔBTB); the second condition is determined according to the brightness Wen Piancha delta BT of the secondary suspected fire pixel and the brightness temperature average deviation delta BT B of the background pixel, namely delta BT > delta BT B +6K; the third condition is determined according to a bright temperature value BT 13 corresponding to the M13 band data of the secondary suspected fire element, a bright temperature value BT 13B corresponding to the M13 band data of the background element, and an average value delta (BT 13B) of absolute differences of bright temperature values corresponding to the M13 band data of the background element, that is, BT 13>BT13B+3δ(BT13B).
The brightness Wen Piancha Δbt of the secondary suspected fire pixel represents the difference value between the brightness temperature value corresponding to the M13 band data and the brightness temperature value corresponding to the M16 band data of the secondary suspected fire pixel, namely BT 13-BT16; the average brightness temperature deviation delta BT B of the background pixel represents the average brightness Wen Piancha value corresponding to the M13 wave band data of the background pixel and the M16 wave band data, namely BT 13B-BT16B; the average absolute deviation delta (delta BT B) of the bright temperature of the background pixel represents the average value of the absolute difference value of the bright temperature value corresponding to the M13 wave band data and the bright temperature value corresponding to the M16 wave band data of the background pixel; and the average value delta (BT 13B) of the absolute difference value of the bright temperature value corresponding to the M13 wave band data of the background pixel represents the average value of the absolute difference value of the bright temperature value corresponding to the M13 wave band data of the background pixel and the bright temperature average value corresponding to the M13 wave band data.
Through the continuous discrimination of the three conditions, the influence of different types of underlying ground features and fire standard change on background temperature calculation can be eliminated, so that the fire extraction precision and the applicability of an algorithm in large-scale fire monitoring are ensured.
C) If the fifth result indicates yes, marking the background pixel as a relative fire pixel; the relative fire image elements form relative fire image element data; and if the fifth result indicates no, discarding the background pixel.
In a specific embodiment, taking forest fire occurring in a certain place as a sample, by comparing the fire detection result of the invention with fire products issued by the NPP/VIIRS official (the space-time ranges of the two can be completely matched), the absolute fire pixels and the relative fire pixels detected by the invention are basically matched with the geographic position of the fire of the NPP/VIIRS official products. From the comparison of the numbers of fire detections shown in fig. 2, it is clear that: the fire condition of the ground is the most widely covered in 30 days of 3 months in 2020, and the number of fire condition pixels is the most, up to 21. Then the fire is reduced in two days, and the number of fire pixels is reduced to 19 and 7 respectively. But the fire is rebounded in 2 days of 4 months, the number of the fire pixels is increased to 15, and then the fire is basically controlled in 3 days of 4 months, and the number of the fire pixels is reduced to 1. In addition, in the algorithm in figure 2, the number of fire pixels is basically consistent with that of the line diagram of the NPP/VIIRS official product, and the difference of the number of fire detection of the two is not more than 3 pixels, so that the algorithm is proved to be feasible.
The invention utilizes the strong radiation characteristic of the fire source in the night visible light/near infrared band, combines the characteristic that the sensitivity of the satellite to heat radiation is obviously higher than that of the satellite near the channel with the wavelength of 11 mu m near the channel with the wavelength of 4 mu m, adopts a multichannel threshold method to distinguish ground fire from other characteristic objects, and develops the automatic recognition multichannel threshold method for the night ground fire based on the micro-light radiation characteristic of the target object.
Example two
As shown in fig. 3, in order to execute a corresponding method of the above embodiment to achieve corresponding functions and technical effects, this embodiment provides an automatic night ground fire recognition system, including:
The data acquisition module 101 is used for acquiring the micro-light observation image data output by the satellite micro-light imager; the micro-light observation image data comprises micro-light observation data of a plurality of pixels; the micro-optical observation data comprises M13 wave band data, M16 wave band data and DNB wave band data.
And the pixel preliminary processing module 201 is used for respectively performing night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the low-light observed image data to obtain preliminary low-light image data.
The suspected fire determining module 301 is configured to screen the preliminary low-light image data for fire pixels to obtain suspected fire pixel data.
The night fire determining module 401 is configured to determine absolute fire image data and relative fire image data according to the suspected fire image data; the absolute fire condition pixel data and the relative fire condition pixel data form night ground fire condition data.
Example III
The embodiment provides an electronic device, which comprises a memory and a processor; the memory is used for storing a computer program, and the processor is used for running the computer program to execute the night ground fire automatic identification method of the first embodiment.
Optionally, the electronic device is a server.
In addition, the present embodiment also provides a computer-readable storage medium storing a computer program; the computer program when executed by the processor realizes the steps of the night ground fire automatic identification method of the first embodiment.
In conclusion, the invention improves the current fire identification effect and improves the identification accuracy of the ground fire at night by introducing the DNB band dynamic radiation threshold at night and the M13 channel dynamic brightness temperature threshold.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In view of the foregoing, this description should not be construed as limiting the invention.
Claims (7)
1. The automatic night ground fire identification method is characterized by comprising the following steps of:
Acquiring micro-light observation image data output by a satellite micro-light imager; the micro-light observation image data comprises micro-light observation data of a plurality of pixels;
Respectively carrying out night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the low-light observed image data to obtain preliminary low-light image data;
Screening fire pixels from the preliminary low-light image data to obtain suspected fire pixel data; the suspected fire state image element data comprises a plurality of secondary suspected fire state image elements; the micro-light observation data comprise M13 wave band data and M16 wave band data;
Determining absolute fire condition pixel data and relative fire condition pixel data according to the suspected fire condition pixel data; the absolute fire condition pixel data and the relative fire condition pixel data form night ground fire condition data;
According to the suspected fire condition pixel data, absolute fire condition pixel data and relative fire condition pixel data are determined, and the method specifically comprises the following steps: judging whether M13 wave band data of the secondary suspected fire pixels are larger than a fourth set threshold value or not so as to obtain a fourth result; if the fourth result indicates yes, marking the secondary suspected fire pixel as an absolute fire pixel; the absolute fire pixels form absolute fire pixel data; if the fourth result indicates no, determining relative fire condition pixel data according to the M13 wave band data and the M16 wave band data of the secondary suspected fire condition pixels;
Determining relative fire pixel data according to the M13 wave band data and the M16 wave band data of the secondary suspected fire pixel, wherein the method specifically comprises the following steps: establishing a background window by taking the secondary suspected fire image element as a center, and extracting a background image element in the background window; judging whether the background pixels meet a preset fire test condition set or not according to each background pixel to obtain a fifth result; the preset fire test condition group comprises a first condition, a second condition and a third condition; the first condition is determined according to the bright temperature deviation of the secondary suspected fire pixel, the bright temperature average deviation of the background pixel and the average absolute deviation of the bright temperature of the background pixel; the second condition is determined according to the bright temperature deviation of the secondary suspected fire pixel and the bright temperature average deviation of the background pixel; the third condition is determined according to an average value of absolute differences of bright temperature values corresponding to M13 wave band data of the secondary suspected fire pixels, bright temperature values corresponding to M13 wave band data of the background pixels and bright temperature values corresponding to M13 wave band data of the background pixels; the brightness Wen Piancha of the secondary suspected fire pixel represents the difference value between the brightness temperature value corresponding to the M13 wave band data and the brightness temperature value corresponding to the M16 wave band data of the secondary suspected fire pixel; the average deviation of the brightness and the temperature of the background pixels represents the average value of brightness Wen Piancha of brightness temperature values corresponding to M13 wave band data and brightness Wen Piancha of brightness temperature values corresponding to M16 wave band data of the background pixels; the average absolute deviation of the brightness temperature of the background pixel represents the average value of the absolute difference value of the brightness temperature value corresponding to the M13 wave band data and the brightness temperature value corresponding to the M16 wave band data of the background pixel;
if the fifth result indicates yes, marking the background pixel as a relative fire pixel; the relative fire image elements form relative fire image element data; discarding the background pixel if the fifth result indicates no;
the first condition is Δbt > Δbt B+3.5δ(ΔBTB);
the second condition is Δbt > Δbt B +6k;
the third condition is BT 13>BT13B+3δ(BT13B);
Wherein Δbt represents the bright Wen Piancha of the secondary suspected fire element, Δbt B represents the bright temperature average deviation of the background element, and δ (Δbt B) represents the average absolute deviation of the bright temperature of the background element; BT 13 represents a bright temperature value corresponding to M13 band data of the secondary suspected fire element, BT 13B represents a bright temperature value corresponding to M13 band data of the background element, and δ (BT 13B) represents an average value of absolute differences of bright temperature values corresponding to M13 band data of the background element.
2. The automatic night ground fire identification method of claim 1, wherein the micro-observation data comprises M13 band data and M16 band data;
Respectively performing night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the low-light observed image data to obtain preliminary low-light image data, wherein the method specifically comprises the following steps of:
Judging whether the solar zenith angle corresponding to M13 wave band data of each pixel is larger than a first set threshold value or not according to each pixel to obtain a first result;
if the first result indicates no, the pixels are removed;
If the first result indicates yes, marking the pixel as a night pixel; a plurality of night pixels form a night pixel set;
Based on a land/ocean static geographic database, carrying out matching screening on the night pixel set to obtain a night land pixel set; the night land pixel set comprises a plurality of night land pixels;
judging whether a bright temperature value corresponding to M16 wave band data of each night land pixel is smaller than a second set threshold value or not according to each night land pixel so as to obtain a second result;
if the second result indicates yes, the night land pixels are removed;
If the second result indicates no, marking the night land pixels as preliminary low-light pixels; and forming preliminary low-light image data by a plurality of the preliminary low-light pixels.
3. The automatic night ground fire identification method of claim 1, wherein the micro-observation data comprises DNB band data and M13 band data;
screening the fire pixels of the preliminary low-light image data to obtain suspected fire pixel data, wherein the method specifically comprises the following steps of:
performing first histogram processing on DNB wave band data in the preliminary low-light-level image data to obtain DNB wave band radiation threshold values;
Performing second histogram processing on the bright temperature value corresponding to the M13 wave band data in the preliminary low-light image data to obtain an M13 wave band bright temperature threshold;
Extracting a primary suspected fire pixel set from the primary low-light image data according to the DNB wave band radiation threshold and the M13 wave band brightness temperature threshold; the primary suspected fire image element set comprises a plurality of primary suspected fire image elements; the primary suspected fire pixel is a pixel in the primary low-light-level image data, wherein DNB wave band data is larger than DNB wave band radiation threshold value, and bright temperature value corresponding to M13 wave band data is larger than M13 wave band bright temperature threshold value;
Judging whether the bright temperature difference value of the primary suspected fire pixel is larger than a third set threshold value or not so as to obtain a third result; the brightness temperature difference value of the primary suspected fire pixel is the difference value of the brightness temperature value corresponding to the M13 wave band data and the brightness temperature value corresponding to the M16 wave band data of the primary suspected fire pixel;
If the third result indicates yes, marking the primary suspected fire pixel as a secondary suspected fire pixel; the secondary suspected fire pixels form suspected fire pixel data;
and if the third result indicates no, eliminating the primary suspected fire pixels.
4. The automatic night-time ground fire identification method according to claim 1, further comprising, before the steps of performing night-time image identification processing, land-image identification processing, and cloud-image removal processing on the micro-observation image data, respectively, to obtain preliminary micro-image data:
and filling default values in the micro-light observation data by adopting a linear interpolation method.
5. An automatic night ground fire identification system, characterized in that the automatic night ground fire identification system comprises:
The data acquisition module is used for acquiring the micro-light observation image data output by the satellite micro-light imager; the micro-light observation image data comprises micro-light observation data of a plurality of pixels;
The pixel preliminary processing module is used for respectively carrying out night pixel identification processing, land pixel identification processing and cloud pixel removal processing on the low-light observed image data so as to obtain preliminary low-light image data;
The suspected fire determining module is used for screening the fire pixels of the preliminary low-light image data to obtain suspected fire pixel data; the suspected fire state image element data comprises a plurality of secondary suspected fire state image elements; the micro-light observation data comprise M13 wave band data and M16 wave band data;
the night fire condition determining module is used for determining absolute fire condition pixel data and relative fire condition pixel data according to the suspected fire condition pixel data; the absolute fire condition pixel data and the relative fire condition pixel data form night ground fire condition data;
According to the suspected fire condition pixel data, absolute fire condition pixel data and relative fire condition pixel data are determined, and the method specifically comprises the following steps: judging whether M13 wave band data of the secondary suspected fire pixels are larger than a fourth set threshold value or not so as to obtain a fourth result; if the fourth result indicates yes, marking the secondary suspected fire pixel as an absolute fire pixel; the absolute fire pixels form absolute fire pixel data; if the fourth result indicates no, determining relative fire condition pixel data according to the M13 wave band data and the M16 wave band data of the secondary suspected fire condition pixels;
Determining relative fire pixel data according to the M13 wave band data and the M16 wave band data of the secondary suspected fire pixel, wherein the method specifically comprises the following steps: establishing a background window by taking the secondary suspected fire image element as a center, and extracting a background image element in the background window; judging whether the background pixels meet a preset fire test condition set or not according to each background pixel to obtain a fifth result; the preset fire test condition group comprises a first condition, a second condition and a third condition; the first condition is determined according to the bright temperature deviation of the secondary suspected fire pixel, the bright temperature average deviation of the background pixel and the average absolute deviation of the bright temperature of the background pixel; the second condition is determined according to the bright temperature deviation of the secondary suspected fire pixel and the bright temperature average deviation of the background pixel; the third condition is determined according to an average value of absolute differences of bright temperature values corresponding to M13 wave band data of the secondary suspected fire pixels, bright temperature values corresponding to M13 wave band data of the background pixels and bright temperature values corresponding to M13 wave band data of the background pixels; the brightness Wen Piancha of the secondary suspected fire pixel represents the difference value between the brightness temperature value corresponding to the M13 wave band data and the brightness temperature value corresponding to the M16 wave band data of the secondary suspected fire pixel; the average deviation of the brightness and the temperature of the background pixels represents the average value of brightness Wen Piancha of brightness temperature values corresponding to M13 wave band data and brightness Wen Piancha of brightness temperature values corresponding to M16 wave band data of the background pixels; the average absolute deviation of the brightness temperature of the background pixel represents the average value of the absolute difference value of the brightness temperature value corresponding to the M13 wave band data and the brightness temperature value corresponding to the M16 wave band data of the background pixel;
if the fifth result indicates yes, marking the background pixel as a relative fire pixel; the relative fire image elements form relative fire image element data; discarding the background pixel if the fifth result indicates no;
the first condition is Δbt > Δbt B+3.5δ(ΔBTB);
the second condition is Δbt > Δbt B +6k;
the third condition is BT 13>BT13B+3δ(BT13B);
Wherein Δbt represents the bright Wen Piancha of the secondary suspected fire element, Δbt B represents the bright temperature average deviation of the background element, and δ (Δbt B) represents the average absolute deviation of the bright temperature of the background element; BT 13 represents a bright temperature value corresponding to M13 band data of the secondary suspected fire element, BT 13B represents a bright temperature value corresponding to M13 band data of the background element, and δ (BT 13B) represents an average value of absolute differences of bright temperature values corresponding to M13 band data of the background element.
6. An electronic device comprising a memory and a processor;
The memory is for storing a computer program, and the processor is for running the computer program to perform the night floor fire automatic identification method of any one of claims 1-4.
7. A computer-readable storage medium, wherein the computer-readable storage medium stores a computer program;
The computer program when executed by a processor implements the steps of the night floor fire automatic identification method of any one of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211141249.5A CN115356782B (en) | 2022-09-20 | 2022-09-20 | Night ground fire automatic identification method, system, electronic equipment and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211141249.5A CN115356782B (en) | 2022-09-20 | 2022-09-20 | Night ground fire automatic identification method, system, electronic equipment and medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115356782A CN115356782A (en) | 2022-11-18 |
CN115356782B true CN115356782B (en) | 2024-08-23 |
Family
ID=84006316
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211141249.5A Active CN115356782B (en) | 2022-09-20 | 2022-09-20 | Night ground fire automatic identification method, system, electronic equipment and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115356782B (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102819926A (en) * | 2012-08-24 | 2012-12-12 | 华南农业大学 | Fire monitoring and warning method on basis of unmanned aerial vehicle |
CN112488091A (en) * | 2021-02-02 | 2021-03-12 | 中科星图股份有限公司 | Fire monitoring method and device based on geosynchronous orbit satellite images |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112562255B (en) * | 2020-12-03 | 2022-06-28 | 国家电网有限公司 | Intelligent image detection method for cable channel smoke and fire conditions in low-light-level environment |
CN112560657B (en) * | 2020-12-12 | 2023-05-30 | 南方电网调峰调频发电有限公司 | Method, device, computer device and storage medium for identifying smoke and fire |
-
2022
- 2022-09-20 CN CN202211141249.5A patent/CN115356782B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102819926A (en) * | 2012-08-24 | 2012-12-12 | 华南农业大学 | Fire monitoring and warning method on basis of unmanned aerial vehicle |
CN112488091A (en) * | 2021-02-02 | 2021-03-12 | 中科星图股份有限公司 | Fire monitoring method and device based on geosynchronous orbit satellite images |
Also Published As
Publication number | Publication date |
---|---|
CN115356782A (en) | 2022-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109581372B (en) | Ecological environment remote sensing monitoring method | |
CN109214439B (en) | Infrared image frozen river detection method based on multi-feature fusion | |
CN110703244B (en) | Method and device for identifying urban water body based on remote sensing data | |
CN110889327B (en) | Intelligent detection method for sewage outlet around water area based on thermal infrared image | |
CN106023203B (en) | Fiery point detecting method based on Landsat-8 landsat images | |
Huang et al. | Landslide monitoring using change detection in multitemporal optical imagery | |
US12032659B2 (en) | Method for identifying dry salt flat based on sentinel-1 data | |
CN111832518A (en) | Space-time fusion-based TSA remote sensing image land utilization method | |
CN113887324A (en) | Fire point detection method based on satellite remote sensing data | |
CN114415173A (en) | Fog-penetrating target identification method for high-robustness laser-vision fusion | |
CN117557584B (en) | Water body extraction method and device, electronic equipment and storage medium | |
CN117456371B (en) | Group string hot spot detection method, device, equipment and medium | |
CN115356782B (en) | Night ground fire automatic identification method, system, electronic equipment and medium | |
CN118135412A (en) | Rural black and odorous water body remote sensing identification method | |
US20040234157A1 (en) | Image processor | |
CN111199557A (en) | Quantitative analysis method and system for decay of remote sensor | |
CN109033984A (en) | A kind of night mist fast automatic detecting method | |
CN106530326B (en) | Change detecting method based on image texture feature and DSM | |
CN113484924A (en) | Remote sensing monitoring and evaluating method for sea ice disaster | |
CN110887567B (en) | Method and system for filtering thermal image data | |
de Jong et al. | Preliminary results of the FATMOSE atmospheric propagation trials in the False Bay, South Africa, November 2009-July 2010 | |
Liu et al. | Shadow extraction and correction from quickbird images | |
CN113657275A (en) | Automatic detection method for forest and grass fire points | |
KR20210111578A (en) | Distortion Method of Total Cloude Cover in Night Time using Ground Based Whole Sky Image Data | |
Roupioz et al. | Quantifying the impact of cloud cover on ground radiation flux measurements using hemispherical images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |