CN114112065A - Method for judging and recognizing fire danger by satellite remote sensing - Google Patents

Method for judging and recognizing fire danger by satellite remote sensing Download PDF

Info

Publication number
CN114112065A
CN114112065A CN202111359168.8A CN202111359168A CN114112065A CN 114112065 A CN114112065 A CN 114112065A CN 202111359168 A CN202111359168 A CN 202111359168A CN 114112065 A CN114112065 A CN 114112065A
Authority
CN
China
Prior art keywords
pixel
fire
channel
area
temperature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111359168.8A
Other languages
Chinese (zh)
Inventor
黄欢
张明祥
吕乾勇
刘丽
吴建蓉
唐红祥
马晓红
田鹏举
毛先胤
李光一
杜昊
牛唯
刘君
黄军凯
许逵
徐舒蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou Power Grid Co Ltd
Original Assignee
Guizhou Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou Power Grid Co Ltd filed Critical Guizhou Power Grid Co Ltd
Priority to CN202111359168.8A priority Critical patent/CN114112065A/en
Publication of CN114112065A publication Critical patent/CN114112065A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0014Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation from gases, flames
    • G01J5/0018Flames, plasma or welding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0066Radiation pyrometry, e.g. infrared or optical thermometry for hot spots detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Plasma & Fusion (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for judging and identifying a satellite remote sensing fire, which comprises the following steps: step 1, preprocessing data, acquiring original satellite data, and performing positioning, calibration, quality inspection and local map processing; step 2, fire point identification is carried out on the preprocessed satellite picture; step 3, judging and identifying the intensity level of the fire point; step 4, fire passing area identification and fire point space-time distribution statistics; step 5, generating a satellite remote sensing fire monitoring result; the method solves the technical problems that in the process of monitoring the satellite remote sensing fire, a meteorological satellite detector is interfered by solar radiation, and the brightness temperature of some special ground objects (interference sources for short) on the underlying surface in a remote sensing image exceeds a target heat source, so that false fire points are generated, and the final satellite remote sensing fire judgment and identification result has larger error.

Description

Method for judging and recognizing fire danger by satellite remote sensing
Technical Field
The invention belongs to the technical field of fire monitoring; in particular to a method for judging and identifying the satellite remote sensing fire danger.
Background
In winter and spring every year, the rainfall is less, air-dried materials are dry, forest and grassland fires are easy to occur, and serious threats are brought to the safety of forestry, agriculture and animal husbandry and people's lives and properties. Moreover, forest-covered mountainous areas are often rare, traffic blockage is caused, and the method has great limitation and difficulty only depending on manual monitoring. With the development of science and technology, satellite remote sensing is more and more widely applied to fire monitoring of forests, grasslands and the like. The meteorological satellite remote sensing has the characteristics of wide visual field, dense observation frequency and sensitivity to ground high-temperature heat sources, and gradually becomes an important means for monitoring forest and grassland fires.
The remote sensing fire danger of forest and grassland mainly utilizes the characteristic that a satellite detector has high sensitivity to abnormal temperature rise in fire in a middle infrared band. In the process of monitoring the satellite remote sensing fire, a meteorological satellite detector is interfered by solar radiation, the brightness temperature of some special ground objects (called interference sources for short) on the underlying surface in a remote sensing image exceeds a target heat source, so that a false fire point is generated, and the final satellite remote sensing fire judgment result has a larger error.
Disclosure of Invention
The technical problems to be solved by the invention are as follows: the method for judging the satellite remote sensing fire danger has the technical problems that in the process of monitoring the satellite remote sensing fire, a meteorological satellite detector is interfered by solar radiation, the brightness temperature of some special ground objects (interference sources for short) on an underlying surface in a remote sensing image can exceed a target heat source, so that false fire points are generated, the final satellite remote sensing fire danger judgment result has larger error, and the like.
The technical scheme of the invention is as follows:
a method for judging fire danger by satellite remote sensing comprises the following steps:
step 1, preprocessing data, acquiring original satellite data, and performing positioning, calibration, quality inspection and local map processing;
step 2, fire point identification is carried out on the preprocessed satellite picture;
step 3, judging and identifying the intensity level of the fire point;
step 4, fire passing area identification and fire point space-time distribution statistics;
and 5, generating a satellite remote sensing fire monitoring result.
The specific method for preprocessing the data comprises the following steps:
step 1.1, calibration and positioning: positioning and calibrating the original satellite data; after the image is positioned, the image is checked by a landmark, and if an error exists, the image is corrected by positioning; the requirement of the positioning precision after correction is not more than 1 pixel;
step 1.2, local area map processing: generating a local area map of a monitoring area from the projection of the preprocessed data, wherein the size of the image is set according to the range of the monitoring area;
and step 1.3, synthesizing the image by multiple channels.
Step 1.3 the multichannel composite image specifically includes:
step 1.3.1, multichannel synthesis of daytime images:
a) channel synthesis mode: respectively endowing middle infrared, near infrared and visible light channels with red, green and blue colors for synthesis;
b) channel enhancement: each channel is subjected to image enhancement to highlight heat source points and earth surface characteristics;
c) image color effect: the fire point is bright red; the vegetation area is green; clouds or smog are white or grey; the water body is blue, black or dark purple;
step 1.3.2, multichannel synthesis of night images:
a) channel synthesis mode: respectively endowing middle infrared, far infrared and far infrared channels with red, green and blue colors for synthesis;
b) channel enhancement: each channel is subjected to image enhancement to highlight heat source points and earth surface characteristics;
c) image color effect: the fire point is bright red; the fireless zone is dark grey.
The fire point judging method in the step 2 comprises an automatic judging method and a human-computer interaction judging method;
the specific automatic identification method comprises the following steps:
step 2.1, generating a pixel marking map, which specifically comprises the following steps:
(1) cloud region pixel mark
If the pixel satisfies RVIS>RVIS_TCTHAnd TFIR<TFIR_TCTHMarked as cloud area pixels;
in the formula: rVISRepresents the visible channel reflectance, expressed as a percentage;
RVIS_TCTHrepresenting a cloud area judgment threshold value of the reflectivity of a visible light channel, and representing the cloud area judgment threshold value in percentage;
TFIRexpressing the brightness temperature of the far infrared channel, and the unit is Kelvin;
TFIR_TCTHthe unit of the cloud area identification threshold value of the brightness and the temperature of the far infrared channel is Kelvin;
(2) water body picture element mark
If the pixel satisfies RNIR<RNIR_TWTHAnd (R)NIR﹣RVIS) If less than 0, marking as a water body pixel; in the formula:RNIRrepresents the near infrared channel reflectance, expressed as a percentage; rNIR_TWTHRepresenting a water body identification threshold value of the reflectivity of the near infrared channel, and representing the water body identification threshold value by percentage;
RVISrepresents the visible channel reflectance, expressed as a percentage;
(3) desert region pixel mark
If the land utilization type of the pixel is a desert area, marking the pixel as the desert area pixel;
(4) blazed spot area pixel mark
If the pixel satisfies SglintMarking the image element as a flare area at an angle of less than or equal to 15 degrees; in the formula: sglintRepresenting solar flare angle in degrees.
(5) Low temperature picture element marking
If the pixel TMIR<TMIR_TLOWTHAnd is marked as a low-temperature area pixel; in the formula: t isMIRRepresents the brightness temperature of the middle infrared channel, and has the unit of Kelvin (K);
TMIR_TLOWTHthe judgment threshold value of the brightness temperature low-temperature region of the middle infrared channel is represented, and the unit is Kelvin;
(6) clear sky vegetation pixel mark
If the pixel is not a cloud area, a water body, a desert area, a blazing area or a low-temperature pixel, marking as a clear sky vegetation pixel;
step 2.2, calculating background temperature:
(1) background region pixel selection
Judging suspected high-temperature pixel
If the clear sky in the neighborhood around the detection pixel indicates that the pixel meets the following conditions, the pixel is taken as a suspected high-temperature pixel:
TMIR>(TMIR_AVG+△TMIR) And TM-F>(TM-F_AVG+8K), or TMIR>TMIR_WM
In the formula: t isMIR_AVGThe brightness and temperature average value of clear air in 7 x 7 pixels around an infrared channel in a detection pixel is expressed in Kelvin;
TM-F_AVGfor detecting the clear empty finger quilt pixel in the 7 x 7 pixels around the pixelThe average value of the brightness temperature difference of the middle infrared channel and the far infrared channel is expressed in Kelvin;
in the formula: delta TMIRIndicating a brightness temperature increment threshold value of a middle infrared channel for judging the suspected high-temperature pixel, wherein the unit is Kelvin;
TM-Fexpressing the brightness temperature difference between the middle infrared channel and the far infrared channel, and the unit is Kelvin;
TMIR_WMindicating that the brightness and temperature threshold of the middle infrared channel of the suspected high-temperature pixel is judged, wherein the unit is Kelvin;
(2) background zone mean temperature and standard deviation calculation
1) Calculating TMIRBG,TFIRBG,TM-FBG
Figure BDA0003358366900000051
In the formula: t isMIRBGRepresenting the average value of the brightness and the temperature of the background area of the intermediate infrared channel, and the unit is Kelvin; t isFIRBGThe average value of the brightness and the temperature of the background area of the far infrared channel is represented, and the unit is Kelvin; t isM-FBGThe average value of the brightness and temperature difference between the infrared channel and the far infrared channel in the background area is represented, and the unit is Kelvin; t isFIR,iThe unit of the temperature of the ith pixel of the far infrared channel is Kelvin; t isMIR,iThe temperature of the ith pixel brightness temperature of the middle infrared channel is represented in Kelvin;
2) calculating delta TMIRBGAnd δ TFIRBG
Figure BDA0003358366900000052
Figure BDA0003358366900000053
In the formula: delta TMIRBGExpressing the standard deviation of the brightness and the temperature of the background area of the intermediate infrared channel, and the unit is Kelvin (K);
δTFIRBGexpressing the standard deviation of the brightness and the temperature of the background area of the far infrared channel, and the unit is Kelvin (K);
3) calculating delta TM-FBG
Figure BDA0003358366900000061
In the formula, delta TM-FBGStandard deviation representing the mean value of the difference in brightness and temperature between the infrared channel and the far infrared channel in the background region, in kelvin (K);
4) correction of standard deviation
When delta TMIRBGLess than δ TbgminIs set to δ TMIRBGIs δ Tbgmin;δTFIRBGLess than δ TbgminWhen, set delta TFIRBGIs δ Tbgmin;δTM-FBGLess than δ TbgminWhen, set delta TM-FBGIs δ Tbgmin;δTbgminThe reference value is 2K, and when the sun zenith angle is more than 87 degrees, delta TbgminReference value is 1.5K;
when delta TMIRBGGreater than δ TbgmaxWhen, set delta TMIRBGIs δ Tbgmax;δTFIRBGGreater than δ TbgmaxWhen, set delta TFIRBGIs δ Tbgmax;δTM-FBGGreater than δ TbgmaxWhen, set delta TM-FBGIs δ Tbgmax,δTbgmaxThe reference value is 3K, and when the sun zenith angle is more than 87 degrees, delta TbgmaxReference value is 2.5K;
in the formula: delta TbgmaxThe standard deviation upper limit of the infrared channel brightness and temperature of the background area representing the fire point identification is expressed in Kelvin (K);
δTbgminthe lower limit of standard deviation of the infrared channel brightness and the temperature in the background area representing the fire point identification is expressed in Kelvin (K);
step 2.3, fire pixel confirmation
If one pixel meets the following conditions, the pixel is determined as a fire point pixel preliminarily;
d) polar orbit satellite: t isMIR≥(TMIRBGTen 4 delta TMIRBG) And TM-F≥(TM-FBG+4δTM-FBG);
e) A stationary satellite: t isMIR≥(TMIRBGTen 3 delta TMIRBG) And TM-F≥(TM-FBG+3δTM-FBG);
If the preliminarily determined fire point pixel meets one of the following cloud pollution conditions, the fire point pixel is excluded, otherwise, the fire point pixel is determined;
f)RVIS>(RVISBG+ 10%) and TMIR<TMIRTC
In the formula: t isMIRTCThe judgment threshold value of the brightness and the temperature of the infrared channel in fire point pixel cloud pollution is represented, the unit is Kelvin (K), and the initial value is 330K;
g)TFIR<(TFIRBG-△TFIR_TCR);
in the formula: delta TFIR_TCRThe method comprises the steps of representing a far infrared channel brightness and temperature identification threshold value polluted by fire point cloud, wherein the unit is Kelvin (K), and the initial value is 5K;
h)RVIS>RVISBGand TFIR<TFIRBGAnd TMIR<(TMIRBG+6δTMIRBG) And TM-F<(TM-FBG+6δTM-FBG);
Step 2.4, grading the credibility of the point pixel
The credibility is divided into four grades according to the following steps:
a) when (T)MIR–TMIRBG) Not less than 15K and (T)M-F–TM-FBG) When the K is more than or equal to 15K, the image element is a first-class fire point image element, which is also called a confirmed fire point image element;
b) when (T)MIR-TMIRBG) < 15K or (T)M-F–TM-FBG) When the temperature is less than 15K, the pixel is a secondary fire point pixel, also called a suspected fire point pixel;
c) satisfying a) or b), and when cloud area pixels exist at the periphery of the fire point pixel within 2 pixels (including two pixels), the fire point pixel is a three-level fire point pixel, also called a cloud area edge fire point pixel;
d) when the 8 pixels around the fire point pixel are allIs not a fire phantom, and (T)MIR–TMIRBG) When the temperature is higher than 20K, the pixel is a four-stage fire point pixel, also called a noise fire point pixel;
step 2.5, fire pixel partitioning
The adjacent fire point pixels are divided into the same fire area and numbered from north to south and from west to east.
The specific man-machine interaction identification method comprises the following steps:
step 2.6, the daytime image fire point identification specifically comprises the following steps:
step 2.6.1, daytime fire monitoring multichannel image processing, including:
1) image enhancement
Performing exponential enhancement processing on the mid-infrared channel image to highlight heat source point information; performing linear enhancement processing on the near infrared and visible light channel images to highlight the surface features;
2) channel synthesis
RGB synthesis is carried out on the intermediate infrared channel image, the near infrared channel image and the visible light channel image to generate a daytime fire monitoring multichannel synthetic image;
step 2.6.2, visual fire point identification of images in daytime: in the daytime fire monitoring multichannel synthetic image, the color effect of the meteorological satellite image is as follows: the fire point is bright red; the fire passing area is dark red or black; the vegetation area without fire is green; cloud or smoke is white or grey; the water body is blue, black or dark purple;
step 2.7, night image man-machine interaction fire point identification, which specifically comprises the following steps:
step 2.7.1, night fire monitoring multichannel image processing, including:
image enhancement: performing exponential enhancement processing on the mid-infrared channel image to highlight heat source point information; respectively carrying out linear enhancement and exponential enhancement treatment on the far infrared channel and the far infrared split window channel to highlight the surface characteristics;
channel synthesis: RGB synthesis is carried out on the intermediate infrared channel, the far infrared channel and the far infrared channel split window channel to generate a night fire monitoring multichannel synthetic graph;
step 2.7.2, identifying the fire point by visual observation of the night image; in the multi-channel synthetic image for monitoring the fire condition at night, the color effect of the meteorological satellite image is as follows: the fire point is bright red; the fireless zone is dark grey.
And step 3, the fire point intensity grade identification comprises the following steps:
step 3.1, estimating the area ratio of the sub-pixel fire points and the fire point temperature, and specifically comprising the following steps:
step 3.1.1, channel data selection
If TMIR≥TMIRthIf the brightness and temperature of the middle infrared channel are saturated, selecting far infrared channel data for estimation; otherwise, selecting intermediate infrared and far infrared channel data and estimating by using a Newton iteration method; if the Newton iteration method is not converged, selecting mid-infrared channel data for estimation;
step 3.1.2, Dual channel data estimation
Will NMIR、NMIRbg、NFIR、NFIRbg、P0,T0Substituting into a Newton iteration method formula to estimate P and T;
step 3.1.3 Single channel data estimation
The middle infrared channel estimation is shown in a formula 5-9, the far infrared channel solution is shown in a formula 5-10, and T is set to be 750K in the formula;
P=(NMIR-NMIRbg)/(NMIRt-NMIRbg)…………(5-9)
wherein the content of the first and second substances,
Figure BDA0003358366900000091
P=(NFIR-NFIRbg)/(NFIRt-NFIRbg)…………(5-10)
wherein the content of the first and second substances,
Figure BDA0003358366900000101
step 3.2, calculating the fire point intensity
The fire point intensity is obtained by the formula (5-11):
FRP=Sf×σT4……………(5-11)
in the formula, FRPThe Radiation Power (Fire Radiation Power) of the open Fire in the pixel, namely the intensity of the Fire point, and the unit is watt (W); sigma is Stefan Boltzmann constant, sigma is 5.6704 × 10-8(W·m-2·K-4);SfIs the area of the sub-pixel fire point and has the unit of square meter (m)2);
Sf=P×S……………(5-12)
Wherein S is the area of the fire pixel;
and 3.3, grading the fire point intensity.
Step 4, the fire passing area identification and fire point space-time distribution statistics comprise:
step 4.1, judging and identifying the fire passing area at a single time, which comprises the following specific steps:
step 4.1.1, single-channel identification threshold
If the pixel is a non-water body pixel and the following conditions are met, the pixel is identified as a fire passing area pixel;
RNir<RNir_th………(5-13)
in the formula, RNir_thA reflectivity threshold value for the near-infrared channel fire passing region identification;
in the formula: rNir_thThe reference value is 10%,
step 4.1.2 NDVI identification threshold method
If the pixel is a non-water body pixel and the following conditions are met, the pixel is identified as a fire passing area pixel;
NDVI=(RNir-RRed)/(RNir+RRed)……(5-15)
in the formula, RRed(ii) reflectance in percent (%) for the visible red channel;
NDVI<NDVIth………(5-16)
in the formula, NDVIthNDVI threshold value for fire passing area pixel identification; in the formula: NDVIthThe reference value is 0;
4.2, judging and identifying the fire passing area of the two time phase images before and after the fire occurs; if the pixel is a non-water body pixel and the following conditions are met, the pixel is identified as a fire passing area pixel;
NDVIBefore-NDVIAfter>NDVIBA_th………(5-17)
in the formula, NDVIBeforePre-fire NDVI values; NDVIAfterNDVI after fire; NDVIBA_thAn NDVI threshold for fire regions is identified based on the two-phase images.
Step 4, the fire passing area identification and fire point space-time distribution statistics further comprise:
step 4.3, verifying and correcting the man-machine interaction fire passing area identification information, which specifically comprises the following steps:
step 4.3.1, manually checking the fire passing area identification effect:
superposing fire passing area identification information on the fire monitoring multichannel synthetic graph, and manually checking the fire passing area identification effect;
step 4.3.2, correcting fire passing area identification through human-computer interaction:
if the fire passing area judgment information has misjudgment or missed judgment, correcting the judgment error by correcting the judgment threshold value until the manual judgment effect is met;
step 4.3.3, fire passing area estimation, which specifically comprises the following steps:
calculating the area of the pixel of the fire passing area of single satellite data:
Figure BDA0003358366900000111
calculating the pixel area of the fire passing area by remote sensing of the meteorological satellite and the high spatial resolution satellite: calculating the vegetation coverage of the high spatial resolution satellite:
CG=(NDVIMix-NDVIS)/(NDVIV-NDVIS)……(5-19)
in the formula, NDVIMixA single pixel NDVI value for a high spatial resolution satellite; NDVIVA single pixel vegetation end-member value for a high spatial resolution satellite; NDVISA single pixel bare earth end member value for a high spatial resolution satellite;
remote sensing of meteorological satellites and high spatial resolution satellites is integrated to estimate the area of a fire passing area:
Figure BDA0003358366900000121
in the formula, CiC in single pixel space range of meteorological satelliteGCalculating an average value, namely:
Figure BDA0003358366900000122
in the formula, CGjThe vegetation coverage of the jth high spatial resolution satellite pixel in the coverage range of the meteorological satellite pixel;
and m is the number of high-spatial-resolution satellite pixels contained in a single pixel of the meteorological satellite.
The invention has the beneficial effects that:
according to the invention, false fire points caused by interference of a photovoltaic power generation field and a thermal power plant in the existing meteorological satellite remote sensing fire monitoring product are eliminated through means of fire point discrimination, fire point intensity discrimination, fire passing area discrimination, fire point space-time distribution statistics and the like, and the reliability of the monitoring product is improved.
On the basis of the existing satellite remote sensing fire point identification method, the satellite remote sensing fire point identification method utilizes the collected geographic information data of interference sources (photovoltaic power plants and thermal power plants) to secondarily identify the satellite remote sensing fire point based on a GIS (geographic information system) technical method (including establishing an interference source information data set, creating a buffer area and the like), and eliminates false fire points generated by the interference sources in monitoring. The invention is applied to the daily business of satellite remote sensing fire monitoring, and a multi-stage service material is manufactured, so that the reliability of a satellite remote sensing fire monitoring product is effectively improved.
The method solves the technical problems that in the process of monitoring the satellite remote sensing fire, a meteorological satellite detector is interfered by solar radiation, and the brightness temperature of some special ground objects (interference sources for short) on the underlying surface in a remote sensing image exceeds a target heat source, so that false fire points are generated, and the final satellite remote sensing fire judgment result has larger error.
Detailed Description
The method specifically comprises the following steps:
step 1: preprocessing data, the satellite original data is processed by preprocessing processes such as positioning, calibration, quality inspection and local map processing,
the specific method for preprocessing the data in the step 1 comprises the following steps:
step 1.1, scaling and positioning
The original data of the satellite is preprocessed by positioning, calibration and the like, and the visible light channel is corrected by the solar altitude angle.
Step 1.2, local area map generation
And generating a local area map of the monitoring area from the projection of the preprocessed data, wherein the size of the image can be set according to the range of the monitoring area.
Step 1.3, multichannel Synthesis of images
Step 1.3.1 daytime image
a) Channel synthesis mode: respectively endowing middle infrared, near infrared and visible light channels with red, green and blue colors for synthesis;
b) channel enhancement: each channel needs image enhancement to highlight heat source points and earth surface characteristics;
c) image color effect:
1, ignition point: bright red color
2) Vegetation area: green colour
3) Cloud or fog, white or grayish-grey
4) Water body of blue, black or dark purple
Step 1.3.2 night image
a) Channel synthesis mode: respectively endowing middle infrared, far infrared and far infrared channels with red, green and blue colors for synthesis;
b) channel enhancement: each channel needs image enhancement to highlight heat source points and earth surface characteristics;
c) image color effect:
1) fire point: bright red color
2) A fireless area: dark grey
Step 1.4 positioning accuracy
Image localization should be checked by landmarks. If the error exists, the positioning correction is needed. And the requirement of the positioning accuracy after correction is not more than 1 pixel.
Step 2: the fire monitoring mainly comprises fire point identification;
and 2, judging the fire point, namely, judging automatically and judging man-machine interaction.
Automatic identification:
the information content of the judgment and identification result comprises a satellite/sensor identification, the resolution ratio (unit is meter) of a middle infrared channel, observation time, the serial number of a fire point image element, the serial number of a fire district, longitude and latitude, names of provinces, cities and counties, land coverage types, reliability, result generation time, processing personnel and the like.
Step 2.1, generating a pixel label chart
(1) Cloud region pixel mark
If the pixel satisfies RVIS>RVIS_TCTHAnd TFIR<TFIR_TCTHAnd the mark is a cloud area pixel.
In the formula: rVISThe visible light channel reflectance is expressed in percent (%).
RVIS_TCTHThe cloud area identification threshold value of the visible light channel reflectivity is expressed in percentage (%), and the reference value is 20%.
TFIRIndicating the far infrared channel brightness temperature in kelvin (K).
TFIR_TCTHThe unit of the cloud area identification threshold value of the far infrared channel brightness temperature is Kelvin (K), and the reference value is 270K.
(2) Water body picture element mark
If the pixel satisfies RNIR<RNIR_TWTHAnd (R)NIR﹣RVIS) If less than 0, marking as a water body pixel.
In the formula: rNIRThe near infrared channel reflectance is expressed in percent (%).
RNIR_TWTHThe water body identification threshold value of the reflectivity of the near infrared channel is expressed in percentage (%), and the reference value is 10%.
RVISThe visible light channel reflectance is expressed in percent (%).
(3) Desert region pixel mark
And if the land utilization type of the pixel is the desert area, marking the pixel as the desert area pixel.
(4) Blazed spot area pixel mark
If the pixel satisfies SglintAnd marking the image element as a flare area at an angle of less than or equal to 15 degrees.
In the formula: sglintDenotes the solar flare angle in degrees (degree).
(5) Low temperature picture element marking
If the pixel TMIR<TMIR_TLOWTHAnd marking as a low-temperature area pixel.
In the formula: t isMIRRepresents the mid-infrared channel luminance temperature in kelvin (K).
TMIR_TLOWTHThe unit of the identification threshold value of the brightness temperature low-temperature region of the middle infrared channel is Kelvin (K), and the reference value is 265K.
(6) Clear sky vegetation pixel mark
If the pixels are not cloud areas, water bodies, desert areas, blazing areas and low-temperature pixels, the pixels are marked as clear-sky vegetation pixels.
Step 2.2, calculation of background temperature
(1) Background region pixel selection
1) Judging suspected high-temperature pixel
If the clear null indicator pixel in the peripheral neighborhood of the detection pixel meets the following conditions, the clear null indicator pixel is taken as a suspected high-temperature pixel:
TMIR>(TMIR_AVG+△TMIR) And TM-F>(TM-F_AVG+8K), or TMIR>TMIR_WM
In the formula: t isMIR_AVGDetecting brightness and temperature average values of pixels in 7 x 7 around an infrared channel in the pixel, wherein the unit is Kelvin (K);
TM-F_AVGthe clear space in 7 pixels around the detection pixel refers to the average value of the brightness and temperature differences of the infrared channel and the far infrared channel in the detected pixel, and the unit is Kelvin (K).
In the formula: delta TMIRIndicating and judging brightness and temperature of middle infrared channel of suspected high-temperature pixelThe incremental threshold value is set to a value that is,
the unit is Kelvin (K), the reference value is 10K.
TM-FThe brightness temperature difference between the mid-infrared channel and the far-infrared channel is expressed in kelvin (K).
TMIR_WMThe unit of the middle infrared channel brightness temperature threshold value for judging the suspected high-temperature pixel is Kelvin (K), and the reference value is 330K.
(2) Background zone mean temperature and standard deviation calculation
1) Calculating TMIRBG,TFIRBG,TM-FBG
Figure BDA0003358366900000171
In the formula: t isMIRBGAnd the average value of the brightness and the temperature of the background area of the intermediate infrared channel is expressed in Kelvin (K).
TFIRBGAnd the unit of the brightness temperature average value of the background area of the far infrared channel is Kelvin (K).
TM-FBGAnd the average value of the brightness temperature difference between the infrared channel and the far infrared channel in the background area is expressed in Kelvin (K).
TFIR,iThe unit of the temperature of the ith pixel of the far infrared channel is Kelvin (K).
TMIR,iAnd the unit of the temperature of the ith pixel brightness of the intermediate infrared channel is Kelvin (K).
2) Calculating delta TMIRBG,δTFIRBG
Figure BDA0003358366900000172
In the formula: delta TMIRBGAnd the standard deviation of the brightness and the temperature of the background area of the intermediate infrared channel is expressed in units of Kelvin (K).
δTFIRBGAnd the standard deviation of the brightness and the temperature of the background area of the far infrared channel is expressed in the unit of Kelvin (K).
3) Calculating deltaTM-FBG
Figure BDA0003358366900000181
In the formula: delta TM-FBGAnd the standard deviation of the mean value of the brightness temperature difference between the infrared channel and the far infrared channel in the background area is expressed in Kelvin (K).
4) Correction of standard deviation
When delta TMIRBGLess than δ TbgminIs set to δ TMIRBGIs δ Tbgmin;δTFIRBGLess than δ TbgminWhen, set delta TFIRBGIs δ Tbgmin;δTM-FBGLess than δ TbgminWhen, set delta TM-FBGIs δ Tbgmin。δTbgminThe reference value is 2K, and when the sun zenith angle is more than 87 degrees, delta TbgminThe reference value is 1.5K.
When delta TMIRBGGreater than δ TbgmaxWhen, set delta TMIRBGIs δ Tbgmax;δTFIRBGGreater than δ TbgmaxWhen, set delta TFIRBGIs δ Tbgmax;δTM-FBGGreater than δ TbgmaxWhen, set delta TM-FBGIs δ Tbgmax,δTbgmaxThe reference value is 3K, and when the sun zenith angle is more than 87 degrees, delta TbgmaxThe reference value was 2.5K.
In the formula: delta TbgmaxAnd the standard deviation upper limit of the infrared channel brightness and temperature of the background area representing the fire point identification is expressed in Kelvin (K).
δTbgminAnd the lower limit of the standard deviation of the infrared channel brightness and the temperature in the background area representing the fire point identification is expressed in Kelvin (K).
Step 2.3, fire pixel confirmation
If a pixel meets the following conditions, the pixel can be determined as a fire point pixel preliminarily:
i) polar orbit satellite: t isMIR≥(TMIRBGTen 4 delta TMIRBG) And TM-F≥(TM-FBG+4δTM-FBG);
j) A stationary satellite: t isMIR≥(TMIRBGTen 3 delta TMIRBG) And TM-F≥(TM-FBG+3δTM-FBG)。
If the preliminarily determined fire pixel meets one of the following cloud pollution conditions, the fire pixel is excluded, otherwise, the fire pixel is determined as the fire pixel:
k)RVIS>(RVISBG+ 10%) and TMIR<TMIRTC
In the formula: t isMIRTCThe unit of the threshold value for judging the brightness and the temperature of the infrared channel in the fire point pixel cloud pollution is Kelvin (K), and the initial value is 330K.
l)TFIR<(TFIRBG-△TFIR_TCR);
In the formula: delta TFIR_TCRThe unit of the identification threshold value of the far infrared channel brightness and temperature of the fire point cloud pollution is Kelvin (K), and the initial value is 5K.
m)RVIS>RVISBGAnd TFIR<TFIRBGAnd TMIR<(TMIRBG+6δTMIRBG) And TM-F<(TM-FBG+6δTM-FBG)。
Step 2.4, grading the credibility of the point pixel
The credibility is divided into four grades according to the following steps:
a) when (T)MIR–TMIRBG) Not less than 15K and (T)M-F–TM-FBG) When the K is more than or equal to 15K, the image element is a first-class fire point image element, which is also called a confirmed fire point image element;
b) when (T)MIR-TMIRBG) < 15K or (T)M-F–TM-FBG) When the temperature is less than 15K, the pixel is a secondary fire point pixel, also called a suspected fire point pixel;
c) satisfying a) or b), and when cloud area pixels exist at the periphery of the fire point pixel within 2 pixels (including two pixels), the fire point pixel is a three-level fire point pixel, also called a cloud area edge fire point pixel;
d) when the 8 pixels around the fire pixel are not fire pixels, and (T)MIR–TMIRBG) And when the temperature is higher than 20K, the pixel is a four-stage fire point pixel, also called a noise fire point pixel.
Step 2.5, fire pixel partitioning
The adjacent fire point pixels are divided into the same fire area and numbered from north to south and from west to east.
Human-computer interaction recognition
Step 2.6, daytime image fire point identification method
Step 2.6.1, daytime fire monitoring multichannel image processing
1) Image enhancement
Performing exponential enhancement processing on the mid-infrared channel image to highlight heat source point information; and performing linear enhancement processing on the near-infrared and visible light channel images to highlight the earth surface characteristics.
2) Channel synthesis
And (4) performing RGB synthesis on the intermediate infrared, near infrared and visible light channel images to generate a daytime fire monitoring multichannel synthetic image.
Step 2.6.2, daytime image visual fire point identification
In general, in the daytime fire monitoring multichannel synthetic image, the color effect of the meteorological satellite image is as follows:
1) fire point: bright red color
2) A fire passing area: dark red or black
3) Vegetation areas without fire: green colour
4) Cloud or smoke: white or bluish grey
5) Water body: blue, black or dark purple
Step 2.7, night image man-machine interaction fire point identification method
Step 2.7.1, multi-channel image processing for monitoring fire condition at night
1) Image enhancement
Performing exponential enhancement processing on the mid-infrared channel image to highlight heat source point information; and respectively carrying out linear enhancement and exponential enhancement treatment on the far infrared channel and the far infrared split window channel to highlight the surface characteristics.
2) Channel synthesis
And performing RGB synthesis on the intermediate infrared channel, the far infrared channel and the far infrared channel split window channel to generate a night fire monitoring multichannel synthetic graph.
(2) Night image visual fire point identification method
In general, in a multi-channel synthetic image for monitoring the fire condition at night, the color effect of a meteorological satellite image is as follows:
1) fire point: bright red.
2) A fireless area: dark grey.
Step 3, judging and identifying the intensity level of the fire point; the method specifically comprises the following steps:
step 3.1, subpixel fire area ratio and fire temperature estimation method
Step 3.1.1, channel data selection
If TMIR≥TMIRthAnd if the brightness temperature of the middle infrared channel is saturated, selecting far infrared channel data for estimation. Otherwise, selecting middle infrared and far infrared channel (namely dual-channel) data, and estimating by using a Newton iteration method; if the Newton iteration method is not converged, selecting mid-infrared channel data for estimation; t isMIRthSee formula (5-4):
Figure BDA0003358366900000211
C1is constant, with a value of: 1.1910659X 10-5mW/
(m2·sr·cm-4)。
C2Is a constant value of 1.438833K/cm-1
VMIRRepresenting the mid-infrared channel center wavenumber.
NMIRCAAnd (3) representing the radiance corresponding to the intercept of the scaling coefficient of the intermediate infrared channel, wherein the unit is as follows: mW/(m)2·sr·cm-1)。
Step 3.1.2, two-channel data estimation method
Will NMIR、NMIRbg、NFIR、NFIRbg、P0,T0Substituting into Newton's iterative method formula to estimate P and T
NMIR、NMIRbg、NFIR、NFIRbgAccording to the publicationFormulas (5-5) - (5-8); p0,T0The dichotomy calculation is used.
Figure BDA0003358366900000221
Figure BDA0003358366900000222
Figure BDA0003358366900000223
Figure BDA0003358366900000224
Step 3.1.3 Single channel data estimation method
The mid-infrared channel estimation is shown in formulas 5-9, the far-infrared channel solution is shown in formulas 5-10, and T is set to 750K in the formulas.
P=(NMIR-NMIRbg)/(NMIRt-NMIRbg)…………(5-9)
Wherein the content of the first and second substances,
Figure BDA0003358366900000231
P=(NFIR-NFIRbg)/(NFIRt-NFIRbg)…………(5-10)
wherein the content of the first and second substances,
Figure BDA0003358366900000232
step 3.2, calculating the fire point intensity
The fire point intensity is obtained by the formula (5-11):
FRP=Sf×σT4……………(5-11)
in the formula, FRP is the Radiation Power (Fire Radiation Power) of the open Fire in the pixel, namely the intensity of the Fire point, and the unit is watt (W); sigma-stipitisBoltzmann constant, σ 5.6704 × 10-8(W·m-2·K-4);SfSub-pixel fire area in square meters (m)2);NFIRRepresents far infrared channel radiance, unit: mW (milliwatt) and/or liver/kidney
(m2·sr·cm-1)。
NFIRBGRepresents the background radiance of the far infrared channel, unit: mW/(m)2·sr·cm-1)。
NFIRtAnd (3) expressing the sub-pixel fire point radiance of the far infrared channel, unit: mW/(m)2·sr·cm-1)。
NMIRRepresents the mid-infrared channel radiance, unit: mW/(m)2·sr·cm-1)。
NMIRBGRepresents the mid-infrared channel background radiance, unit: mW/(m)2·sr·cm-1)。
NMIRtAnd (3) representing the radiance of the fire point of the sub-pixel of the middle infrared channel, wherein the unit is as follows: mW/(m)2·sr·cm-1)。
P represents the area ratio of the sub-pixel fire points.
P0And (4) representing the iteration initial value of the variable P in the Newton iteration method formula.
T represents the sub-pixel fire temperature, unit: K.
T0and (4) representing the iteration initial value of the variable T in the Newton iteration method formula.
VFIRRepresenting the far infrared channel center wave number.
Sf=P×S……………(5-12)
In the formula, S is the area of the fire pixel.
Step 3.3, fire point intensity grading
The fire intensity ratings are defined in table 1.
Table 1: fire intensity rating definition
Figure BDA0003358366900000241
Step 4, processing procedures such as fire passing area identification, fire point space-time distribution statistics and the like are carried out to generate a satellite remote sensing fire monitoring result
Step 4, area estimation of fire passing area
Step 4.1, single-time fire passing area identification method
Step 4.1.1, single-channel identification threshold method
If the pixel is a non-water body pixel and the following conditions are met, the pixel is identified as a fire passing area pixel:
RNir<RNir_th………(5-13)
in the formula, RNir_th-a reflectivity threshold for the near infrared channel fire crossing zone identification.
In the formula: rNir_thThe reference value is 10%.
Step 4.1.2 NDVI identification threshold method
If the pixel is a non-water body pixel and the following conditions are met, the pixel is identified as a fire passing area pixel:
NDVI=(RNir-RRed)/(RNir+RRed)……(5-15)
in the formula, RRedThe visible red channel reflectance, expressed as a percentage (%).
NDVI<NDVIth………(5-16)
In the formula, NDVIth-NDVI threshold value of picture element identification of fire passing area.
In the formula: NDVIthThe reference value is 0,.
Step 4.2, fire area judgment method based on two time phase images before and after fire
If the pixel is a non-water body pixel and the following conditions are met, the pixel is identified as a fire passing area pixel:
NDVIBefore-NDVIAfter>NDVIBA_th………(5-17)
in the formula, NDVIBefore-pre-fire NDVI values; NDVIAfter-post fire NDVI values; NDVIBA_th-identifying the NDVI threshold of the fire passing zone based on the two temporal images, from the vicinity of the fire passing zone and the fire passing zoneThe difference between the NDVI of two phases for areas having the same land cover type is determined.
Step 4.3, verifying and correcting the identification information of the fire passing area through human-computer interaction
Step 4.3.1, artificially inspecting the fire passing area identification effect
And superposing the fire passing area identification information on the fire monitoring multichannel synthetic graph, and manually checking the fire passing area identification effect. The visual identification method of the fire passing area is disclosed in the daytime image visual fire point identification method of the 7.1.2.1 polar orbit meteorological satellite of QX/T344.2.
Step 4.3.2, the fire passing area judgment is corrected through human-computer interaction
If the fire passing area judgment information has the condition of misjudgment or missed judgment, the judgment error is corrected by correcting the judgment threshold value until the manual judgment effect is met.
Step 4.3.2 fire passing area estimation method
2.3.4.1 method for calculating area of image element of single satellite data fire passing area
Figure BDA0003358366900000261
2.3.4.2 meteorological satellite and high spatial resolution satellite remote sensing integrated fire passing area pixel area meter
Calculation method
a) High spatial resolution satellite vegetation coverage calculation
CG=(NDVIMix-NDVIS)/(NDVIV-NDVIS)…………(5-19)
In the formula, NDVIMix-high spatial resolution satellite single pixel NDVI values; NDVIV-high spatial resolution satellite single pixel vegetation end-member values; NDVISHigh spatial resolution satellite single pixel bare earth end-member values.
b) Meteorological satellite and high spatial resolution satellite remote sensing comprehensive fire passing area estimation
Figure BDA0003358366900000262
In the formula, CiC in single pixel space range of meteorological satelliteGCalculating an average value, namely:
Figure BDA0003358366900000271
in the formula, CGjThe vegetation coverage of the jth high spatial resolution satellite pixel within the coverage range of the meteorological satellite pixel;
m is the number of high spatial resolution satellite pixels contained in a single pixel of the meteorological satellite.
And 5: and performing secondary judgment based on the fire monitoring result to obtain a satellite remote sensing fire monitoring result.
Based on the fire monitoring product carries out the secondary and judges and know, mainly includes:
step 5.1, establishing an interference source information database
An interference source information database is established based on ArcGIS software by utilizing geographic information data of photovoltaic power generation fields, thermal power plants and the like.
Step 5.2, secondary identification of remote sensing fire point
And (3) creating a buffer area of 5 kilometers of an interference source (a photovoltaic power generation field and a thermal power plant) by using a GIS technology, and judging and identifying the fire monitoring result again. If the fire point is in the buffer area, the fire point is identified as a false fire point, and if the fire point is outside the buffer area, the remote sensing fire hazard material is manufactured according to the flow.

Claims (8)

1. A method for judging fire danger by satellite remote sensing comprises the following steps:
step 1, preprocessing data, acquiring original satellite data, and performing positioning, calibration, quality inspection and local map processing;
step 2, fire point identification is carried out on the preprocessed satellite picture;
step 3, judging and identifying the intensity level of the fire point;
step 4, fire passing area identification and fire point space-time distribution statistics;
and 5, generating a satellite remote sensing fire monitoring result.
2. The method for judging the satellite remote sensing fire risk according to claim 1, characterized in that: the specific method for preprocessing the data comprises the following steps:
step 1.1, calibration and positioning: positioning and calibrating the original satellite data; after the image is positioned, the image is checked by a landmark, and if an error exists, the image is corrected by positioning; the requirement of the positioning precision after correction is not more than 1 pixel;
step 1.2, local area map processing: generating a local area map of a monitoring area from the projection of the preprocessed data, wherein the size of the image is set according to the range of the monitoring area;
and step 1.3, synthesizing the image by multiple channels.
3. The method for judging the satellite remote sensing fire risk according to claim 2, characterized in that: step 1.3 the multichannel composite image specifically includes:
step 1.3.1, multichannel synthesis of daytime images:
a) channel synthesis mode: respectively endowing middle infrared, near infrared and visible light channels with red, green and blue colors for synthesis;
b) channel enhancement: each channel is subjected to image enhancement to highlight heat source points and earth surface characteristics;
c) image color effect: the fire point is bright red; the vegetation area is green; clouds or smog are white or grey; the water body is blue, black or dark purple;
step 1.3.2, multichannel synthesis of night images:
a) channel synthesis mode: respectively endowing middle infrared, far infrared and far infrared channels with red, green and blue colors for synthesis;
b) channel enhancement: each channel is subjected to image enhancement to highlight heat source points and earth surface characteristics;
c) image color effect: the fire point is bright red; the fireless zone is dark grey.
4. The method for judging the satellite remote sensing fire risk according to claim 1, characterized in that: the fire point judging method in the step 2 comprises an automatic judging method and a human-computer interaction judging method;
the specific automatic identification method comprises the following steps:
step 2.1, generating a pixel marking map, which specifically comprises the following steps:
(1) cloud region pixel mark
If the pixel satisfies RVIS>RVIS_TCTHAnd TFIR<TFIR_TCTHMarked as cloud area pixels;
in the formula: rVISRepresents the visible channel reflectance, expressed as a percentage;
RVIS_TCTHrepresenting a cloud area judgment threshold value of the reflectivity of a visible light channel, and representing the cloud area judgment threshold value in percentage;
TFIRexpressing the brightness temperature of the far infrared channel, and the unit is Kelvin;
TFIR_TCTHthe unit of the cloud area identification threshold value of the brightness and the temperature of the far infrared channel is Kelvin;
(2) water body picture element mark
If the pixel satisfies RNIR<RNIR_TWTHAnd (R)NIR﹣RVIS) If less than 0, marking as a water body pixel; in the formula: rNIRRepresents the near infrared channel reflectance, expressed as a percentage; rNIR_TWTHRepresenting a water body identification threshold value of the reflectivity of the near infrared channel, and representing the water body identification threshold value by percentage;
RVISrepresents the visible channel reflectance, expressed as a percentage;
(3) desert region pixel mark
If the land utilization type of the pixel is a desert area, marking the pixel as the desert area pixel;
(4) blazed spot area pixel mark
If the pixel satisfies SglintMarking the image element as a flare area at an angle of less than or equal to 15 degrees; in the formula: sglintRepresenting solar flare angle in degrees.
(5) Low temperature picture element marking
If the pixel TMIR<TMIR_TLOWTHAnd is marked as a low-temperature area pixel; in the formula: t isMIRRepresents the brightness temperature of the middle infrared channel, and has the unit of Kelvin (K);
TMIR_TLOWTHthe judgment threshold value of the brightness temperature low-temperature region of the middle infrared channel is represented, and the unit is Kelvin;
(6) clear sky vegetation pixel mark
If the pixel is not a cloud area, a water body, a desert area, a blazing area or a low-temperature pixel, marking as a clear sky vegetation pixel;
step 2.2, calculating background temperature:
(1) background region pixel selection
Judging suspected high-temperature pixel
If the clear sky in the neighborhood around the detection pixel indicates that the pixel meets the following conditions, the pixel is taken as a suspected high-temperature pixel:
TMIR>(TMIR_AVG+△TMIR) And TM-F>(TM-F_AVG+8K), or TMIR>TMIR_WM
In the formula: t isMIR_AVGThe brightness and temperature average value of clear air in 7 x 7 pixels around an infrared channel in a detection pixel is expressed in Kelvin;
TM-F_AVGthe detection pixel is characterized in that clear sky in 7 × 7 pixels around the detection pixel refers to the average value of the brightness and temperature difference between an infrared channel and a far infrared channel in the detected pixel, and the unit is Kelvin;
in the formula: delta TMIRIndicating a brightness temperature increment threshold value of a middle infrared channel for judging the suspected high-temperature pixel, wherein the unit is Kelvin;
TM-Fexpressing the brightness temperature difference between the middle infrared channel and the far infrared channel, and the unit is Kelvin;
TMIR_WMindicating that the brightness and temperature threshold of the middle infrared channel of the suspected high-temperature pixel is judged, wherein the unit is Kelvin;
(2) background zone mean temperature and standard deviation calculation
1) Calculating TMIRBG,TFIRBG,TM-FBG
Figure FDA0003358366890000041
In the formula:TMIRBGrepresenting the average value of the brightness and the temperature of the background area of the intermediate infrared channel, and the unit is Kelvin; t isFIRBGThe average value of the brightness and the temperature of the background area of the far infrared channel is represented, and the unit is Kelvin; t isM-FBGThe average value of the brightness and temperature difference between the infrared channel and the far infrared channel in the background area is represented, and the unit is Kelvin; t isFIR,iThe unit of the temperature of the ith pixel of the far infrared channel is Kelvin; t isMIR,iThe temperature of the ith pixel brightness temperature of the middle infrared channel is represented in Kelvin;
2) calculating delta TMIRBGAnd δ TFIRBG
Figure FDA0003358366890000042
In the formula: delta TMIRBGExpressing the standard deviation of the brightness and the temperature of the background area of the intermediate infrared channel, and the unit is Kelvin (K);
δTFIRBGexpressing the standard deviation of the brightness and the temperature of the background area of the far infrared channel, and the unit is Kelvin (K);
3) calculating delta TM-FBG
Figure FDA0003358366890000051
In the formula, delta TM-FBGStandard deviation representing the mean value of the difference in brightness and temperature between the infrared channel and the far infrared channel in the background region, in kelvin (K);
4) correction of standard deviation
When delta TMIRBGLess than δ TbgminIs set to δ TMIRBGIs δ Tbgmin;δTFIRBGLess than δ TbgminWhen, set delta TFIRBGIs δ Tbgmin;δTM-FBGLess than δ TbgminWhen, set delta TM-FBGIs δ Tbgmin;δTbgminThe reference value is 2K, and when the sun zenith angle is more than 87 degrees, delta TbgminReference value is 1.5K;
when delta TMIRBGGreater than deltaTbgmaxWhen, set delta TMIRBGIs δ Tbgmax;δTFIRBGGreater than δ TbgmaxWhen, set delta TFIRBGIs δ Tbgmax;δTM-FBGGreater than δ TbgmaxWhen, set delta TM-FBGIs δ Tbgmax,δTbgmaxThe reference value is 3K, and when the sun zenith angle is more than 87 degrees, delta TbgmaxReference value is 2.5K;
in the formula: delta TbgmaxThe standard deviation upper limit of the infrared channel brightness and temperature of the background area representing the fire point identification is expressed in Kelvin (K);
δTbgminthe lower limit of standard deviation of the infrared channel brightness and the temperature in the background area representing the fire point identification is expressed in Kelvin (K);
step 2.3, fire pixel confirmation
If one pixel meets the following conditions, the pixel is determined as a fire point pixel preliminarily;
a) polar orbit satellite: t isMIR≥(TMIRBGTen 4 delta TMIRBG) And TM-F≥(TM-FBG+4δTM-FBG);
b) A stationary satellite: t isMIR≥(TMIRBGTen 3 delta TMIRBG) And TM-F≥(TM-FBG+3δTM-FBG);
If the preliminarily determined fire point pixel meets one of the following cloud pollution conditions, the fire point pixel is excluded, otherwise, the fire point pixel is determined;
a)RVIS>(RVISBG+ 10%) and TMIR<TMIRTC
In the formula: t isMIRTCThe judgment threshold value of the brightness and the temperature of the infrared channel in fire point pixel cloud pollution is represented, the unit is Kelvin (K), and the initial value is 330K;
b)TFIR<(TFIRBG-△TFIR_TCR);
in the formula: delta TFIR_TCRThe method comprises the steps of representing a far infrared channel brightness and temperature identification threshold value polluted by fire point cloud, wherein the unit is Kelvin (K), and the initial value is 5K;
c)RVIS>RVISBGand TFIR<TFIRBGAnd TMIR<(TMIRBG+6δTMIRBG) And TM-F<(TM-FBG+6δTM-FBG);
Step 2.4, grading the credibility of the point pixel
The credibility is divided into four grades according to the following steps:
a) when (T)MIR–TMIRBG) Not less than 15K and (T)M-F–TM-FBG) When the K is more than or equal to 15K, the image element is a first-class fire point image element, which is also called a confirmed fire point image element;
b) when (T)MIR-TMIRBG) < 15K or (T)M-F–TM-FBG) When the temperature is less than 15K, the pixel is a secondary fire point pixel, also called a suspected fire point pixel;
c) satisfying a) or b), and when cloud area pixels exist at the periphery of the fire point pixel within 2 pixels (including two pixels), the fire point pixel is a three-level fire point pixel, also called a cloud area edge fire point pixel;
d) when the 8 pixels around the fire pixel are not fire pixels, and (T)MIR–TMIRBG) When the temperature is higher than 20K, the pixel is a four-stage fire point pixel, also called a noise fire point pixel;
step 2.5, fire pixel partitioning
The adjacent fire point pixels are divided into the same fire area and numbered from north to south and from west to east.
5. The method for satellite remote sensing fire judgment according to claim 4, wherein the method comprises the following steps: the specific man-machine interaction identification method comprises the following steps:
step 2.6, the daytime image fire point identification specifically comprises the following steps:
step 2.6.1, daytime fire monitoring multichannel image processing, including:
1) image enhancement
Performing exponential enhancement processing on the mid-infrared channel image to highlight heat source point information; performing linear enhancement processing on the near infrared and visible light channel images to highlight the surface features;
2) channel synthesis
RGB synthesis is carried out on the intermediate infrared channel image, the near infrared channel image and the visible light channel image to generate a daytime fire monitoring multichannel synthetic image;
step 2.6.2, visual fire point identification of images in daytime: in the daytime fire monitoring multichannel synthetic image, the color effect of the meteorological satellite image is as follows: the fire point is bright red; the fire passing area is dark red or black; the vegetation area without fire is green; cloud or smoke is white or grey; the water body is blue, black or dark purple;
step 2.7, night image man-machine interaction fire point identification, which specifically comprises the following steps:
step 2.7.1, night fire monitoring multichannel image processing, including:
image enhancement: performing exponential enhancement processing on the mid-infrared channel image to highlight heat source point information; respectively carrying out linear enhancement and exponential enhancement treatment on the far infrared channel and the far infrared split window channel to highlight the surface characteristics;
channel synthesis: RGB synthesis is carried out on the intermediate infrared channel, the far infrared channel and the far infrared channel split window channel to generate a night fire monitoring multichannel synthetic graph;
step 2.7.2, identifying the fire point by visual observation of the night image; in the multi-channel synthetic image for monitoring the fire condition at night, the color effect of the meteorological satellite image is as follows: the fire point is bright red; the fireless zone is dark grey.
6. The method for judging the satellite remote sensing fire risk according to claim 1, characterized in that: and step 3, the fire point intensity grade identification comprises the following steps:
step 3.1, estimating the area ratio of the sub-pixel fire points and the fire point temperature, and specifically comprising the following steps:
step 3.1.1, channel data selection
If TMIR≥TMIRthIf the brightness and temperature of the middle infrared channel are saturated, selecting far infrared channel data for estimation; otherwise, selecting intermediate infrared and far infrared channel data and estimating by using a Newton iteration method; if the Newton iteration method is not converged, selecting mid-infrared channel data for estimation;
step 3.1.2, Dual channel data estimation
Will NMIR、NMIRbg、NFIR、NFIRbg、P0,T0Substituting into a Newton iteration method formula to estimate P and T;
step 3.1.3 Single channel data estimation
The middle infrared channel estimation is shown in a formula 5-9, the far infrared channel solution is shown in a formula 5-10, and T is set to be 750K in the formula;
P=(NMIR-NMIRbg)/(NMIRt-NMIRbg)…………(5-9)
wherein the content of the first and second substances,
Figure FDA0003358366890000081
P=(NFIR-NFIRbg)/(NFIRt-NFIRbg)…………(5-10)
wherein the content of the first and second substances,
Figure FDA0003358366890000082
step 3.2, calculating the fire point intensity
The fire point intensity is obtained by the formula (5-11):
FRP=Sf×σT4……………(5-11)
in the formula, the FRP is the Fire Radiation Power (Fire Radiation Power) in the pixel, i.e. the Fire intensity, and the unit is watt (W); sigma is Stefan Boltzmann constant, sigma is 5.6704 × 10-8(W·m-2·K-4);SfIs the area of the sub-pixel fire point and has the unit of square meter (m)2);
Sf=P×S……………(5-12)
Wherein S is the area of the fire pixel;
and 3.3, grading the fire point intensity.
7. The method for judging the satellite remote sensing fire risk according to claim 1, characterized in that: step 4, the fire passing area identification and fire point space-time distribution statistics comprise:
step 4.1, judging and identifying the fire passing area at a single time, which comprises the following specific steps:
step 4.1.1, single-channel identification threshold
If the pixel is a non-water body pixel and the following conditions are met, the pixel is identified as a fire passing area pixel;
RNir<RNir_th………(5-13)
in the formula, RNir_thA reflectivity threshold value for the near-infrared channel fire passing region identification;
in the formula: rNir_thThe reference value is 10%,
step 4.1.2 NDVI identification threshold method
If the pixel is a non-water body pixel and the following conditions are met, the pixel is identified as a fire passing area pixel;
NDVI=(RNir-RRed)/(RNir+RRed)……(5-15)
in the formula, RRed(ii) reflectance in percent (%) for the visible red channel;
NDVI<NDVIth………(5-16)
in the formula, NDVIthNDVI threshold value for fire passing area pixel identification; in the formula: NDVIthThe reference value is 0;
4.2, judging and identifying the fire passing area of the two time phase images before and after the fire occurs; if the pixel is a non-water body pixel and the following conditions are met, the pixel is identified as a fire passing area pixel;
NDVIBefore-NDVIAfter>NDVIBA_th………(5-17)
in the formula, NDVIBeforePre-fire NDVI values; NDVIAfterNDVI after fire; NDVIBA_thAn NDVI threshold for fire regions is identified based on the two-phase images.
8. The method for satellite remote sensing fire judgment according to claim 7, wherein the method comprises the following steps: step 4, the fire passing area identification and fire point space-time distribution statistics further comprise:
step 4.3, verifying and correcting the man-machine interaction fire passing area identification information, which specifically comprises the following steps:
step 4.3.1, manually checking the fire passing area identification effect:
superposing fire passing area identification information on the fire monitoring multichannel synthetic graph, and manually checking the fire passing area identification effect;
step 4.3.2, correcting fire passing area identification through human-computer interaction:
if the fire passing area judgment information has misjudgment or missed judgment, correcting the judgment error by correcting the judgment threshold value until the manual judgment effect is met;
step 4.3.3, fire passing area estimation, which specifically comprises the following steps:
calculating the area of the pixel of the fire passing area of single satellite data:
Figure FDA0003358366890000101
calculating the pixel area of the fire passing area by remote sensing of the meteorological satellite and the high spatial resolution satellite:
calculating the vegetation coverage of the high spatial resolution satellite:
CG=(NDVIMix-NDVIS)/(NDVIV-NDVIS)……(5-19)
in the formula, NDVIMixA single pixel NDVI value for a high spatial resolution satellite; NDVIVA single pixel vegetation end-member value for a high spatial resolution satellite; NDVISA single pixel bare earth end member value for a high spatial resolution satellite;
remote sensing of meteorological satellites and high spatial resolution satellites is integrated to estimate the area of a fire passing area:
Figure FDA0003358366890000111
in the formula, CiC in single pixel space range of meteorological satelliteGCalculating an average value, namely:
Figure FDA0003358366890000112
in the formula, CGjFor the first in the meteorological satellite pixel coverageVegetation coverage of j high spatial resolution satellite pixels;
and m is the number of high-spatial-resolution satellite pixels contained in a single pixel of the meteorological satellite.
CN202111359168.8A 2021-11-17 2021-11-17 Method for judging and recognizing fire danger by satellite remote sensing Pending CN114112065A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111359168.8A CN114112065A (en) 2021-11-17 2021-11-17 Method for judging and recognizing fire danger by satellite remote sensing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111359168.8A CN114112065A (en) 2021-11-17 2021-11-17 Method for judging and recognizing fire danger by satellite remote sensing

Publications (1)

Publication Number Publication Date
CN114112065A true CN114112065A (en) 2022-03-01

Family

ID=80396946

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111359168.8A Pending CN114112065A (en) 2021-11-17 2021-11-17 Method for judging and recognizing fire danger by satellite remote sensing

Country Status (1)

Country Link
CN (1) CN114112065A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114780665A (en) * 2022-06-21 2022-07-22 环球数科集团有限公司 Digital image processing system of satellite remote sensing information in fire monitoring
CN115615559A (en) * 2022-12-19 2023-01-17 南京信大卫星应用研究院有限公司 Fire disaster state monitoring system based on image information acquisition

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108280812A (en) * 2018-01-23 2018-07-13 中国科学院遥感与数字地球研究所 A kind of excessive fire method for extracting region based on image enhancement
CN108564761A (en) * 2018-05-10 2018-09-21 中南林业科技大学 Forest fires recognition methods based on wind and cloud weather satellite data
CN108717526A (en) * 2018-05-11 2018-10-30 中南林业科技大学 Satellite monitoring forest fires hot spot recognition methods based on AVHRR data
CN112488091A (en) * 2021-02-02 2021-03-12 中科星图股份有限公司 Fire monitoring method and device based on geosynchronous orbit satellite images
CN113340432A (en) * 2021-06-09 2021-09-03 广东电网有限责任公司 Fire monitoring method and system based on stationary meteorological satellite

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108280812A (en) * 2018-01-23 2018-07-13 中国科学院遥感与数字地球研究所 A kind of excessive fire method for extracting region based on image enhancement
CN108564761A (en) * 2018-05-10 2018-09-21 中南林业科技大学 Forest fires recognition methods based on wind and cloud weather satellite data
CN108717526A (en) * 2018-05-11 2018-10-30 中南林业科技大学 Satellite monitoring forest fires hot spot recognition methods based on AVHRR data
CN112488091A (en) * 2021-02-02 2021-03-12 中科星图股份有限公司 Fire monitoring method and device based on geosynchronous orbit satellite images
CN113340432A (en) * 2021-06-09 2021-09-03 广东电网有限责任公司 Fire monitoring method and system based on stationary meteorological satellite

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郑伟: "基于多源卫星遥感数据的森林过火区面积估算方法", 《林业科学》, vol. 47, no. 8, pages 192 - 194 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114780665A (en) * 2022-06-21 2022-07-22 环球数科集团有限公司 Digital image processing system of satellite remote sensing information in fire monitoring
CN114780665B (en) * 2022-06-21 2022-08-23 环球数科集团有限公司 Digital image processing system of satellite remote sensing information in fire monitoring
CN115615559A (en) * 2022-12-19 2023-01-17 南京信大卫星应用研究院有限公司 Fire disaster state monitoring system based on image information acquisition
CN115615559B (en) * 2022-12-19 2023-03-10 南京信大卫星应用研究院有限公司 Fire disaster state monitoring system based on image information acquisition

Similar Documents

Publication Publication Date Title
CN109581372B (en) Ecological environment remote sensing monitoring method
CN108564761B (en) Forest fire identification method based on wind and cloud meteorological satellite data
CN112435207B (en) Forest fire monitoring and early warning method based on sky-ground integration
Lee et al. Improved detection of hotspots using the AVHRR 3.7-um channel
Derrien et al. Automatic cloud detection applied to NOAA-11/AVHRR imagery
Hamann et al. Remote sensing of cloud top pressure/height from SEVIRI: analysis of ten current retrieval algorithms
CN109509319B (en) Power transmission line forest fire monitoring and early warning method based on static satellite monitoring data
CN101452078B (en) Daytime and nighttime sea fog detecting method based on remote sensing of polarorbiting meteorological satellite
Li et al. A technique for detecting burn scars using MODIS data
CN114112065A (en) Method for judging and recognizing fire danger by satellite remote sensing
Wilson et al. Enhancing a simple MODIS cloud mask algorithm for the Landsat data continuity mission
CN113553907A (en) Forest ecological environment condition evaluation method based on remote sensing technology
CN108731817A (en) The different sensors infra-red radiation normalizing modeling method differentiated applied to forest fires hot spot
CN110632032A (en) Sand storm monitoring method based on earth surface reflectivity library
Eck et al. Cloud-screening for Africa using a geographically and seasonally variable infrared threshold
Jee et al. Development of GK-2A AMI aerosol detection algorithm in the East-Asia region using Himawari-8 AHI data
CN115205709A (en) Forest fire point identification method based on satellite remote sensing
Inoue Features of clouds over the tropical Pacific during northern hemispheric winter derived from split window measurements
Klekociuk et al. The state of the atmosphere in the 2016 southern Kerguelen Axis campaign region
CN110261341A (en) A kind of volcanic ash cloud detection method and system based on stationary weather satellite data
Chen et al. Application of FY-4B geostationary meteorological satellite in grassland fire dynamic monitoring
Pergola et al. Advanced Satellite Technique for Volcanic Activity Monitoring and Early Warning.
Zhukov et al. Experience of detection and quantitative characterization of fires during the experimental small satellite mission BIRD
Lee et al. Volcanic ash retrieval using a new geostationary satellite
Manyangadze Forest fire detection for near real-time monitoring using geostationary satellites

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination