CN111006771A - Method and device for judging and identifying fire point based on polar orbit meteorological satellite - Google Patents

Method and device for judging and identifying fire point based on polar orbit meteorological satellite Download PDF

Info

Publication number
CN111006771A
CN111006771A CN201911380389.6A CN201911380389A CN111006771A CN 111006771 A CN111006771 A CN 111006771A CN 201911380389 A CN201911380389 A CN 201911380389A CN 111006771 A CN111006771 A CN 111006771A
Authority
CN
China
Prior art keywords
pixel
fire point
fire
candidate
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911380389.6A
Other languages
Chinese (zh)
Inventor
王彤
田翔
黄勇
周恩泽
魏瑞增
朱凌
范亚洲
饶章权
豆朋
刘淑琴
周永言
刘剑锋
向谆
潘君镇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electric Power Research Institute of Guangdong Power Grid Co Ltd
Original Assignee
Electric Power Research Institute of Guangdong Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electric Power Research Institute of Guangdong Power Grid Co Ltd filed Critical Electric Power Research Institute of Guangdong Power Grid Co Ltd
Priority to CN201911380389.6A priority Critical patent/CN111006771A/en
Publication of CN111006771A publication Critical patent/CN111006771A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0014Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation from gases, flames
    • G01J5/0018Flames, plasma or welding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/007Radiation pyrometry, e.g. infrared or optical thermometry for earth observation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0096Radiation pyrometry, e.g. infrared or optical thermometry for measuring wires, electrical contacts or electronic systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/48Thermography; Techniques using wholly visual means
    • G01J5/485Temperature profile
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/80Calibration
    • G01J5/804Calibration using atmospheric correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/80Calibration
    • G01J5/806Calibration by correcting for reflection of the emitter radiation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0092Temperature by averaging, e.g. by scan

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Plasma & Fusion (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental & Geological Engineering (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geology (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a method and a device for judging and identifying fire points based on polar orbit meteorological satellites, which comprise the following steps: performing pixel elimination processing on the acquired satellite data to obtain a pixel to be detected; comparing the brightness temperature of the pixel to be detected with the candidate fire point threshold value to obtain a candidate fire point pixel; calculating the background brightness temperature of the candidate fire point pixels; if the background brightness temperature is successfully calculated, judging the brightness temperature difference of the candidate fire points, and determining fire point pixels; and if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, thereby determining the fire point pixel. According to the method, the polar orbit meteorological satellite is adopted to acquire the image data of the detection area, and a threshold value combination method is adopted, so that the fire point pixel can be identified finely.

Description

Method and device for judging and identifying fire point based on polar orbit meteorological satellite
Technical Field
The application relates to the technical field of satellite monitoring, in particular to a method and a device for judging and identifying fire points based on polar orbit meteorological satellites.
Background
Energy resources and power loads in China are distributed in a reverse direction, so that the erection of a long-distance large-capacity high-voltage-grade electric energy transmission line becomes a necessary way for guaranteeing the balanced development of economy in China. When a mountain fire occurs near the power transmission line, the insulation clearance of the power transmission line is extremely reduced to induce the tripping of the power transmission line due to factors such as high temperature, smoke and fly ash generated by the mountain fire. The sustained high temperature of the mountain fire maintains the gap dielectric strength below the line at a low level, resulting in a difficult success of reclosing. Mountain fires which are intensively exploded due to human activities such as sacrifice, burning wasteland and the like in spring festival, clearness and late autumn of China can cause simultaneous or sequential tripping of a plurality of power transmission lines, thereby causing large-scale power failure accidents. Therefore, the power transmission line corridor needs to be monitored for the mountain fire in real time so as to ensure the reliable operation of the power grid.
At present, the wide-area monitoring of the forest fire of the power transmission line mainly depends on a satellite remote sensing monitoring technology, however, the general resolution of a geostationary satellite is lower than that of a polar orbit satellite, the missing judgment still exists for the micro fire point identification, and the influence of the satellite monitoring characteristics, seasonal characteristics, vegetation indexes and the like is not comprehensively considered in the existing forest fire monitoring algorithm, so that the adaptability of the forest fire for the satellite monitoring is poor. Therefore, the scheme provides a method for monitoring the forest fire of the power transmission line by adopting the polar orbit meteorological satellite.
Disclosure of Invention
The embodiment of the application provides a method and a device for judging and identifying fire points based on polar orbit meteorological satellites, so that fire point pixels can be identified in a refined mode.
In view of the above, a first aspect of the present application provides a method for identifying a fire point based on polar orbiting meteorological satellites, the method comprising:
performing pixel elimination processing on the acquired satellite data to obtain a pixel to be detected;
comparing the brightness temperature of the pixel to be detected with a candidate fire point threshold value to obtain a candidate fire point pixel;
calculating the background brightness temperature of the candidate fire point pixels;
if the background brightness temperature is successfully calculated, judging the brightness temperature difference of the candidate fire points, and determining fire point pixels; and if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, thereby determining the fire point pixel.
Optionally, before the pixel exclusion processing is performed on the acquired satellite data to obtain the pixel to be detected, the method further includes: preprocessing the satellite data.
Optionally, the pixel exclusion processing is performed on the acquired satellite data, and the obtaining of the pixel to be detected specifically includes:
and removing the water body pixel, the flare area pixel and the cloud pixel in the satellite data to obtain the pixel to be detected.
Optionally, the step of comparing the pixel to be detected with the candidate fire point threshold to obtain the candidate fire point pixel specifically comprises:
and comparing the lighting temperature of most of pixels to be detected with the candidate fire point threshold value to obtain the candidate fire point pixels, wherein the candidate fire point threshold value comprises a daytime lighting temperature threshold value and a nighttime lighting temperature threshold value.
Optionally, the judging of the lighting temperature difference of the candidate fire point is performed, so as to determine the fire point image element specifically as follows:
and obtaining a bright temperature difference threshold value table according to historical statistical data, and comparing the bright temperature of the candidate fire point with the threshold value in the bright temperature difference threshold value table to determine the fire point pixel.
Optionally, calculating neighborhood pixels of the candidate fire points, and excluding cloud area pixels, water body pixels and suspected pixels in the neighborhood pixels;
and selecting at least 6 pixels in the neighborhood pixels to calculate the background brightness temperature.
Optionally, if the background brightness temperature is successfully calculated, judging the brightness temperature difference of the candidate fire point, so as to determine a fire point pixel; if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, and further comprising the following steps of:
and determining the position of the fire point according to the longitude and latitude coordinates matched with the satellite data corresponding to the fire point pixel.
The second aspect of the present application provides a device for judging and identifying fire points based on polar orbit meteorological satellites, the device includes:
the pixel removing unit is used for performing pixel removing processing on the acquired satellite data to obtain a pixel to be detected;
the first pixel selection unit is used for comparing the brightness temperature of the pixel to be detected with a candidate fire point threshold value to obtain a candidate fire point pixel;
the background brightness temperature calculation unit is used for calculating the background brightness temperature of the candidate fire point pixels;
the fire point pixel determining unit is used for judging the lighting temperature difference of the candidate fire points if the background lighting temperature is successfully calculated, so that a fire point pixel is determined; and if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, thereby determining the fire point pixel.
Optionally, the system further comprises a preprocessing module, wherein the preprocessing module is used for preprocessing the satellite data.
Optionally, the fire monitoring system further comprises a position determining module, which determines the position of the fire point according to the longitude and latitude coordinates matched with the satellite data corresponding to the fire point pixel.
According to the technical scheme, the embodiment of the application has the following advantages:
the embodiment of the application provides a method for judging and identifying fire points based on polar orbit meteorological satellites, which comprises the steps of carrying out pixel elimination processing on acquired satellite data to obtain pixels to be detected; comparing the brightness temperature of the pixel to be detected with a candidate fire point threshold value to obtain a candidate fire point pixel; calculating the background brightness temperature of the candidate fire point pixels; if the background brightness temperature is successfully calculated, judging the brightness temperature difference of the candidate fire points, and determining fire point pixels; and if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, thereby determining the fire point pixel.
According to the method, the polar orbit meteorological satellite is adopted to acquire the image data of the detection area, and a threshold value combination method is adopted, so that the fire point pixel can be identified finely.
Drawings
FIG. 1 is a flowchart of a method of identifying a fire based on polar orbiting meteorological satellites according to one embodiment of the present application;
FIG. 2 is a flowchart of a method of identifying a fire based on polar orbiting meteorological satellites according to another embodiment of the present application;
FIG. 3 is a block diagram of an embodiment of the apparatus for determining a fire based on polar orbiting meteorological satellites according to the present application;
FIG. 4 is a flow chart illustrating cloud detection in an embodiment of a method for identifying a fire based on polar orbiting meteorological satellites;
FIG. 5 is a schematic diagram of the variation of the highest and lowest values of the mid-IR channel fire ignition temperature and the background lighting temperature with the season in the present application;
FIG. 6 is a schematic diagram showing the variation of the highest and lowest values of the far infrared channel fire lighting temperature and the background lighting temperature along with seasons.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
For easy understanding, please refer to fig. 1, in which fig. 1 is a flowchart illustrating a method for determining a fire point based on a polar orbiting meteorological satellite according to an embodiment of the present invention, as shown in fig. 1, in which fig. 1 includes:
101. and carrying out pixel elimination processing on the acquired satellite data to obtain a pixel to be detected.
It should be noted that, the pixel elimination processing on the acquired satellite data may be the elimination processing on the water body pixel, the elimination processing on the pixel in the speckle area and the elimination processing on the cloud pixel, and the pixel to be detected is obtained after the interference pixel is removed.
102. And comparing the brightness temperature of the pixel to be detected with the candidate fire point threshold value to obtain the candidate fire point pixel.
It should be noted that the candidate fire threshold may be obtained by statistics according to different regions, seasonal differences, and difference data of day and night, and the corresponding candidate fire thresholds are obtained for different seasons and different days in different regions, and the pixel to be detected is compared with the candidate fire threshold, for example, if the lighting temperature of the pixel to be detected is greater than the candidate fire threshold, the pixel to be detected is used as the candidate fire pixel, and the rough selection of the fire pixel is completed.
103. And calculating the background brightness temperature of the candidate fire point pixels.
It should be noted that on the basis of determining fire pixel elements as candidate fire points, the background brightness temperature of the candidate fire points is calculated, and during calculation, neighborhood pixels of the pixels used for calculation are firstly required to exclude cloud areas, water bodies and suspected fire points, and then no less than 6 pixels exist in the selected field to be used for calculating the background temperature. If the condition is not met, the calculation is expanded to 9 × 9, 11 × 11, …, 19 × 19, and if the condition is not met, the calculation of the background brightness temperature is not successful.
104. If the background brightness temperature is successfully calculated, judging the brightness temperature difference of the candidate fire points, and determining fire point pixels; and if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, thereby determining the fire point pixel.
It should be noted that, if the background brightness temperature of the candidate fire pixel is successfully obtained, the difference between the brightness temperature of the infrared channel of the polar orbit meteorological satellite in the region and the background brightness temperature can be obtained according to the statistical data of the region according to the difference of the region, and whether the candidate fire pixel is the fire pixel is judged according to the difference; if the background lighting temperature of the candidate fire pixel is not obtained successfully, the absolute fire threshold value can be obtained through direct statistics according to the region or difference, seasonal difference and difference data of day and night, and when the lighting temperature of the infrared channel of the polar orbit meteorological satellite is larger than the absolute fire lighting temperature threshold value, the pixel is judged to be a fire pixel, so that accurate selection is completed.
According to the method for judging and identifying the fire point based on the polar orbit meteorological satellite, the polar orbit meteorological satellite is adopted to acquire image data of a detection area, and the method of combining threshold values is adopted to complete primary selection and fine selection on the pixel to be detected, so that the fire point pixel can be identified finely.
For detailed understanding of the present solution, the present application further provides a specific embodiment of a method for identifying a fire point based on polar orbiting meteorological satellites, which may specifically refer to fig. 2, and includes:
201. preprocessing the satellite data.
It should be noted that the satellite data is preprocessed to analyze, radiometric-correct, location-correct, enhance synthesis processing and projection conversion, so that subsequent operations of automatic identification of cloud pixels and fire pixels, pixel fire area extraction, smoke band identification and the like are facilitated.
The data analysis means that satellite data of specified time is obtained, data sets of corresponding channels for cloud detection and fire point identification are read for standby, and the satellite data are respectively data with resolution of 1km of an FY3 satellite VIRR (visible light infrared radiation) sensor and data with resolution of 1km of an FY4 satellite.
Data radiation correction: the remote sensing satellite data radiometric correction comprises radiometric calibration and atmospheric correction; the radiometric calibration is used for calculating a calibration coefficient through a calibration field geometric parameter and carrying out error analysis, and atmospheric correction is used for eliminating the radiometric error caused by atmospheric influence in the total radiance measured by the sensor.
And (3) data positioning correction: the positioning correction method adopted by the application is GLT geometric correction. When polar orbit satellite detectors observe and scan the earth, the zenith angle (+ -55.38 ℃) can cause data local positioning errors for a relatively large scanning zenith angle. In addition, local positioning errors caused by factors such as scanning change of an on-satellite sensor, small change of a satellite state angle, high-order perturbation force change influencing satellite orbit and the like have the characteristics of randomness and instantaneous change. The geometric fine correction is implemented by establishing a geometric transformation relationship between the reference image space and the correction image space based on Ground Control Points (GCPs) or Ground Control Blocks (GCBs) under a certain error criterion. In the scheme, the geometric fine correction is realized by selecting a sufficient number of high-precision GCBs (geospatial satellite systems) from high-resolution reference images (1: 100 ten thousand digital maps and Landsat (TM) satellite images) and establishing a geometric transformation relation between a reference high-resolution image space and a polar-orbit satellite image space under a least square criterion by adopting selectable correction polynomials (such as binary first-order polynomials and binary second-order polynomials). The number of GCBs used is related to the correction polynomial selected. In order to ensure the positioning accuracy of the selected GCBs feature block pair, a certain image block in a high-resolution 1: 100 ten thousand digital map and a Landsat (TM) satellite image is used as a reference image block of the GCBs, an automatic matching method is used for searching the maximum relevant image block corresponding to the high-resolution reference image block in the satellite image, and the center point coordinate of the reference image block and the center point coordinate of the image block matched with the reference image block are respectively used as the accurate position and the calculated position of the GCBs feature block pair. The GCBs selected in the way better utilize the high positioning accuracy of the high-resolution reference image and the image area and structure information contained in the GCBs, have good noise resistance and anti-interference performance, are obviously superior to GCPs selected manually based on feature point information, and can ensure that the geometric fine correction stably reaches the sub-pixel level positioning accuracy.
In order to improve the matching precision of the GCBs characteristic block pair between two images with different resolutions, the scheme is further improved from two aspects:
1) aiming at two reference images with high resolution, namely a 1: 100 ten thousand digital map and a Landsat image, comparing two different modes of gray level image matching and edge image on an image matching algorithm, and selecting an edge image matching mode with better adaptability.
2) And interpolating and thinning the image to obtain a series of image blocks with improved resolution, and then accurately registering the image blocks with the high-resolution reference image block at the improved nominal resolution level.
Data enhancement synthesis processing: in order to avoid the satellite remote sensing data from being too bright or too dark, histogram processing is carried out on the data, and the histogram is adjusted to be normal distribution for the satellite image of which the histogram is in non-normal distribution so as to improve the quality of the image.
Data projection conversion: in order to facilitate data matching and subsequent positioning processing in the fire point identification operation process, projection conversion needs to be performed on satellite data and auxiliary data (earth surface type data), and common projection is converted into equal longitude and latitude projection.
Because the corresponding geographic positions of polar orbit satellites in each revisit are different, the equal longitude and latitude projection of polar orbit satellite data is converted by adopting a dynamic Delaunay triangulation network method, and the Delaunay triangulation network interpolation principle is that linear interpolation is carried out on any point in an area based on the observation value of each triangular plate vertex in the triangulation network.
Let 3 vertexes of the triangular plate be A, B, C, which are denoted by numerals 1,2, and 3, respectively, and the longitude of the vertex position be mi(i ═ 1,2,3) and latitude ni(i ═ 1,2,3) and the observations are f (a), f (b), f (c), respectively, and any point x has a unique barycentric coordinate representation with respect to this triangle:
x=μA+θB+ωC
the linear interpolation can be expressed as:
f(x)=μf(A)+θf(B)+ωf(C)
Figure BDA0002342066950000071
Figure BDA0002342066950000072
Figure BDA0002342066950000073
g=m1(n3-n2)+m2(n1-n3)+m3(n2-n1)
and obtaining a result f (x) of any position x in the triangle by the above formula and the observed values of the three vertexes.
The Delaunay triangulation network interpolation method comprises the following specific steps:
step 1: determining a target projection range: and determining the projection latitude and longitude range according to the geographic range covered by the data.
Step 2: and acquiring longitude and latitude corresponding to the data point set of the static satellite.
And step 3: and manufacturing the Delaunay triangulation network based on the longitude and latitude.
And 4, step 4: making equal longitude and latitude grids: and (3) generating equidistant grid points according to the longitude and latitude range and the target resolution determined in the step (2).
And 5: equal longitude and latitude grid interpolation: and (4) interpolating to an equal longitude and latitude grid based on the spliced two-dimensional data point set by utilizing the Delaunay triangulation network generated in the step (3).
202. And carrying out pixel elimination on the acquired satellite data, wherein the eliminated pixels comprise a water body pixel, a flare area pixel and a cloud pixel, so as to obtain the pixel to be detected.
It should be noted that, because the fire point identification is performed based on the high sensitivity of the mid-infrared channel to the fire point light temperature radiation, the mid-infrared channel receives not only the light temperature radiation of the underlying surface but also the reflection of the underlying surface in the daytime, when the water body forms a certain angle with the sun, a flare phenomenon is generated, and at this time, the water body is mistakenly identified as the fire point, so that the influence of the water body pixel needs to be eliminated when the fire point identification is performed. Generally, water body pixels are eliminated in a form of auxiliary surface type data, in addition, the water body pixels can be identified through visible light and near infrared bands, and if the pixels meet the following conditions at the same time, the water body pixels can be judged, namely:
Figure BDA0002342066950000081
wherein R1, R2 and R7 respectively represent the reflectivity values of channels of 0.62-0.67 μm, 0.841-0.876 μm and 2.105-2.155 μm.
The method adopting the cloud detection comprises the following steps:
step 1: the method comprises the steps of generating a clear-sky synthetic map by using an optimized ISCP (international satellite cloud climate plan) method, judging regions where different data belong in seasons where the data belong on the basis of obtaining data of a static satellite, respectively and dynamically manufacturing corresponding clear-sky synthetic maps according to a table 1, and carrying out cloud detection after removing accumulated snow and sea ice.
TABLE 1 synthetic clear-air ground chart of surface
Figure BDA0002342066950000082
Figure BDA0002342066950000091
Step 2: with the multichannel threshold method: the cloud detection algorithm is developed to the present, and the method is applied to a multichannel threshold method no matter a single cloud detection algorithm or a matching combination of multiple algorithms, and is also a cloud detection algorithm commonly adopted by all satellite data business processing application departments at home and abroad at present. The only differences are the optimal combination of channels, and the difference in the chosen threshold. According to the selected 7 research areas, the multi-threshold algorithm suitable for specific conditions in China is provided according to different satellite data and the requirements of an optimized channel. The specific judgment method is that when the daytime pixel point meets one of the following conditions, the daytime pixel point is considered to be thick cloud, the nighttime pixel point is judged, and if the 32-channel brightness temperature value meets T32<265K, the pixel point is considered to be cloud.
R1+R2>0.9
T32<265K4
Figure BDA0002342066950000092
Wherein R1, R2 and T32 respectively represent the reflectance R and the brightness temperature of channels of 0.62-0.67 μm, 0.841-0.876 μm and 11.77-12.27 μm.
And step 3: space texture monitoring based on threshold method of climate SST (sea surface water temperature) data
And respectively calculating the sea surface temperature of each satellite data by applying the existing mature sea temperature inversion method, and if the obtained sea surface temperature is lower than the climate SST data by a certain value, considering the pixel as cloud.
And (3) spatial texture detection: because the texture of the ocean surface is uniform, once small-scale cloud deposits or broken clouds pollute, the texture of the ocean surface is obviously changed, and the space texture detection has a good effect on the small-scale cloud deposits or broken clouds on the sea. For each pixel, the standard deviation of its surrounding 8 pixels, denoted by S, is calculated, for example as follows:
S(T11.0μm)>0.6K;
s (T11.0 μm-T3.9 μm) >0.1K (0.4K at night);
S(R0.7μm)>(0.8%+0.03R0.7th)。
and 4, step 4: solar flare area identification:
when the solar reflection angle is 0-36 degrees, the solar flare area treatment is needed. The blaze area is not represented on the infrared channel data, but a large sudden change appears on the visible light channel data, and the solar blaze area can be better distinguished by combining the infrared channel data and the visible light channel data.
The comprehensive optimization flow of cloud detection is shown in fig. 4.
203. And comparing the brightness temperature of the pixel to be detected with the candidate fire point threshold value to obtain the candidate fire point pixel.
204. And calculating the background brightness temperature of the candidate fire point pixels.
It should be noted that the background temperature calculation has a direct influence on the identification accuracy, and for a vegetation covered dense area with a single underlying surface, the average brightness temperature of an adjacent pixel has a better representativeness to the identified pixel:
Figure BDA0002342066950000101
in the formula, TibgIs the i μm channel background light temperature value, TiThe i mum channel brightness temperature of the pixels adjacent to the pixels to be judged, and n is the number of the adjacent pixels.
And during the calculation of the background brightness temperature, the influence of cloud areas, water bodies and suspected fire point pixels needs to be removed, namely, the areas, the water bodies and the suspected fire point pixels in the neighborhood are removed before the average temperature is calculated, and the pixels under the clear sky condition are only used for calculation. After removing cloud region, water body, and high temperature pixels, if the remaining pixels are too few (e.g. less than 6 pixels), the size of the neighborhood window can be enlarged, e.g. from 9 × 9, 11 × 11, etc., …, 19 × 19, and if the condition is still not satisfied, the judgment of the pixel is abandoned.
205. And if the background brightness temperature is successfully calculated, judging the brightness temperature difference of the candidate fire points, thereby determining the fire point pixel.
It should be noted that, if the background brightness temperature of the candidate fire pixel is successfully obtained, the difference between the brightness temperature of the infrared channel of the polar orbit meteorological satellite in the region and the background brightness temperature can be obtained according to the statistical data of the region according to the difference of the region, and whether the candidate fire pixel is the fire pixel is judged according to the difference.
206. And if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, thereby determining the fire point pixel.
It should be noted that, because the surface vegetation types of different regions are complex, the combustible carrying capacity distribution is unbalanced, the topography is fluctuated, and the fire point identification can not be carried out by simply using the same fixed threshold value, the application collects typical fire point data of a certain region as an analysis sample, and calculates the middle infrared and far infrared brightness temperature and the background brightness temperature of each fire point by using FY3B data so as to calculate the fire point identification threshold values suitable for different seasons and different regions of the Guangdong region. The fire point identification depends on the fire point lighting temperature and the background temperature, the background temperature of different geographical positions in different seasons or in the same period in the same geographical position is different, and the ground vegetation types and the quantity of the combustible substances in different underlying surfaces correspond to different fire points and mountain fire intensities, so that the relationship between the fire point lighting temperature and the regional, climate and cloud coverage is analyzed to obtain the local mountain fire identification threshold value in order to improve the accuracy of local mountain fire identification by combining the climate and environmental characteristics of a certain region.
In addition, classification analysis is carried out according to different times, different areas and different underlying surface information. Based on 2016-2018 annual fire point data, through the principle of similar combination, a total of 49 representative fire points are extracted, and simultaneously, medium infrared and far infrared light temperatures of an FY3B satellite are matched, and the light temperatures (T4 and T11) and background light temperatures (T4bg and T11bg) of the corresponding fire points are calculated, wherein the numerical values are as follows in the following table 2:
TABLE 2 typical fire data set information for the region 2016 + 2018
Figure BDA0002342066950000111
Figure BDA0002342066950000121
Counting the value ranges of the fire lighting temperature and the background lighting temperature in different seasons, and obtaining a statistical chart through statistics, as shown in fig. 5 and fig. 6, wherein T4, T11, T4bg and T11bg respectively represent the mid-infrared lighting temperature, the far-infrared lighting temperature, the mid-infrared background lighting temperature and the far-infrared background lighting temperature.
The graph shows that the interval of the infrared channel bright temperature (T4) is 314-356K, the interval of the background bright temperature (T4bg) is 301-323K, and far infrared information, whether a fire point image element or a surrounding background image element, has bright temperature distribution between 283-300K and has unobvious change; in addition, the background temperature is also very different in different seasons, from 11 months to 2 months in the next year, the maximum background brightness temperature of the intermediate infrared channel is 307K, and during the period of 3-5 months, the maximum background brightness temperature is 323K, namely the background brightness temperature is higher in spring than in winter, the background brightness temperature is in the middle of the summer flood season, and the average background brightness temperature is about 310K.
Because the maximum temperature of the middle infrared channel is directly related to the fire intensity, the middle infrared parameter is the most key parameter for fire point identification, and the bright temperature of the far infrared channel can be considered as a secondary variable, namely the variable can be considered when fire point misjudgment is eliminated.
Based on the statistics of the mid-infrared light temperatures, fire point identification experiment tests are carried out between the minimum value and the maximum value, and finally, a candidate fire point threshold value table and an absolute fire point threshold value table (tables 3 and 4) are respectively established according to candidate fire point identification conditions and absolute fire point identification conditions in different seasons, wherein the candidate fire point threshold value table represents the mid-infrared minimum light temperature of the pixel which can be used for candidate fire point identification, the absolute fire point threshold value table represents that the mid-infrared minimum light temperature of the fire point can be directly identified under the condition that other conditions are not met, the night satellite infrared light temperature is 4-7K lower than the daytime on average, and the average value is 5K.
TABLE 3 fire point candidate threshold table
Figure BDA0002342066950000131
TABLE 4 Absolute ignition threshold Table
Figure BDA0002342066950000132
The highest and lowest differences of the brightness temperatures of the middle infrared channels and the far infrared channels in different regions can be obtained by the distribution conditions of the highest and lowest values of the brightness temperatures of the middle infrared channels and the far infrared channels in the regions, and the highest and lowest differences of the brightness temperatures of the far infrared channels in the regions are not large, so that the fire point identification cannot be directly influenced. Therefore, the bright temperature difference threshold value table of different regions can be obtained:
TABLE 5 Bright temperature difference threshold table
Figure BDA0002342066950000141
It should be further noted that, current satellite remote sensing fire point monitoring is mainly based on infrared remote sensing, a pixel space bright temperature difference method is adopted, and attribute difference of underlying surface types has a large influence on judgment accuracy. Cloud is the most common and most uncertain underlying surface element, and has a non-negligible influence on fire point identification. The influence of cloud layers on fire point identification mainly comprises the following aspects:
when the cloud layer thickness is larger: ground fire point energy cannot penetrate through cloud layers, the cloud layers cut off signals, and satellite remote sensing cannot monitor fire points under clouds.
When the cloud layer is relatively thin: the signal can penetrate through the cloud layer, but the middle infrared brightness temperature is reduced, the original threshold value cannot be applied, in this case, the fire point threshold value needs to be adjusted to be low, and the amplitude is generally reduced to 10-15K according to the specific penetrability of the thin cloud.
Cloud edge time: the influence mainly lies in the misjudgment of the peripheral non-cloud pixel and the misjudgment of the abnormal reflection of the cloud pixel. The main reasons for misjudgment of the peripheral non-cloud pixels are as follows: the existence of the cloud can cause the reduction of the brightness temperature of the pixels, and at the cloud edge, the cloud and the land form a mixed pixel which can be easily calculated as a background pixel, so that the brightness temperature of the background pixel is reduced, and the peripheral non-cloud pixels are mistakenly judged as fire points; the reason for misjudgment of abnormal reflection of the cloud pixel is as follows: the signal of the intermediate infrared channel is not only emitted but also reflected, when the cloud pixel and the satellite form mirror reflection, a large amount of energy enters the satellite sensor due to reflection, and the brightness temperature of the cloud pixel is abnormally high, so that misjudgment is caused. The solution method generally adopts a method of strictly limiting the threshold, namely, the value of T4-T4bg is increased, and when the cloud edge exists, the original threshold can be increased by 1.2-1.8 times.
207. And determining the position of the fire point according to the longitude and latitude coordinates matched with the satellite data corresponding to the fire point pixel.
208. And outputting the fire point position information in a text form.
The method for precisely judging the fire point is provided by adopting the polar orbit meteorological satellite to acquire image data of a detection area and integrating factors such as satellite monitoring characteristics, seasonal characteristics and vegetation indexes. The method further improves the technology aiming at the aspects of satellite data processing, coordinate conversion, cloud detection comprehensive algorithm, combination of the flow of fire point identification and a threshold value and the like, so that fire point pixels can be identified in a refined manner.
The above is an embodiment of the method of the present application, and the present application further provides an embodiment of a device for determining and identifying a fire point based on a polar orbiting meteorological satellite, as shown in fig. 3, including:
and the pixel removing unit 301 is configured to perform pixel removing processing on the acquired satellite data to obtain a pixel to be detected.
And the pixel selection unit 302 is configured to compare the brightness temperature of the pixel to be detected with a candidate fire point threshold value to obtain a candidate fire point pixel.
And a background brightness temperature calculation unit 303, configured to calculate a background brightness temperature of the candidate fire point pixel.
In one embodiment, the method further comprises:
and the preprocessing module is used for preprocessing the satellite data.
And the position determining module is used for determining the position of the fire point according to the longitude and latitude coordinates matched with the satellite data corresponding to the fire point pixel.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The terms "first," "second," "third," "fourth," and the like in the description of the application and the above-described figures, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" for describing an association relationship of associated objects, indicating that there may be three relationships, e.g., "a and/or B" may indicate: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of single item(s) or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (10)

1. A method for judging and identifying fire points based on polar orbit meteorological satellites is characterized by comprising the following steps:
performing pixel elimination processing on the acquired satellite data to obtain a pixel to be detected;
comparing the brightness temperature of the pixel to be detected with a candidate fire point threshold value to obtain a candidate fire point pixel;
calculating the background brightness temperature of the candidate fire point pixels;
if the background brightness temperature is successfully calculated, judging the brightness temperature difference of the candidate fire points, and determining fire point pixels; and if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, thereby determining the fire point pixel.
2. The method for judging and identifying fire points based on polar-orbit meteorological satellites according to claim 1, wherein the step of performing pixel elimination processing on the acquired satellite data to obtain pixels to be detected further comprises the following steps:
preprocessing the satellite data.
3. The method for judging and identifying fire points based on polar-orbit meteorological satellites according to claim 1, wherein the pixel elimination processing is performed on the acquired satellite data to obtain the pixel to be detected specifically as follows:
and removing the water body pixel, the flare area pixel and the cloud pixel in the satellite data to obtain the pixel to be detected.
4. The method for judging and identifying fire points based on polar orbiting meteorological satellites according to claim 1, wherein the step of comparing the pixel to be detected with a candidate fire point threshold value to obtain candidate fire point pixels specifically comprises the following steps:
and comparing the lighting temperature of most of pixels to be detected with the candidate fire point threshold value to obtain the candidate fire point pixels, wherein the candidate fire point threshold value comprises a daytime lighting temperature threshold value and a nighttime lighting temperature threshold value.
5. The method for fire point identification based on polar orbiting meteorological satellite according to claim 1, wherein the bright temperature difference of the candidate fire point is identified, so as to determine the fire point pixel as:
and obtaining a bright temperature difference threshold value table according to historical statistical data, and comparing the bright temperature of the candidate fire point with the corresponding threshold value in the bright temperature difference threshold value table to determine the fire point pixel.
6. The method for judging and identifying fire points based on polar orbiting meteorological satellites according to claim 1, wherein the calculating of the background brightness temperature of the candidate fire point pixels specifically comprises:
calculating neighborhood pixels of the candidate fire points, and eliminating cloud area pixels, water body pixels and suspected pixels in the neighborhood pixels;
and selecting at least 6 pixels in the neighborhood pixels to calculate the background brightness temperature.
7. The method for identifying a fire based on a polar orbiting meteorological satellite according to claim 1, wherein the lighting temperature difference of the candidate fire is identified if the background lighting temperature is successfully calculated, so as to determine a fire pixel; if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, and further comprising the following steps of:
and determining the position of the fire point according to the longitude and latitude coordinates matched with the satellite data corresponding to the fire point pixel.
8. The utility model provides a device for judging and discerning fire point based on polar orbit meteorological satellite which characterized in that includes:
the pixel removing unit is used for performing pixel removing processing on the acquired satellite data to obtain a pixel to be detected;
the first pixel selection unit is used for comparing the brightness temperature of the pixel to be detected with a candidate fire point threshold value to obtain a candidate fire point pixel;
the background brightness temperature calculation unit is used for calculating the background brightness temperature of the candidate fire point pixels;
the fire point pixel determining unit is used for judging the lighting temperature difference of the candidate fire points if the background lighting temperature is successfully calculated, so that a fire point pixel is determined; and if the background calculation is unsuccessful, judging the absolute lighting temperature of the fire point, thereby determining the fire point pixel.
9. The polar orbiting meteorological satellite based fire point identification apparatus according to claim 8, further comprising:
a pre-processing module for pre-processing the satellite data.
10. The polar orbiting meteorological satellite based fire point identification apparatus according to claim 8, further comprising:
and the position determining module is used for determining the position of the fire point according to the longitude and latitude coordinates matched with the satellite data corresponding to the fire point pixel.
CN201911380389.6A 2019-12-27 2019-12-27 Method and device for judging and identifying fire point based on polar orbit meteorological satellite Pending CN111006771A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911380389.6A CN111006771A (en) 2019-12-27 2019-12-27 Method and device for judging and identifying fire point based on polar orbit meteorological satellite

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911380389.6A CN111006771A (en) 2019-12-27 2019-12-27 Method and device for judging and identifying fire point based on polar orbit meteorological satellite

Publications (1)

Publication Number Publication Date
CN111006771A true CN111006771A (en) 2020-04-14

Family

ID=70119113

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911380389.6A Pending CN111006771A (en) 2019-12-27 2019-12-27 Method and device for judging and identifying fire point based on polar orbit meteorological satellite

Country Status (1)

Country Link
CN (1) CN111006771A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111783560A (en) * 2020-06-12 2020-10-16 云南电网有限责任公司电力科学研究院 Multi-parameter fused power grid forest fire secondary discrimination method and device
CN111815664A (en) * 2020-07-08 2020-10-23 云南电网有限责任公司电力科学研究院 Fire point detection method and system
CN112113913A (en) * 2020-08-30 2020-12-22 山东锋士信息技术有限公司 Himapari 8 land fire point detection algorithm based on background threshold
CN112837489A (en) * 2021-01-07 2021-05-25 云南电网有限责任公司电力科学研究院 Floating threshold power transmission line forest fire monitoring method based on satellite and meteorological data
CN113340432A (en) * 2021-06-09 2021-09-03 广东电网有限责任公司 Fire monitoring method and system based on stationary meteorological satellite
CN113361323A (en) * 2021-04-23 2021-09-07 云南电网有限责任公司输电分公司 Method and device for monitoring fire points near power grid of plateau area based on satellite technology
CN113657275A (en) * 2021-08-16 2021-11-16 中国科学院空天信息创新研究院 Automatic detection method for forest and grass fire points
CN114034388A (en) * 2021-10-19 2022-02-11 吉林大学 Whole moon given place time brightness temperature drawing method
CN114550006A (en) * 2022-02-23 2022-05-27 黑龙江省生态气象中心(东北卫星气象数据中心) Polar-orbit meteorological satellite fire point identification system, storage medium and equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193093A (en) * 2010-03-15 2011-09-21 北京师范大学 System and method for detecting small burning spots of forest or grassland fires by using environmental minisatellite HJ
CN104269012A (en) * 2014-09-28 2015-01-07 浙江大学 Method for monitoring mountain fire nearby electric transmission line based on MODIS data
CN106503480A (en) * 2016-12-14 2017-03-15 中国科学院遥感与数字地球研究所 A kind of fixed statellite fire remote-sensing monitoring method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193093A (en) * 2010-03-15 2011-09-21 北京师范大学 System and method for detecting small burning spots of forest or grassland fires by using environmental minisatellite HJ
CN104269012A (en) * 2014-09-28 2015-01-07 浙江大学 Method for monitoring mountain fire nearby electric transmission line based on MODIS data
CN106503480A (en) * 2016-12-14 2017-03-15 中国科学院遥感与数字地球研究所 A kind of fixed statellite fire remote-sensing monitoring method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
周艺 等: ""基于MODIS数据的火点信息自动提取方法"", 《自然灾害学报》 *
江全元 等: ""基于概率统计的输电线路山火监测方法"", 《高电压技术》 *
王钊 等: ""新一代极轨气象卫星FY3A-VIRR数据的地表火监测算法研究与评价"", 《火灾科学》 *
魏英策 等: ""利用改进型MODIS火点探测算法实现河北省秸秆焚烧火点识别"", 《全球定位系统》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111783560A (en) * 2020-06-12 2020-10-16 云南电网有限责任公司电力科学研究院 Multi-parameter fused power grid forest fire secondary discrimination method and device
CN111815664A (en) * 2020-07-08 2020-10-23 云南电网有限责任公司电力科学研究院 Fire point detection method and system
CN111815664B (en) * 2020-07-08 2023-10-17 云南电网有限责任公司电力科学研究院 Fire point detection method and system
CN112113913A (en) * 2020-08-30 2020-12-22 山东锋士信息技术有限公司 Himapari 8 land fire point detection algorithm based on background threshold
CN112113913B (en) * 2020-08-30 2021-07-09 山东锋士信息技术有限公司 Himapari 8 land fire point detection algorithm based on background threshold
CN112837489A (en) * 2021-01-07 2021-05-25 云南电网有限责任公司电力科学研究院 Floating threshold power transmission line forest fire monitoring method based on satellite and meteorological data
CN113361323A (en) * 2021-04-23 2021-09-07 云南电网有限责任公司输电分公司 Method and device for monitoring fire points near power grid of plateau area based on satellite technology
CN113340432A (en) * 2021-06-09 2021-09-03 广东电网有限责任公司 Fire monitoring method and system based on stationary meteorological satellite
CN113657275A (en) * 2021-08-16 2021-11-16 中国科学院空天信息创新研究院 Automatic detection method for forest and grass fire points
CN114034388A (en) * 2021-10-19 2022-02-11 吉林大学 Whole moon given place time brightness temperature drawing method
CN114034388B (en) * 2021-10-19 2024-01-26 吉林大学 Brightness and temperature drawing method for given place of full moon
CN114550006A (en) * 2022-02-23 2022-05-27 黑龙江省生态气象中心(东北卫星气象数据中心) Polar-orbit meteorological satellite fire point identification system, storage medium and equipment

Similar Documents

Publication Publication Date Title
CN111006771A (en) Method and device for judging and identifying fire point based on polar orbit meteorological satellite
Berberoglu et al. Assessing different remote sensing techniques to detect land use/cover changes in the eastern Mediterranean
CN109993237B (en) Water body rapid extraction method and system based on high-resolution satellite optical remote sensing data
KR101258668B1 (en) Korea local radar processing system
CN109509319B (en) Power transmission line forest fire monitoring and early warning method based on static satellite monitoring data
CN109974665B (en) Aerosol remote sensing inversion method and system for short-wave infrared data lack
CN102298150B (en) Global land cover broadband emissivity inversion method and system
Proud et al. Rapid response flood detection using the MSG geostationary satellite
CN107479065B (en) Forest gap three-dimensional structure measuring method based on laser radar
Tooke et al. Integrated irradiance modelling in the urban environment based on remotely sensed data
Nghiem et al. Observations of urban and suburban environments with global satellite scatterometer data
Moreno et al. Validation of daily global solar irradiation images from MSG over Spain
CN114022783A (en) Satellite image-based water and soil conservation ecological function remote sensing monitoring method and device
CN113553907A (en) Forest ecological environment condition evaluation method based on remote sensing technology
CN113340432B (en) Fire monitoring method and system based on stationary meteorological satellite
Feidas et al. Validation of an infrared-based satellite algorithm to estimate accumulated rainfall over the Mediterranean basin
CN115205709B (en) Forest fire point identification method based on satellite remote sensing
Zhong et al. Application of the Doppler weather radar in real-time quality control of hourly gauge precipitation in eastern China
Kuma et al. Ground-based lidar processing and simulator framework for comparing models and observations (ALCF 1.0)
Chaurasia et al. Detection of fog using temporally consistent algorithm with INSAT-3D imager data over India
Reuter et al. The CM-SAF and FUB cloud detection schemes for SEVIRI: Validation with synoptic data and initial comparison with MODIS and CALIPSO
CN115267941A (en) High-resolution true color visible light model generation and inversion method and system
Handwerger et al. Rapid landslide identification using synthetic aperture radar amplitude change detection on the Google Earth Engine
Mashi Monitoring Al-Hammar Marsh topography and climatic applied satellied MODIS Imagery
Cho et al. First Atmospheric Aerosol Monitoring Results from Geostationary Environment Monitoring Spectrometer (GEMS) over Asia

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200414