CN117689959A - Remote sensing classification method for fusing vegetation life cycle features - Google Patents
Remote sensing classification method for fusing vegetation life cycle features Download PDFInfo
- Publication number
- CN117689959A CN117689959A CN202410124198.8A CN202410124198A CN117689959A CN 117689959 A CN117689959 A CN 117689959A CN 202410124198 A CN202410124198 A CN 202410124198A CN 117689959 A CN117689959 A CN 117689959A
- Authority
- CN
- China
- Prior art keywords
- vegetation
- period
- time
- pixel point
- remote sensing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 230000012010 growth Effects 0.000 claims abstract description 64
- 230000032683 aging Effects 0.000 claims abstract description 48
- 238000012549 training Methods 0.000 claims abstract description 26
- 238000010586 diagram Methods 0.000 claims description 25
- 230000009466 transformation Effects 0.000 claims description 17
- 230000008569 process Effects 0.000 claims description 11
- 230000003595 spectral effect Effects 0.000 claims description 9
- 238000012795 verification Methods 0.000 claims description 9
- 230000010287 polarization Effects 0.000 claims description 8
- 238000012937 correction Methods 0.000 claims description 7
- 238000012935 Averaging Methods 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 5
- 238000009499 grossing Methods 0.000 claims description 3
- 230000009758 senescence Effects 0.000 claims description 3
- 230000001172 regenerating effect Effects 0.000 claims 1
- 230000006870 function Effects 0.000 description 19
- 238000004364 calculation method Methods 0.000 description 11
- 238000000605 extraction Methods 0.000 description 7
- 230000003698 anagen phase Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 238000011156 evaluation Methods 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000018109 developmental process Effects 0.000 description 4
- 238000001914 filtration Methods 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000011835 investigation Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 238000012952 Resampling Methods 0.000 description 1
- 208000027066 STING-associated vasculopathy with onset in infancy Diseases 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000002146 bilateral effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000006798 recombination Effects 0.000 description 1
- 238000005215 recombination Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/806—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Image Processing (AREA)
Abstract
The invention provides a remote sensing classification method for fusing vegetation life cycle features, which comprises the following steps: generating a dense long-time sequence vegetation index data set of each pixel point and each vegetation type based on the remote sensing image of the region to be classified; forming a corresponding whole-course fitting curve of each pixel point and vegetation type; obtaining a full-derivative function curve of each pixel point and each vegetation type; determining a time period of a growing period, a saturation period and an aging period of each pixel and vegetation type; acquiring a growth stage feature map of each pixel point; constructing a classification feature space of each pixel point and a sample set of each vegetation type in a remote sensing image of the region to be classified; training a classifier; and classifying vegetation types of all pixel points in the remote sensing image of the region to be classified by using the classifier after training. The invention can realize classification of regional vegetation types based on high-quality remote sensing data and vegetation life cycle knowledge driving.
Description
Technical Field
The invention belongs to the technical field of vegetation remote sensing rapid classification, and particularly relates to a remote sensing classification method for fusing vegetation life cycle features.
Background
The ground object/vegetation type extraction technology based on the remote sensing data has important significance in aspects of natural resource investigation and monitoring, ecological and environment protection, municipal administration, highway engineering planning and the like. Most of the feature classification research is focused on the classification method research of remote sensing images, and a series of machine learning models and methods based on spectrum, texture, shape and the like are provided. Although research on remote sensing classification has made remarkable progress, research on growth characteristics of vegetation itself is insufficient.
The image and the characteristics obtained in different growth time periods of the vegetation have important significance for remote sensing classification, and the differential performance of different vegetation on the remote sensing image is influenced by the growth speed of the vegetation. According to the growth speed period of different vegetation, the vegetation growth period can be divided into three stages, namely: growth phase, saturation phase and aging phase. Wherein, the growing period is that vegetation is in a rapid growth stage; the vegetation growth speed in the saturation period is slowed down, and the development is a good period at the moment; in the senescence period, the vegetation growth rate approaches to stagnate. In previous studies, the study of vegetation growth has mainly involved two aspects: the method mainly comprises the steps of time sequence reconstruction and life cycle node extraction, wherein a Savitzky-Golay filtering method, a double Logistic function, harmonic analysis and other methods are mainly used in the time sequence reconstruction process to reconstruct representative vegetation index data, and the life cycle node parameter extraction method mainly comprises a threshold method, a derivative method, a fitting method and the like. However, the time sequence uses a fitting mode aiming at the whole vegetation growth period, so that the difference of vegetation development growth rates in different growth periods cannot be well reflected, the classification accuracy is low, and the life period node parameter extraction method is to be developed.
Disclosure of Invention
The invention aims to solve the defects in the background technology, and provides a remote sensing classification method for fusing vegetation life cycle characteristics, which can realize classification of regional vegetation types based on high-quality remote sensing data and vegetation life cycle knowledge driving.
The technical scheme adopted by the invention is as follows: a remote sensing classification method for fusing vegetation life cycle features comprises the following steps;
forming a dense long-time sequence vegetation index data set of each pixel point based on vegetation indexes of each pixel point in the remote sensing image of the region to be classified at different times;
averaging vegetation indexes of each vegetation type in different time according to a sample set of each vegetation type in a remote sensing image of an area to be classified to form a dense long-time sequence vegetation index data set of each vegetation type;
performing curve fitting on the dense long-time sequence vegetation index data set of each pixel point and vegetation type to form a corresponding whole-course fitting curve of each pixel point and vegetation type;
respectively performing differential operation on each pixel point and the whole-course fitting curve of each vegetation type to obtain a corresponding derivative function curve;
determining time periods of a growing period, a saturation period and an aging period of each pixel point and vegetation type according to the maximum value, the minimum value and the time node determined by the set threshold ratio of the derivative function curve of each pixel point and vegetation type;
acquiring a growth stage characteristic diagram of each pixel point according to the maximum value, the minimum value and the set ratio of the dense long-time sequence vegetation index data set of each pixel point and the vegetation type; the growth stage characteristic diagram is used for representing vegetation at the growth starting time, the saturation time and the aging time;
constructing a classification feature space of each pixel point and a sample set of each vegetation type in a remote sensing image of the region to be classified; the classification characteristic space comprises spectral characteristics, texture characteristics and polarization characteristics corresponding to the growing period, the saturation period and the aging period, and a growth stage characteristic diagram;
training a classifier by using sample sets of different vegetation types of the region to be classified and classification feature spaces of the sample sets;
and classifying vegetation types in the remote sensing image of the region to be classified by utilizing the classifier which is completed by training based on the classification feature space of all the pixel points of the remote sensing image of the region to be classified.
In the technical scheme, remote sensing images of the areas to be classified are obtained based on remote sensing satellite data of different sources; performing band transformation and vegetation index transformation on each time point of the remote sensing image of the multisource region to be classified, and calculating a vegetation index mean value of each time point after transformation, wherein the vegetation index mean value is used for forming a dense long-time sequence vegetation index data set of each pixel point and vegetation type.
In the technical scheme, the sample set of each vegetation type is divided into a training set and a verification set; forming classification feature spaces of different vegetation types by adopting a training set so as to train a classifier; and verifying the trained classifier by using the verification set to generate vegetation type prediction precision.
In the above technical solution, the fitting process of the dense long-time sequence vegetation index data set includes: dividing the vegetation index data set according to the growing period and the aging period according to any dense long-time sequence, independently fitting the divided data sets to obtain fitted curves of the growing period and the aging period, and connecting the fitted curves of the growing period and the aging period to form corresponding whole-course fitted curves of each pixel point and each vegetation type.
In the technical scheme, aiming at the dense long-time sequence vegetation index data set of each pixel point and vegetation type, the maximum value and the minimum value of the dense long-time sequence vegetation index data set are found; the time period before the maximum value corresponds to the time point is the growth period, the time period after the maximum value corresponds to the time point is the aging period, the time period before the minimum value corresponds to the time point is the aging period, and the time period after the minimum value corresponds to the time point is the growth period.
In the technical scheme, after the remote sensing image of the area to be classified is subjected to atmosphere correction, terrain correction and data cloud removal, a dense long-time sequence vegetation index data set of each pixel point and vegetation type is regenerated.
In the technical scheme, the intensive long-time sequence vegetation index data set of each pixel point and vegetation type is subjected to smooth denoising and then curve fitting.
In the above technical solution, the process of constructing the classification feature space of the sample set of each vegetation type includes: aiming at any sample point in the sample set of any vegetation type, based on the remote sensing image of the sample point in the time period of the vegetation type growing period, the saturation period and the aging period, obtaining the corresponding spectral characteristics, texture characteristics and polarization characteristics of the sample point growing period, the saturation period and the aging period; and using the growth stage characteristic diagram of the pixel point corresponding to the sample point as the growth stage characteristic diagram of the sample point.
In the above technical solution, the process of determining the single pixel point and the time periods of the growing period, the saturation period and the senescence period of each vegetation type includes: obtaining the minimum value P of the corresponding derivative function curve min And maximum value P max And calculating the amplitude of the signal to be D; according to the set threshold ratio R, the threshold T1 and the threshold T2 are calculated by:
D=P max -P min ;
T1=P max -R·D;
T2=P min +R·D;
the abscissa of the derivative function curve represents time, and the ordinate represents slope;
the time period formed by the time point corresponding to the intersection point of the horizontal line made by the threshold T1 and the derivative curve is taken as a growth period,
the time period formed by the time point corresponding to the crossing point of the horizontal line made by the threshold T2 and the derivative curve is taken as the aging period,
the time period formed by the time points corresponding to the minimum value and the maximum value of the derivative function curve is a saturation period.
In the above technical solution, the growth stage feature map U of each pixel point is calculated by the following formula:
U=(V min -V max )·P+V min
wherein V is min And V max And respectively obtaining the minimum value and the maximum value of the dense long-time sequence vegetation index data set corresponding to each pixel point, wherein P is a set amplitude proportion.
The invention also provides a remote sensing classification system for fusing the vegetation life cycle characteristics, which is used for realizing the remote sensing classification method for fusing the vegetation life cycle characteristics.
The invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the remote sensing classification method for fusing vegetation life cycle features according to the above technical scheme.
The invention provides a remote sensing classification method, a remote sensing classification system and a remote sensing classification storage medium for fusing vegetation life cycle characteristics, which have the beneficial effects that: the invention constructs the method for extracting the life cycle features from the remote sensing image based on the vegetation life cycle knowledge drive, can fully play the role of the vegetation life cycle features in vegetation classification, and has certain advantages especially for extracting the ground features in the vegetation development flourishing area. According to the invention, a method of averaging is adopted for a training set to obtain dense long-time sequence vegetation index data sets of various vegetation types, the life cycle changes of different vegetation are concerned, the different growth differences of the vegetation life cycle are effectively represented, and meanwhile, the calculation cost is saved. According to the invention, the classification feature space is constructed for each pixel point and each vegetation type, so that the corresponding vegetation type is effectively identified based on the life cycle features of each pixel point, and the identification precision is effectively improved.
Furthermore, the multi-source data is cooperated by adopting the cross-sensor conversion model, so that the quality of remote sensing images can be effectively improved, the influence of image holes caused by time gaps is sufficiently reduced, and the vegetation type extraction precision is improved; based on the band and the vegetation index after cooperative transformation, the method and the device form the dense long-time sequence vegetation index data set of each pixel point and the dense long-time sequence vegetation index set of the training set through average value obtaining operation, so that the influence of the abnormal value of the data on the whole data can be reduced.
Furthermore, the invention adopts a time window and filtering mode to process the time sequence curve, which is beneficial to reducing the influence of noise data on real data.
Furthermore, the invention adopts the idea of piecewise fitting to segment the dense long-time sequence vegetation index data set of each pixel point and the dense long-time sequence vegetation index set of the training set, respectively fits the subset of the data sets, and then obtains the integral fitting curve through spline interpolation, and the method is more suitable for vegetation growth and has the characteristic of staged life cycle.
Further, the classification feature space of the sample set of each vegetation type constructed by the invention can fully reflect the characteristics of each vegetation type, effectively ensure the sample diversity of the training set and improve the training precision of the trainer.
Furthermore, the method and the device are combined with a threshold algorithm to extract time nodes of different vegetation growth, end time and different growth stages respectively, the time points of the growth, saturation and aging poles (most) can be rapidly positioned by using the method, classification characteristic diagrams of all time periods can be rapidly acquired around the time points, and the recognition accuracy is ensured while the calculation efficiency is improved.
Furthermore, the invention acquires the recognition accuracy of the classifier by verifying the output result of the classifier after the sample set is input and trained, and provides background data support for the use of the subsequent classifier.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a schematic diagram a of a division of the anagen and senescent phases of an exemplary embodiment;
FIG. 3 is a schematic diagram b of a division of the anagen and senescent phases of an exemplary embodiment;
FIG. 4 is a schematic diagram of multi-source data cooperation and time window denoising according to an embodiment;
FIG. 5 is a schematic diagram of extracting different life cycle characteristics according to an embodiment.
Detailed Description
The invention will now be described in further detail with reference to the drawings and specific examples, which are given for clarity of understanding and are not to be construed as limiting the invention.
The invention provides a remote sensing classification method for fusing vegetation life cycle features, which comprises the following steps of;
forming a dense long-time sequence vegetation index data set of each pixel point based on vegetation indexes of each pixel point in the remote sensing image of the region to be classified at different times;
averaging vegetation indexes of each vegetation type in different time according to a sample set of each vegetation type in a remote sensing image of an area to be classified to form a dense long-time sequence vegetation index data set of each vegetation type;
performing curve fitting on the dense long-time sequence vegetation index data set of each pixel point and vegetation type to form a corresponding whole-course fitting curve of each pixel point and vegetation type;
respectively performing differential operation on each pixel point and the whole-course fitting curve of each vegetation type to obtain a corresponding derivative function curve;
determining time periods of a growing period, a saturation period and an aging period of each pixel point and vegetation type according to the maximum value, the minimum value and the time node determined by the set threshold ratio of the derivative function curve of each pixel point and vegetation type;
acquiring a growth stage characteristic diagram of each pixel point according to the maximum value, the minimum value and the set ratio of the dense long-time sequence vegetation index data set of each pixel point and the vegetation type; the growth stage characteristic diagram is used for representing vegetation at the growth starting time, the saturation time and the aging time;
constructing a classification feature space of each pixel point and a sample set of each vegetation type in a remote sensing image of the region to be classified; the classification characteristic space comprises spectral characteristics, texture characteristics and polarization characteristics corresponding to the growing period, the saturation period and the aging period, and a growth stage characteristic diagram;
training a classifier by using sample sets of different vegetation types of the region to be classified and classification feature spaces of the sample sets;
and classifying vegetation types in the remote sensing image of the region to be classified by utilizing the classifier which is completed by training based on the classification feature space of all the pixel points of the remote sensing image of the region to be classified.
The principles of the present invention are further described below in connection with specific embodiments.
The embodiment provides a remote sensing classification method for fusing vegetation life cycle features, as shown in fig. 1, comprising the following steps:
firstly, preparing and processing remote sensing data:
aiming at the region to be classified, corresponding Sentinel-2 and Landsat-8 remote sensing satellite data sources of the year to be classified are obtained, and preprocessing operation is carried out, wherein the preprocessing operation mainly comprises radiation calibration, atmosphere correction, terrain correction, cloud (snow) removal processing and mosaic cutting, and in addition, the preprocessing method also comprises resampling processing with different spatial resolutions.
Wherein atmospheric and terrain corrections are used to eliminate atmospheric and terrain disturbances; the bit mask wave band 'QA 60' with cloud mask information in a data source is utilized for carrying out data cloud removal processing, wherein MSK_CLDPRB and MSK_SNWPRB are wave bands reflecting cloud probability and snow probability and can be used for removing cloud and snow.
Because S2 and L8 have different spatial resolutions, the S2 ground resolution is 10 m, and the spatial resolution of the wave band used by L8 is 30 m, and in order to ensure the consistency of the spatial resolution, the nearest neighbor interpolation method is adopted to resample the L8 original wave band to the spatial resolution of 10 m.
And part of pixel points in the obtained remote sensing image of the region to be classified are different vegetation sample sets. The vegetation types of the sample set are already clarified through a field investigation mode and the like, and the purpose of the embodiment is to predict the vegetation types of the remote sensing images of the whole region to be classified. The sample set is a set of position information of pixel points for which vegetation types are already clear.
In this particular embodiment, the vegetation types of the areas to be classified include cultivated land, grassland, wetland, and woodland. Wherein the sample set covers the vegetation types described above.
The specific embodiment divides a sample set of each vegetation type into a training set and a verification set; forming classification feature spaces of different vegetation types by adopting a training set so as to train a classifier; and verifying the trained classifier by using the verification set to generate vegetation type prediction precision.
Secondly, the multisource data cooperatively generates a dense long-time sequence set:
and forming a dense long-time sequence vegetation index data set aiming at each pixel point in the remote sensing image.
And classifying the sample set in the remote sensing image according to the vegetation type to which the sample set belongs. And averaging the vegetation indexes of the training set of each vegetation type to form a dense long-time sequence vegetation index data set of each vegetation type. In this embodiment, the vegetation training sets for forming the dense long-time-series vegetation indexes are 4 types, and correspond to cultivated land, grassland, wetland and woodland respectively.
On the basis of the preprocessed multi-source remote sensing data, a dense long-time sequence vegetation index data set (such as NDVI, EVI, SAVI and the like) of each pixel point and vegetation type is generated by using a combination mode of two data sources, and the dense long-time sequence vegetation index data set is used for constructing a growth stage characteristic diagram and researching detailed change states of vegetation indexes, so that the generation of data which are more needed to be more appropriate for parameter extraction of a life cycle is facilitated, the recombination of the two data sources is used for compensating the condition of single data loss to a certain extent, and the time sequence expression can be enhanced, as shown in a multi-source data cooperation part of fig. 1.
Based on a sensor-crossing conversion model, a common least square linear regression method is used for solving conversion parameters among multiple groups of data so as to realize the aim of cooperative data, and the calculation formula is as follows:
W’=S·W+I (1)
wherein W' refers to the band or vegetation index after transformation, W refers to the band or vegetation index before transformation, S refers to the calculated parameter slope, and I is the calculated parameter intercept. The wave band transformation and vegetation index transformation are calculated by adopting a formula (1).
The cross-sensor refers to remote sensing image acquisition equipment serving as different sources. In order to sample all possible spectral values of the entire region to be classified, a simple random sampling point method is used to randomly generate a set of points based on a sample set in the region to be classified, wherein the set of points contains various vegetation coverage types.
The set of points is further divided into training data for obtaining the transformation parameters S and I between the sensors and verification data for verifying the accuracy of the transformation coefficients. On the one hand, the evaluation can be carried out qualitatively by examining the scatter diagrams of the different sensor combinations, and on the other hand, the evaluation can be carried out quantitatively by means of statistical data (decision coefficient, R2; average relative error, MRE; root mean square error, RMSE, etc.).
In order to further improve the stability and reliability of the data, for the vegetation index calculated after the band transformation at each time point and the vegetation index converted after the band calculation, the average value of the vegetation index is calculated as a dense long-time sequence vegetation index data set of each pixel point and vegetation type, and the specific expression is as follows:
M’ t1 =(M t1 +N t1 )/2 (2)
wherein t represents the number of data points corresponding to any pixel point or vegetation type dense long-time sequence vegetation index data set; m's' t1 The vegetation index matrix of t rows and 1 column is obtained after comprehensive calculation; m is M t1 The vegetation index matrix of t rows and 1 column is calculated after wave band transformation; n (N) t1 Refers to the transformation from untransformedAnd calculating vegetation indexes by the wave bands, and then obtaining a vegetation index matrix of t rows and 1 column by transformation.
The subsequent data processing methods are respectively executed for each dense long-time sequence vegetation index data set, and the data processing methods are the same.
Third step, intensive long time sequence set piecewise fitting:
and (3) on the basis of the dense long-time sequence vegetation index data set obtained in the second step, comprehensively considering the characteristic of periodicity of vegetation growth, dividing the vegetation growth into a growth period and an aging period, and fitting the vegetation growth period and the aging period respectively. As shown in fig. 2, taking the dense long-time-series vegetation index data set of single-season vegetation as an example, the determined time point corresponding to the maximum value of the vegetation index in the dense long-time-series vegetation index data set may be divided into two subsets, where the time point corresponds to the formation of the growth period data set before and the time point corresponds to the formation of the aging period data set after.
As shown in fig. 3, if there are multiple seasons of vegetation, it can be divided into more subsets.
M={M ik ,M kj }(3)
Wherein M refers to the entire dense long-time sequence set; i, j respectively refer to the corresponding positions of the start time and the end time of the sequence set; k refers to the corresponding position when the data is at the maximum (polar) value and the minimum (polar) value; m is M ik ,M kj Is a subset of M.
Wherein the time period before the time point corresponding to the maximum (pole) maximum value and the time period after the time point corresponding to the minimum (pole) value is the growth period, and the time period after the time point corresponding to the maximum (pole) maximum value and the time period before the time point corresponding to the minimum (pole) value is the aging period.
In view of data noise possibly generated by dense data, in order to reduce the influence of noise on the life cycle of extracted vegetation, the whole data set is denoised by adopting a method of time window mean synthesis, as shown in the denoised part of fig. 4, assuming that the window size is W, and is usually odd, the calculation formula of the smoothed data point y (i) at the time point i is as follows:
(4)
x (t-i) in the above formula refers to a data point within a window; i, w correspond to the time point location and the time window size, respectively.
High quality data points can be obtained by smoothing and denoising, and the segmented ideas are used for the data sets to fit the growing period and the aging period respectively.
(5)
Wherein F is i And F j Refers to fitted curves during the growth phase and the aging phase, respectively.
For the fitting algorithm, the common fitting modes include S-G filtering, cubic convolution interpolation, harmonic analysis fitting and the like. The S-G filtering algorithm is based on the idea of polynomial fitting, a polynomial function is obtained by least square fitting of data points in a certain time window, and then the value of a central point is estimated by the function to realize smoothing and denoising of time sequence data. The cubic convolution interpolation is a polynomial interpolation method that approximates a set of known data points using a cubic polynomial to obtain a continuous curve function. In harmonic analysis, the time series is decomposed into a linear combination of sine and cosine functions, which correspond to periodic variations in different frequencies, respectively. These harmonic components are then weighted to fit the original time series.
In this embodiment, spline interpolation is used to connect the fitted curves, and the connection is generally in a transition phase from the growth phase to the aging phase, and the connection process is implemented by combining the end points of the bilateral fitted curves and the original data points (control points). Specific cubic spline function S i (x) The calculation formula is as follows:
(6)
wherein a is i 、b i 、c i And d i Is of the formulaParameters. F in (5) i And F j Fitting a value of a certain distance between an end point and a starting point corresponding to the curve as two end points (a, b) of cubic spline interpolation, dividing the value into a plurality of subintervals, using original data points as control points to adjust an optimized curve until the original data points are fitted to the whole original data points, wherein common evaluation indexes comprise: visualization evaluation (Visual evaluation), residual analysis (Residual Analysis), decision coefficients (R-squared), and mean square error (Mean Squared Error), among others.
Fourth, constructing a growth stage characteristic diagram by a threshold method:
the key point of the life cycle feature construction is the extraction of vegetation life cycle nodes, and a dynamic threshold method is adopted to obtain each pixel point and a growth stage feature map of different vegetation based on the dense long-time sequence data set reconstruction curve result; and extracting time periods of different vegetation in the growing period, the saturation period and the aging period of each pixel point by adopting a curve derivation threshold method, and accordingly obtaining vegetation difference characteristics in the time periods.
The dynamic threshold method is an algorithm for adaptively adjusting the threshold, takes the time corresponding to a certain ratio of the curve amplitude as a life cycle point, can eliminate the difference between soil background and vegetation types to a certain extent, and is suitable for extracting the growth starting time and the growth ending time of different vegetation.
The growth phase feature map U is calculated using the following formula:
U=(V min -V max )·P+V min (7)
wherein V is min And V max Respectively the minimum value and the maximum value of the dense long-time sequence vegetation index data set, and P is the set amplitude proportion.
In the above equation, U is estimated as a dynamic value. In this embodiment, it depends on the annual amplitude of the time series, V min And V max The minimum and maximum annual values of the dense long-time series vegetation index, respectively, P being a proportion of the amplitude. In general, p=0.5 is used as mid-returning and mid-declining phases of the growing season, the threshold indicator of the amplitude estimation being caused by time series discontinuitiesThe deviation effect is less.
The curve derivative threshold method is a fixed threshold-based mode, and aims at the fitting curve in the third step, and derivative operation is carried out on the fitting curve to obtain a derivative function curve. The life cycle time nodes are extracted on the basis of the derivative function curve, and time division of different vegetation life cycle stages is further extracted through maximum synthesis, and specific calculation content is described with reference to fig. 5.
Firstly, differentiating the fitted curve, and expressing the vegetation growth or aging rate at the moment according to the specific calculation formula:
(8)
where Δx is a very small increment, representing the amount of change in x. The above formula F (x) represents the slope, i.e. the derivative, of the fitted curve F (x) at point x.
Next, the minimum and maximum values of the derivative function curve are obtained, the amplitude thereof is calculated as D, and the threshold ratio R and the maximum value (P max ) And minimum value (P min ) Two thresholds T1 and T2 are calculated.
Obtaining the minimum value P of the vegetation index derivative function curve of the vegetation min And maximum value P max And calculating the amplitude of the signal to be D; according to the set threshold ratio R, the thresholds T1 and T2 are calculated by:
D=P max -P min ;
T1=P max -R·D;
T2=P min +R·D; (9)
the abscissa of the derivative function curve represents time, and the ordinate represents slope;
the time period formed by the time point corresponding to the intersection point of the horizontal line made by the threshold T1 and the derivative curve is taken as a growth period,
the time period formed by the time point corresponding to the crossing point of the horizontal line made by the threshold T2 and the derivative curve is taken as the aging period,
the time period formed by the time points corresponding to the minimum value and the maximum value of the derivative function curve is a saturation period.
The time positions corresponding to the areas above and below the intersection of the horizontal lines and the derivative curves made according to T1 and T2 represent the time periods of faster growth rate and faster aging rate. The time points of each life cycle are projected onto a fitted curve, as shown in fig. 5, the corresponding sections of the growing period (S1-S2) and the aging period (S5-S6) are determined, and the saturation period is the most corresponding section (S3-S4).
Fifth, vegetation classification based on life cycle features:
the life cycle characteristic time periods are obtained based on the calculation, and each life cycle characteristic time period comprises a growing period, a saturation period and an aging period, and the development states of different vegetation and the remote sensing image expression in the three life cycle time periods have certain differences.
In addition, the vegetation growth stage diagram extracted by the dynamic threshold method in the fourth step is further based on the classification feature space formed by combining the multi-time-phase classification feature set calculated according to the three time phases.
Constructing classification feature spaces of different vegetation types and pixel points in a remote sensing image of a region to be classified; the classification feature space comprises spectral features, texture features and polarization features corresponding to the growing period, the saturation period and the aging period, and a growth stage feature map.
Specifically, the process of constructing the classification feature space of the sample set of vegetation types includes: and acquiring time periods of vegetation type growing period, saturation period and aging period of any sample point in the sample set of any vegetation type. According to remote sensing images of the sample points in each time period of the vegetation type growing period, the saturation period and the aging period, calculating to obtain corresponding spectral characteristics, texture characteristics and polarization characteristics of the sample points in the growing period, the saturation period and the aging period; and using the growth stage characteristic diagram of the pixel point corresponding to the sample point as the growth stage characteristic diagram of the sample point. A classification feature space for each sample set of vegetation types is formed based on the classification feature space for each sample point.
Specifically, the growth stage feature map corresponding to the sample set of each vegetation type directly adopts the growth stage feature map of the pixel point represented by each sample point, and repeated calculation is not needed. The dense long-time sequence vegetation index data set of each vegetation type in this embodiment is mainly used for distinguishing the time periods of the growing period, the saturation period and the aging period of each vegetation type, so as to calculate and obtain typical classification features (spectrum, texture and the like) corresponding to the images of each vegetation type in different life cycle time periods.
The process of constructing the classification feature space of the single pixel point comprises the following steps: and aiming at any pixel point, based on the corresponding remote sensing images of the growing period, the saturation period and the aging period of the pixel point, obtaining the corresponding spectral characteristics, texture characteristics and polarization characteristics of the growing period, the saturation period and the aging period of the pixel point, and generating a classification characteristic space of the pixel point by combining the growth stage characteristic diagram of the pixel point.
And training the classifier by using training sets and classification feature spaces of different vegetation of the region to be classified.
Specifically, the training samples input to the classifier include position information of sample points and corresponding classification feature spaces, and vegetation types of the sample points are used as training labels.
And predicting vegetation types of the remote sensing images of the areas to be classified by using the trained classifier to obtain vegetation types of cultivated lands, grasslands, wetlands and forest lands represented by all pixel points of the remote sensing images of the areas to be classified.
Specifically, the position information and the classification feature space of each pixel point are input into a classifier to obtain vegetation type classification of each pixel point, and the vegetation type classification is used as a vegetation classification thematic map of the remote sensing image of the region to be classified.
The classifier model may employ and is not limited to machine learning such as RF and SVM and deep learning frameworks such as DNN and CNN.
And sixthly, based on the position information of the pixel points of the verification set in the remote sensing image of the region to be classified and the output result of the classifier, generating the precision index of the classifier as the supplementary data of the output result of the classifier, so that the reference of a user is facilitated.
Specifically, the vegetation types of all sample points in the verification set are clear, and the vegetation types are compared with the output result of the classifier to obtain the accuracy of the classifier as an accuracy index.
The invention also provides a remote sensing classification system for fusing the vegetation life cycle characteristics, which is used for realizing the remote sensing classification method for fusing the vegetation life cycle characteristics.
The invention also provides a computer readable storage medium, on which a computer program is stored, which is characterized in that the computer program, when executed by a processor, implements the remote sensing classification method for fusing vegetation life cycle features according to the technical scheme.
What is not described in detail in this specification is prior art known to those skilled in the art.
Claims (10)
1. A remote sensing classification method for fusing vegetation life cycle features is characterized in that: comprises the following steps of;
forming a dense long-time sequence vegetation index data set of each pixel point based on vegetation indexes of each pixel point in the remote sensing image of the region to be classified at different times;
averaging vegetation indexes of each vegetation type in different time according to a sample set of each vegetation type in a remote sensing image of an area to be classified to form a dense long-time sequence vegetation index data set of each vegetation type;
performing curve fitting on the dense long-time sequence vegetation index data set of each pixel point and vegetation type to form a corresponding whole-course fitting curve of each pixel point and vegetation type;
respectively performing differential operation on each pixel point and the whole-course fitting curve of each vegetation type to obtain a corresponding derivative function curve;
determining time periods of a growing period, a saturation period and an aging period of each pixel point and vegetation type according to the maximum value, the minimum value and the time node determined by the set threshold ratio of the derivative function curve of each pixel point and vegetation type;
acquiring a growth stage characteristic diagram of each pixel point according to the maximum value, the minimum value and the set ratio of the dense long-time sequence vegetation index data set of each pixel point; the growth stage feature map is used for representing growth start, saturation and aging nodes of vegetation;
constructing a classification feature space of each pixel point and a sample set of each vegetation type in a remote sensing image of the region to be classified; the classification characteristic space comprises spectral characteristics, texture characteristics and polarization characteristics corresponding to the growing period, the saturation period and the aging period, and a growth stage characteristic diagram;
training a classifier by using sample sets of different vegetation types of the region to be classified and classification feature spaces of the sample sets;
and classifying vegetation types in the remote sensing image of the region to be classified by utilizing the classifier which is completed by training based on the classification feature space of all the pixel points of the remote sensing image of the region to be classified.
2. A method according to claim 1, characterized in that: acquiring remote sensing images of the areas to be classified based on remote sensing satellite data of different sources; performing band transformation and vegetation index transformation on each time point of the remote sensing image of the multisource region to be classified, and calculating a vegetation index mean value of each time point after transformation, wherein the vegetation index mean value is used for forming a dense long-time sequence vegetation index data set of each pixel point and vegetation type.
3. A method according to claim 1, characterized in that: dividing a sample set of each vegetation type into a training set and a verification set; forming classification feature spaces of different vegetation types by adopting a training set so as to train a classifier; and verifying the trained classifier by using the verification set to generate vegetation type prediction precision.
4. A method according to claim 1, characterized in that: the fitting process of the dense long-time series vegetation index dataset comprises: dividing the vegetation index data set according to the growing period and the aging period according to any dense long-time sequence, independently fitting the divided data sets to obtain fitted curves of the growing period and the aging period, and connecting the fitted curves of the growing period and the aging period to form corresponding whole-course fitted curves of each pixel point and each vegetation type.
5. A method according to claim 4, characterized in that: aiming at the dense long-time sequence vegetation index data set of each pixel point and vegetation type, finding the maximum value and the minimum value of the dense long-time sequence vegetation index data set; the time period before the maximum value corresponds to the time point is the growth period, the time period after the maximum value corresponds to the time point is the aging period, the time period before the minimum value corresponds to the time point is the aging period, and the time period after the minimum value corresponds to the time point is the growth period.
6. A method according to claim 1, characterized in that: and after performing atmospheric correction, terrain correction and data cloud removal processing on the remote sensing image of the area to be classified, regenerating a dense long-time sequence vegetation index data set of each pixel point and vegetation type.
7. A method according to claim 1, characterized in that: and smoothing and denoising the dense long-time sequence vegetation index data set of each pixel point and vegetation type, and then performing curve fitting.
8. A method according to claim 1, characterized in that: the process of constructing the classification feature space of the sample set of each vegetation type comprises: aiming at any sample point in the sample set of any vegetation type, based on the remote sensing image of the sample point in the time period of the vegetation type growing period, the saturation period and the aging period, obtaining the corresponding spectral characteristics, texture characteristics and polarization characteristics of the sample point growing period, the saturation period and the aging period; and using the growth stage characteristic diagram of the pixel point corresponding to the sample point as the growth stage characteristic diagram of the sample point.
9. A method according to claim 1, characterized in that: the process of determining the single pixel and the time periods of the growth, saturation and senescence phases of each vegetation type includes: obtaining the minimum value P of the corresponding derivative function curve min And maximum value P max And calculating the amplitude of the signal to be D; according to the set threshold ratio R, the threshold T1 and the threshold T2 are calculated by:
D=P max -P min ;
T1=P max -R·D;
T2=P min +R·D;
the abscissa of the derivative function curve represents time, and the ordinate represents slope;
the time period formed by the time point corresponding to the intersection point of the horizontal line made by the threshold T1 and the derivative curve is taken as a growth period,
the time period formed by the time point corresponding to the crossing point of the horizontal line made by the threshold T2 and the derivative curve is taken as the aging period,
the time period formed by the time points corresponding to the minimum value and the maximum value of the derivative function curve is a saturation period.
10. A method according to claim 1, characterized in that: the growth stage characteristic diagram U of each pixel point is calculated by adopting the following formula:
U=(V min -V max )·P+V min
wherein V is min And V max And respectively obtaining the minimum value and the maximum value of the dense long-time sequence vegetation index data set corresponding to each pixel point, wherein P is a set amplitude proportion.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410124198.8A CN117689959B (en) | 2024-01-30 | 2024-01-30 | Remote sensing classification method for fusing vegetation life cycle features |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410124198.8A CN117689959B (en) | 2024-01-30 | 2024-01-30 | Remote sensing classification method for fusing vegetation life cycle features |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117689959A true CN117689959A (en) | 2024-03-12 |
CN117689959B CN117689959B (en) | 2024-06-14 |
Family
ID=90133711
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410124198.8A Active CN117689959B (en) | 2024-01-30 | 2024-01-30 | Remote sensing classification method for fusing vegetation life cycle features |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117689959B (en) |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107122739A (en) * | 2017-01-23 | 2017-09-01 | 东北农业大学 | The agricultural output assessment model of VI time-serial positions is reconstructed based on Extreme mathematical modelings |
CN107273820A (en) * | 2017-05-26 | 2017-10-20 | 中国科学院遥感与数字地球研究所 | A kind of Land Cover Classification method and system |
US20180189564A1 (en) * | 2016-12-30 | 2018-07-05 | International Business Machines Corporation | Method and system for crop type identification using satellite observation and weather data |
CN110766299A (en) * | 2019-10-11 | 2020-02-07 | 清华大学 | Watershed vegetation change analysis method based on remote sensing data |
CN111832506A (en) * | 2020-07-20 | 2020-10-27 | 大同煤矿集团有限责任公司 | Remote sensing discrimination method for reconstructed vegetation based on long-time sequence vegetation index |
CN112750048A (en) * | 2020-12-22 | 2021-05-04 | 中国农业大学 | Method for dynamically analyzing cereal crop canopy senescence progress by utilizing post-flowering NDVI |
WO2021098471A1 (en) * | 2019-11-19 | 2021-05-27 | 浙江大学 | Wide-range crop phenology extraction method based on morphological modeling method |
CN113643409A (en) * | 2021-08-24 | 2021-11-12 | 中国农业大学 | Method and device for representing vegetation production rate and storage medium |
CN114266388A (en) * | 2021-12-08 | 2022-04-01 | 浙江大学 | Soybean yield prediction method based on historical vegetation index time series spectrum curve and yield mapping mode |
CN114782840A (en) * | 2022-04-20 | 2022-07-22 | 南京农业大学 | Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images |
CN114926743A (en) * | 2022-07-11 | 2022-08-19 | 北京艾尔思时代科技有限公司 | Crop classification method and system based on dynamic time window |
CN115331098A (en) * | 2022-07-15 | 2022-11-11 | 北京师范大学 | Early-stage classification method and system for overwintering crops based on enhanced phenological features |
CN115512123A (en) * | 2022-10-19 | 2022-12-23 | 上海第二工业大学 | Multi-period key growth characteristic extraction and time period classification method for hypsizygus marmoreus |
CN116503728A (en) * | 2023-03-16 | 2023-07-28 | 中国农业科学院作物科学研究所 | Method and device for predicting crop growth and development period based on big data |
CN116824384A (en) * | 2023-03-03 | 2023-09-29 | 杭州师范大学 | Soybean identification method based on standard curve |
CN117274798A (en) * | 2023-09-06 | 2023-12-22 | 中国农业科学院农业信息研究所 | Remote sensing rice identification method based on regularized time sequence variation model |
-
2024
- 2024-01-30 CN CN202410124198.8A patent/CN117689959B/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180189564A1 (en) * | 2016-12-30 | 2018-07-05 | International Business Machines Corporation | Method and system for crop type identification using satellite observation and weather data |
CN107122739A (en) * | 2017-01-23 | 2017-09-01 | 东北农业大学 | The agricultural output assessment model of VI time-serial positions is reconstructed based on Extreme mathematical modelings |
CN107273820A (en) * | 2017-05-26 | 2017-10-20 | 中国科学院遥感与数字地球研究所 | A kind of Land Cover Classification method and system |
CN110766299A (en) * | 2019-10-11 | 2020-02-07 | 清华大学 | Watershed vegetation change analysis method based on remote sensing data |
WO2021098471A1 (en) * | 2019-11-19 | 2021-05-27 | 浙江大学 | Wide-range crop phenology extraction method based on morphological modeling method |
CN111832506A (en) * | 2020-07-20 | 2020-10-27 | 大同煤矿集团有限责任公司 | Remote sensing discrimination method for reconstructed vegetation based on long-time sequence vegetation index |
CN112750048A (en) * | 2020-12-22 | 2021-05-04 | 中国农业大学 | Method for dynamically analyzing cereal crop canopy senescence progress by utilizing post-flowering NDVI |
CN113643409A (en) * | 2021-08-24 | 2021-11-12 | 中国农业大学 | Method and device for representing vegetation production rate and storage medium |
CN114266388A (en) * | 2021-12-08 | 2022-04-01 | 浙江大学 | Soybean yield prediction method based on historical vegetation index time series spectrum curve and yield mapping mode |
CN114782840A (en) * | 2022-04-20 | 2022-07-22 | 南京农业大学 | Real-time wheat phenological period classification method based on unmanned aerial vehicle RGB images |
CN114926743A (en) * | 2022-07-11 | 2022-08-19 | 北京艾尔思时代科技有限公司 | Crop classification method and system based on dynamic time window |
CN115331098A (en) * | 2022-07-15 | 2022-11-11 | 北京师范大学 | Early-stage classification method and system for overwintering crops based on enhanced phenological features |
CN115512123A (en) * | 2022-10-19 | 2022-12-23 | 上海第二工业大学 | Multi-period key growth characteristic extraction and time period classification method for hypsizygus marmoreus |
CN116824384A (en) * | 2023-03-03 | 2023-09-29 | 杭州师范大学 | Soybean identification method based on standard curve |
CN116503728A (en) * | 2023-03-16 | 2023-07-28 | 中国农业科学院作物科学研究所 | Method and device for predicting crop growth and development period based on big data |
CN117274798A (en) * | 2023-09-06 | 2023-12-22 | 中国农业科学院农业信息研究所 | Remote sensing rice identification method based on regularized time sequence variation model |
Non-Patent Citations (2)
Title |
---|
JINXI YAO 等: "The Classification Method Study of Crops Remote Sensing with Deep Learning, Machine Learning, and Google Earth Engine", 《REMOTE SENSING》, 8 June 2022 (2022-06-08), pages 3 - 4 * |
张明伟: "基于MODIS数据的作物物候期监测及作物类型识别模式研究", 《中国优秀博硕士学位论文全文数据库 (博士) 农业科技辑》, 15 February 2007 (2007-02-15) * |
Also Published As
Publication number | Publication date |
---|---|
CN117689959B (en) | 2024-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108921885B (en) | Method for jointly inverting forest aboveground biomass by integrating three types of data sources | |
Eerens et al. | Image time series processing for agriculture monitoring | |
CN104656098B (en) | A kind of method of remote sensing forest biomass inverting | |
CN108460739A (en) | A kind of thin cloud in remote sensing image minimizing technology based on generation confrontation network | |
CN112348812B (en) | Forest stand age information measurement method and device | |
CN111721714B (en) | Soil water content estimation method based on multi-source optical remote sensing data | |
CN103902802B (en) | A kind of vegetation index time series data method for reconstructing for taking spatial information into account | |
CN114120101B (en) | Multi-scale comprehensive sensing method for soil moisture | |
CN111639587A (en) | Hyperspectral image classification method based on multi-scale spectrum space convolution neural network | |
Huang et al. | A methodology to reconstruct LAI time series data based on generative adversarial network and improved Savitzky-Golay filter | |
Dong et al. | Area extraction and spatiotemporal characteristics of winter wheat–summer maize in Shandong Province using NDVI time series | |
CN105372198A (en) | Infrared spectrum wavelength selection method based on integrated L1 regularization | |
CN112464172A (en) | Growth parameter active and passive remote sensing inversion method and device | |
CN114707412B (en) | SWAT model optimization method based on vegetation canopy time-varying characteristics | |
CN118072178A (en) | Corn yield estimation method and system based on classified percentage data assimilation | |
He et al. | ICESat-2 data classification and estimation of terrain height and canopy height | |
CN108073865B (en) | Aircraft trail cloud identification method based on satellite data | |
Jing et al. | Cloud removal for optical remote sensing imagery using the SPA-CycleGAN network | |
Ayub et al. | Wheat Crop Field and Yield Prediction using Remote Sensing and Machine Learning | |
Yu et al. | China’s larch stock volume estimation using Sentinel-2 and LiDAR data | |
CN118097426A (en) | Night sea area monitoring multi-mode data identification method, medium and system | |
CN117690017A (en) | Single-season and double-season rice extraction method considering physical time sequence characteristics | |
CN117689959B (en) | Remote sensing classification method for fusing vegetation life cycle features | |
CN116720041A (en) | Fine NDVI data reconstruction method based on earth surface coverage type | |
CN114332546B (en) | Large-scale migration learning crop classification method and system based on phenological matching strategy |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |